Neural emulations are considered safe when below natural error rate

Our brains change from day to day. Sleep, experiences, learning, and many other processes influence our brains and arguably also minds. One might argue that a neural emulation which achieves an error rate below the natural baseline of deviation in human life is an sufficiently “accurate” one to be deployed. This is losely related to how self-driving cars are considered safe when they are at least as safe as human drivers. However, one could also very well argue that it’s in this unpredictible error in which the human nature resides. This conflicts has been shortly explored in Greg Egan’s short story, “Learning To Be Me”.