Certainly the 100% deterministic version would be more user friendly, but it's not possible in
an actual system implemented in our universe.
After all
an asteroid could hit you. Crazy simultaneously bitflips could ruin any level of redundancy in any system. etc.
Pure mathematics can be deterministic, implemented systems can't be completely. The functional question is does it converge fast enough to high enough probability that its risk that the convergence was false is minor and burred under all the externalities?
It may be easier to explain something when its platonic ideal is deterministic, but since the actual implementation can't be completely those explanations are inherently misleading.
