Was playing around with some basic physics simulations, and noticed that changing the timescale will give radically different outcomes. I imagine this is because of some floating point precision problems.
Anyone know of any work-arounds or resources related to this topic?
I'd really like to be able to experiment with creating a replay system with varying speed options, and allow it to rely on rerunning the simulation.
The results are only deterministic relative to the time scale being used, but not deterministic across different time scales.