Stepping the Simulation

DavidArppe
Posts: 4
Joined: Fri Oct 10, 2014 2:49 am

Stepping the Simulation

Post by DavidArppe »

Hi, my name is David. I have been using Bullet for a school project since September.
I have been stepping my the simulation with a variable time step. I use the difference between the current system time and the system time from the previous frame. This works really well, but the simulation is jerking. It studders, and I don't know how to fix that.
I tried to use a fixed time step when I realized my physics was running at different speeds on different computers. Occasionally the game will go into slow motion, this only happens with the fixed timestep. Aside from that however it is really smooth.

I want to know if I am stepping the simulation wrong, or if there is a method anyone knows of to fix this. I read an article here: http://gafferongames.com/game-physics/f ... -timestep/

The description of 'Free the Physics' sounds like what I am looking for, but when I get to the section under explaining what to do with the remaining time, it mentions interpolating between physics states. I don't know how to do that, or something similar in Bullet.

Can someone help me step my game, so it doesn't go slow motion, or studder?
Thanks!

Edit: I am programming in C++, using Visual Studio 2013
Basroil
Posts: 463
Joined: Fri Nov 30, 2012 4:50 am

Re: Stepping the Simulation

Post by Basroil »

DavidArppe wrote: I have been stepping my the simulation with a variable time step. I use the difference between the current system time and the system time from the previous frame. This works really well, but the simulation is jerking. It studders, and I don't know how to fix that.
I tried to use a fixed time step when I realized my physics was running at different speeds on different computers. Occasionally the game will go into slow motion, this only happens with the fixed timestep. Aside from that however it is really smooth.
Here's a few questions that might help pinpoint the issue:
  • Is your game rendering at ~60 fps or much slower?
  • Is your simulation taking longer to run than your framerate?
  • Are you using MLCP solvers?
  • Are you using AI or heavy scripting "inside" the simulation loop?
Given that your physics in fixed time step are at different rates on different computers, it definitely sounds like something is taking longer than 16ms on the slower computers
DavidArppe
Posts: 4
Joined: Fri Oct 10, 2014 2:49 am

Re: Stepping the Simulation

Post by DavidArppe »

Hi Basroil. I hope I can answer all of your questions:
1) My FPS is constant, 60 frames a second.
2) My simulation doesn't always take longer than my framerate, but the game does go slow motion occasionally. Either that or it studders, and if it is the simulation, it is only happening on and off.
3) I am not using MLCP solvers, but should I look into that?
4) I am not doing a lot of heavy scripting inside of my loops.

Is there a way I can ask bullet how long the calculations it is making are taking?
Thank you!
Basroil
Posts: 463
Joined: Fri Nov 30, 2012 4:50 am

Re: Stepping the Simulation

Post by Basroil »

DavidArppe wrote: Is there a way I can ask bullet how long the calculations it is making are taking?
You can use the btQuickprof code to take a look at what takes time, though if you want more control over how the timing works you might just need to do it manually (or using an IDE side profiler).
User avatar
drleviathan
Posts: 849
Joined: Tue Sep 30, 2014 6:03 pm
Location: San Francisco

Re: Stepping the Simulation

Post by drleviathan »

By default Bullet implements the "final step" as mentioned in that article. It uses a fixed substep with an accumulator for the spare fraction of the substep that didn't fit. This means that when the deltaTime argument to the stepSimulation() call is less than the fixed timestep it may be that the physics engine doesn't even step -- it may just accumulate the time instead and step on a subsequent call once the timestep has accumulated enough time.

On the other side, the case where very large deltaTime values are passed (multiples of the fixed substep), Bullet will only take some maxSubSteps per call to stepSimulation(). This is necessary to avoid the "spiral of death" that your article mentions. When the maxSubSteps variable is invoked the physics engine will run slower than real-time and the engine is said to be "losing time".

The default settings for btDiscreteDynamicsWord are as follows:

fixedSubStep = 1/60 sec
maxSubStep = 1

In this configuration the physics engine won't be losing time as long as the average deltaTime supplied is less than 1/60th of a second. When the average is larger the accumulator will eventually grow to a total step greater than 2/60 sec and the simulation will take only the single (max) step and reset the accumulator to zero -- the simulation just lost time. When this happens often then the physics simulation will appear to be running slow.

If the physics engine is the source of slowdown then this is a good thing. You don't want it to walk the spiral of death. However, if the slowdown is caused by something else, which is likely the case when the simulation is simple (Bullet is fast), you should increase the maxSubSteps value to 2, 3, or 4, depending on how low of an FPS you game may run. Given a large enough maxSubSteps and an inexpensive simulation Bullet will happily compute the real-time positions of objects and the simulation will not appear to run slow even when the FPS is low.

If your game is running at a steady 60 FPS as you claim then I would suggest you bump the maxSubStep to 2 instead of 1. This will prevent a lost substep when the frame rate happens to be slightly lower (say 59 FPS) which would eventually accumulate a total timeStep of 2/60th of a second.

You might expect your simulation to glitch when Bullet happens to take two substeps instead of one, however you're probably using a MotionState class to harvest the transforms which means Bullet is actually providing an interpolated transform that takes the accumulator's unstepped time into consideration. This makes for a smoother visual result when Bullet occasionally needs to take extra step to keep up with real-time.

My guess as to what is happening for you game: you're measuring an FPS of 60 on average but occasionally there is a very large frame time -- something in your game is blocking. This would cause the simulation to lose a lot of time at once and the effect would be a stutter.

If the simulation is running slow but you really are calling stepSimulation() at 60 FPS, with accurately measured deltaTime of about 1/60th of a second, then something very odd is going.
DavidArppe
Posts: 4
Joined: Fri Oct 10, 2014 2:49 am

Re: Stepping the Simulation

Post by DavidArppe »

thanks everybody. I increased the substeps, and now the game runs smoother. I also used a highpass filter to smooth a curve of past history for the variable time step. This allowed me to give a better estimation of the time the game will take to bullet. I read about it in an article, but I can't seem to find it again.