How does the size of objects affect bepu efficiency?

Discuss any questions about BEPUphysics or problems encountered.
Post Reply
chebob69
Posts: 40
Joined: Thu Aug 16, 2012 7:27 pm

How does the size of objects affect bepu efficiency?

Post by chebob69 »

Is bepu scale-invariant such that the only relevant factor is the relative sizes of colliding objects. So for example-

In scene A I have x cubes of length/width/height 1 interacting with a terrain where each terrain square is 3x3.
In scene B I have x cubes of length/width/height 0.1 interacting with a terrain where each terrain square is 0.3x0.3

Will bepu have a preference between the 2 scenes or will they be equivalent in every way and run at the same speeds.
Norbo
Site Admin
Posts: 4929
Joined: Tue Jul 04, 2006 4:45 am

Re: How does the size of objects affect bepu efficiency?

Post by Norbo »

Is bepu scale-invariant such that the only relevant factor is the relative sizes of colliding objects.
The short answer is 'scale matters.'

On a high conceptual level, the core algorithms are not aware of anything but relative values. In terms of implementation, though, there are a bunch of tuning variables which make parts of the engine prefer a certain 'scale interpretation' by default. A 'scale interpretation' is the location of a sliding window along the axis of scale.

The default tuning values put this window around 0.5 to 10 units. In that range, things behave very solidly. Outside of the range, it's generally still acceptable, but drops off gradually. The further away from the optimal range and the closer the inspection, the worse it gets.

Generally, if objects get too small for the scale interpretation, you'll notice degraded contact stability as they approach the contact management tuning thresholds. Once those thresholds are passed- particularly the CollisionDetectionSettings.ContactMinimumSeparationDistance, which defaults to 0.03- stability can drop extremely quickly. Performance can actually sometimes increase marginally with excessively small simulations; this is because algorithms are terminating earlier than they should due to the tuning factors, harming quality.

Objects which are too large for the window typically degrade more gracefully. If the default scale interpretation is in use and everything in the simulation is around 5-100 units, things typically still work fairly reasonably. Even 50-1000 is still functional, though not optimal. Performance can decrease with excess size because the epsilons are much smaller relative to the objects, preventing algorithms from finishing quickly.

It's important to note that the scale interpretation is chosen mostly by the expected scale of interaction and inspection. If a player character is 1000 units tall, the player is probably primarily interacting with objects ranging from 250 to 5000 units. So, the scale interpretation window should be centered at around that point which would be a 500x multiplier applied to the default scale interpretation.

Because of the slow degradation of large objects and the importance of player interaction, it's usually okay to have pretty big objects in a simulation even if they're outside of the window when the player's perception is such that the objects are considered huge. If they are considered huge, chances are they move slowly relative to their size (the gravity is small relative to their size) and interactions are less stressful. Objects which are considered tiny (i.e. objects which are very small compared to the magnitude of their motion) are much harder to handle in comparison.

Check out the BEPUphysicsDemos ConfigurationHelper.ApplyScale method to see the tuning variables. I'd recommend looking at the development fork's version; it has been significantly improved. The ScaleDemo shows behavior with different sizes.

The absolute values of masses also matter. SolverSettings.DefaultMinimumImpulse allows constraints to early out. Heavy objects tend to early out slower, and light objects early out faster. If masses tend to differ significantly from the values used in the demos (around 1-20 units for the most part), scaling the DefaultMinimumImpulse proportionately could help stability or performance.

So:
In scene A I have x cubes of length/width/height 1 interacting with a terrain where each terrain square is 3x3.
In scene B I have x cubes of length/width/height 0.1 interacting with a terrain where each terrain square is 0.3x0.3

Will bepu have a preference between the 2 scenes or will they be equivalent in every way and run at the same speeds.
Scene A will likely behave better with default settings. As discussed above, there are more issues at play than just the absolute size, but there aren't many cases where 0.1-size blocks will behave better than 1.0-size blocks given the default scale interpretation. Performance will be very similar, though Scene B may actually have a slight speed advantage due to excessively early termination and bad behavior. Using ConfigurationHelper.ApplyScale(0.1) will make Scene B behave like Scene A behaves with default settings.

(At some point I need to polish up and put all of this in a piece of documentation on codeplex!)
chebob69
Posts: 40
Joined: Thu Aug 16, 2012 7:27 pm

Re: How does the size of objects affect bepu efficiency?

Post by chebob69 »

Lol thanks for the encyclopedic answer Norbo. I'll make sure I play things safe!
Post Reply