So even though stopping the physics updating, it could be that external interaction with the physics, makes updating it take longer?
I don't know the implementation, so I can't say. I wouldn't rule it out.
The main problem is the progressive slowdown is only seen on the xbox, it does not happen on the PC. Therefore, the profiling tools I have for the %s aren't available. We have some form of profiling in the game that times sections, but it can only give out that ms data I gave first, no %s spent in each method.
The PC is an order of magnitude faster than the Xbox, so the progressive slowdown might still be present- just much better hidden. It may not be reasonable to reproduce it that way, though.
Unless there are platform-specific codepaths, running CLR profiler will tell you if there are allocation issues.
For checking what's going on on the Xbox directly, using the XNA Framework Remote Performance Monitor might help too. It can tell you about allocations and GC counts too.
Ok, the main problem is just I can't gleen any information from profiling really, as having not written the physics engine, how long it takes in certain parts doesn't really mean anything to me. The ms data above was given for the slowed down period.
Unfortunately, my familiarity with the engine is not going to help much in identifying the problem. It's already relatively clear that it's not a standard expected performance problem sourced in high polygon objects or something trivial like that. That leaves things I can't really predict.
Fortunately, you don't need to know much about how the engine is implemented to flag something as strange.
Here's the rough guidelines to follow for the relative physics times:
1 ) Is any one stage greater than 40%? Small probability of something weird, unless there's a reasonable explanation (simulation type).
2 ) Is any one stage greater than 60%? Probably something weird going on, unless you specifically designed your simulation to stress that stage. For example: a world full of only hundreds of static objects will stress the broad phase, but not much else since no collisions or solving will happen.
3 ) Is any one stage greater than 90%? Almost certainly some shenanigans going on.
4 ) Most simulations see either the narrow phase or solver taking the top spot, both hanging around 20-40%.
5 ) Simulations that have few dynamic objects and more static objects in them pressure the broad phase more relatively speaking, like yours.
6 ) Simulations that have lots of dynamic objects floating in space will also pressure the broad phase more than other systems due to the rarity of sustained collision.
If something violates the above expectations, or if the physics time is just taking up massive amounts of time in absolute terms (say, growing to 100ms or something ridiculous and obviously wrong), then it would be worth taking a closer look at the parts of the simulation.
Until then, though, I would suggest broadening the investigation.
1 ) Assume to start with that the problem is not primarily
caused by any external library, be it .NET or BEPUphysics or XNA. It may
involve those external libraries, but the core problem is likely elsewhere.
2 ) Attempt to reproduce the problem on the PC where you have better tools. Most problems transfer from PC to Xbox360 in the absence of platform-specific code; the PC is just better at hiding them.
3 ) Use every profiler you can to gather information. Learn what each one can do. Use both performance profilers and memory profilers.
4 ) Attempt to reproduce an isolated version of the problem. Disable chunks of code or whatever else allows you to narrow it down.
5 ) Rigorously record the incremental changes and their results. If something doesn't make sense, make sure to note the confusion- it's a hint!
6 ) If disabling something helps but you do not know the mechanism by which it helps, re-enable it and disable a smaller portion. Keep searching deeper, narrowing the causes.
7 ) While going deeper, always keep in mind you might only be looking at one tendril of the problem. The monster may have multiple limbs that you need to cut off before you can fully understand why it is happening.
8 ) Make sure you have detailed and helpful information. The more, the better. When you hit a tantalizing data point, go deeper. For example, in the earlier post, total physics and update time were similar (within a factor of 2), but the max Update time was 8x higher. There's not enough information to know what's wrong with just that top level datapoint- it just hints that something isn't quite right.
9 ) Spending a couple of hours to create a live visualization tool that graphs time spent in various targeted sections of code might save you hours of labor hunting through raw data. Measure, remeasure, reremeasure.
Good luck!