Can you clarify what you mean by separated?
The phone sensors can be used to determine the direction of gravity separate from all other accelerations. If only the direction of gravity is used, then shaking effects will not be taken into account. It basically acts as an attitude/orientation detector. This device-separated gravity and nongravity acceleration is used in the second demo below.
However, as I also explain later (the first demo below), the reason why it doesn't seem to work as expected is probably just scale.
I believe it has to do with the fact that in real life the box is moving and in my app, only gravity is changing.
It should not matter if the accelerometer is providing correct values.
The BEPUphysics Space is considered to be following the device. So, within the simulation, all values are local to the device. This means the mesh representing the world is static and unmoving (because it is, in fact, unmoving relative to the device).
So with a real box, if your frame of reference is that of the person shaking the box, you do indeed see the box moving. But imagine you had a camera attached to the inside of the box. The box would be stationary and the contents of the box would appear to be flying all over the place from changing accelerations. The source of those accelerations is the person-space gravity and shaking transformed into the the box's local space.
It's a shift in perspective, but it is equivalent.
The issue may simply be that the forces exerted by shaking are too small to register at the low scales involved. If the gravity is set to the acceleration scaled by 500, you can start to see objects behave more like you expect. You can pile them into a corner and then launch them up and out with a quick shake. That motion still happened at lower scaled gravity, but it only resulted in a tiny amount of displacement.
I made a little demo in the BEPUphysicsPhoneDemo to demonstrate this:
The above pure raw-input method does have side effects. Everything is consistent, but maintaining that consistency requires some really high relative gravity. It's almost as if there's tiny objects in a phone-sized box
It uses a 120hz update rate to help out stability. This isn't really feasible for larger simulations on the phone, and I assume you'd prefer more controllable motion, where the gravity is more manageable but shaking still has a significant effect. This would break the consistency, so we need another approach.
So, what we'd like to do is separate the gravity direction (or just as usefully, orientation) from the raw acceleration data, leaving us with the remaining non-earth-gravity accelerations. Then, we could scale gravity a bit to make it just fast enough for the gameplay and separately scale the nongravitational accelerations to a greater degree to introduce usable shakes and kicks.
With a single datapoint from absolutely raw linear accelerometer input that includes some unknown gravity direction, it is not feasible to know exactly what part of the acceleration is due to shaking and what part is due to earth's gravity. So we have to cheat.
The 'simple shake gesture recognizer' approach I mentioned earlier is one option. A simple implementation that just watches magnitude could determine when a sufficiently large shake in the gravitational direction occurred, at which time some random impulses could be applied. A bit more work could be put in to extract some rough direction, perhaps based on previous frame gravities. If you notice a strong change in direction between frames, you can assume it to be a shake attempt. This would require extra care with thresholds and smoothing to avoid introducing shakes due to sensor noise.
The most robust way to separate gravity from nongravity is to leverage additional sensors. If we assume the phone supports Motion, all the work is done for us. I made another demo showing this:
It works on my old Samsung Focus as expected; most of the other old phones should be able to run it too. Newer phones with fancier sensors will get higher quality data. If you encounter a phone which doesn't support Motion, the shake gesture recognition approach could be used as a backup.