Using CoreMotion, we can easily measure the pitch an iPhone in real-time but it requires the phone to be stationary. When in motion (e.g. when the phone is mounted on a vehicle), the sensor fusion algorithm will start drifting due to the accelerometer noise (from vibrations and constant acceleration and deceleration) and is ultimately unable to correctly measure the gravity. The gravity from the accelerometer is a critical reference frame for the pitch and roll.
What is a good strategy or implementation to overcome this challenge? Will I have to rewrite the sensor fusion algorithm with some dynamic features that use other potential heuristics to separate the user acceleration from gravity?
PS: I tried fusing the GPS data to measure the slope of the road. The data accuracy was very poor. I also considered using the Altimeter but that will require the use of GPS position or speed to measure the slope, and both are inaccurate.