ArduSub: Using IMU for horizontal positioning


Is there a way to use only the on-board IMUs for (rough) positioning in the xy-plane? So far I’ve tried a few approaches with no luck. AP_InertialNav::get_position() returns (0,0,z). I’ve come to the conclusion that this is because NavEKF2_core::getPosNED(…) detects no GPS or optical flow sensor, and “NavEKF2_core::validOrigin” is false. I also tried looking at the values of NavEKF2_core::outputDataNew.position, but the x and y elements don’t seem to be reacting at all when I move the ROV (they stay in range +/- 1).



Hi Magnus,

You can get the rough position from inertial data only. Keep in mind that it will drift quite a bit. @jaxxzer has spent some time testing this and has some videos on Youtube of the BlueROV reacting to position changes with only inertial sensing.

I’ll point him towards this post.


Hi Magnus. Unfortunately, the position estimate from the internal sensors is not accurate. I have one video, where I briefly demonstrate that the rov reacts appropriately when trying to hold its position. This test had two purposes, to find out whether or not the autopilot could hold its position with only the onboard sensors (it can’t), and second to translate the roll and pitch outputs from the navigation controller to forward an lateral outputs in ardusub. The video I posted demonstrates that the second point works, and I took that clip when the autopilot was behaving nicely for a few seconds and drift was not great. What the video does not show is the much longer amount of time I spent resetting the origin over and over only for the rov to immediately zoom off in a random direction. In general, I found that the circle of error grows so quickly that the position estimate is unusable in a matter of seconds after resetting the origin.


In order to enable the position hold mode I had to make some pretty hefty hacks and disable several checks like ekf.position_ok() which will always return false with no gps or optical flow sensor attached. I did not attempt to actually see what the relative position output of the autopilot was at this point, so I dont know if the calls you mentioned would change once forced into a gps enabled mode with no gps attached. One thing you can look at for now is the output of one of the GPS_LOCATION fields. These always have a vx and vz representing the velocity estimate in the global NED frame. Mine usually shows a velocity of at least 10 cm/s while it is sitting still.

There may be room for improvement in this way of things, but I have not looked in to it as I plan on having a gps for my application.


If you would like to know more about how to do these kinds of tests, please join in the conversation on gitter.



1 Like

Thanks for the helpful information! I was hoping it would be somewhat useful in a short time frame but oh well, back to the drawing board.

  • Magnus

Could you point me to how to extract a rough position computation using the IMU from the ROV data’s datastream ?

I want to look at some post processing solutions using the IMU relative velocities and other external positioning information.

Some source code (or a paper) showing the IMU calculation to NEU would be appreciated

You can find the IMU data in the SCALED_IMU message. The maximum output frequency of this message is 50Hz. The same information is logged at 400Hz to the SD card on the autopilot. An extended kallman filter is usually used to convert the data from all of the sensors to a position estimate.

A post was split to a new topic: Increase MAVLink IMU data rate

Some One have writer yet a script in python that implement the kalman filter whit the starting sensors of the Blue rov?