Integrating extarnal sensors into Ardusub with mavlink

Hello,

I’m implementing external sensors on the BlueRov2 with the intention of providing it with the necessary inputs in order to utilize the position hold, auto, and guided feature.

I have not been able to find documentation that clearly describes which mavlink messages should be used for this purpose, however, I believe the following messages are relevant:

  • VISION_POSITION_DELTA
  • VISION_POSITION_ESTIMATE
  • GLOBAL_VISION_POSITION_ESTIMATE

Even if the previous messages are the appropriate ones, I’m not sure whether these are ALL the viable messages, or whether these are the BEST options.

So with this in mind, I got the following questions:

  1. What are ALL the relevant mavlink messages for integrating external sensors into ArduSub?

  2. If some messages overlap, which seems to be the case with VISION_POSITION_ESTIMATE and GLOBAL_VISION_POSITION_ESTIMATE. Which message should be used in which scenario?

  3. If some messages are redundant, which might be the case with VISION_POSITION_DELTA and VISION_POSITION_ESTIMATE, will there be a problem if 2 or more redundant messages are used at the same time?

  4. In the case of VISION_POSITION_ESTIMATE and GLOBAL_VISION_POSITION_ESTIMATE, they both take all the same arguments, however, their description differs slightly for the x, y, and z arguments where one is global and the other is local position. what is difference between global and local position?
    a. are they referring to different coordinate systems, i.e. ECEF or NED?
    b. do their origo differ, that is, where does x, y, z = 0.0 refer to in either case?

I hope to get a good overview of which MAVLINK messages are available for external sensor integration. Thank you for any help regarding this.

Hi @martinbj,

There is relevant documentation in ArduPilot’s developer wiki for providing position estimates through MAVLink :slight_smile:

It is admittedly somewhat sparse, and hasn’t been fine-tuned for ArduSub-specific usage, but it’s at least a starting point, and you’re welcome to ask follow-up questions here (or in the more general ArduPilot forum) for details you’re not clear on.

Other sensors are typically integrated through other means, like as an External AHRS.

To start with some context, the autopilot uses an Extended Kalman Filter for estimating its state, including its position. For resolution and intuitiveness reasons the internal position estimate is in a local, cartesian coordinate system (so forwards, sideways, and upwards make sense as directions, and don’t need to be converted into geodesics or spherical coordinates), so it’s best if position estimates can be determined and provided in that local frame of reference, rather than converting to and from global coordinates at every point.

The ArduPilot MAVLink positioning docs I linked to mention:

  1. GPS_INPUT is not recommended
    • I assume this is because of the conversion inefficiencies/inaccuracies, as well as the latencies involved with sending critical state components via the telemetry system
  2. GLOBAL_VISION_POSITION_ESTIMATE is also not recommended
    • I expect this is because it uses the same X/Y/Z terminology as the local one, and MAVLink is unclear as to the difference between them, so ArduPilot just treats them identically (in which case the local one is clearer as to what is actually being interpreted)

I don’t believe that should be an issue, but you would need to be careful of differences in reference frame, especially if you’re sending information from multiple independent sensors/estimates through a single interface (e.g. MAVLink) because then any discrepancies make the whole interface seem unreliable to the EKF, rather than allowing it to prioritise whichever one seems more consistent over time / with the inertial sensors.

As mentioned in 2., ArduPilot treats them identically.

It does seem worth mentioning the origin behaviour though, whereby the position estimate is determined relative to the EKF’s configured origin, which is set either

  1. automatically, using the first GPS data after booting, or
  2. in the absence of a GPS system, can be set using MAVLink

As a minor additional note, the MAVLink specification includes optional fields, and sometimes ArduPilot doesn’t implement/handle them.

1 Like

Thank you Elliot for such an extensive reply.

We just did some testing with VISION_POSITION_DELTA and got some varying result.

We do not have access to a 6x6 covariance matrix for our position estimates, we only have a figure of merit which is a single value.

Therefore, we are not able to fill the covariance field in this message properly and thus we experimented with this field.

In one test, we set all the values to a very small number, 0.00001, effectively telling the controller that the position estimate is great. In this case the ROV performed very well in auto mode.

In one test we left the covariance as unknown (that is, the first element is set to NAN) which is more correct in our scenario since we do not have access to this information. In this case the ROV performed significantly worse, mainly due to it switching between EKF3 lanes. It would switch back and forth between “EKF3 lane switch 0” and “EKF3 lane switch 1” and as a result, the ROV’s position (and I believe attitude) would visibly change on the map in QGroundControl and it would really struggle to navigate to any of the waypoints.

So, first of all, is there a good way to use our figure of merit value in this case? That would allow us to at least input some indication of the position estimate, and hopefully have it performing properly without “lying” about the covariance.

And if this is not possible, is it then possible to leave the covariance field as unknown and at the same time prevent the ROV from lane switching?

In case this is relevant, we have been operating with the following settings:
EK3_SRC_1_POSXY = ExternalNav
EK3_SRC_1_VELXY = ExternalNav
EK3_SRC_1_POSZ = Baro
EK3_SRC_1_VELZ = None
EK3_SRC_1_YAW = ExternalNav

EK3_SRC_2_YAW = Compass (Moved compas from src1 to src2 since I believe it will resort to the compass for yaw in case the the externalNav estimates are not present)