How can I get the exact speed of my ROV?

Hello, I need to move the ROV forward or sideways an exact distance, so I need to get the exact speed of the ROV. I used the following code, it looks like “groundspeed” is the speed of the ROV, but the ROV is still, there must be something wrong. So how do I get the exact speed of the ROV?

# Import mavutil
from pymavlink import mavutil

# Create the connection
# From topside computer
master = mavutil.mavlink_connection('udpin:0.0.0.0:14550')

while True:
    msg = master.recv_match()
    #print(msg)
    if not msg:
        continue
    
    if msg.get_type() == 'VFR_HUD':
        #print("\n\n*****Got message: %s*****" % msg.get_type())
        print("Message: %s" % msg)

The terminal output is as follows:


Thank you for all your answers!

Hi @robotmark,

groundspeed is the autopilot’s best estimate of what its current speed is relative to the ground, based on its sensor inputs. The internal sensors (e.g. accelerometers, gyroscopes, magnetometer) in the autopilot’s IMU (inertial measurement unit) can provide a rough estimate, but additional accuracy requires sensors that are better suited to that kind of measurement. As some examples,

  • a DVL can measure speed relative to the ground directly
  • USBLs and similar systems measure position directly, relative to an external reference
  • scanning sonars (like our Ping360) and other imaging sonars can also help with local positioning
  • echo-sounders (like our Ping Sonar) can be used to measure distance to an obstacle or wall in a particular direction
  • computer vision can sometimes be used for positioning around detectable objects of known shape+size / spacing
  • and so forth

Note that accurate positioning underwater is a challenging problem, and there’s no “one size fits all” solution that is affordable and suits every situation (e.g. see this recent discussion on USBL prices). If it’s important to your application then you’ll need to determine a suitable option given your particular constraints.

1 Like

Thank you for your reply, I would like to ask one more question. I want to read the yaw Angle of the ROV relative to the Earth, I use the following code, but the yaw Angle varies from 0° to 360°, and is only an integer. I want a more accurate value, how can I get a more accurate value?

# Import mavutil
from pymavlink import mavutil

# Create the connection
# From topside computer

def read_attitude_continual(Boolean):
    master = mavutil.mavlink_connection('udpin:0.0.0.0:14550')
    while Boolean:
        msg = master.recv_match()
        if not msg:
            continue
        if msg.get_type() == 'VFR_HUD':
            print(msg.heading)

if __name__=='__main__':
    read_attitude_continual(True)

The terminal output is as follows:

Thank you for your reply.

VFR_HUD.heading is in degrees, so it’s not possible to get a more precise value with that message. GLOBAL_POSITION_INT.hdg is in centidegrees, but may require a GPS to be connected.

I’m not certain if there’s some other way, so I’ve asked internally and will get back to you.

Note that precision and accuracy are not the same thing. I’m not sure how accurate the magnetometers (compass) are on a Pixhawk, so it may be the case that you can get or calculate a more precise value from the magnetometer+gyroscope readings, but that doesn’t guarantee that it will actually be accurate to that level of precision. The same applies for any other sensor reading.

1 Like

Thank you again for your answer. I have also seen someone mention the combination of gyroscopes and magnetometers on the forum before.

I believe that doing so would probably improve accuracy and reliability. But I don’t know how to combine these two data for analysis or calculation. Could you give me a demonstration?

Thank you very much.

Following up on this, I’ve been told that ATTITUDE.yaw should also be measured from north, and that’s a float value so should have as much precision as the autopilot’s EKF state estimator provides (even if some of that precision is false/inaccurate). Note that it’s measured in radians rather than degrees, so you’ll need to scale it for comparable values.

A naive estimation is possible from only magnetometer values by assuming the strongest field strength points towards north, but that value is quite subject to noise from the surrounding electronics, and has a potentially slow response time when the vehicle is rotating. Compensating for rotation with gyroscope readings gets complicated quite fast.

Since the autopilot is already doing those calculations for you it’s likely best to use them rather than trying to do your own calculation instead, so I’d recommend using ATTITUDE.yaw, as above :slight_smile:

1 Like

Thank you for your answer. I just tried this method. But why does the yaw Angle keep changing when the ROV is stationary? And it’s getting smaller all the time. The way I understand it is it’s always going to be around a value.

# Import mavutil
from telnetlib import DO
from pymavlink import mavutil
import matplotlib.pyplot as plt
import time
# Create the connection
# From topside computer

def DO_ATTITUDEFilter(msg_type):   
   
    master = mavutil.mavlink_connection('udpin:0.0.0.0:14550')
    x_time=[]
    y_yaw=[]
    while True:
        msg = master.recv_match()
        plt.ion()
        #print(msg)
        if not msg:
            continue
        if msg.get_type() == msg_type:
            #print("\n\n*****Got message: %s*****" % msg.get_type()) 
            #print("Message: %s" % msg)  
            print(msg.yaw)
            x_time.append(time.time())
            y_yaw.append(msg.yaw)
            plt.clf()
            plt.xlabel("time")
            plt.ylabel("yaw")
            plt.plot(x_time,y_yaw)
            plt.pause(0.1)
            plt.ioff()

DO_ATTITUDEFilter('ATTITUDE')

Here’s what I drew with Matplotlib:
Figure_2

Thank you for your answer!

Have you considered a DVL, or are you constrained to built-in sensors?

Until you go fiber optic on the IMU and/or have some direct environmental feedback, I’m curious how much more accuracy you could (should) expect.

1 Like

Thank you for your answer. Since MY budget is limited, DVL equipment is too expensive. The accuracy can only be improved by using the ROV’s own sensors or by adding other, cheaper sensors.

I’m not sure if or where ArduSub utilizes any EKF or what level inertial controller is being utilized when navigating. Most of my experience with Ardu- flight control is ground and traditional helicopters, where there are multiple layers of filtering before the controllers are tuned for navigational control. I hope I am saying that right, but my terminology is most likely off a bit.

Nevertheless, what I’m wondering is if you could take some of the post-filtered data that would normally be fed into an attitude controller and report that out, versus the raw data. It would have to be a bit more accurate, as I believe the entire point is to filter out most of what you are experiencing.

If I had to do a software only solution (sort of like Stanford did for Stanley in the DARPA challenges), this may be a place to start. In fact, it might be interesting to put a second IMU (like a Cube) onboard a BROV2, but with an attitude controller that is a bit more matured, like ArduCopter. Then, take a look at measured vs commanded data, and especially some of the more calculated parameters, and see how different it is than raw IMU from the hourglass Pixhawk.

Now that I think of it, the Cube alone would be a fairly significant upgrade on the IMU accuracy and drift. I’m sure someone has put a Cube in a BROV2 by now…

No matter what, the water around the vehicle is always moving and causing attitude and position effect on the vehicle. Even if the vehicle is perfectly stationary in the water column, if that water is moving, so is the vehicle…

It may help to re-calibrate your sensors, but in general inertial positioning systems tend to struggle when no motion is occurring, because sensor noise will often have some bias that can be challenging to fully account for in the sensor fusion algorithm.

ArduSub 4.0 uses EKF2, and ArduSub 4.1 (currently still beta, but planned for stable release soon) uses EKF3. Dev docs are something we’re planning to do more of, but in the meantime the ArduCopter EKF docs are the most relevant :slight_smile: