Auotonomous ROV / Dead Reckoning

Hello Blue Robotics Community,

I am currently working on a project to develop an autonomous underwater ROV using the BlueROV1 frame. Here’s an overview of my setup and progress so far:

  1. Hardware Setup:

    • ROV: Custom-built with 6 thrusters achieving 6-DOF control, based on the BlueROV1 frame.
    • Companion Computer: Raspberry Pi 4 Model B with BlueOS installed.
    • Flight Controller: Pixhawk 2.4.8 with ArduSub firmware.
    • Ground Station: Running QGroundControl on Ubuntu 22.04 LTS with a PS5 controller for manual control.
  2. Progress So Far:

    • I have successfully connected the ROV to QGroundControl, and I can manually control it using the PS5 controller.
    • The camera feed is visible on QGroundControl.
  3. Goal:
    My objective is to enable autonomous navigation using dead reckoning. I plan to send destination values through ROS 2 (Humble) to maneuver the ROV autonomously in real time.

  4. Software Configuration:

    • ROS 2 Humble is installed on my ground station computer.
    • MAVROS and MAVROS Extras have been installed.
    • I aim to use MAVROS to handle MAVLink communication and publish velocity commands for autonomous movement.
  5. Current Challenges:

    • Integrating MAVROS with the Pixhawk via the companion computer (Raspberry Pi).
    • Setting up a ROS 2 node to send velocity commands to the ROV for autonomous navigation.
    • Ensuring proper communication between ROS 2, MAVROS, and QGroundControl.
  6. Questions:

    • Is there a standard way to handle dead reckoning with ArduSub without using USBL/DVL, just using the relative position of rov?
    • GUIDE ME to start my autonomous journey of my rov with any kind of existing github repository for above to implement

Any guidance or suggestions on achieving autonomous navigation using MAVROS and dead reckoning would be greatly appreciated.

Thank you.

1 Like

Q: Is there a standard way to handle dead reckoning with ArduSub without using USBL/DVL, just using the relative position of the ROV?

Short Answer: No, not in a real-world scenario. You won’t know the ROV’s position without some sort of external reference sensor. USBL, DVL, vision-based localization, underwater motion capture, or even a “GPS on a stick” are common ways to provide the flight controller (Pixhawk/Navigator) with position data. Otherwise, the autopilot has no way to estimate its position or navigate to a specific waypoint.

  • GUIDE ME to start my autonomous journey of my rov with any kind of existing github repository for above to implement

Simulation as an Alternative:
For testing autonomy without the cost and complexity of real-world sensors, you can leverage simulation environments. Gazebo, Webots, Unity, or Unreal Engine (UE4/5) can be paired with ROS + ArduSub to provide “synthetic” positioning and sensor data. This lets you experiment with navigation algorithms and understand the control loops—though it won’t perfectly match real-world uncertainties.

You can try install ROS and SITL and integrate them with Gazebo, UE4/5, etc.

In simulation, you can feed perfect or noisy position data into Pixhawk for testing.

In real life, consider adding a USBL/DVL or vision-based system for underwater positioning

Check these:

Spending time researching and experimenting is part of the process. Start small with simulations, then gradually incorporate real-world sensors for a robust autonomous ROV system.

1 Like