BlueROV2 (ROV) to AUV

Hi :slight_smile: My team and I are doing a aproject for university that is basically to change the BlueROV2 in a way that it becomes autonomous. Our main goals is for it to go from point A to point B without any joystick control (just an inicial input). Later on, we would also like for it to receive that input (the mission) through wifi. Also we would like to save the video stream in a card so later the user can save it.
We have a system concept more or less defined but it would be very helpfull if someone has done anything similar to this…
Any help would be amazing! Thank you :blush:

3 Likes

The autopilot communicates with the MAVLink protocol. You can use MAVProxy as a development tool, and as an introduction to poking around what the autopilot can do.

Our topside software, QGroundControl can save the video stream directly to the hard drive (with a tether), but it is also possible to save the video locally on the vehicle.

Hi @ericagomes
We’ve been tossing around the AUV idea for a few years now, at this point in time, the only solution is to use a Water Linked system with a tether. That will drive waypoint missions.

The community lacks affordable underwater wireless positioning systems. Here’s a bit of a discussion: ROV and AUV Localization an Introduction - #6 by kevink

Hi @kevink. Thank you for your answer… We saw that but for this kind of project is too much, so our idea is to use the sensors that already exist in the bluerov2 to get some kind of relative position. For now we will only be working in a water tank so we think it will enough. Can you maybe tell me how can we obtain the sensors values?

Thank you for your answer @jwalser! Our idea for now is to connect an odroid to the raspeberry pi. In the Odroid we are thinking about installing qgroundcontrol and then share the odroid “desktop” to a ground computer so the user can reach it. Maybe this way we can avoid handling to much with mavlink because it looks a bit too complicate. We would only need to export the signals from the sensors and create control signals or something like that in the qgroundcontrol. Do you think this will work?

@ericagomes I’m not much of an expert on getting sensor values out of the Mavlink messages, but you can try reading through this documentation: MAVLink Interface — Dev documentation

The compasses and gyros inside the Pixhawk won’t be accurate enough to get a reliable position, it’s been tried. About the only sensor that may work in very shallow water is a PX4 Flow sensor. I think that has been tried without good results.

1 Like

Hi @ericagomes,

To get the data you’ll need to use mavlink, how are you planning to solve this adding the odroid ?
Like @kevink said, it’s not possible to use the sensors to estimate the ROV odometry.
If you are using a water tank, the best idea to solve the position problem is to use a camera and a mark (QR Code/ Aruco) above the ROV.

If you want to use:

  • matlab, there are some packages that provide mavlink abstraction.
  • ROS, take a look in mavros.
  • Python/C++/etc: mavlink :slight_smile:

I’m also working on making my ROV autonomous.

The first step is to format a SD card with raspian and get ssh working over wifi. You probably want to do this with a separate raspberry pi first since getting an SD card in and out of the blue rov2 is a royal pain. For supplying wall power during development, I used a micro-usb elbow adaptor (80 cents) and a hand soldered assembly which has a male USB A and micro B adaptor connected through a BlueRobotics switch. Sparkfun sells easy to solder USB connectors. The assembly is just long enough to reach out of the enclosure, but short enough to fit inside when not in use. I also found that the sony CP-ELSVP battery fits perfectly in the enclosure, allowing the raspberry pi to be powered without the main battery. I found this useful for test dives.

If you want to record video, use an UHC card. I’m using a lexar 1000x and it works fine. Even so, when you implement your video recorder make sure you buffer everything to memory first and then write it to the SD card. I experienced write latency up to 4 seconds, enough for the camera API to start dropping frames.

If you manage to fit the odroid inside the enclusure, let me know how you did it; I’m still using the raspberry pi. My aim is to navigate using the camera. So far I’ve sped up ORB_SLAM to run at 15 frames per second, but I now have heating issues due to the tight space. Part of the code is published at GitHub - 0xfaded/pislam: Real-time feature extraction on the Raspberry Pi 3. The full sped up SLAM will be released soon.

Good luck.

We are doing a a project to change the recently purchased BlueROV2 to autonomous with image capture followed processing and then according navigation.
At present, we are able to control Blue ROV2 through groundcontrol joystick and additionally we are able command ROV through mavproxy(arm throttle, rc 3 1600 etc. ) on Rpi.
Now we are planning to write python script in Rpi attached to pixhawk (ardusub) through USB. Can you send some command lines to move Blue ROV2 (Arming/disarming, forward, reverse, lateral right/left, depth hold mode etc.) in python language. We would like to run this python script on Rpi bootup.

Any help would be amazing! Thank you

Take a look here:

Hi,

Could you please give me a quick hint on how to save the video locally on the vehicle?

Cheers

Hi,

Probably this is what you are looking for:

Thank you, that’s exactly it