Visual Inertial Odometry on BlueROV2

Hi there,
I’m a student working on my thesis, which is implementing visual-inertial odometry on the BlueROV2. I’m very new to this, so please bear with me. I’ve got the monocular. USB camera that came with the ROV, as well as a stereo machine vision camera pair from Deep Water Exploration. How do I get frames from the cameras, as well as the IMU data off the ROV? I’ve noticed the Log Browser on BlueOS that seems to record data; however, it’s not clear to me when it starts recording and what the reference time is. Is it the time the ROV turned on? Additionally, I noticed that in the cockpit, you can record video streams. Is there a way to get them in frames, or should I manually convert them from videos to frames? And how do I get the data from the machine vision cameras? Any help would be greatly appreciated.

Hi @ethanfaraday -

Where do you plan for the visual-inertial odometry processing to occur? If on the topside, with a computer that likely has an Nvidia GPU, you’ll be receiving the video streams via UDP or RTSP - you’d have to ask Deep Water Exploration on their software approach from there. You can receive information on the vehicle post via the Mavlink2Rest interface, via a a python script running on your topside computer as well - checkout the json payloads available at the various endpoints linked under Mavlink2Rest under Available Services (vehicleIP/mavlink2rest/xxx)

Cockpit is a GCS (ground control station) software, and so isn’t particularly relevant for your application. Once your software is outputting position estimates, you should be able to stream those to the autopilot via mavlink messages, and have the vehicle autopilot use them for navigation / position holding.

Hi Tony, thank you for getting back to me. I aim to do post-processing on my computer at this stage, so I’m really just looking to use the BlueROV2 as a data collection tool at the moment. Simply put, I just want to be able to retrieve the IMU data, onboard camera data, and the Deep Water Exploration machine vision camera data.

I think retrieving the vehicles information via the Mavlink2Rest interface is the way to go.

For the onboard camera data, how do I record frames from the stream? In cockpit, there’s a record stream button and I can then download that video stream, and convert it to frames. Is there an easier way to do this?

I’ve managed to connect and retrieve the ROV’s IMU data through the mavlink2rest interface, so thank you for that. I’m now struggling with receiving the video streams via UDP. I can see that the ROV is sending data, and the endpoint is my computer (with the right IP and port address); however, when I try to receive it, it just loads endlessly.

Hi @ethanfaraday -

For post processing, pulling the data from the .BIN log and processing videos recorded with Cockpit, or onboard the vehicle via the DashCam extension is your best bet. Mavlink2Rest is only useful for the real time data?

To record the stream directly requires the use of VLC, and generally this will work better if you configure your video stream to use RTSP rather than UDP, under Video Streams. VLC can open the network stream, and also stream it to file - however Cockpit is likely to provide better recordings, especially if you use the standalone Electron app version rather than the extension! You can add another video recording widget to capture the second stream, and even map a joystick button to starting recording all streams/ stopping recording all streams.

One more thing @ethanfaraday -

I’ve just learned that the latest version of Cockpit saves additional useful data to the exif data of snapshots captured - which can occur on a configurable interval- see the snapshot trigger control (select instead of video record) and check the saved image file properties!