ROS integration with BlueROV2 Navigator

Hi,
I am currently trying to integrate ROS with BlueOS using ROS BlueOS extension.
However, I am not able to read optical and acoustic data from the tilt camera and Ping360, respectively.
If I undestrand correctly, BlueOS forwards such data via the UDP protocol, which is not supported by the ROS BlueOS Extension.
Is there a way to deactivate this protocol without shutting down BlueOS docker or, alternatively, is it possible to read data in ROS BlueOS extension using UDP protocol?

Moreover, I have a question about the new Sonoptix ECHO Multibeam Imaging Sonar. In the specifications of the sonar, it is reported that “Sonar output is video only. Access to raw echo data is not supported”. I was wondering if it is possible to access sonar image through ROS framework (e.g. for image processing algorithms) or is the output only for visualization using the dedicated web application?

Hi Francesco!

First, unfortunately the ECHO only has video output available at this time. The manufacturer is working on an update that may allow some amount of raw data output, but it is not yet available.

As for the ROS extension, I’m not super familiar with it. I’ll see if anyone can provide you more information!

Hi Francesco - just to clarify - the ECHO video output is available both on the web interface, as well as via the API as a video stream that a visual pipeline may be able to run on.

Hi Antony, thank you for the clarifications. I still have a couple of questions.
Since the goal is to use the sonar images in real time for CNN-based AI algorithms, I would like to be able to read the images (as an array of values) in real time directly on the PC onboard the BlueROV via serial or udp/tcp connection. Is there anything available that allows sonar images to be read via the API directly on the on-board PC? At the same time, would it be possible to interface with the sonar to send a request to change its parameters such as range and frequency?

Hi Francesco,

Right now the ROS extensions is in its initial state, It provides the basics for ROS access and toying around. But sadly it does not have camera or ping integration, or any integration besides mavlink.
But I do hope that the community help us to improve it or use as an template / example to create their own ROS extensions.

For ping360 integration, you can check this awesome driver created by Centrale Antes Robotics.
https://wiki.ros.org/ping360_sonar

Hi Patrick, thank you very much for your suggestions.
Can you provide some information about the Sonoptix ECHO Multibeam Imaging Sonar?
Would it be possible to process the output image as an array of values and send a request to change the operating parameters?

Hi Francesco,

From the user guide you should be able to interface with it using the REST API and accessing the video directly with webrtc.
It’s necessary to investigate the API to see if it returns the data as a websocket or something that is more friendly to ROS, otherwise it’ll be necessary to deal with the webrtc video directly.