Hi. I am looking to implement SLAM on the BlueROV2 using the Ping360 sidescanning sonar, camera, and IMU, but I am a bit unsure where to start. I began by making my own service and building a socket connection to receive the sonar data as well as creating a MavlinkMessenger instance to get orientation and camera data. The code base is pretty large and I am having a bit of trouble figuring out what features I need to develop myself and what is already provided in the code base. Any suggestions or tips would be much appreciated.
Hi @andrewkwolek -
Welcome to the forums, that sounds like a cool project!
There is very little available to get you started on SLAM specifically within the Blue Robotics “code base.” The Ping360 is a scanning sonar, not a sidescan!
Generally, recording video and sonar data, and processing it on a topside computer to make whatever map you’re trying to create is likely the best approach. You can record video within QGC or Cockpit, and PingViewer log files will have the ping360 data. Mavlink messenger won’t have any camera data for you, but if knowing the orientation of the camera is important, the vehicle orientation + camera tilt position should be able to provide this.
Is your goal autonomous navigation, or just creation of a map?
Hi @tony-white. Thanks for the response. I think for now I would like to just try to build a map, but a stretch goal would be to do it autonomously using a frontier based navigation algorithm. I’ve been trying to understand the code base for the last few weeks and although I think I’m getting a good grasp of things, it is still pretty daunting. I don’t have a ton of experience with networking and Docker so trying to understand that while applying it to the BlueOS code has been tough.
When you say “say recording video and sonar data” are you implying that the processing would be done after the flight? My hope was that it would build a map during flight.
Also, for testing purposes, what do you normally do? I’ve been building a docker container and running it on my laptop, but that abstracts away a lot of the other components. I also flashed BlueOS to a Raspberry Pi 4B, but I’m not sure how to create my own image to flash on it from my forked repo.
Thanks!
Hi @andrewkwolek -
Have you got any slam software mapping tools in mind? That seems like the more daunting code base to me!
Yes, I think it’s good to walk before you run. If you can process logged video and sonar data (that is time-sync’d of course) after the mission into a map, then the natural progression is moving faster and doing it in (near) real time! You could maybe capture such data with a group that operates the hardware in your area, and simulate for many hours to perfect your approach, before wow’ing them in the field with the result! I’m aware of several projects out there doing this general kind of thing, maybe they would be kind enough to share feedback with you.
The whole point of BlueOS is to make development easy! You can definitely get code working on your local machine, and if it’s in a docker container building that as an extension is a cinch. Checkout the documentation and let us know where you get stuck!
I typically edit some python code for the backend, and then do the html front-end, vue-2 can make a nice pretty interface easy! Once you have a github action setup to build this code as a docker image, you can manually install it - the first time is a bit of. a hassle to enter everything in, but then updating is quick in the future - just click edit on the extension, and then save, and it will pull the image from docker hub (assuming the one there is newer!)
What repo did you fork? I typically use this example to get started. You can see this Simple PingViewer extension for reference (from this guide), but more relevant is this video recorder extension I’ve just been working on. As of today it’s working well enough to share, apologies for any rough edges! It will only work if you have deleted the default low-light USB camera stream from the Video Streams page, which would make driving your ROV tough (no camera feed in Cockpit, as both processes can’t access the stream at the same time! A more competent software engineer could likely get around this by writing code to interact with Mavlink camera manager, but as this is developed for an autonomous, tether-free application that wasn’t necessary.)
In the meantime it sounds like you need to get a Navigator and camera hardware, if not a whole ROV to really get started?
@tony-white I wasn’t sure exactly how to start developing so I was looking at creating a new service within BlueOS or building an extension. I saw it is recommended to build a service within BlueOS and then make it into an extension after verifying it works within BlueOS first. The repo I have forked is the master BlueOS repo.
I do have a BlueROV2 that my Masters program purchased so I have the full hardware to work with. As far as SLAM goes, I was looking to just build something from the ground up as a learning experience. My only experience is using the NAV2 stack in ROS2, but that kind of abstracts away a lot of the low level details.
Hi @andrewkwolek -
Did you fork the entire BlueOS repo? That shouldn’t be necessary- there are lots of different examples for extensions, and you shouldn’t need to modify anything in BlueOS core…
Maybe ORB-SLAM or LSD-SLAM/PTAM would be a good software package to investigate?
Graph based slam tools include g2o and GTSAM, or Ceres Solver.
I hadn’t heard of any of these, but the large language models I asked had!
@tony-white
Yeah I forked the entire BlueOS repo. I definitely need to understand how to communicate with the blueos-core Docker container. I understand that the services are located at specified ports, but I am not entirely familiar with how to establish a connection to those ports to access the API, or how the backend communicates with the BlueOS backend to get the required data from Mavlink.
Hi Andrew -
BlueOS has different services for different things. The Mavlink camera manager handles streaming. MavlinkServer (new) or Router/MavP2P route mavlink messages. The Autopilot container holds ArduSub or ArduRover. Mavlink2Rest makes mavlink information available on a web interface. The available services page lists many of these, and the docs detail their function.
You shouldn’t need to build your own BlueOS version though, just extensions to run within it!
Got it. I’ve looked at all that and have a decent understanding. Is there a way to visualize the endpoints that are available when running the BlueOS container? I think that would be helpful for understanding everything that is available to work with.
Hi @andrewkwolek -
I’m not sure what you mean by visualizing the endpoints, are you trying to understand how the vehicle works by default? The table linked provides the port numbers and descriptions for each service… if you go to Available Services, and click the link for Mavlink2Rest, you’ll see all the raw autopilot telemetry that is making it into the vehicle .BIN logs (found under Vehicle logs) - this telmetry can also be accessed by your own code via this interface or pymavlink.
I just meant being able to see all of the API endpoints for each service, but I see I can access that through the documentation when running BlueOS. I started writing the service and am working on getting all of the data from Mavlink.
Hey @tony-white!
I made some progress with the service. Mostly working on getting the data together right now. I was able to test the service with the robot and it is able to successfully retrieve data.
I do have a few questions regarding the Ping360. In the ping-python repo, there are a few examples for creating a ping instance, but it seems like this would be how you would set it up separate from BlueOS. Since it appears to be properly set up through BlueOS already, is there a way that I can just get the data through an API call or something? Maybe I can get it through the PingViewer application, but I worry about latency discrepencies if I’m retrieving data through too many calls.
One other thing. My SCALED_IMU2 data is receiving 0s on the acc and gyro readings even when I move the ROV, however the RAW_IMU data gives me nonzero readings on those values. Any idea why that may be the case?
You can check out what I have so far here.
Thanks!
Hi @andrewkwolek -
PingViewer would not be possible to run onboard the vehicle, so is not likely a good approach for anything but logging data on a connected topside computer. This specific example should have the bones of what you need to ingest the data into your extension! Ping-python can be used separate from BlueOS, but no reason it can’t be used running within an extension onboard - that’s kind of the whole idea! There have been quite a few posts on the forum working thru parsing the ping360 data if you get stuck.
A good check for parameters is to navigate to Available Services, and checkout the link to Mavlink2Rest as mentioned previously. Scaled_IMU2 seems to only hold x,y,z magnetometer data. It’s likely best to use AHRS2 as this contains the already filtered/ processed orientation data, rather than the raw values.
If you’re running BlueOS beta, you can switch to MavlinkServer (under Mavlink Endpoints, at the top) and have a much more powerful view of data:
Thank you! This is very helpful. The reason I asked is because I was looking at the Mavlink documentation and the services documentation and it seemed that there should be acc and gyro data in the SCALED_IMU2 message so I was confused why there wasn’t.
@patrickelectric @tony-white I’ve been using the Mavlink2Rest tool in order to get data for gps, imu, etc. I’m seeing that the imu data is only being updated at a frequency of 2Hz which is pretty slow for implementing SLAM. Is there a way that I can change the update frequency to be faster or connect directly to the mavlink server via UDP?
I will continue building the service with the current method of data retrieval, but I fear the update rate may be way too low to properly localize the ROV.
EDIT: I found the parameter that controls the frequency in autopilot parameters! That being said, my question still remains about a direct UDP connection. Thanks!
Hi Andrew-
Please share your code - it’s definitely possible to retrieve this data at vastly faster rates, but tough to say why you’re not achieving them without taking a look! The Mavlink2Rest page or the MavlinkServer page in Available Services will show you the default rate the messages are being updated at. The EKF loop runs at 400Hz I believe?
@patrickelectric may be able to comment on how to access data directly from MavlinkServer
It is however worth mentioning that integrating the motion data to determine position is not likely a viable approach regardless, as the IMU used is too noisy of a data stream.
@tony-white This is how I am trying to establish the UDP connection. You can also see my repo here: GitHub - andrewkwolek/BlueOSSLAM
async def listen_udp():
# Listening for UDP packets on this address and port
listen_address = (DOCKER_HOST, 14550)
loop = asyncio.get_running_loop()
# Set up the UDP endpoint to listen for incoming datagrams
listen = await loop.create_datagram_endpoint(
# Use lambda to pass data_manager to protocol
lambda: MavlinkUDPProtocol(data_manager),
# Listen on all available network interfaces (0.0.0.0)
local_addr=listen_address
)
logger.info(f"Listening for UDP packets on {listen_address}")
# The event loop will now handle the datagrams asynchronously
# listen_udp will continue running and processing incoming data.
await asyncio.Future()
logger.info(f"Exiting")
async def start_services():
asyncio.create_task(listen_udp())
# Running the uvicorn server in the background
config = Config(app=app, host="0.0.0.0", port=9050, log_config=None)
server = Server(config)
await server.serve()
if __name__ == "__main__":
logger.debug("Starting SLAM.")
if os.geteuid() != 0:
logger.error(
"You need root privileges to run this script.\nPlease try again, this time using **sudo**. Exiting."
)
sys.exit(1)
asyncio.run(start_services())
I ran the same code on the actual robot and it worked so I believe there is some issue with the Docker container when I run it with SITL. I don’t believe the Mavlink data gets broadcasted on port 14550 in SITL because QGC doesn’t receive any data either when running the docker container.
Hi,
You should be able to receive data from the surface computer if the IP is configured correctly, by default BlueOS connects to an UDP server in 14550, you can also connect to the vehicle UDP server in 14550, but for that you should enable the GCS Server Link endpoint.
SITL only connects via TCP on port 5760, for more info check here
mavlink-server provides the same API as mavlink2rest, so it’s possible to receive data via UDP (raw mavlink) as others routers or using directly the websocket.
The websocket message frequency will be the same as the frequency via the UDP channel, on both cases it’ll be necessary to configure the stream rate parameter to get IMU data in higher frequencies.
For a minimal websocket example, you can check it here.
@patrickelectric I tried making a new set of endpoints for my extension using the Mavlink Endpoints service. I used the same IPs as GCS but changed the port from 14550 to 14555. Unfortunately, the connection is not working and my service is not getting any data. When I set it to port 14550 I can receive data, but then I can’t run QGC because it is listening on the same port.
EDIT: I am getting some strange behavior even when I run my service on port 14550. Sometimes I start it and it will receive data, and other times it won’t make the connection…