I’m a new user of the Blue Robotics Blue Boat platform, and we are working on integrating MOOS-IvP into our setup. MOOS-IvP has a decision-making framework specialized in marine tasks and utilizes a lightweight publish-subscribe middleware, which we believe can greatly enhance our autonomous navigation capabilities.
Our specific goal is to integrate an OAK-D camera to perform object detection and avoidance. We have seen some discussions on the forum regarding the OAK-D camera extension for BlueOS, but we have already built an application for object avoidance in MOOS with the OAK-D camera and would like to integrate them on Blue Boat.
One potential solution I’m considering is running MOOS-IvP on a separate Raspberry Pi connected to the OAK-D camera for object detection. This setup could handle the object avoidance logic and send high-level commands (like speed and course) via UART to the Navigator, which would then control the boat. However, we haven’t fully explored this solution yet, and I’m unsure about any potential challenges or limitations.
Ideally, it would be great to run both MOOS-IvP and the OAK-D integration directly on the Blue Boat’s Raspberry Pi that’s already running BlueOS, so we don’t need an additional Raspberry Pi. However, I’m not sure how feasible this is regarding performance or compatibility.
I would appreciate any guidance, opinions, or examples of similar integrations. Thanks in advance for your help!
Hi @andycyl -
Welcome to the forums! I’ve heard of MOOS before, in the context of government projects and AUV control - it has been around quite a long time right?
If you’re able to docker-ize your existing MOOS based code, you could certainly deploy it within BlueOS as an extension. When setup properly, it should be able to access the Oak-D camera via USB, and communicate with the Autopilot via Mavlink2Rest or pymavlink.
Hi @tony-white,
I’ve successfully installed the OAK-D Video Streams extension developed by @williangalvani on my BlueOS system. However, I’m encountering some issues, and I’m not sure if I’m starting the OAK-D camera correctly. Here are the details of my setup and the issues I’m facing:
OAK-D Detection:
I ran the command lsusb, and the OAK-D camera is correctly detected:
Bus 002 Device 006: ID 03e7:f63b Intel Myriad VPU [Movidius Neural Compute Stick]
Video Stream Errors:
When I go to the Video Streams section in BlueOS, I encounter several errors.
Clarification Request:
Should the video stream appear under the Redirect section in BlueOS if everything is set up correctly?
If not, where exactly should the stream be accessible, and what’s the proper workflow to start the OAK-D camera within BlueOS?
Everythign looks correct there! The video streams should be available in Cockpit, you’ll need to add a new video stream and select either the Oak RGB or Stereo Disparity. You can see how that’s done in this recent video, here!
Thank you for your previous instructions! I’m happy to confirm that I’ve successfully streamed the video to Cockpit.
Now, I’m working on enhancing the OAK-D extension’s functionality to run an object detection model and overlay bounding boxes directly onto the video stream. My goal is to stream this processed video to Cockpit, QGC, or a web interface just for visualization.
I have a few questions and would appreciate your insights:
Latency:
I’ve noticed that the RGB video stream to Cockpit experiences an approximate 2-second latency. Is this behavior expected? If not, what strategies do you suggest for reducing the latency?
Video Encoding with Overlays:
Since the current RTSP server in the OAK-D extension uses H.264 encoding, I’m concerned that overlaying bounding boxes may require switching the format to MJPEG (as it’s frame-based). Here are my key questions:
Is it possible to retain H.264 encoding with bounding box overlays to keep streaming logic used now?
If switching to MJPEG becomes necessary, do Cockpit or QGC support this format?
I look forward to your advice on the best approach.
Thank you again for your support!
I assume this has to do with the encoding step, blueos and cockpit are already tuned to lower latency as much as we can. I also do not recall having that much latency on my tests, but I might be misremembering.
Again, not super familiar with the oak-d environment, but I assume you should be able to add the overlays before encoding, no?
Alternatively you could send it in a different data channel, maybe? Then overlay on top of the video with a custom cockpit widget.
I think qgc support MPEG, but not JPEG. I’ve played “HTTP Motion JPEG” in cockpit before by using an Image widget pointing to the correct URL, maybe that is an option?