No we used the code out of the box. We tested running it on a linux machine with ubuntu 22.04. The ubuntu machine has plenty of resources but has very bad latency. We were getting about 1 frame every 4-5 seconds. I will add we also tested out changing the code after having latency problems running the given code and couldn’t get that to perform any better.
Hi @ssicari,
In this case, it appears to be a problem with the libav installed in your machine. I would recommend changing the avdec_h264 with another h264 decoder.
In my machine I have the following:
libav: avdec_h264: libav H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 decoder
nvcodec: nvh264dec: NVDEC H.264 Decoder
vulkan: vulkanh264dec: Vulkan H.264 decoder
libav works without problems.
You can test the pipeline using the following command:
gst-launch-1.0 udpsrc port=5600 caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264' ! rtph264depay ! avdec_h264 ! queue ! autovideosink
Thank you that works now. Weirdly enough the same code that was giving me problems also works now without being changed. Perhaps it was related to our camera having problems due to the resistors being pushed against the camera mount. I fixed that last week and it all seems to be working now
Hi @qimg -
Try that with an H264 stream, not MJPG!
Is blue os not able to get rtsp with mjpg? I want to add url video in cockpit, can it work?
Hi @qimg -
BlueOS and Cockpit do not support MJPG video currently. An H264 stream should work just fine though!
So, I am also trying to integrate a MJPG stream.
I’m actually trying to troubleshoot another issue, but I’m currently away from our ROV. My goal is to integrate an off-the-shelf camera with a ROV clone (just the Pi4+Navigator). In my infinite wisdom, I left the spare BR low-light camera in our lab and won’t have it for the next month of testing. Thus, I assumed any USB camera would suffice.
When I plugged in the camera to the USB 3.0 port on the Pi, I only see 2 options for the stream encoding.
The MJPG allows for 30fps, while YUYV only allows for 1fps for me. It appears that I can create a UDP stream, as sometimes the stream will be created, and I can view it in VLC using the SDP link. The UDP address doesn’t show anything on VLC.
More often than not, after restarting the Pi, the stream works for the first minute or two (on VLC), then cuts out and will not be accessible from VLC. During none of this time is the stream visible on QGC or Cockpit. Also, the BlueOS video streams page often automatically deletes the video stream or displays a timeout or no video streams available error.
My question: is the MJPG-encoded camera able to be displayed on QGC? Based on the fact I can sometimes see it on VLC, I assume video is being sent, but I can’t quite tell.
My question: is the MJPG-encoded camera able to be displayed on QGC? Based on the fact I can sometimes see it on VLC, I assume video is being sent, but I can’t quite tell.
It does work with QGC if you use RTSP instead of UDP for that. Be aware, though, that a 30fps 1080p MJPG stream can use up to around 50Mbps of bandwidth.
Here’s an example, using our Low-Light HD USB Camera:




