Interfacing Cameras with Companion Computer

Hi there, Eliot! Our team is currently on setting up the companion computer and we have also successfully routed our Arduino to the topside computer for our sensors. As of now, we are in the process of trying a USB camera on the companion. We would just like to ask since we are quite stumped. The web interface System tab shows our USB camera and is also in the active services:

However, the Camera tab does not show our camera in the dropdown:

Do you know what seems to be the problem with this?

Additionally, do you recommend using an Raspberry Pi camera or a USB camera?
Lastly, is QGroundControl the place where we would watch the stream? We are currently confused as to where we could watch the video stream.
Thanks so much!

Hi @yvesyves,

Auto Streaming (H264)

Companion is set up to auto-stream the first H264-encoded video source that it detects (an encoding that’s efficient for streaming). The expected H264 camera output is shown in our Software Components diagram :slight_smile:

Non-H264 Cameras

By the looks of things the HP KQ246AA doesn’t support H264 output, which means to get it to stream video to the topside would require encoding to a streaming format on the Raspberry Pi. That’s technically possible to do, but it’s quite resource-heavy so you likely would have quite a bit of latency, wouldn’t be able to achieve the full framerate, and the performance of the rest of the companion software would likely also suffer.

Available Options

Because of those limitations, companion isn’t set up to automatically stream from non-H264 cameras. Our Low-Light HD USB Camera “has an onboard H.264 compression chip so that all of the video compression is done onboard and doesn’t place much load on the main computer.” Other H264 cameras should also work well automatically, and Raspberry Pi cameras can also be used (there’s some special hardware on the Raspberry Pi that’s specifically able to do the encoding efficiently for them - unfortunately it can’t be used for other cameras).

Other Streaming Formats

Some cameras have other streaming-compatible output formats available, such as MJPEG or H265. From a network data transfer standpoint those should also be fine for companion to use, but some modifications/additions to the current companion software would be necessary to stream them.

IP Cameras (Bypass Companion)

Yet another alternative is using IP cameras, which connect directly to an ethernet network so companion doesn’t even need to know they exist. The easiest way to connect an IP camera is via a network switch, such as our recently released Ethernet Switch, which mounts directly over the Fathom-X. Other network switches can of course also be used, or routers, but they tend to be larger and/or generally more difficult to set up in an ROV enclosure.

Accessing the Stream

When the companion computer is streaming it does so over UDP to the topside computer (IP at port 5600, as shown in the Camera page in your second screenshot. QGroundControl is set to look for a stream at that port by default, but if you want you can instead go to to access an sdp file that can be run using VideoLAN (VLC), or you can view it through any other application that’s capable of receiving and displaying a H264 stream over UDP (see here for a list of a few options).

As a side note, is there a reason this was posted privately? While I understand some information is sensitive and shouldn’t/can’t be shared with the general community, your post doesn’t seem to contain anything like that, and others who are interested in the topic could likely benefit from both the question and response. If there’s no particular reason that it’s private please let me know and I’ll convert it to a public post :slight_smile:

1 Like

Hi Elliot, again thanks for this!

I think that would be the case. We’ll try to find another camera with these capabilities. I will also look onto the low-light camera that you have linked.

I see. We’ll be looking in to this as soon as we acquire the proper camera.

Apologies for this. It slipped my mind when I was about to ask the question. Yes, this may be converted to a public post. Again, I am always appreciative of your immediate response to our questions. Thank you very much and have a great day!

1 Like

Great, thanks!

I’ve made it public, and added some section headings to my response, along with a link to a list of a few known video receiving softwares from a previous post :slight_smile:

No worries - glad I was able to help :slight_smile:

Thanks - you too!

1 Like

Hi there, Eliot! So far, we have had a new camera to try. Unfortunately, the issue is still the same with the Camera tab not being to detect the camera, although the System tab shows the video device and video as an active service. I’m suspecting that the camera we have bought still doesn’t have H.264. However, we have seen from the forums that the feed may be converted from one format to H.264 as shown here. We have tried the code and while we are not sure if it works, it definitely is better than our previous attempts which only shows the pipeline either getting errors or closing. The code that we used is:

gst-launch-1.0 -v v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,type=video,framerate=30/1 ! jpegdec ! videoscale ! videoconvert ! x264enc tune=zerolatency ! rtph264pay config-interval=10 pt=96 ! udpsink host= port=5600

and produced an output like this (although this shows only the start and the end since it is too long):

Setting pipeline to PAUSED ...                                                  
Pipeline is live and does not need PREROLL ...                                  
Setting pipeline to PLAYING ...                                                 
New clock: GstSystemClock                                                       
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = "image/jpeg\,\ wid
th\=\(int\)1280\,\ height\=\(int\)720\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ 
colorimetry\=\(string\)2:4:7:1\,\ framerate\=\(fraction\)30/1\,\ type\=\(string\
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = "image/jpeg\
,\ width\=\(int\)1280\,\ height\=\(int\)720\,\ pixel-aspect-ratio\=\(fraction\)1
/1\,\ colorimetry\=\(string\)2:4:7:1\,\ framerate\=\(fraction\)30/1\,\ type\=\(s
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = "video/x
-raw\,\ format\=\(string\)I420\,\ width\=\(int\)1280\,\ height\=\(int\)720\,\ in
terlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ ch
roma-site\=\(string\)mpeg2\,\ colorimetry\=\(string\)1:4:0:0\,\ framerate\=\(fra
Redistribute latency...       
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 1874450511        
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 7397

We also tried to do this through VLC and created an .sdp file which contains the following code:

m=video 5600 RTP/AVP 96
c=IN IP4
a=rtpmap:96 H264/90000

However, VLC only runs, loads, and just does nothing. We are currently trying to watch the stream from the camera in the companion computer to the topside. Do you know what could be the problem with what we did? Thanks!

Generally the available output encodings are provided with the marketing materials of the camera (e.g. on the webpage if you bought it online). If it doesn’t offer a H264-encoded output then it won’t work by default with Companion.

While it’s possible to use gstreamer to read in one encoding and convert to another one, that decoding and encoding process then needs to happen on the Raspberry Pi, which uses up significant CPU, and generally has quite poor output framerates.

That command tries to get a 720p30 jpeg-encoded stream from the device at /dev/video0, and decodes it before re-encoding it with h264 and then sending it to whatever device has the IP address If you’re running that on our companion computer then your topside IP is likely, but beyond that I assume it should at least kind of work if your camera supports 720p30 jpeg-encoded output and is the device at /dev/video0, although there may be other gstreamer details that I’m not aware of that could stop it from working as expected.

I would also note that our companion computer repeatedly tries to connect to cameras if it can’t find a valid one, so you may need to close the video screen session (screen -S video -X quit) if you’re not overwriting/using it to run your alternative code.

VLC can only display a stream that it receives - same applies to gstreamer. If you’re using the default .sdp settings then you should expect VLC and gstreamer should be pretty much equivalent (at least in terms of showing the stream if it exists) :slight_smile:

1 Like

Hi Eliot. What you sent has been noted. We’ll look into it again and find the more appropriate hardware. We’ll see how we could make it work. Thanks again for the help!

1 Like

Hi there, Eliot! We are considering for now on just trying to get a video stream to the topside, although not necessarily on QGroundControl. GStreamer has already been installed on the topside, as well the plugins required in OBS Studio. For sending the video stream, this is the code that we have used in the companion:

gst-launch-1.0 -v v4l2src device=/dev/video0 ! "image/jpeg,width=800, height=600,framerate=30/1" ! rtpjpegpay ! udpsink clients=

and we were able to get this output from the terminal:

The LED on our camera has also lit up, giving us a sign that the camera is working.

On the receiver side, we have used this code on the OBS plugin Gstreamer Source:
udpsrc port=5600 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! video

However, there is no video output in OBS Studio after clicking Apply or OK.

As of now, we are currently stumped as to why there is no video output on OBS.

On a side note, we have also tried the conversion of MJPEG to H264 and have seen the LED on the camera light up, as well as the increase in CPU usage of the Raspberry Pi. However, upon starting QGroundControl to check, we have been met by errors such as these:
Would you have any idea as to why this has happened?

We have also tried another pipeline which looks like it plays a little nicer:

gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=800,height=600,framerate=30/1 ! jpegparse ! rtpjpegpay ! udpsink host= port=5600

and had a better looking output in the Raspberry Pi terminal:

On the receiving end, this is what we used:

gst-launch-1.0 udpsrc address= port=5600 ! application/x-rtp, encoding-name=JPEG,payload=96 ! rtpjpegdepay ! jpegdec ! videoconvert ! autovideosink

However, there is still no output video or window displayed and this is what we only get from the cmd:

Any help provided is highly appreciated. Thanks!

I’d recommend using a different port (e.g. 5610) just to make sure there aren’t any conflict issues with companion’s default streaming functionality or QGC’s receiving.

Better to try things one at a time/in isolation. First try receiving with gstreamer from the commandline (e.g. use the gst-launch-1.0 program and output to autovideosink) to confirm whether your pipeline is correct and that the stream is reaching your computer, and separately check the obs-gstreamer is working properly with a test pipeline (e.g. videotestsrc ! video).

When I was working with OBS-gstreamer previously, I found that it uses a different version of gstreamer to QGC (MinGW instead of MSVC), which can cause some compatibility issues if you’re building both. I’m not sure if you’re also building QGC, but given it’s giving errors about a MinGW gstreamer it’s potentially a related issue.

1 Like

Thanks again, Eliot! We’ll try what you suggested and come back to you for updates. :grinning:

Hi, Eliot! I’m happy to tell you that it works now! Thanks for your inputs, they really helped a lot to make it work!

1 Like

Great to hear! :smiley:

Glad I was able to help :slight_smile:

1 Like