How to stream another cameras' video

Hello, i want to stream a usb camera simultaneously with pi camera i know that is not possible in Qground i just want to stream it via opencv just as the example in ardusub documentation so i can do some image processing on it
I think that is kinda belongs to gstreamer but i came here to see if there is any solution besides writing a new script to stream from another camera
Thanks in advance

2 Likes

I would like to see a walk through for this also. I need a second USB camera. I use ethernet cameras, works great. Need USB camera though.

There isn’t a plug-and-play method of adding an extra camera stream, but it’s also not crazily difficult if you’ve done some programming before, and especially not if you’ve worked with gstreamer before. The following assumes you’re using a non-networked camera (e.g. a USB camera or something using the CSI bus (e.g. RPi camera)).

Getting a H264 Stream

Blue Robotics USB cameras already come with a h264-encoded stream as an access option, so if you’re using one of them or another similar camera then you can skip this step. RPi cameras are a bit of an outlier because the camera itself doesn’t send h264, but the RPi is set up to provide a h264 interface to it so it can be used as though it does. The only thing to note there is that the RPi has to work a bit harder when using the RPi camera, which might slow some things down a bit, although for the most part it shouldn’t be an issue because most of those extra operations are happening on the GPU.

All other cameras will need to have their stream converted into a h264 stream by gstreamer, which is slow and not recommended - better to use a different camera if possible.

Forwarding the Stream over UDP (Gstreamer)

Getting the Device ID

I’d suggest using gst-inspect-1.0 or gst-device-monitor-1.0 to find the device id of the cameras you’ve connected. This lets you tell gstreamer which device you’re trying to connect to.

Note that each camera often appears as a pair of ids, and audio is often set up as a video device when available. To get the correct ids

  1. Plug in any cameras/audio devices that will be connected during operation
  2. Turn on the ROV/RPi
  3. Check the available ids and write down the ones with a h264 option
  4. Once the pipeline is set up (next section), create your parameter files to specify the relevant id, and swap the ids as required to get the cameras aligned to the desired ports

Setting up the pipeline

Existing pipeline

It’s relevant here to look at how the bluerobotics companion computer sets up the existing pipeline.

The file .companion.rc is run on startup. Video is started on line 18 (if relevant, audio is started on line 22). Line 18 starts a ‘screen session’ (basically opens a new terminal and gives it a name so you can access it again later), and runs a script called streamer.py, which starts the video stream and monitors it - restarting the stream at most 5 seconds after it fails. Note that stream failures are rare, so most of the time this is just chilling and making sure the stream is still available.

To actually start the video stream, streamer.py runs a bash script start_video.sh, using either parameters that are passed in as command-line arguments, or by reading them in from a file vidformat.param. start_video.sh is basically set up to

  1. parse the input parameters
  2. check if those parameters are for a valid h264-encoded camera
  3. if not, try to find a valid h264-encoded camera
  4. start a gstreamer stream with the specified camera and parameters

Your pipeline

To set up your own pipeline, you can either piggy-back off the existing scripts with modifications to handle your extra camera(s), or copy the relevant/desired components of them and make that run through .companion.rc. The main important factor is making sure the different pipelines can’t get confused with each other. You’ll want a separate parameter file for each camera, and you’ll need to make sure all streams go to a unique port so they’re not competing for the same receiver on the other end.

This is one way of doing the piggy-back approach. It replaces the file gstreamer2.param from the home directory (/home/pi/) with gstreamer2_BRF.param for the Front camera, and gstreamer2_BRT.param for the Top camera (both cameras were blue-robotics (BR) cameras). It has the downside that if one stream fails both/all of them are restarted, but was simpler to program at the time than it would be to properly handle individual streams in a scalable manner (although if you implement that feature it’d be great to make a pull-request to the bluerobotics companion repo). Given the low frequency of stream failures it likely isn’t much of a problem.

EDIT: there’s now another approach covered here, which keeps the cameras separated, and makes it easier to add additional streams.

Receiving the stream

QGroundControl is currently only set up to receive a single video stream. For adding one or more extra cameras, the following are some possible ways of receiving the incoming UDP stream(s):

  • obs-gstreamer
    general instructions
    1. install Open Broadcast Studio (OBS)
    2. install gstreamer
      • read the prebuilt section of the obs-gstreamer README first - provides some useful simpler install links
      • read the normal gstreamer install instructions for extra requirements like updating PATH environment variable
    3. install the obs-gstreamer plugin
      1. download the latest release (.zip) from the releases
      2. move the plugin file to the obs-studio plugins folder, e.g. C:\Program Files\obs-studio\obs-plugins\64bit\
    4. Open OBS and add a Gstreamer Source, and use udpsrc port=5600 ! application/x-rtp, payload=96 ! rtph264depay ! h264parse ! avdec_h264 ! video. for a h264 stream on port 5600 (see here for an option with audio)
  • OpenCV with ffmpeg/gstreamer (python, C++, Java, etc.)
  • VideoLan (VLC), possibly with a VLC Mosaic
  • gstreamer command-line interface
  • ffmpeg command-line interface
5 Likes

well i streamed two cameras at the same time with gstreamer a command that i found at script star_video.sh
i streamed the rpi camera with the script and the other one with gstreamer command : gst-launch-1.0 -v v4l2src device=/dev/video4 do-timestamp=true ! video/x-h264, width=640, height=480, framerate=30/1 ! h264parse ! queue ! rtph264pay config-interval=10 pt=96 ! udpsink host=192.168.2.1 port=2001
but the raspberry pi can’t handle streaming both cameras simultaneously
the quality of cameras decreased quickly to the degree that i can’t see anything from them
the quality is good when it is just one camera
i think this is because rpi encodes the video before sending
so im looking if it’s possible streaming a raw video so the rpi can stream both cameras without problem
my cameras are usb low light digital and arducam (csi port)
thank you very much for your answer

H264 is a particularly efficient format for streaming, so streaming ‘raw video’ would unfortunately be significantly worse/slower, even factoring in the regained time from not having to perform encoding. I’d suggest you try to

  • use the companion web interface to check the network speed, without the RPi camera connected - it should be >50Mbps, and if it’s lower than that then you might have issues with your tether
  • reduce the framerate or resolution of one or both cameras (reductions on the RPi camera should have a larger effect/significance)
  • swap out the RPi camera for another blue robotics camera
1 Like

Both download and upload are above 80 MBps
Will try to do the other solutions thank you very much

Any updates on this “issue” now in 2023? Regarding multiple video streams at once in QgroundControl

unfortunately back then, i didn’t reach a solution for this, so we settled for one camera, though we were using rpi 3b if i remember correctly, don’t know if the current version of rpi4 or 5 would be capable of this or not, you should look up jetson nano as well, if it is affordable.

Hi @WilliamDomben, welcome to the forum! :slight_smile:

QGroundControl still only supports viewing one camera at a time, although if you have multiple cameras set up in BlueOS then it will at least allow you to switch between them (with the caveat that switching cameras also ends any active video recording).

This isn’t something Blue Robotics staff are likely to work on, because our control station development efforts have shifted to Cockpit, which already has support for multiple video streams (as long as you have sufficient networking bandwidth to send them to the surface). You’re welcome to try out Cockpit if you want to, although be aware it’s not yet officially released, so support will be limited, and there may be some missing features.

Something like this may also be of interest to consider.

1 Like

Thank you for the reply! Currently got 4 cameras up and running with the Cockpit Extension, works very well.

William

1 Like

A post was split to a new topic: Hotas joystick compatibility