Hello, i want to stream a usb camera simultaneously with pi camera i know that is not possible in Qground i just want to stream it via opencv just as the example in ardusub documentation so i can do some image processing on it
I think that is kinda belongs to gstreamer but i came here to see if there is any solution besides writing a new script to stream from another camera
Thanks in advance
I would like to see a walk through for this also. I need a second USB camera. I use ethernet cameras, works great. Need USB camera though.
There isnât a plug-and-play method of adding an extra camera stream, but itâs also not crazily difficult if youâve done some programming before, and especially not if youâve worked with gstreamer before. The following assumes youâre using a non-networked camera (e.g. a USB camera or something using the CSI bus (e.g. RPi camera)).
Getting a H264 Stream
Blue Robotics USB cameras already come with a h264-encoded stream as an access option, so if youâre using one of them or another similar camera then you can skip this step. RPi cameras are a bit of an outlier because the camera itself doesnât send h264, but the RPi is set up to provide a h264 interface to it so it can be used as though it does. The only thing to note there is that the RPi has to work a bit harder when using the RPi camera, which might slow some things down a bit, although for the most part it shouldnât be an issue because most of those extra operations are happening on the GPU.
All other cameras will need to have their stream converted into a h264 stream by gstreamer, which is slow and not recommended - better to use a different camera if possible.
Forwarding the Stream over UDP (Gstreamer)
Getting the Device ID
Iâd suggest using gst-inspect-1.0
or gst-device-monitor-1.0
to find the device id of the cameras youâve connected. This lets you tell gstreamer which device youâre trying to connect to.
Note that each camera often appears as a pair of ids, and audio is often set up as a video device when available. To get the correct ids
- Plug in any cameras/audio devices that will be connected during operation
- Turn on the ROV/RPi
- Check the available ids and write down the ones with a h264 option
- Once the pipeline is set up (next section), create your parameter files to specify the relevant id, and swap the ids as required to get the cameras aligned to the desired ports
Setting up the pipeline
Existing pipeline
Itâs relevant here to look at how the bluerobotics companion computer sets up the existing pipeline.
The file .companion.rc
is run on startup. Video is started on line 18 (if relevant, audio is started on line 22). Line 18 starts a âscreen sessionâ (basically opens a new terminal and gives it a name so you can access it again later), and runs a script called streamer.py
, which starts the video stream and monitors it - restarting the stream at most 5 seconds after it fails. Note that stream failures are rare, so most of the time this is just chilling and making sure the stream is still available.
To actually start the video stream, streamer.py
runs a bash script start_video.sh
, using either parameters that are passed in as command-line arguments, or by reading them in from a file vidformat.param
. start_video.sh
is basically set up to
- parse the input parameters
- check if those parameters are for a valid h264-encoded camera
- if not, try to find a valid h264-encoded camera
- start a gstreamer stream with the specified camera and parameters
Your pipeline
To set up your own pipeline, you can either piggy-back off the existing scripts with modifications to handle your extra camera(s), or copy the relevant/desired components of them and make that run through .companion.rc
. The main important factor is making sure the different pipelines canât get confused with each other. Youâll want a separate parameter file for each camera, and youâll need to make sure all streams go to a unique port so theyâre not competing for the same receiver on the other end.
This is one way of doing the piggy-back approach. It replaces the file gstreamer2.param
from the home directory (/home/pi/
) with gstreamer2_BRF.param
for the Front camera, and gstreamer2_BRT.param
for the Top camera (both cameras were blue-robotics (BR) cameras). It has the downside that if one stream fails both/all of them are restarted, but was simpler to program at the time than it would be to properly handle individual streams in a scalable manner (although if you implement that feature itâd be great to make a pull-request to the bluerobotics companion repo). Given the low frequency of stream failures it likely isnât much of a problem.
EDIT: thereâs now another approach covered here, which keeps the cameras separated, and makes it easier to add additional streams.
Receiving the stream
QGroundControl is currently only set up to receive a single video stream. For adding one or more extra cameras, the following are some possible ways of receiving the incoming UDP stream(s):
- obs-gstreamer
general instructions
- install Open Broadcast Studio (OBS)
- install gstreamer
- read the prebuilt section of the obs-gstreamer README first - provides some useful simpler install links
- read the normal gstreamer install instructions for extra requirements like updating
PATH
environment variable
- install the obs-gstreamer plugin
- download the latest release (
.zip
) from the releases - move the plugin file to the obs-studio plugins folder, e.g.
C:\Program Files\obs-studio\obs-plugins\64bit\
- download the latest release (
- Open OBS and add a
Gstreamer Source
, and useudpsrc port=5600 ! application/x-rtp, payload=96 ! rtph264depay ! h264parse ! avdec_h264 ! video.
for a h264 stream on port 5600 (see here for an option with audio)
- OpenCV with ffmpeg/gstreamer (python, C++, Java, etc.)
- VideoLan (VLC), possibly with a VLC Mosaic
- gstreamer command-line interface
- ffmpeg command-line interface
well i streamed two cameras at the same time with gstreamer a command that i found at script star_video.sh
i streamed the rpi camera with the script and the other one with gstreamer command : gst-launch-1.0 -v v4l2src device=/dev/video4 do-timestamp=true ! video/x-h264, width=640, height=480, framerate=30/1 ! h264parse ! queue ! rtph264pay config-interval=10 pt=96 ! udpsink host=192.168.2.1 port=2001
but the raspberry pi canât handle streaming both cameras simultaneously
the quality of cameras decreased quickly to the degree that i canât see anything from them
the quality is good when it is just one camera
i think this is because rpi encodes the video before sending
so im looking if itâs possible streaming a raw video so the rpi can stream both cameras without problem
my cameras are usb low light digital and arducam (csi port)
thank you very much for your answer
H264 is a particularly efficient format for streaming, so streaming âraw videoâ would unfortunately be significantly worse/slower, even factoring in the regained time from not having to perform encoding. Iâd suggest you try to
- use the companion web interface to check the network speed, without the RPi camera connected - it should be >50Mbps, and if itâs lower than that then you might have issues with your tether
- reduce the framerate or resolution of one or both cameras (reductions on the RPi camera should have a larger effect/significance)
- swap out the RPi camera for another blue robotics camera
Both download and upload are above 80 MBps
Will try to do the other solutions thank you very much
Any updates on this âissueâ now in 2023? Regarding multiple video streams at once in QgroundControl
unfortunately back then, i didnât reach a solution for this, so we settled for one camera, though we were using rpi 3b if i remember correctly, donât know if the current version of rpi4 or 5 would be capable of this or not, you should look up jetson nano as well, if it is affordable.
Hi @WilliamDomben, welcome to the forum!
QGroundControl still only supports viewing one camera at a time, although if you have multiple cameras set up in BlueOS then it will at least allow you to switch between them (with the caveat that switching cameras also ends any active video recording).
This isnât something Blue Robotics staff are likely to work on, because our control station development efforts have shifted to Cockpit, which already has support for multiple video streams (as long as you have sufficient networking bandwidth to send them to the surface). Youâre welcome to try out Cockpit if you want to, although be aware itâs not yet officially released, so support will be limited, and there may be some missing features.
Something like this may also be of interest to consider.
Thank you for the reply! Currently got 4 cameras up and running with the Cockpit Extension, works very well.
William
A post was split to a new topic: Hotas joystick compatibility