Integration of MJPG Format Camera

Hey all,
I’ve been doing some research trying to get a camera I have working. This particular camera is only capable of MJPG pixel formatting. Is there any chance I can integrate it with the existing BlueROV2 software? I’ve read around the forum and it seems to be a common issue as Qground control apparently does not support anything other that h264 pixel formatting. Has anyone had success with MJPG formatting? Is there a known workaround? Or am I out of luck?

Jake

Hi Jake,

Yes it’s possible, the only problem is the encode that’ll be necessary to convert MJPG to H264.
This is an example of how to create a gstreamer pipeline from MJPG to H264.

gst-launch-1.0 -v v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,type=video,framerate=30/1 ! jpegdec ! videoscale ! videoconvert ! x264enc tune=zerolatency ! rtph264pay ! udpsink host=192.168.1.146 port=5600

Try to read more about gstreamer and figure out the best configuration for your camera.

Note: the encode process will use 100% of the cpu and may result in a unstable system and low fps.

Also, take a look here:

If you are running on a raspbery I strongly suggest that you use the omx encoder, as it can use hardware encoding in the gpu. This won’t be enough to stream 1080p at any decent framerate, but you can tweak resolution and framerate and maybe get something acceptable.

You can get it with apt install gstreamer1.0-omx and then
Then replace x264enc with omxh264enc (you might need to tweak other things in the pipeline).

I’ve used this for streaming an analog thermal camera on another application, as the camera had a small resolution (around 640x480) and framerate (9 fps) the CPU usage was negligible.

1 Like

@patrickelectric great! I’m having a slight issue with getting the images on the topside machine now here are my gstreamer parameters, can you see any issue? The error messages are pretty vague, “GLib.Error: gst_parse_error: could not link videoconvert0 to videoconvert1 (3)”

udpsrc port=4777 ! application/x-rtp, payload=96 ! rtph264depay ! h264parse ! x264enc ! decodebin ! videoconvert ! video/x-raw,format=(string)JPEG ! videoconvert ! appsink emit-signals=true sync=false max-buffers=2 drop=true

That is because you have two “videoconvert” nodes.
try something simpler like this:

gst-launch-1.0 -v udpsrc port=4777 caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264' ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink sync=false

That did the trick, I also am looking into your earlier post. I installed omx and tired using omxh264enc and got the following “WARNING: erroneous pipeline: no element “omxh264enc””

Any idea what’s up? Did I drop the ball on the install?

looks like something is broken:
https://www.raspberrypi.org/forums/viewtopic.php?t=192855

that package linked seems to work, try this:

sudo apt remove gstreamer1.0-omx* 
wget http://steinerdatenbank.de/software/gstreamer1.0-omx-rpi_1.10.5-2+rpi+patches_armhf.deb
dpkg -i gstreamer1.0-omx-rpi_1.10.5-2+rpi+patches_armhf.deb

then gst-inspect1.0 | grep omx

you should see something like this:

pi@raspberrypi:~ $ sudo gst-inspect-1.0 | grep omx
omx:  omxmpeg2videodec: OpenMAX MPEG2 Video Decoder
omx:  omxmpeg4videodec: OpenMAX MPEG4 Video Decoder
omx:  omxh263dec: OpenMAX H.263 Video Decoder
omx:  omxh264dec: OpenMAX H.264 Video Decoder
omx:  omxtheoradec: OpenMAX Theora Video Decoder
omx:  omxvp8dec: OpenMAX VP8 Video Decoder
omx:  omxmjpegdec: OpenMAX MJPEG Video Decoder
omx:  omxvc1dec: OpenMAX WMV Video Decoder
omx:  omxh264enc: OpenMAX H.264 Video Encoder
omx:  omxanalogaudiosink: OpenMAX Analog Audio Sink
omx:  omxhdmiaudiosink: OpenMAX HDMI Audio Sink