HD USB Webcam connection

Hi All

Im new here and building a ROV originally based on OpenROV but have since changed the direction of the project to use ArduSUB and QGC. I have a Logitech C920 webcam which has a great picture. I am fairly new to Rasp Pi and cant find any threads on forums on how to get it to work.

I have burnt the ardusub image to SD card and can log into Rasp Pi using a small monitor and keyboard. I understand that the build is using gstreamer. Any ideas where to look for how to test the picture? or configure the RPI to output the correct format?

From what I gather it is a H264 webcam and its powered through a USB powered hub to stop brownouts.

Is it worth/easier if I just buy a normal RPI cam?

Thanks

Star

You will need to construct a gstreamer pipeline to stream H264 encoded video to UDP port 5600 of your surface computer.

You will first need to find out which device on the RPi corresponds to the H264 video output of the webcam. Here is an example of how to do this on a laptop computer with a built in webcam (/dev/video0), and a connected USB webcam with two output formats (/dev/video1 and /dev/video2). The output shows that the device supporting H264 output that we want to read from is /dev/video2.

jack@jack-Q502LA:~$ ls /dev/video*
/dev/video0  /dev/video1  /dev/video2
jack@jack-Q502LA:~$ v4l2-ctl --list-formats --device 2
ioctl: VIDIOC_ENUM_FMT
	Index       : 0
	Type        : Video Capture
	Pixel Format: 'H264' (compressed)
	Name        : H.264

jack@jack-Q502LA:~$ v4l2-ctl --list-formats --device 1
ioctl: VIDIOC_ENUM_FMT
	Index       : 0
	Type        : Video Capture
	Pixel Format: 'MJPG' (compressed)
	Name        : MJPEG

	Index       : 1
	Type        : Video Capture
	Pixel Format: 'YUYV'
	Name        : YUV 4:2:2 (YUYV)

Then you need to launch a gstreamer pipeline using the selected video device as input, it will look something like this, but you may need to change some things (particularly in the caps field, surrounded by double quotes) depending on your hardware:

gst-launch-1.0 -v v4l2src device=/dev/video1 do-timestamp=true ! queue ! "video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1" ! h264parse ! rtph264pay config-interval=1 pt=96 ! udpsink host=192.168.2.1 port=5600

Finally, once you have a video stream working to your satisfaction, you will need to modify the startup scripts to launch your modified pipeline instead of the default raspivid pipeline.

If doing this sort of thing or referring to the gstreamer API documentation is out of your comfort zone, then I would suggest buying a Raspberry Pi camera.

-Jacob

Hi Jacob

Many thanks for the reply, I have picked apart your post and googled more on gstreamer pipelines.

After executing /dev/video* I managed to get /dev/video0 (usb webcam working) also found it by typing lsusb

v4l2-ctl --list-formats --device 0 gives me MJPG pixel format not the H264 I was looking for…

I found this to stream MJPG

$ gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,width=800,height=600,framerate=30/1 ! jpegparse ! jpegdec ! xvimagesink

I have combined the two… as a newbie to all this its the best I can do

gst-launch-1.0 -v v4l2src device=/dev/video0 do-timestamp=true ! queue ! “video/jpeg, stream-format=(string)byte-stream, alignment=(string)au, width=(int)1080, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1” ! rtpjpegpay config-interval=1 pt=96 ! udpsink host=192.168.2.1 port=5600

I get erroneous pipeline syntax error.

From what ive learnt, can QGC decode mjpg ? Can the codec be sent inside the pipeline?

Many thanks for your time

Star

QGC only handles H264-encoded streams, so no mjpeg support. As far as I know, that webcam is capable of H264 output, it may take some probing to get it to play nice though.

-Jacob