Custom Gstreamer pipeline in BlueOS

How can I edit / customize the gstreamer pipeline in the BlueOS? I need to flip the stream by 90° for a specific job, and in the old Companion I could add a videoflip element, for instance.

Hi @VMartini,

As far as I’m aware that’s not available via the frontend (Video page of the web interface), so if it’s possible at all it would need to be via terminal / editing some files. I’ve asked the software team for confirmation and will get back to you.

Out of interest, what kind of camera stream are you using? Since video processing requires an unencoded stream, for efficiency and latency reasons this kind of thing is generally done via camera settings (like with a Raspberry Pi camera), or at the display side, in which case the gstreamer pipeline is usually irrelevant. If you’re using a raw source to gstreamer, and then using gstreamer to do your encoding, or just sending an unencoded stream, then I suppose this kind of processing makes sense, but that’s CPU and/or bandwidth-heavy, and would generally be avoided unless you’ve got processing power and tether bandwidth to spare.

If gstreamer is taking in an encoded stream, decoding it, processing it, and re-encoding it then that’s a fair amount of redundant work, and would add some unnecessary latency.

Following up,

with the existing camera manager.

Hi @EliotBR ,

I’m using the BR2 standard USB camera. For a specific job, the camera has to be flipped 90º to a portrait orientation, so I wanted to flip the video stream to avoid the pilot to have to phisically flip his screen (or his neck :rofl:). I was wondering how I could do that from Subsea side, so I wouldn’t need to run any code on topside. I’m not sure what would be the extra processing load to the RPI do that.

As I believe it won’t be a regular operation, for now I’m happy to create a second stream in the BlueOS and flip the video when decoding in the display side. The pilot can use this second window for his orientation, and leave the QGC in the background.

Haha, fair enough :stuck_out_tongue:

That makes sense, but because our camera does H264 compression on the camera itself, it would be quite processing intensive to then decode and re-encode that stream on the Raspberry Pi in order to rotate the video in the middle, and would likely add quite a bit of latency as well.

Fair enough. It should be reasonably simple to add the pipeline component you want to an OBS gstreamer stream or whatnot. That said, good display software would make that easy for you without needing to mess around with video stream elements, and this kind of quality of life feature is something we’re definitely keeping in mind.

If it’s just for one job, you may be able to just rotate the physical usb web cam in the mount… It’s held in with 4 mounting screws, but I believe they are in a symmetrical square arrangement? It might be weird driving when not in the orientation you expect, but maybe that’s when you just use your neck? Always nice to change the hardware instead of the software !

Hi @tony-white

Yes, that’s the idea, to rotate phisically the camera in the mount. By doing that, the operator wouldn’t be able to pilot the ROV, so that’s why I needed to rotate the video as well. The solution I have found is to create a second stream to the same video, and this second stream other I rotate via software on top side and display it on another window. It worked fine this way to me.