BlueOS Mavlink-Camera-Manager CPU Overload

Background

Our team has developed a pair of 1000 meter rated ROV’s mostly using Blue Robotics components. The ROV tether was suppled by TE Rochester (P/N A320327). The steel armored cable has a single 39 Ohm coax. It works well with the FathomX and regularly see data rates of 5+ Mbps. This is enough bandwidth for two video streams and an Oculus sonar set to low resolution. One of the video streams is set to 320x240 15 FPS and is used to monitor the release of the umbilical. The other camera is set to 640x480 15 FPS.

We initially believed the throughput limitations of the cable were the limiting factor with regards to video resolution. With only 5+ Mbps to work with we expected limitations. We ordered a set of RAK WisLink PLC LX200V30 EVB’s. These are similar the FathomX but have 2.5 times the data rate. We tested them we are now seeing 12-15+ Mbps data rates. We expected we could now increase the camera resolution 1080P. This was not the case. Further investigation revealed that the problem is in the Mavlink-Camera-Manager.

The Mavlink-Camera-Manager is used to configure and establish video streams from the ROV. It runs on the Raspberry Pi and is able to keep up at low resolutions, say 640x480 15 FPS, but anything higher causes the Raspberry Pi to reach 400% processor usage. This causes the camera to lock up. Again thinking this was caused by the cable, we eliminated it from the setup and connected an Ethernet cable directly to the Raspberry Pi. Nothing changed.

Setup:

We are running BlueOS version 1.4.0. When the camera resolution is set to 640x480 15FPS it works OK, the CPU % hovers around 60-80%. As long as it stays below 100% the stream does not lock up, at about 200% there are brief hesitations, but it recovers, above this percentage it locks up and often does not resume unless it is rebooted in some fashion.

Questions:

  1. In the Stream Creation window, what does it mean to disable MavLink? When I disable it, the processor load decreases somewhat, so the video continues to work without locking up at resolutions of 1280 x 720 15FPS. However, vigorous movement of the camera will eventually cause CPU% to increase and eventually lock up the camera.

  2. What does Mavlink-Camera-Manager -default settings BlueROV UDP - tcpout 127.0.0.1:5777 telling me? the TCP camera settings are set to 192.168.2.3:8554?

  3. Can somebody please respond and tell us what they are seeing on their ROV with regards to CPU % related to MavLink-Camera-Manager and what resolution your camera is set to? CPU Usage is found in System Information.

  4. Is there some place that provides more documentation about the camera settings? I’d like to try some of the other options other than RTSP, or UDP. I do not think this will solve the problem but it would be interesting to try.

We have two ROV’s one has a Pi 3 and the other a Pi 4. Both have the same issues. One has a 16GB memory card, the other a 32GB. I have multiple spare cameras and have swapped them out, no joy.

Possible solutions: We are investigating IP based cameras, there are some new ones on the market that look interesting. I’ll report back after we have tested them. We have an old Axis IP Encoder that works great but it has more latency than we would like.

1 Like

Dunno if this helps, but i’m getting 100 Mbps over 4000-ft of 90 ohm cable-tv coax using off-the-shelf ethernet-over-coax adapters. $75 a pair?
In the ROV, the Rpi connects to a 4-port hub, and my stripped-down security system DVR connects to the hub as well. Hub connects to coax adapter. This allows me 8 HD camera feeds.
So my camera feeds have nothing to do with my control system, other than sharing bandwidth through the coax adapter.

Hello, @pacific1875, would you mind telling us:

  1. what USB camera you are using?
  2. what is your GCS, or how are you receiving the video?

Below is a table of tests from today with the BR camera on a Pi3, running on BlueOS 1.4.0 (bullseye).

Note: in order to have a consistent throughput in this test, the camera was set to manual exposure mode with high levels of lighting and very low exposure.

For UDP, the video was being received with:

gst-launch-1.0 -vc udpsrc do-timestamp=true port=5600 ! application/x-rtp,payload=96 ! rtph264depay ! h264parse ! openh264dec ! videoconvert ! fpsdisplaysink sync=false

The RTSP, the video was being received with:

gst-launch-1.0 -vc rtspsrc location=rtsp://192.168.2.4:8554/0 ! application/x-rtp,payload=96 ! rtph264depay ! h264parse ! openh264dec ! videoconvert ! fpsdisplaysink sync=false

When testing with automatic exposure (the camera default), the received FPS had a direct correlation to the illumination levels of the scene, varying the bandwidth from 4 to 12 Mbps on a 1080p 30fps stream. The RTSP/UDP had the same results.

Now, going specifically over each of your points:

  1. The disable mavlink option will prevent the Mavlink Camera Manager to advertise the added camera through mavlink communication, meaning that mavlink GCSs won’t know about that camera.
  2. the Mavlink Camera Manager ‘default settings’ CLI loads a pre-baked profile to all USB cameras that it finds when it’s starting up: the BlueROV UDP creates 1080p30FPS H264 UDP stream for each USB camera available.
  3. Unfortunately, the CPU Usage on the System Information can fluctuate quite heavily due to how the Linux kernel values are sampled from our software, especially when recently opened. It’s a known limitation, and we are working on improvements for that. The CPU percentage varies between 0 to 400%, the latter meaning that all four cores of the Raspberry Pi are being fully utilized. A better approach is to go to the user terminal and run htop, and observe the CPU usage from there.
  4. I believe we don’t have an in-depth documentation about video stream, but here are some quick words that might help:

Mavlink Camera Manager uses the GStreamer library to get video frames from a given source, like a USB or an IP camera. The video frames can be output from the camera in different “encoding” formats. The formats MCM supports are H264, H265, MJPG/MPEG, or “raw” YUYV. For each group of those frames, it first assembles them into RTP packets, and then sends them through some network protocol. Today, the network stream-supported protocols are: UDP, RTSP, and WebRTC.

By default, all of those network streaming protocols will send the same RTP-encoded UDP packets, with some communication channels beyond the main UDP stream to help control the stream. Additionally, Mavlink Camera Manager also allows RTSP via TCP or HTTP-tunneled protocols. In practice, for most uses, the main difference from UDP and RTSP will be that UDP will always be consuming bandwidth, even when there are no clients, while RTSP will wait for a client to be connected to start sending the packages, so it has some overhead but provides a better handling of resources.

The third protocol – WebRTC, is the same web-based technology Google Meet and Zoom (and basically all video chat apps) use for a low-latency and real-time video/audio stream. This protocol was added to Mavlink Camera Manager to allow web-based GCS to receive video from our vehicle, like the Cockpit.

Now, about the camera controls, it depends on their source.

For USB cameras, Mavlink Camera Manager supports some camera configurations using a Linux API called “video for linux” (V4L2). Those configurations are provided by each camera. Most cameras allow brightness, contrast, hue, and some other basic image configurations, and a few others support encoding configurations, like changing the bitrate mode between variable bitrate and constant bitrate. The BlueRobotics camera doesn’t expose encoding configurations.

For IP cameras, Mavlink Camera Manager talks the Onvif protocol to discover, authenticate, and manage the camera streams, and soon, to control their exposed video and image parameters too.


Possible solutions: We are investigating IP based cameras, there are some new ones on the market that look interesting. I’ll report back after we have tested them. We have an old Axis IP Encoder that works great, but it has more latency than we would like.

Yes, while this can be a solution, and MCM can be integrated (in more than one way) to an IP camera, the challenge is exactly that: screen-to-screen latency. Our USB-to-UDP pipeline via tether fluctuates between 128 to 172ms at 1080p30fps H264, while a general-purpose IP camera with the same settings can easily range from 300 to 500ms. The entire MCM processing pipeline on a Raspberry Pi 3 adds around 1 frame of latency (~3.3ms) for the H264 1080p@30fps video from our USB camera, and the rest is diluted from (1) the sensor latency, (2) the video encoding done inside the camera, (3) the USB communication, (4) the network, (5) the topside computer decoding process, and (6) the monitor latency.

From here, I’ll be waiting for more information from your side:

  1. CPU usage read from the htop.
  2. Information about your GCS.
  3. System logs (you can send them to my inbox).
  4. A graph from our BlueOS Network Test tool.

Thanks!

2 Likes

Hi,

Thank you for your response. I’ll respond on the forum this evening after I have had a chance to do some testing. To start with however, I’m using a Blue Robotics USB camera, and Blue OS version 1.4.0. I use both the browser and the application, I have not seen any performance issues with one versus another.

-Jeff

1 Like

“The ROV tether was suppled by TE Rochester (P/N A320327)”. If this is not a trade secret, what is the cost of this cable?
Will this work better with the Raspberry pi 5?

Hi @Pierre021 -

@pacific1875 may share the price, but generally Pi4 vs Pi5 won’t matter, as an ethernet connection is provided to either via whatever adapter is used to send signals over the co-ax cable. In this case, that seems to be the RAK WisLink PLC LX200V30 EVB.

The image below contains the speed test. The connection direct via the 1000m steel armored cable averages around 27.9 Mbps. At the end of the cable there is a deployment cage with a 70M tether to the ROV. The ROV is released from the cage at depth. This 70m cable and connection through a Blue Robotics switch reduces throughput by about 5 Mbps. I suspect we could reduce this by swapping the Fathom X for a WisLink on the 70M Tether, but it does not seem like there’s much to gain.

Per previous posting we are using Blue Blue OS and Cockpit.

Htop CPU Usage read while video image is frozen.

Cable Information

Regarding some of the questions posted about the Rochester cable. First, it is absolutely indestructible, we have wrapped it in tight knots, crushed it and generally abused it for six years. It’s great. The cost was around $6-7K for 1000m including shipping from the east coast, that was five years ago, I have no idea what the cost is now, in the past they have had it in stock. It is probably not the ideal cable because there is a lot of attenuation above 1-2 Mhz. But it has worked for our needs. We use it for a side scan and/or magnetometer, then when needed we use it to deploy the ROV in up to 1000m. The cable is small in diameter, with a 200Kg depressor we can tow the sidescan in almost 900m water depth. The manufacturer’s literature suggests they have other cables that might do better at higher frequencies. If we ever replace this cable we will investigate other options that might provide higher data rates.