BlueOS Mavlink-Camera-Manager CPU Overload

Hello, @pacific1875, would you mind telling us:

  1. what USB camera you are using?
  2. what is your GCS, or how are you receiving the video?

Below is a table of tests from today with the BR camera on a Pi3, running on BlueOS 1.4.0 (bullseye).

Note: in order to have a consistent throughput in this test, the camera was set to manual exposure mode with high levels of lighting and very low exposure.

For UDP, the video was being received with:

gst-launch-1.0 -vc udpsrc do-timestamp=true port=5600 ! application/x-rtp,payload=96 ! rtph264depay ! h264parse ! openh264dec ! videoconvert ! fpsdisplaysink sync=false

The RTSP, the video was being received with:

gst-launch-1.0 -vc rtspsrc location=rtsp://192.168.2.4:8554/0 ! application/x-rtp,payload=96 ! rtph264depay ! h264parse ! openh264dec ! videoconvert ! fpsdisplaysink sync=false

When testing with automatic exposure (the camera default), the received FPS had a direct correlation to the illumination levels of the scene, varying the bandwidth from 4 to 12 Mbps on a 1080p 30fps stream. The RTSP/UDP had the same results.

Now, going specifically over each of your points:

  1. The disable mavlink option will prevent the Mavlink Camera Manager to advertise the added camera through mavlink communication, meaning that mavlink GCSs won’t know about that camera.
  2. the Mavlink Camera Manager ‘default settings’ CLI loads a pre-baked profile to all USB cameras that it finds when it’s starting up: the BlueROV UDP creates 1080p30FPS H264 UDP stream for each USB camera available.
  3. Unfortunately, the CPU Usage on the System Information can fluctuate quite heavily due to how the Linux kernel values are sampled from our software, especially when recently opened. It’s a known limitation, and we are working on improvements for that. The CPU percentage varies between 0 to 400%, the latter meaning that all four cores of the Raspberry Pi are being fully utilized. A better approach is to go to the user terminal and run htop, and observe the CPU usage from there.
  4. I believe we don’t have an in-depth documentation about video stream, but here are some quick words that might help:

Mavlink Camera Manager uses the GStreamer library to get video frames from a given source, like a USB or an IP camera. The video frames can be output from the camera in different “encoding” formats. The formats MCM supports are H264, H265, MJPG/MPEG, or “raw” YUYV. For each group of those frames, it first assembles them into RTP packets, and then sends them through some network protocol. Today, the network stream-supported protocols are: UDP, RTSP, and WebRTC.

By default, all of those network streaming protocols will send the same RTP-encoded UDP packets, with some communication channels beyond the main UDP stream to help control the stream. Additionally, Mavlink Camera Manager also allows RTSP via TCP or HTTP-tunneled protocols. In practice, for most uses, the main difference from UDP and RTSP will be that UDP will always be consuming bandwidth, even when there are no clients, while RTSP will wait for a client to be connected to start sending the packages, so it has some overhead but provides a better handling of resources.

The third protocol – WebRTC, is the same web-based technology Google Meet and Zoom (and basically all video chat apps) use for a low-latency and real-time video/audio stream. This protocol was added to Mavlink Camera Manager to allow web-based GCS to receive video from our vehicle, like the Cockpit.

Now, about the camera controls, it depends on their source.

For USB cameras, Mavlink Camera Manager supports some camera configurations using a Linux API called “video for linux” (V4L2). Those configurations are provided by each camera. Most cameras allow brightness, contrast, hue, and some other basic image configurations, and a few others support encoding configurations, like changing the bitrate mode between variable bitrate and constant bitrate. The BlueRobotics camera doesn’t expose encoding configurations.

For IP cameras, Mavlink Camera Manager talks the Onvif protocol to discover, authenticate, and manage the camera streams, and soon, to control their exposed video and image parameters too.


Possible solutions: We are investigating IP based cameras, there are some new ones on the market that look interesting. I’ll report back after we have tested them. We have an old Axis IP Encoder that works great, but it has more latency than we would like.

Yes, while this can be a solution, and MCM can be integrated (in more than one way) to an IP camera, the challenge is exactly that: screen-to-screen latency. Our USB-to-UDP pipeline via tether fluctuates between 128 to 172ms at 1080p30fps H264, while a general-purpose IP camera with the same settings can easily range from 300 to 500ms. The entire MCM processing pipeline on a Raspberry Pi 3 adds around 1 frame of latency (~3.3ms) for the H264 1080p@30fps video from our USB camera, and the rest is diluted from (1) the sensor latency, (2) the video encoding done inside the camera, (3) the USB communication, (4) the network, (5) the topside computer decoding process, and (6) the monitor latency.

From here, I’ll be waiting for more information from your side:

  1. CPU usage read from the htop.
  2. Information about your GCS.
  3. System logs (you can send them to my inbox).
  4. A graph from our BlueOS Network Test tool.

Thanks!

2 Likes