ArduSub submodules

I was keen to take a look at how the ArduSub development was progressing, but keep hitting this error when compiling:

fatal: reference is not a tree: 3b960549601cc22b7bbaa89257833f8d9053e491
Unable to checkout ‘3b960549601cc22b7bbaa89257833f8d9053e491’ in submodule path ‘…/modules/PX4Firmware’

I’m using the method described here:

but also hit the same error when using Windows to build.

Is there another step I’m missing somewhere?


Sorry, I made a mistake in a commit this morning which broke the submodule in git. Rusty or I will let you know when this is resolved, I’m looking into it now.

The issue should be resolved.

In the git root directory, do

‘git submodule update’

Thanks Jacob

All compiled and uploaded now.

Might be a bit off topic now, but what’s the standard approach to teleoperating the PixHawk running ArduSub?

Are you running a RPi as a companion computer, connected to the PixHawk via mavlink, with Mavros providing the link back to the surface computer and joystick? Or something simpler?

That is how I do it. Rusty has been using qgroundcontrol to send the joystick commands to the pixhawk. Rusty’s approach doesn’t require a RPi or ros, and is easier to set up/run. I use the approach you mentioned because my tether is an ethernet cable, which obviously won’t plug in to the pixhawk.


To use Rusty’s approach, you will need to download a daily build of qgc.

To use my approach, you will need to start mavros with this command, instead of using the apm.launch file in the bluerov ros package.

rosrun mavros mavros_node _fcu_url:=/dev/ttyACM0:115200 _gcs_url:=‘udp://:14556@’ _system_id:=255

I also modified the bluerov ros package to work for this. That can be found in the Hexa branch here:

I think the BR guys are working on making similar modifications to their own repository.




I’ll give the ROS method a try as I’m using an ethernet tether and have already started down the ROS route.

Great, I haven’t tested the pixhawk in the water with this method, but indications on my desk are that it is working. I will let you know when I have put it in the water. This is still a work in progress.

Also the controls with my Hexa branch are different:

Left stick: forward and yaw

Right stick: roll and pitch (disabled at the moment)

Triggers: climb/descend

Bumpers: strafe

A: arm

B: disarm

Scratch that, disarming does not work right now, I would not try to use this while the pixhawk is connected to motors. I’ll update you when it’s sorted.

There is still some work to be done using ros with the pixhawk. I had it running well with the apm, but upon closer inspection, I am seeing some issues on the pixhawk that will need to be worked out. I have fixed the arming issue, but inputs are being dropped unless they are constantly changing.

Andrew, I have finally performed several test flights with the pixhawk on the BlueROV. You should pull the changes made in ArduSub over the last few days into your local repository. The pixhawk requires some changes to PID coefficients to perform optimally. I am still fine tuning those parameters, they will go up when I am finished. I am interested in hearing how your tests go and where you are at with this, as well as any feedback.


Thanks Jacob,

I’ll pull down the latest changes and let you know how things go.

I’m running a little behind where I wanted to be with the build, so still running bench tests. My custom frame is complete, just waiting for my thrusters to arrive then I can get it into the pool for testing and tuning.



I experimented with a few connection setup options last night to see how they’d work. Using qgroundcontrol on a laptop, connected to a RPi running Mavproxy, with the RPi connected to the Pixhawk running Ardusub via serial seems to work well and integrates the RPi video stream directly into the qgroundcontrol hud.

So I think I’ll run my first pool tests with that setup once the thrusters arrive (hopefully in the next couple of days).


That’s great! Can you please describe the steps you took to get the video stream to display in qgc? This has been on my ToDo list.

No problem

I fairly closely followed the instructions from here:

I’m running Ubuntu on both the RPi and the Laptop, so followed that path.

Then, instead of running the streaming script in the instructions, I ran the following altered script to stream from the RPi camera:

raspivid -t 0 -h 720 -w 1080 -rot 180 -fps 24 -hf -b 2000000 -o - | gst-launch-1.0 -v fdsrc ! video/x-h264,width=1080,height=720,framerate=24/1 ! h264parse ! rtph264pay ! udpsink host=XXX.XXX.XXX.XXX port=5000

where XXX.XXX.XXX.XXX is the ip address of the laptop where qgroundcontrol is running.

The other important step is to make sure you’re running a version of qgroundcontrol that supports video streaming - the pre-compiled versions don’t seem to so I’ve installed the development build from here:

Hope this helps? Let me know if any of the steps don’t make sense.



Hi, I’ve build qgc from source after installing gstreamer, and ran the command you suggested from the raspberry pi. The output on the pi is

Setting pipeline to PAUSED …
Pipeline is PREROLLING …
How do I open the video in qgc?

I’ve found the video window in qgc. It says no video, though. I am connected to the pixhawk via raspberry pi, and raspicam is running. Do you have to create an additional comm link in qgc for the video feed? Or do you need to run an additional terminal command on the laptop to start the feed?


No additional comm link is needed in qgc so long as the right target ip address is in the rpi command block. I’ll double check I haven’t missed any steps in the notes above when I get back from work later today.

Got it! I had taken part of your command out on initial testing, and had to put it back in to make it work. The command you posted is fine. Thanks!

Excellent! Glad to hear the video feed is working