Hi
I’m currently attempting to build an AUV for a university project. I’m new to this area so some help with a few questions would be great.
Bit of background :The goal of is for autonomous movement around a glass tank that’s roughly1.5m X 4m X 1m. We currently have two ROVs to work with a Bluerov 1 and a smaller custom build with ROV with 4 T100 thrusters.
We also currently have a Raspberry Pi3b or Arduino mega to run it.
We’re currently using the pi for motor control and planning on using it for object detection using Opencv.
So my questions are:
The Pi has problems with PWM control (can only control in 5% jumps that we’ve found) so we want to get a flight controller to streamline control, gyro integration, compass etc. What flight controllers work well with the Pi? I noticed the Pixhawk is on your site but is it hard to interface with the Pi for object detection? Naivo 2 and PXFMini were other options i saw. The size of the tank also means we need very fine motor control.
Does anyone have any ideas for close range detection of the glass walls? We’ve tried ultrasonic sensors but nearly all their ranges are 20cm in air minimum which comes out close to a 1m in water (way too big for tank). Laser detection was another idea we had but are struggling to start with what to use and how?
Any help on these questions would be greatly appreciated
Right now we are using the pixhawk (running ardusub) to control and stabilize the ROV and a raspberry pi (running our companion) that do the camera stream and the bidirectional middleware communication layer from the pixhawk to the surface computer.
You can read more about it in https://www.ardusub.com/software/components.html.
To communicate and control the pixhawk, you can use pymavlink, this python library will abstract the mavlink protocol for you. The camera can be connected to the raspberry, and it’s possible to use opencv to do access the camera, do the object detection and pymavlink to send high level controls to the pixhawk.
So I’ve received the Pixhawk and I’m currently installing and setting up just had a couple of questions. For the time being were not implementing avoidance, just creating a basic control program.
Does anything need to changed to allow Pixhawk to Pi connection via USB? As in Pixhawk and pi settings.
Would I be better using a standard version of Raspbian with a GUI to program rather than the supplied one? Or is it better to program on another computer connected to the Pi. I’m not great at command line programming and pymavlink looks rather complicated.
Thanks in advance
By default you should connect with it, SERIAL0_ parameters are used to control the serial interface connected in the usb, usually the parameters are 115200 baud rate and MAVLink2 protocol.
We usually suggest the companion image, there is already a lot of information in this forum to modify or how to use it.
You can connect with your file manager via ssh in the raspberry and use a normal IDE in your desktop. Besides that, if you aiming to to avoidance algorithms or any basic autonomous logic, pymavlink or ROS are the best approaches.
@Josh
We are in the same boat. I am currently refurbishing a ROV as part of Bachelor Thesis, here in Spain. I have been struggling to get a trustworthy Pixhawk ( not a bad copy), are you based in Europe? May I ask you where you got yours?
We have set the pi and the Pixhawk to that setup (above) and am currently using Mavproxy to look at the signals being returned. But they do not make sense and i believe its to do with the high packet loss. What can cause this?
Its currently connecting through ACM but it changes which ACM its connected to (ACM0, ACM1, ACM2, ACM3 so far). Would it be best to connect another way.
Also the pi tries to connect to a network for 2 mins on startup how do we disable this
Thanks again for the help
Can you share your code ? how can it “make no sense” ? Can you share the output ?
You can connect to the pixhawk with the /dev/serial/by-id/ path, here it appears as /dev/serial/by-id/usb-3D_Robotics_PX4_FMU_v2.x_0-if00.
Are you running the companion image ? It should not take this long to start, do you have a connected ethernet cable ?
Can you share the dmesg output with dmesg | curl -F 'f:1=<-' ix.io and sending the output link, or you can create a file with dmesg >> /tmp/output.txt and send it ?
When we start Mavproxy using:
mavproxy.py --master =/dev/serial/by-id/usb-3D_Robotics_PX4_FMU_v2.x_0-if00 --console
The code output we’re getting is:
"
online system 1
Mode MANUAL
APM: ArduSub V3.5.3 (ad81760b)
DEPTH TN2_FUNCTIONy@INS_ACC_BODYFIXa~GPS_SAVE_CFGfence breach
0?CamTiltCa~|>>Z :sDW
G{qOw>J>>Z0?CamTiltCa;RNGFND2_POS_Y 0K5?Lights1LiO[ lGE z/J hpWQ 6=KDSERVO2_MAXnBTet1
PiMA)?CamTiltCaU8InputHold# 5l G$c^=?CamTiltCa445(B?CamTiltCa ’
:. K?Lights1Lif AEV_ID2*^UInputHoldS
"
This is on start up of the Mavproxy console and it goes on constantly with only some readable words.
It also reads packet loss at a bit over 50%. The connection is via straight USB between the 2.
That connection path works perfectly thanks.
With regards to the connection we don’t have a connection via Ethernet as there isn’t a topside computer.Everything is been programmed on the PI.
So we need to disable it looking for a connection.
Are you using our companion image ? Do you have QGC connected to the ROV ?
If you are using it, we have a series of services running in background that will use the serial port to communicate with the pixhawk, you need to kill this services ($screen -ls to see what is running) or create a pymavlink script.
If you aim to use pymavlink, this part is important:
# Create the connection
# If using a companion computer
# the default connection is available
# at ip 192.168.2.1 and the port 14550
# Note: The connection is done with 'udpin' and not 'udpout'.
# You can check in http:192.168.2.2:2770/mavproxy that the communication made for 14550
# uses a 'udpbcast' (client) and not 'udpin' (server).
# If you want to use QGroundControl in parallel with your python script,
# it's possible to add a new output port in http:192.168.2.2:2770/mavproxy as a new line.
# E.g: --out udpbcast:192.168.2.255:yourport
Yes we are using the companion image, just modified to have a GUI and some other stuff installed.
It isn’t running QGC as I’ve seen most people struggle to get it running on the Pi and the only time we have used it is to modify the settings directly to Pixhawk from a Laptop.
With regards to the ports, under
192.168.2.2:2770/mavproxy
its currently set to :
–out udpin:localhost:9000
–out udpcast:192.168.2.255:14550
Should this be changed to
–out udp:localhost:9000
or is it add the line
–out udpbcast:192.168.2.255:9000
Screen ls shows 7 proceeses running: wldriver, nmearx, filemanager, commrouter, webterminal, webui, and mavproxy.
Which need to be killed
# If you want to use QGroundControl in parallel with your python script,
# it's possible to add a new output port in http:192.168.2.2:2770/mavproxy as a new line.
# E.g: --out udpbcast:192.168.2.255:yourport
You should only kill the service mavproxy if you want to do small tests with the pixhawk. Take in mind that doing so, nothing will be able to connected with the pixhawk, and a reboot of the system will be necessary to run the service once again.