I am looking at incorporating two peripheral devices into my system, an altimeter and scaling lasers.
Browsing through the forum, I have found a few similar topics with some ideas for implementation, but I haven’t seen much for specific details. I will separate out the two items below and see if I have it straight for what is needed for each.
Altimeter: Using a Tritech micron sounder. All I am looking for is the number displayed, but no system control from it (though an alarm would be nice)
I have seen some others using this @jwalser . As I understand it ardusub will take this in as a range finder, and display in the main HUD. What is the best/easiest method for incorporating the RS232 output into the system? USB to serial into RPi, RS232 shifter into Pixhawk serial/GPS input? Then what is needed to read the message in the data string?
Scaling laser: Using a Outland Technology Laser system.
I believe all that I will need to control this is a Relay/MOSFET activated from one of the PixHawk output (Main or Aux?) and map a controller button to turn it off/on. Am I missing anything on that?
We need a program to read the data from the altimeter and then pass it to the autopilot.
The simplest approach would be to connect the altimeter to the Pi, and have a python script to parse the altimeter data, then pack it into the MAVLink DISTANCE_SENSOR message and pass it to the autopilot via mavproxy which is already running on the raspberry pi.
The relay/button sounds like a fine way to control a laser.
Thanks.
Should be simple enough to get the RS232 to 3.3V TTL for the Pi.
However software is not my forte. I can work my way through some C code, arduino programming and BASH scripting, but not much work with Python. Would you happen to have any sample code that you could share, to get me pointed in the right direction?
Again, thanks.
Sorry for my ignorance (this is my first time dealing with mavlink).
Currently for initial testing, I am trying to get it to display a static value what I assign in the .py file before adding reading the serial value.
I have set ardusub to RNGFND-TYPE MAVLink in setup>parameters and restarted the system.
I am using putty to ssh into the pi and running “python readAltimeter.py”
# Import mavutil
from pymavlink import mavutil
# Import serial to access Altimeter
import serial
# Import time to query system time.
import time
# Open Serial Terminal, 9600,8,N,1 with 1s timeout.
# This is to connect to Altimeter
ser = serial.Serial('/dev/ttyUSB0')
# Create the connection, Default connection for companion computer
#master = mavutil.mavlink_connection('udp:192.168.2.1:14550')
# This endpoint is created with the
# '--out udpin:localhost:9000' option with MAVProxy
master = mavutil.mavlink_connection('udpout:localhost:9000')
# Wait a heartbeat before sending commands
master.wait_heartbeat()
#x = ser.read(10)
#line = ser.readLine()
# Set Message Parameters
# Configure the autopilot to use mavlink rangefinder, the autopilot
# will need to be rebooted after this to use the updated setting
master.mav.param_set_send(
1,
1,
"RNGFND_TYPE",
10, # "MAVLink"
mavutil.mavlink.MAV_PARAM_TYPE_INT8)
# Replace min and max values. Default they were 10 and 40, respectively
min = 0 # minimum valid measurement that the autopilot should use
max = 100000 # maximum valid measurement that the autopilot should use
#distance = ser.readLine() # You will need to supply the distance measurement
type = mavutil.mavlink.MAV_DISTANCE_SENSOR_UNKNOWN
id = 1
orientation = mavutil.mavlink.MAV_SENSOR_ROTATION_PITCH_270 # downward facing
covariance = 0
tstart = time.time()
while True:
#Read from Serial Device
distance = 56 #ser.readLine()
#Pack Msg and Send
master.mav.distance_sensor_send(
(time.time() - tstart) * 1000,
min,
max,
distance,
type,
id,
orientation,
covariance)
#Sleep before reading Serial Port again
time.sleep(0.5)
Before running this command, do I need to run “mavproxy.py --udpout:localhost:9000”?
Is there anything else I am missing on this?
Thanks again.
I just tried looking into http://192.168.2.2:2770/mavproxy and nothing shows up.
I can at least see the pages on http://192.168.2.2:2770/camera or /routing.
I am running the version that was shipped about 2 weeks ago (QGround Control Version: v3.2.4-BlueRobotics-Rev4.
I have tried running the above python script with master = mavutil.mavlink_connection(‘udp:192.168.2.1:14550’)
but it states that it cannot create the connection.
I am trying to connect the Rpi so that I can update the companion software. 192.168.2.2:2770 does not show any networks to connect to, but I was able to configure it by SSH command line.
However it still does not give me any options on the /system page for download. I can see where to upload manually, but I thought I remember that I need to use the Blue robotics version and not the one from ardusub, correct?
Thanks.
It looks like the issues I was having with accessing the companion web interface was due to using IE (at least the version installed with Win7. Chrome works much better.
Thanks Pete.
I have actually used those exact lasers on a separate project. The main issue I have had with those in the past, is keeping them parallel. I ended up 3d printing a braket using Tech-G and tapping some adjustment screws directly into it.
Incorporating the MOSFET circuit to activate the lasers with a button push was pretty straightforward.
Jacob @jwalser,
The script is working well now, as it the companion web interface.
We found that the script was hanging up while waiting for the heartbeat. Once that line was commented out it ran as expected. What would be the main concern about not waiting for the heartbeat, or any ideas as to why it would hang there?
Thanks.
The reason is that the link on port 9000 is one-directional input-only (the mavproxy option being used here is --out udpin). This means messages from the autopilot including heartbeats are never forwarded to your program.
If you want bi-directional communication, use udpout or just udp.
Hi all
I have almost the same task, an ultrasonic sensor for sending distance measurements to QGC.
And thanks to all of you, and @jwalser … I start with some things to study already !
As far as I understand, @RyanC 's problem was not the rpi image he used, but the “waiting for a heardbeat” python code.