Trying to extend my BlueROV2 to switch between receiving the control inputs from the xbox controller and running autonomously from a pi file.
Has someone done this before, and is the work to do this all done on the companion computer, or does the pixhawk have to get involved?
Please correct me if I am wrong, but I imagine having a python script on the companion computer that (if no user input from the xbox controller) will run a mission autonomously that I programmed.
That’s likely to be challenging to have a very smooth transition for, because the control station software running on your topside computer (e.g. QGroundControl) continuously sends joystick (MANUAL_CONTROL) messages to the vehicle to tell it the current state of the joysticks, so you’ll need a way to turn that off at the same time as you start sending new commands from the program on the onboard computer (RPi).
What kind of autonomous behaviour are you after though? ArduSub has some position-based autonomous commands built in, so the mission behaviour you’re after may already be available if you have a sensor providing a positioning estimate.
@EliotBR So you are saying the topside joystick messages would override the commands I would send from the Pi? When you say its sending it to the vehicle you mean its getting forwarded by the pi to the pixhawk correct?
The behavior I am after is identifying an object using the camera, then (at least at first) completing a circle around the object while maintaining the object in the center of the camera’s FOV, while maintaining the same depth, then surfacing.
The motivation of the project is to experiment with different patterns of motion around an underwater object to minimize time taken while producing a video that can be used to create a 3D model from the images (this part is done after the fact).
Not so much that they’d override, but rather that there’s no way to tell the autopilot to prefer control from one source over another when it’s receiving conflicting control commands. You could potentially create a proxy program on the RPi to do that for you (or on your topside computer if both QGC and your control program are running on the topside), but that may not be trivial, especially if you’re wanting to maintain the existing communication latency.
Yes, and telemetry from the autopilot (running on the Pixhawk) also gets forwarded by the onboard computer (RPi) to the topside computer.
Fair enough. Logic wise that should be reasonably straightforward with a combination of the existing depth hold mode and a process like the one described here. The control switchover is a separate issue though, as discussed above.