has anybody here ever tried to use an optical flow sensor for position hold over ground? I found some very old posts suggesting this, but no reports of any experiments or success related to that.
I’m considering to try, but some hints or experiences would be a great starting point.
I’m not aware of anyone actively pursuing this, and past discussions indicate it’s not expected to work well in water.
Performance expectations aside, most current DVL integrations are using the autopilot’s optical flow functionality, so it should at least be possible to pass any estimates you generate to the autopilot
I’ve recently written an optical flow extension with some help from Sanket Sharma. I’ve blogged about it here BlueOS Extension for Optical Flow and Precision Landing - Blog - ArduPilot Discourse but it only works with an ethernet connected camera gimbal that provides an RTSP stream. It could theoretically be extended to a webcam though
Thanks @EliotBR and @rmackay9 for pointing me to this valuable information and the optical flow code for BlueOS. I like the idea of being able to use a custom down-looking camera (which I have in my UW vehicle anyway) and optimize the code to my needs instead of using a COTS optical flow sensor. Since I don’t have the option to mount the camera on a gimbal, I wonder if it would be possible to use AP’s live IMU data to compensate for image changes which are not related to lateral movement (like roll, pitch and yaw) in an advanced OF algorithm.
Yes, by default, AP subtracts the rotation from the gyros from the flow values it receives. One problem though can be timing mismatches between the IMU and flow values especially if the vehicle is very active (like multicopters are).
BTW, optical flow also requires a sonar or lidar so that the optical flow algorithm knows how far away it is from the scene. Just like looking out a window of a train, flow values are much lower for objects that are far away.
Ah, I was not aware that AP is already caring for this. Yes, timing might be an issue, and also the behavior of the optical flow algorithm during yaw. Probably it would be more reliable to address the yaw compensation directly the OF code?
Sure, optical flow only works when the distance is known. I intend to use a green lidar from a COTS rangefinder, hook it up to the Raspberry Pi via USB and feed the value to the Pixhawk (corrected for the refractive index of water) via a BlueOS extension.
Probably my UW application is less timing-critical as using it in a copter like you did. So maybe I should just begin with some experiments…
Hi @rmackay9 -
Would your extension, perhaps with some tweaks, be able to parse an RTSP video stream coming from a USB camera in BlueOS? An ExploreHD is easy to setup as a downward facing camera on an ROV, and I prefer the RTSP stream for Dashcam extension recording.
A Ping single beam sonar also gives pretty good altitude measurements, already in autopilot via distance sensor…
I’d be happy to setup an ROV and record some logs/videos, which should be ~ synchronized since recording starts in dashcam when vehicle armed?
The yaw compensation is a real problem actually. Even on a multicopter, so far I’ve been keeping the yaw movement of the vehicle to a minimum because I have seen some significant estimation issues when the vehicle yaws. I hope to come back and address this in the future. I’m not sure if I’ll try and handle this in the optical flow algorithm (running on the RPI) or in the autopilot
If an RTSP stream is available then I think the extension will work as-is. The extension allows the user to specify the RTSP URL and FOV of the camera..
I’ve been working on optical flow based drift hold to cancel out surge and sway for the last 2 weeks. In theory it’s churning out sensible values but just at the stage of transitioning to mapping it out to the thrusters to test armed. So no idea if this will work but it’s been a journey so far!