I am looking forward to being able to actually put my AUV in the water and say “go that way at this ESC revs, for 30 seconds” and see how repeatable it is and how it scales with battery voltage.
I suspect that it will be good enough, but if it’s not then I’ll replace the depth pinger with a DVL.
If it’s relevant, I’ve found this to be a similar experience with our 148Wh LiPo packs, as long as they have the leads covered (e.g. with some tape over them), they’re in boxes, and there are no more than 2 per passenger.
I guess that could be more complicated if the battery in question isn’t commercially available, as people tend to be suspicious of anything homemade…
Thanks for the shout-out
To those not aware, BR have made one with broader coverage if that’s of interest, which is actually getting updates. I’m not sure how best to reconcile the two options, given I like the transparency of git for tracking changes over time, but understandably a comparative resource is most useful when it’s comprehensive and up to date. As I mentioned in a recent talk, “open source is not sufficient”…
I should likely try to link to the spreadsheet in the places that currently link to the GitHub file, if only to make sure people are aware of it.
Some really interesting insights and conversations in this thread - it’s always fascinating and exciting when the community comes together to discuss ideas, tradeoffs, and potential innovations
I’m also looking at this but here is the information I have so far.
For an autopilot I think Ardupilot’s sub should be OK ( ArduPilot Sub — Sub documentation ). I have used it for an Octocopter and was good with good support (many users). Not sure how stable the “sub” part I haven’t personally experimented with it.
PX4 open-source is also an autopilot but I’m less familiar with its capabilities.
ROS/ROS2 can be used to add higher level capabilities to Ardupilot or PX4.
I’ve been working on an AUV vehicle running ArduSub - controlled via Lua script in conjunction with BlueOS! A full guide is coming before the end of the year…. but you can checkout the code here (HAUV.Lua) - the repository also holds an extension that is used to record video, triggered by the script, and also telemetry data to a .ass subtitles file…
I talked to WaterLinked and they said the error is in velocities only (x,y,z)… so “distance traveled” can be very accurate with no INS. Positional error in the other hand will depend greatly on quality of the attitude sensor. i.e. what is the orientation of the vehicle when traveling at x,y,z velocities… integrated over time of course.
So, as I suspected, a good INS will be needed for low error in long range navigations.
I have been pondering this a bit and I think there is a bit more to it (yep better heading accuracy helps heaps)
But given a theoretical say 100m of run as a “standard” measure
Using the Bluerobotics Navigator with inbuilt Memsic MMC5983MA 3-axis Magnetic Sensor Heading Accuracy3 ±1.0° Degrees (Note 3 MEMSIC product enables users to utilize heading accuracy to be 1.0 degree typical when using MEMSIC’s proprietary software or algorithm)
This gives over the theoretical 100m an accuracy of ±1.745m
Looking at an improved accuracy from the PNI RN3100 as an example
This is over the theoretical 100m a ± 0.174m
For the DVL 1% of the velocity (assume 1.5m/s for the 100m run ±0.015m/s for the 66.6 second theoretical run) equates to ±1m (so broadly 5 time worse than the mid range “compass”)
Waterlinked documentation is a little lacking as to it accuracy claims I strongly suspect the ±1% and the ±0.1% are NOT of the measured value (1.5m/s in this assumption) BUT of the Full-Scale Range (somewhere I think I saw 4.5m/s but can’t lay my hands on it)
Bang for $’s improved heading pitch and yaw from “mid range” sensors is the way to go but the errors in the DVL shouldn’t be underestimated
I still also suspect as the DVL’s accuracy ±1% to ±0.1% is only software and no hardware it is just abetter filtering algorithm.
Don’t forget the accuracy of the accelerometer input to the orientation problem, as well as the relative misalignment of the sensors wrt to each other.
The orientation of the 3D accelerometer is used to transform the 3D magnetic sensor measurement from a body centric frame of reference to a world centric frame of reference. And the 3D magnetic field measurement presumes that you have an accurate 3D magnetic field reference vector most likely provided by the World Magnetic Model, which itself is only an estimate.
Certainly in my part of the world the magnetic field is more vertical than horizontal so even being a little bit out with “tilt” measurement causes a larger error in the horizontal magnetic direction estimate.
Putting all this together, I think the most important thing is that the direction estimate is consistent. If you think you turn left by 15 degrees, you should turn left by 15 degrees. and it should not matter that the body has a 2 degree roll stbd side down or 0.5 degree nose up because the body balance isn’t perfect. Absolute accuracy would achieve this of course, but it also presumes that the field reference is correct.
Hi @Scott_W (and all others interested in AUVs and. BlueOS) -
I’ve got a couple extensive guides in the works, but this video serves as a good teaser for the content! (also available on my personal channel) While the profiling AUV that is the focus of this effort isn’t a traditional AUV running horizontal mapping transects, it’s not a huge stretch to reach that stage if the autopilot can be fed a position, from a DVL for example. Happy to answer questions, and they may help me from forgetting anything important to include in the write-up!
Just a question as to your logic as to why to chose to go a Lua script direction rather than an Auto mode, not trying to say what’s the rigth direction just more what you saw as the advantages or disadvantage of one over the other
Another point of interest about the Starlink: It has an built in NTP server so Cerulean Surveyor data can be precisely time-synced to external data sources in post processing even if not logged in the same file stream. Granted, usually not an issue in a BlueBoat scenario.
The AUV only travels vertically, and has no acoustic positioning sensor (DVL, USBL or LBL.) As a result, Auto mode isn’t really an option! The Lua script is an easy way to expand autopilot functionality without having to modify the firmware itself. You could accomplish similar things with python scripts and mavlink commands, but this requires considerably less setup!
For more traditional, horizontally navigating AUVs ArduSub’s Auto mode is definitely the right approach - but that position sensor is going to be (a moderately expensive) requirement! Lua scripts may still be useful in that application, for other purposes, like controlling a payload.
My little AUV is largely about vertical dives too, and the Ping echosounder is fundamental part of not hitting the bottom.
It looks like you are not using a sounder. If this is this the case, is there a particular reason, or just a matter of keeping things as simple as possible?
That’s correct, no sounder, but you could definitely use one and avoid striking the bottom directly ! I chose not to, as the maximum depth I was targeting (400m) was deeper than the Ping echosounder max depth specification.
I do hope to use a stereocamera (1000m rated) to recognize the bottom and trigger hovering when a desired offset is achieved, but this is a longer term goal!