Hello Blue Robotics community,
I’m excited to share two emerging Seattle Aquarium conservation research projects with you all, centered around using a customized BlueROV2 to conduct standardized benthic surveys in the nearshore subtidal along the Olympic Peninsula and within Puget Sound, Washington, USA. Given the logistical limitations of conducting scientific SCUBA surveys (there’s only so much “ground” we as divers can cover carrying around life support equipment on our back!), our overarching objective is to expand the spatial extent across which we survey species abundance and distribution for kelp, invertebrates, and fishes, particularly within kelp forests.
A major limitation is the idea of using a tethered vehicle within a canopy-forming kelp forest. However, thanks to the small size, high maneuverability, and responsive handling of the BlueROV2, we’ve developed methods to (1) survey 100m straight into a kelp forest, (2) turn the ROV around and locate the tether, then (3) carefully follow the tether back out of the kelp forest, which requires maneuvering to the left/right, or under/above, bull kelp stipes (see here for examples of us getting the vehicle out of bull kelp forests). Through careful coordination between the tether handler and ROV pilot, we’ve even developed methods of putting tension on the tether then exerting directional force via the ROV, all of which “creates an opening” when the tether is lodged in the thick of a tangle of bull kelp stipes (see this short video).
We have x1 GoPro 10 facing forwards and x1 GoPro 10 facing downwards mounted on a Payload Skid (neither of which are wired into the ROV, i.e., we turn them on at the beginning at let them run). We have a Ping Sonar Altimeter to maintain imagery at a consistent scale (we keep the ROV 1m above the benthos). We’re also using the WaterLinked Underwater GPS with the U1 locator to provide a geospatial record of our surveys. We have four lights facing downwards (our primary illumination) and two facing forwards. See here, here, & here for examples of our benthic transects via the downward-facing GoPro, and here, here, & here for examples of our benthic transects via the forward-facing GoPro.
On the analytical side, we’re using open-source AI programs to extract metrics of abundance and percent-coverage from our imagery. Specifically, we’re using VIAME (see here) for object detection, and CoralNet (see here) for metrics of percent-coverage. See the image attached for an example of using the ROV imagery within the respective GUIs, including an example of mapping a ROV survey atop open-source bathymetry. In short, our workflow involves extracting stills from the 4K GoPro video, annotating those images in VIAME and CoralNet, then appending those community data back to the original ROV telemetry file (containing the Ping data and GPS coordinates), such that we have a spatially-explicit record of species abundances/percent-coverage along with all ROV metadata.
Having developed methods to conduct standardized benthic surveys within kelp forests, it is our hope these methods will provide researchers and those engaged in conservation an additional tool within their long-term benthic monitoring toolkit. As we are a big proponent of open access research, we’ve created two GitHub repositories where we’re maintaining information:
-
Seattle_Aquarium_ROV_development houses general information, the exact hardware/software we’re using, and video from the ROV.
-
Seattle_Aquarium_ROV_telemetry_imagery_analysis is slightly more in the weeds and has several document and a variety of code for certain tasks.
I by no means will claim our analytical approach is THE way, but rather it is simply A way. . . keep in mind we’re field biologists (and not computer engineers!), thus we’re doing the best we can to adopt and implement the technology and software you all have worked so hard to make available.
There are a variety of features and customizations we want to develop further, some of which I see posts about, some of which I haven’t come across any mention of before. I think I’ll save that all for another day however. Thank you, Blue Robotics, for making your technology modular and accessible . . . I’m really excited to develop these survey methodologies further – I genuinely believe this type of technology and the parallel advances in open-source AI have the potential to change the way we survey and understand ecological health and resilience. Please let me know if you have any questions, comments, or ideas.
Take good care,
Zach
|