Update on research using BlueROV2 for kelp forest surveys

Hello Blue Robotics community,

It’s been a long while since I wrote the original post here on this thread about our methods of using a BlueROV2 to conduct benthic surveys within kelp forests, and as our methods have evolved quite a bit, I figured I’d share an update. I’m tagging @M.williams, @Reidtosa, and @Clyde on the Aquarium team—and shoutout to all of them—as this whole body of work is very much a team sport, and we couldn’t do it without each of them! :slight_smile:

Our ultimate objective: increase the spatial extent (seafloor area) across which we gather data about the benthos (with derived metrics on substrate type, algae, invertebrates, and fishes). More data = a greater understanding of the ecological processes structuring coastal ecosystems = more information to increase the efficacy of coastal conservation, restoration, and management.

To pursue that end, we’ve adapted the BlueROV2 framework to gather high-resolution downward-facing photos and forward-facing video while conducting standardized ROV surveys. I’ll briefly touch on a few aspects of our research:

  • We no longer extract still images from 4K downward-facing video due to the degradation in image quality when producing stills. Instead, we shoot 27.3MP photos directly (see images below) using a GoPro 12 with custom settings via GoPro Labs; we batch edit the resulting raw .GPR photos in Adobe Lightroom Classic.

  • Shooting these photos with properly dialed in GoPro settings (for shutter speed, ISO, etc.) require A LOT of light; we use x4 Kraken solar flare 18,000 lumen lights stepped down to 80% . . . 14,400 lumens per light, 57,600 lumens total).

  • To maintain a consistent scale of downward-facing imagery, we use surftrak (in ArduSub 4.5) to lock in the ROV’s altitude at 0.8m above the seafloor (via WaterLinked’s DVL-A50); we motor forward at approximately 0.10m/s.

  • Our primary ROV is now powered by x2 200Ah lithium-ion batteries on the Seattle Aquarium’s vessel, via the Outland Technology Power Supply.

  • Our positioning and navigation has greatly improved thanks to sensor fusion. Our topside Advanced Navigation GNSS Compass provides GPS and compass heading information, which is fed to WaterLinked’s UGPS via the BlueOS Extension WL_UGPS_External. The data from all these sensors (along with those from the DVL and within the Navigator) are fused via ArduSub’s EKF.

  • To generate data from our survey imagery we’re using CoralNet-Toolbox (which itself utilizes Ultralytics AI/ML) to generate metrics of percent-cover for aggregate taxa (such as red, brown, and green algae, substrate type, colonial sponges, etc.) and metrics of abundance for individually conspicuous species (sea stars, crabs, fish, etc.). You can see some image patches below for a subset of our percent-cover categories. We’re very pleased with CoralNet-Toolbox; one of our trained YOLO-11 models has an overall accuracy of 91.5% after only ~6,500 manual annotations across 25 percent-cover categories.

  • To generate inference from these data we’re utilizing a variety of approaches, including basic visualization (kernel densities), complexity reduction (non-metric multidimensional scaling, NMDS), a predictive framework for our percent-cover data via generalized linear mixed effects models (GLMMs), and broader spatial analyses via a bull kelp habitat suitability model.

  • We have a variety of research projects and collaborations underway. One that we’re especially excited about is a team-up with University of Washington (UW) and the Applied Physics Lab (APL), where a group of UW students are working with Dr. Aaron Marburg and the Aquarium as part of a UW Industry Capstone Program. The students are developing a visual odometry (VO) system using stereo downward-facing cameras, on-board compute power, and open-source VO/SLAM algorithms. It is our hope that low-cost, open-source VO systems might reduce the barrier to entry for new entities wanting to get up and running in the ROV space.

  • Finally, we are striving to maintain as much information as possible in the public domain to pursue repeatable, open-source research, and to make it easier for other entities to get similar low-cost ROV research programs up and running. You can therefore access a suite of GitHub repositories here on our main GitHub landing pad. For example, we couldn’t do this work without the support of volunteers, and if you have any software chops and want to get involved, you can access a couple “1-pager” project descriptions here.

I do have one robotics question, specifically regarding whether anyone has any experience or knowledge about integrating very bright (> 8k lumen) lights into a BlueROV2. Currently, we power our ROV from the Seattle Aquarium vessel’s power supply and want to explore options regarding integrating powerful lights. Check out this post for more information.

Anyways, I just figured out I’d reach out and share this work with you all.

Thanks for everything that you do, Blue Robotics—and the broader community here on the forums—to facilitate a more accessible landscape of robotics!

All the best,

Zach

6 Likes