Retrieve Ping Sonar data for analysis

Hello,

I’m using my BlueROV2 equipped with a Ping Sonar Altimeter & Echosounder. I also attached a GoPro camera to film the bottom.

The Ping Sonar works really well for “real-time” information, which is really useful. But I would love to retrieve the bathymetry data and use it for analysis. Ideally, I want to have a value for each image of the bottom I analyse.

Has somebody done this yet?

I have found the documentation for ping-python, but I cannot seam to convert the log file (in Binary format) into something i can use (like a .csv or something). Also, I’m really not a coder.

Can somebody help me with this?

Cheers,

Lea

Hi @katlea,

We have a sensor log decoding example in the Ping Viewer repository (since Ping Viewer creates those logs, and it defines the format).

If you’re specifically after just the distance measurements you could do something like the following (requires downloading “decode_sensor_binary_log.py”, and running this in your own python file from the same directory):

from decode_sensor_binary_log import PingViewerLogReader
from datetime import datetime, timedelta, time
from pathlib import Path
import csv

# TODO: put in the path to the log file you want to process
#  e.g. "~/My Documents/PingViewer/Sensor_Log/something.bin"
logfile = Path("/path/to/file.bin")

def to_timedelta(time_str: str) -> timedelta:
    ''' Returns a time delta from an iso-format time string. '''
    delta = time.fromisoformat(time_str)
    return timedelta(hours=delta.hour, minutes=delta.minute,
                     seconds=delta.second, microseconds=delta.microsecond)

log = PingViewerLogReader(logfile)
outfile = Path(logfile.stem).with_suffix(".csv")
# ideally it would be good to localise this to your timezone,
#  but in this case it shouldn't cause issues unless you were
#  recording the sonar data at the time of a daylight savings
#  changeover or something
start_time = datetime.strptime(logfile.stem, "%Y%m%d-%H%M%S%f")

with outfile.open("w") as out:
    csv_writer = csv.writer(out)
    csv_writer.writerow(("timestamp", "distance [mm]", "confidence [%]"))
    for timestamp, message in log.parser():
        # convert the 'duration since scanning started' into a local-time timestamp
        #  the .replace here ensures cross-compatibility between operating systems
        timestamp = start_time + to_timedelta(timestamp.replace('\x00',''))
        csv_writer.writerow((timestamp, message.distance, message.confidence))

If you’re interested in more involved analysis (or just extraction) of the profiles themselves then the code will need to be more involved, and a csv format may not be suitable. This thread may be of interest.

I’m planning to make a detailed post-processing example including synching Ping Viewer log data with other things like telemetry logs and video, but haven’t had the time to do so yet unfortunately.

Out of interest, how are your videos/frames saved? You’ll need some way to align the video with the sonar scan data, and depending on what you actually want to do with it you may need to

  • use a program to extract video frames and
    • include the corresponding distance+confidence in each filename
    • write the distance+confidence as text on each frame (which could be saved individually or as a new video)
  • write the distance+confidence information into a subtitle file, which could then be played together with the video (without permanently modifying the frames)

Note also that

  • the video and sonar likely won’t have the same update rate, so you’ll need some way to get the nearest value for each frame
  • it may be relevant to correct for tide levels in the distance values, especially for inspections that are happening during a changing tide
  • if your video is from a reasonably steady boat you may be able to stitch together the frames into one large image of the full inspected area, and potentially plot the bathymetry results on top of that (that would likely be somewhat involved, but is definitely an interesting application)
    • alternatively it may be possible to use photogrammetry to create a 3D model, which could be cross-referenced with the sonar data

Hi @EliotBR ,

Thanks for your response!

I tried your code, but the “datetime.strptime…” doens’t work at all for me. But I used your code and mixed it with mine and I ended up with something I can use :smiley: Thank you!!

I’m actually doing this now. I’m trying to gather all the data from the BlueROV sensors and synching everything for analysis.

Actually, I’m extracting images from the video, and I plan to annotate each image (for analysing biodiversity of the bottom). In the end, I aim to have a dataframe with the names of the images and all their associated sensor data. And use this as a metadata file for my biodiversity annotations So with the following column names : imagename, time, altitude, distance_to_bottom, yaw, lat, long, temperature, etc…

Yes, that is one of the problems i encountered, but it’s not impossible to resolve. I have more telemetry and ping data than images, so each image has a value! I don’t really know how to explain but i made it work for me.

That is exactly what i’m planning to do next! Actually, my goal is to create small “maps” of the biodiversity of the sea bottom. I’m using Agisoft Metashape to do this and I have good results with my image alignment. I have not tried yet to add the sonar data, but I’m getting there!

Thanks a lot for your help and tips

Lea

1 Like

That’s confusing to me, unless you’ve renamed your log files? I’ve used that in a previous project, and also confirmed it was working when before I posted it…

Hmmmmmmmm…
I’ve got working code, but need to refactor it to strip out some stuff that I can’t share. Given you’re actively working on this, I might see if I can prioritise that and get the example out (although it’s very early Saturday morning now, so won’t be for a day or few).

Curious what kind of approach you’re taking for this - is it manual? Every nth frame? I’ve got some code that extracts the sharpest frame from each predefined interval (which I’ve previously used for stitching), but that may not apply to your use-case if you’re selecting frames at more random intervals by what you know to be of interest/relevance to the use case. I’ve also previously considered augmenting the ‘sharpest’ selection approach with something that factors in vibration/acceleration data from the telemetry.

Sounds like a good candidate for a database, although a simple datastore like a csv file could also potentially be sufficient, depending on your storage and processing/querying requirements.

Awesome - exciting stuff! :smiley:

I’m using ffmpeg for that. And I extract 1 image every 10 seconds, so if the ROV doesn’t stop, I am sure that i don’t annotate the same animal/algae twice!

That would be awesome! I’ll be following this closely.

Cheers

Lea

Indeed, my bad, I did something weird with renaming. Now it works ! Thanks!

1 Like

Progress is here :slight_smile:

Feel free to raise issues for anything you think would be useful in an example like this, and/or just things that would be useful for your particular use-case (worst that can happen is I decide something is out of scope - best that can happen is it gets implemented :man_shrugging:).

Ended up having to sort out some other things, so not as far in as I wanted to be, but I’ve transferred over the ‘telemetry log(s) → csv → dataframe’ part and the initial Python and library requirements. I’m hoping to do the Ping Sonar component tomorrow, and possibly some initial video stuff as well (e.g. extract the sharpest frame at rough time intervals, and ideally also extract frames at specified travel distances).

At this stage I’m working with the equipment and data I’ve got on hand at my desk, but if you happen to have a telemetry log (.tlog), some sonar data (.bin), and some video that you’re happy to share then that could be useful, especially for proper testing of things like positioning data and your video format (assuming it’s not just the timestamp-in-filename that QGC uses).

If you are able to share some data your timezone would also be useful for alignment, since telemetry is saved in UTC while sonar data and QGC video are saved with local time stamps.

I’ve been struggling to make time to fit this in, but basic ping sonar distance extraction is now available, including timezone localisation and csv output.

I’m of course planning to make one or more examples of doing something meaningful with data alignment, but in the interim it’s possible to align telemetry and the sonar distance csvs by their timestamps using pandas.

Hope your project is going well :slight_smile:

Hi Elliot! Drudging up this old thread to see if your scripts would need some rework for .bin log files from a blue boat, vs. .tlog from older systems. Not a lot out there on converting between the two… but hoping to combine ping long and vehicle log for a .csv of latitude, longitude, and depth to then make some interpolated maps in QGIS.
Thanks!

Ok, using pymavlink I got .bin vehicle log outputting to csv ok with
mavlogdump.py --planner --format csv --types POS log_name.BIN > temp.csv

But when running the other sonar parser, something not quite right. Is the intention of this repository to do the syncing of ping timestamp to gps location timestamp, and I’m just using it wrong?

...
...
  File "pandas/_libs/tslibs/strptime.pyx", line 150, in pandas._libs.tslibs.strptime.array_strptime
ValueError: time data 'harbormouthsurvey7.1_ping2' does not match format '%Y%m%d-%H%M%S%f' (match)```

Thanks!

Working with the sonar log with
python decode_sensor_binary_log.py ping2log.bin
Works, and generates a playback of data for each ping. It also makes a .pyc file extension file, but only 10kb, in a pycache folder in the same directory.

Header: start_1: 66 start_2: 82 payload_length: 526 message_id: 1300 src_device_id: 1 dst_device_id: 0
Payload:
  - distance: 1882
  - confidence: 0
  - transmit_duration: 60
  - ping_number: 23604
  - scan_start: 0
  - scan_length: 2265
  - gain_setting: 4
  - profile_data_length: 500
  - profile_data: ['0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xfd', '0xbb', '0xb6', '0xc1', '0xc8', '0xc0', '0xd8', '0xde', '0xd2', '0xee', '0xdc', '0xdb', '0xd6', '0xd5', '0xc1', '0xc4', '0xb5', '0x8f', '0x91', '0x78', '0x78', '0x79', '0x83', '0x79', '0x8d', '0x90', '0x7f', '0x90', '0x8a', '0x8f', '0x8d', '0x95', '0x8a', '0x9c', '0x99', '0x88', '0xa6', '0xa8', '0xb0', '0xb9', '0xbf', '0xab', '0xae', '0x9f', '0x7b', '0x70', '0x57', '0x52', '0x4b', '0x41', '0x35', '0x34', '0x31', '0x27', '0x27', '0x1b', '0x19', '0x19', '0x1e', '0x1f', '0x2d', '0x32', '0x34', '0x3e', '0x41', '0x43', '0x3f', '0x3f', '0x33', '0x38', '0x37', '0x32', '0x3c', '0x47', '0x50', '0x52', '0x57', '0x4b', '0x50', '0x4d', '0x42', '0x3c', '0x31', '0x2c', '0x24', '0x21', '0x17', '0x1b', '0x1e', '0x1c', '0x1b', '0x1a', '0x1b', '0x19', '0x1d', '0x19', '0x21', '0x23', '0x25', '0x25', '0x23', '0x24', '0x1e', '0x1c', '0x13', '0xf', '0xb', '0x4', '0x2', '0x0', '0x0', '0x0', '0x2', '0x5', '0x6', '0x9', '0xb', '0xa', '0x7', '0x5', '0x2', '0x1', '0x5', '0x7', '0x8', '0xb', '0xd', '0x13', '0x15', '0x1a', '0x1f', '0x22', '0x1b', '0x18', '0x14', '0xf', '0x10', '0x10', '0x12', '0x15', '0x17', '0x1b', '0x1e', '0x1c', '0x16', '0x13', '0x11', '0xe', '0xc', '0x8', '0x6', '0x4', '0x2', '0x4', '0x8', '0xb', '0xe', '0x12', '0x13', '0x11', '0x10', '0x10', '0x12', '0x11', '0x11', '0xc', '0x7', '0x1', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0x0', '0xff', '0x65', '0xca', '0xd7', '0xfd', '0xf2', '0xbd']
Checksum: 24701 check: 24701 pass: True

hi @tony-white,

I also wanted to sync both logs. I have the BlueROV2 btw.
I also wanted to sync my video footage from the camera and from another added gopro.
I do all of this by using “t0” = when the ROV hits the water. In the ping sonar log, it’s when the sonar starts logging actual data.
My solution is to convert both logs to a .txt or .csv and then just use any code language to create a new csv file that starts at this t0.

Hope this helps!

Hi Lea!
Thanks! I’m about at that point. I’ve got a 36 minute vehicle log, and a 32 minute sonar log (didn’t open ping viewer until vehicle was in the water.)
Of course, the sonar log starts from 0 and is in minute:second, while the telemetry log counts in microseconds from boot, and seems to be recorded at a higher rate.
Were you succesful in using the provided scripts to interleave the data? Any additional code you can share? I plan to document the entire process, including the map creation in QGIS once I get it happening smoothly…

I’ve got some initial results! Enough to think the workflow might not be too bad, but I’m going to collect and process some more data to verify and document. For now, a teaser…
qgis_ping2

1 Like

Hi anthony,

If you use QGroundControl from a computer, you can sync everything with the time from your device. So if you use the time when the .bin file was created, to sync with the times of the tlog from QGC (which uses your computer’s time).

To convert .bin files i use this : wham/wham_code/3.1_ping/._bin_to_csv.ipynb at master · katzlea/wham · GitHub

To convert my tlog, you can use this :

OR, you can just check the option in QGC to save the tlog as a .csv too, which is what i do from now on.

Then to sync both files, I wrote some code on R (because that’s the language I’m familiar with as a biologist!). I do this by converting the .tlog Timestamp to a POSIXct (which is a date-time format in R), then do the same for the .bin time and giving it the correct “start” time by looking up in my files when it was created. Then using a simple “inner join” function in R that adds the sonar values to the tlog where it is necessary.
Here are some code snippets :

library(tidyverse)
library(lubridate)

campaign<-"campaign_name"
dive<-"dive_name"
square<-"square_nX"

#TLOG
tlog<-read.csv(paste0("wham_data/",campaign,"/",dive,"/",campaign,"_",dive,".csv"))
tlog$Timestamp<-as.POSIXct(tlog$Timestamp,tz="Antarctica/Rothera")

#PING
start_time_ping<-as.POSIXct("2023-03-07 14:18:34", #see time .bin file was created !! take care of time difference (-4)...
                            tz="Antarctica/Rothera") 

ping_raw<-read.csv(paste0("wham_data/",campaign,"/",dive,"/",campaign,"_",dive,"_sonar.csv"))
punix=start_time_ping+hms(ping_raw$time)
ping<-cbind(ping_raw,punix)
ping<-aggregate(ping,by=list(as.character(ping$punix)), FUN=last)

tlog$Timestamp <- lubridate::ymd_hms(tlog$Timestamp, tz = "Antarctica/Rothera")
ping$punix <- lubridate::ymd_hms(ping$punix, tz = "Antarctica/Rothera")
nav<-inner_join(tlog, ping, by=c("Timestamp" = "punix"))

## SAVE
save(nav,file=paste0("wham_data/",campaign,"/",dive,"/",campaign,"_",dive,"_",square,"-nav.RData"))

You could use same logic in any coding language i guess.
Hope this helps!

Lea

1 Like

.tlog files are logs of the MAVLink message stream, whereas .bin log files are logs of the internal autopilot operation, so they’re technically logging different things (even though there’s generally significant overlap between the types of data being logged). There’s some additional description and explanation in the old ArduSub docs.

Unfortunately my code at the moment is only made for working with .tlog files, but I agree that supporting .bin logs would be useful (especially considering applications like an autonomously operating BlueBoat that may not always have a MAVLink connection to a control station computer).

It could also be helpful to provide a broader abstraction that allows providing a log of either type and specifying the general type of data desired rather than needing to know the actual message names and whatnot to retrieve that data…

I’m currently working at reduced capacity for a few weeks during some travel, but this seems like some fun, so if I’m not able to make time for it in the immediate future then hopefully I can do so when I’m back home.

1 Like

Wow @katlea - thanks for sharing in such detail! I love the use of R, and using device time to sync makes sense, especially in your paradigm.

Thanks @EliotBR for the clarification on log file type - I was indeed looking for the most reliable data stream, not limited by cellular connection I use with the vehicle.

From talking with Rusty, I’ve set my next goal to develop a script, maybe with a BlueOS extension interface, that lets a user trigger a survey log (start/stop/pause/download) . This log would be created on the vehicle, from processing ping protocol packets and mavlink messages for position. This starts down the path that displaying data live on a map would require, but may have some potential conflicts with the existing processes that push ping data out over UDP, potentially?
In the meantime, I hope to collect some data via my initial manual approach soon! Will keep this topic active…

Thanks all!

Are you doing some kind of manual processing of the profile data, or are you just wanting the ping’s distance estimates? If the latter, the MAVLink driver built into BlueOS’s ping service (in recent BlueOS 1.1 beta releases) already sends those estimates via MAVLink (when the MAVLink output is enabled), in which case your extension could just fetch and log(/display) the data it wants from a custom MAVLink endpoint, or from the MAVLink2REST API.

The ping service only sends out data over UDP when it’s being requested by something like Ping Viewer, but even then the MAVLink driver functionality just leaches off that data stream for any messages that include distance estimates.

Great tips Eliot!
With some more help from chatGPT, I moved logging to the vehicle via python script that pulls data from the REST interface at 2hz, syncing ping data (incl. confidence, etc.) and position data, along with whatever else I’d like! Next improvement is to do a bit of QC on incoming data to make mapping even easier, and turn it into an extension

The limitation that you mentioned, of the ping service only making data available on REST interface when Ping Viewer is connected is a bit of a hassle, as momentary losses in telemetry connection shouldn’t impact this local log process in an ideal situation!
Maybe it is possible to create that UDP request locally? Or keep a local log script receiving data at all times via another method?

One test of the script so far - sent my BlueBoat out on a 8km, mostly unsupervised mission, monitoring from nearby via 4G connection. I’m fleshing out a pretty easy, code-free process for visualization via free QGIS software. Amazing how well the satellite imagery lines up with mapped contour features (readings in m, 0.5m contours)

@tony-white Looks like you are making great progress here. I am busy working on a similar project that is integrating a different type of echosounder. Happy to help.