Ping360 mapping in ROS

Hi there. I am trying to mapping our small water pool in ROS by point cloud data published by Ping 360, but I get some trouble.
First, I download the code from ping360_sonar/src/ping360_sonar at develop · CentraleNantesRobotics/ping360_sonar · GitHub, and run it in ros as a driver for Ping360. It works well and I can get the data including angle and distance from topic in every frame.
Then I just transform the angle and distance data to point cloud data in ROS program, which will be published in rviz, and we can see the result below.

It preforms good but far away from real world. As comparison, I provide the real pool image and point cloud result scanning by lidar.

I thought the difference was caused by the inaccuracy of the sonar, but soon I notice it achieves a really good result in ping viewer, which shows sonar can mapping the edge of pool specificlly.

Therefore, I realized that the difference may be caused by incorrect processing on the point cloud data.
So I wonder:

  1. How to calculated the point cloud position from the distance and angle data published by sonar. You can see my code below.
  2. The max angle of Ping360 is 400 or 360? I set the max angle is 360 but the default setting is 400, it may influence the final point cloud position.
  3. Where is the start angle (0 degree) in Ping360, does it begin at 0 degree everytime I start it?

    Also, I have provided the code, please see if there are any errors. Thank you for your patient answers!
    Launch file to set the max angle and other parameters:

    Sonar mapping program to publish point cloud data:

    Topic from Ping360:

Hi @Alex001,

I don’t have experience with the ROS driver, so I’m not sure how it’s handling the Ping360’s profile data. If it’s not taking account of changes to the vehicle orientation and position then each output point will be relative to the Ping360’s pose at the time it was taken, so different points aren’t necessarily in the same coordinate system.

For reference, PingViewer corrects for Ping360 yaw rotation using the vehicle’s orientation data that it provides in the MAVLink telemetry stream.

The Ping360 outputs profile data that is relative to the device, and has no concept of

  • where the device is
  • how the device is oriented
  • which response strengths correspond to object reflections
  • the speed of sound in the transmit medium

all of which are relevant for generating a point cloud, and need to be determined from external data / provided by the user.

The device_data message the Ping360 uses to communicate its profiles uses gradians for the transducer head angle. There are 400 gradians in a full circle, so most likely your code should be using 400 for the maximum angle, unless it’s doing some internal conversion between degrees and gradians.

The 0 angle for the Ping360 transducer head is at the back, where the cable enters the device body.

Hi @Alex001 , Im also thinking of using ping360 to do underwater mapping. Just wondering if you have managed to improve accuracy of the point clouds from your previous post. Any help would be appreciated!! Thank you.