Skip to content

Terrain Aided Navigation (TAN) Senarios

Woensug Choi edited this page Aug 24, 2021 · 16 revisions

TAN Development support features

Introduces TAN development scenarios using Dave project incorporating various sensors and bathymetry models.

Table of Contents


Definitions

Interfaces

resources

Feature scenarios

1. Dead Reckoning

1.1 Inertial Navitation System (Utilizing IMU/Pressure Sensors)

There is a simple example of Dead reckoning at glider_hybrid_whoi repo written by Brian. For details, Standalone glider deadreckoning. Source code is at glider_deadreckoning. It is a standalone ROS node python script using a simple algorithm (this). Maybe include this at the end of this section as an example? For more, maybe include this standalone package into dave repo too?

1.2 Utilizing DVL Sensor

https://github.com/Field-Robotics-Lab/dave/wiki/DVL-Water-Tracking

1.3 Ultra-Short Baseline

Ultra-Short Baseline(USBL) is used to assist underwater navigation and positioning. It consists of one transceiver and one or multiple transponders (or beacons). Usually, the transceiver will be attached to either a stationary object or mobile central node, whereas the transponder(s) is attached to objects that are being tracked. For more information, see this video

2. Positional Error Reset

2.1 Utilizing GPS

3. Feature-based Navigation

3.1 Utilizing Multibeam Echosounder for Forward-Looking Sonar (FLS) images

A Multibeam Echosounder (MBES; Multibeam Sonar) measurement can be used to obtain terrain datasets that correspond to multiple altimeter measurements at multiple locations. Multibeam sonar plugin is designed as a separate plugin that can extend the capabilities of the Dave project. It requires the proprietary NVIDIA graphics card in order to utilize GPU parallel calculations. For the Blueview P900 sonar case, which has 512 separate beams, the refresh rate of the sonar image it can generate can go up to 10 Hz for a 10-meter distance range. These multiple separate beams measuring the distance of the terrain simultaneously make optimal sonar sensors for bathymetric terrain navigation. Unlike the Lidar or Radar sensors, the acoustic measurement includes strong correlations between beams as speckle noise making the final image much blurry. This is a critical physical phenomenon for real-world sonar sensors and the feature is included based on the physical model (the point scattering model) in the plugin.

summary

  • Introduction of the MBES sensor plugin

    Previous sonar sensor plugins were based on image processing realms by translating each subpixel (point cloud) of the perceived image to resemble sonar sensors with or without sonar equations (Detailed Review for the previous image-based methods). Here, we have developed a ray-based multibeam sonar plugin to consider the phase and reberveration physics of the acoustic signals providing raw sonar intensity-range data (the A-plot) using the point scattering model. Physical characteristics including time and angle ambiguities and speckle noise are considered. The time and angle ambiguity is a function of the point spread function of the coherent imaging system (i.e., side lobes due to matched filtering and beamforming). Speckle is the granular appearance of an image that is due to many interfering scatterers that are smaller than the resolution limit of the imaging system.

  • Features

    • Physical sonar beam/ray calculation with the point scattering model
      • Generating intensity-range (A-plot) raw sonar data
      • Publishes the data with UW APL's sonar image msg format
    • NVIDIA CUDA core GPU parallelization
      • 10Hz refresh rate with 10m range
  • Tutorials

3.2 Utilizing DVL Sensor for Bathymery gradient estimation

High frequency seafloor features are generally not useful in TAN applications as they are subject to change in dynamic underwater environments. Seafloor gradients are more stable features that can be sampled using an AUV's DVL sensor. By comparing the collected seafloor gradients with those of a known bathymetry map, gradient features can be matched to improve navigation outside the DVL's traditional role of bottom-tracking for dead reckoning.

4. Terrain Feature Set

4.1 Incorporating Bathymetry Model

A bathymetry integration plugin included in the Dave project automatically spawns and removes bathymetry grids converted preliminarily from high-resolution NOAA bathymetry data.

  • Features

    • Automatic spawn/remove with vehicle locations
    • Overlaps for mission continuity
  • Generating bathymetry tiles to pipeline into the Gazebo world

    The bathymetry data need to be converted to be imported using the plugin. The converter to obtain bathymetry tiles from raw data downloaded from NOAA can be found at Bathymetry converter. Follow this tutorial. Any bathymetry data format that can be read by the GDAL library can be processed.

  • Video tutorial link (2 min 50 sec)

Video tutorial link (2 min 50 sec)

5. Evaluation

5.1 Extracting vehicle position using Gazebo service call

Every model called in the Gazebo world generically includes Gazebo service call get_model_state to obtain vehicle position and orientations. Tutorials on how to use service calls are at ROS communication - Get Model State Example.

Clone this wiki locally