November 9th, 2014
The investigation of underwater structures and natural features through Autonomous Underwater Vehicles (AUVs) is an expanding field with applications in archaeology, engineering, environmental sciences and astrobiology.
In other settings, it is possible to use optical methods to analyze the geometry of obstacles and features surrounding an autonomous vehicle, but the optical properties of underwater environments make this impossible. Sonar is still the best technology for underwater location, navigation and scanning.
Sonar point clouds made up of hundreds of millions to billions of points are not uncommon. Highly interactive, immersive visualization is a desirable tool that researchers can use to improve the quality of a final sonar-based data product.
We present a scalable toolkit for the processing and visualization of sonar point clouds on a cluster-based, larges scale immersive visualization environment. The cluster is used simultaneously as a parallel processing platform that performs sonar beam-tracing of the source raw data, and as the rendering driver of the immersive display.
Febretti, A., Richmond K., Doran, P., Johnson, A., Parallel Processing and Immersive Visualization of Sonar Data (poster), IEEE Symposium on Large Data Visualization (LDAV) 2014, Paris, France, November 9th, 2014.