Skip to content

The many “eyes” of the MiniROV: Inside the mapping system

CoMPAS Lab engineers outfitted the MiniROV with a suite of advanced sensors and software to help the vehicle survey its surroundings and map complex seafloor terrain.
Image: Marike Pinsonneault © 2026 MBARI

The many “eyes” of the MiniROV: Inside the mapping system

Expedition log by MBARI Graduate Fellow Javiera Fuentes Guíñez

Exploring the deep sea requires more than just a single camera. In such a dark and complex environment, robots rely on multiple sensors working together to understand their surroundings.

Members of the expedition team observe live footage and data from the MiniROV’s camera systems and sonar systems in a portable control van on the deck of the ship. Image: Marike Pinsonneault © 2026 MBARI

On this expedition, the MiniROV carries a suite of instruments that act as its many “eyes” and “ears,” allowing it to perceive the seafloor.

The vehicle carries two sets of stereo camera systems, enabling the robot to reconstruct the three-dimensional structure of the environment. One camera system is oriented downward, mapping the seafloor beneath the vehicle. The other looks forward, allowing the robot to scan vertical features such as rocky walls and reef structures.

To help measure the shape of the seafloor with high precision, each camera system is paired with a laser line projector. When the laser hits the terrain, it creates a visible line that the cameras observe. By analyzing the deformation of that line across the surface, the system can estimate the fine-scale geometry of the seabed.

But vision alone is not enough in the ocean. Light fades quickly underwater, and sometimes particles can obscure what cameras see. To compensate for poor visibility, the CoMPAS Lab’s Portable Mapping System (PoMS) also relies on acoustic sensors, which work similarly to hearing.

A multibeam sonar (Norbit) measures the echo of sound waves bouncing off the seafloor. This produces wide swaths of bathymetric data, allowing the robot to map large areas of terrain. An imaging sonar (Oculus) provides a different acoustic perspective. It generates detailed acoustic images of the surroundings, helping the MiniROV detect obstacles and navigate even in poor visibility. Together, these acoustic sensors allow the vehicle to “listen” to the shape of the seafloor.

 

To map the seafloor, we also need to know where the vehicle is and how it is moving. The MiniROV carries an inertial navigation system that measures orientation and motion, along with a Doppler velocity log (DVL) that estimates velocity relative to the seafloor. By combining these measurements, the vehicle can maintain accurate estimates of its position while surveying the ocean floor.

All of these sensors generate enormous amounts of data. Two onboard computers—Atuncita and Atuncito—process this information in real time using software developed by MBARI’s CoMPAS Lab. From synchronizing sensors to processing imagery and acoustic data, they help transform raw measurements into maps and scientific observations.

The MiniROV has multiple ways of perceiving the underwater world. Some sensors see with light, others sense with sound, and others help the robot understand its own motion. Together, they form a coordinated sensing system—giving the MiniROV the many “eyes” it needs to explore the deep sea.

Team

Collaborators

Jong Kuk Hong (Korean Polar Research Institute), Young Keun Jin (Korean Polar Research Institute), Tae Siek Rhee (Korean Polar Research Institute), Scott Dallimore (Geological Survey of Canada). Mathieu Duchesne (Geological Survey of Canada)