ROV Video Transects

MBARI’s remotely operated vehicles (ROVs) frequently perform benthic video transects, where the ROV flies in a straight line just above the seafloor over kilometer distances. The main video camera records all of the benthic animals seen, providing valuable information to benthic ecologists on what species are present and in what numbers.

While identifying and counting the animals in the video is relatively straightforward, determining their size is much more difficult. Although the zoom angle of the camera can be calculated, the distance between the camera and the animals is not known and accurate size measurements can’t be made.

Parallel Lasers

The first approach to solving this problem was to mount two parallel lasers on the camera, aimed in the direction that the camera is looking. Because the distance between the lasers is known (typically 29 cm), the two laser dots on the seafloor provide a size reference to which seafloor animals can be compared.

There are, however, significant limitations to this technique. Because the seafloor is tilted at an unknown angle with respect to the camera, only animals lying directly between the laser dots can be accurately sized. In addition, many animals are much longer than they are wide and are more difficult to size if they aren’t oriented on the line between the two laser dots. Due to these issues, many animals seen in the video can’t be measured.

Helical Lasers

We developed Laser Measure to address these limitations. Instead of just two parallel lasers, Laser Measure uses three lasers aimed in a helical pattern to create three laser dots on the seafloor. If we assume that the seafloor is flat, as it is for most of the areas we study with video transects, then the locations of the three laser dots in the video images can tell us the distance and the orientation (i.e. tilt) of the seafloor. Given these parameters, we can measure the size of any object in the image regardless of its location or orientation in the image.

The system must be calibrated each time the three lasers are mounted on the ROV’s main camera. Once the ROV is on the seafloor, a calibration target with a checkerboard pattern is held in front of the camera in various positions by the ROV’s robotic manipulator. This calibration allows us to determine two sets of parameters: first, the intrinsic parameters such as the camera’s focal length and field of view, both of which change with the camera’s zoom; and second, the extrinsic parameters, which are the locations in 3D space of the target and the laser dots falling upon it. The intrinsic and extrinsic parameters in turn allow us to determine the precise locations and pointing directions of the three lasers. The calibration procedure takes about 15 minutes and is performed just before the video transect is started.

After the ROV dive, the calibration video is analyzed to generate a calibration file, which is then used in the Laser Measure software to make measurements of still images from the video transect. The Laser Measure system has been tested in Monterey Bay and by measuring artificial animals of known size, we are seeing measurement errors of 10–15%. We are working to improve the calibration and analysis algorithms to further reduce this error.

Engineer Paul McGill checks the Laser Measure calibration. The three lasers are mounted on the body of the ROV’s camera and move with the camera.
A computer simulation of the Laser Measure system pointed at the seafloor. The three lasers are aimed in a helical (twisting) pattern. One of the lasers is green to keep track of which laser produces which dot on the seafloor.
The Laser Measure software in use. Once the three laser dots (center of image) are designated, size measurements can be made anywhere in the image. The crab at the upper right is shown to have a carapace that is 11.5 cm across.

Additional Information

Dorado-class AUV, flooded fairing design with electronics, payload and batteries in sealed glass-spheres or separate metal pressure housings.

  • Length: 19.5 ft
  • Mass: 1350 lbs (Mass at recovery with entrained water: xxxx lbs)
  • Weight in water: -4 lbs (4 lbs buoyant)
  • Propulsion and control: single propeller with duct on 2-axis gimbal, allowing for rudder and elevator function, oil compensated
  • Batteries: 14.5 kWh LiIon batteries in 3 glass spheres
  • Navigation: RDI Workhorse 1200kHz in water track mode and VectorNav VN100 MEMS compass
  • Acoustic Comms: Sonardyne AvTrak 6 tracking beacon with SMS feature
  • Camera: 2k Video camera systems in underwater housing with aspherical dome
  • Lights: 4x DSP&L lights with up to 9000 lm each
  • Bioacoustics: Downward and forward looking acoustic transducers (200 and 333 kHz)
  • Water Sensors: Seabird Conductivity, Temperature and Oxygen sensor as well as transmissometer

MBARI has been running video transects in the midwater out in Monterey Bay for the last 25 years. These efforts have historically involved ROVs that have been manually driven to a particular depth and through the transects. This required a whole day of operations on a large ship with a work-class ROV to get just a few transects lines covered. In 2014 the i2map project was started to move task of the recording of these video transects to an AUV. At that point, MBARI had been deploying deep-water AUVs for at least 15 years so a new nose payload with a high resolution camera and a solid-state recorder was built to be tested on an existing AUV.

The advantage of using an AUV for running video transects is not just limited to the reduced work effort. Since after deployment the vehicle operates without supervision for around 24 hours, it can run many more transects that would be possible in a typical day of ROV operations, including transects all the way through the night.

The i2map transects start at 25m and stretch all the way to 1000m. The maximum depth the vehicle can operate in is 1500m.

In 2018 the i2map vehicle got updated with an additional sensor suite as part of MBARIs interest in bioacoustic research, which uses sonars to detect and even identify biology in the water column using their acoustic signature. Now the i2map vehicle has a set of forward and downward looking sonars that work around 200 kHz and 333 kHz.

Combining these instruments with the camera-equipped vehicle allowed to see the influence the vehicle and especially the lights have on the animals around the vehicle. The acoustic system has a much larger range than what can be seen in the camera pictures which allows the study of avoidance and attraction behavior of the creatures. Furthermore, the vehicle is repeating transects without the lights and camera on at all, which can tell the researchers something about if the animals react to the vehicle based on noise or if it is just the super-bright lights.

Related News

Sorry, no results were found.