Machine learning

MBARI was founded by David Packard to be an alternative to the traditional academic oceanographic research institution. The mission of MBARI is to achieve and maintain a position as a world center for advanced research and education in ocean science and technology, and to do so through the development of better instruments, systems, and methods for scientific research in the deep waters of the ocean.

MBARI’s video assets

The Monterey Bay Aquarium Research Institute’s (MBARI) remotely operated vehicles (ROVs) have been collecting deep-sea video in the Monterey Bay region and other Pacific waters since 1988. To date, approximately 25,000 hours of video tape have been collected and professionally annotated using MBARI’s Video Annotation and Reference System (VARS). These annotations, currently totaling over 6.2 million, provide an established baseline for research and present a perspective into the biodiversity of the deep sea that no other data set can rival.

An ever-increasing number of camera deployments on MBARI platforms are collecting a variety of visual data at an astonishing rate. The expectation from scientists, management, board members, policy makers, and the public are that these videos should be instantly accessible and swiftly and accurately processed and analyzed for science, policy, management, and educational uses. Unfortunately, the technology, infrastructure, and practical tools available to the science community have not caught up with these expectations. This gap creates a daunting challenge.

Machine learning goals

The commercial proliferation of new imaging technology developments—from collection, storage, distribution, manipulation, and, of particular interest, automated detection, tracking, and classification of objects—continues to be integrated into the marine science community in innovative and effective ways. And yet, because the real-world, visual data collected for science differs so drastically from the imagery found in the popular culture (e.g. cats, cars, faces, shoes) that drives the most significant advances in artificial intelligence (AI), the potential for AI to enable groundbreaking exploration and discovery in science has not yet been fully realized.

At MBARI, significant progress has been made in generating data sets to drive machine-learning models from our existing, expertly curated video data with a minimal amount of additional processing. Our engineers also continue to adapt new deep-learning techniques for processing deep-sea video data. We have already demonstrated success classifying objects in still images. Our current focus has shifted towards improving video understanding: object tracking, classification and counting, measurement, behavior analyses, and habitat surveys. Our goal is to develop a suite of video analysis tools relevant to ocean sciences and conservation. We see unlimited potential for analyzed imagery in the ocean domain—from basic science exploration and discovery, time-series studies, marine protected area assessment and monitoring, fisheries and aquaculture management, subsea industrial process control and compliance testing, and more.

MBARI seeks to strategically partner with leaders in industry and academia in mutually beneficial ways. MBARI can offer unique data sets and challenging objectives that help advance AI technologies. In turn, these partnerships would leverage large teams of dedicated researchers, algorithms, storage, computing infrastructure, and distribution mechanisms in the commercial realm and have the potential to produce the sea change needed for processing extensive amounts of complex visual and associated environmental data critical for understanding the health of our ocean.DigitsDevBox

Team

Technology

Solving challenges
Taking the laboratory into the ocean
In Situ Ultraviolet Spectrophotometer
Midwater Respirometer System
Mobile flow cytometer
Enabling targeted sampling
Automated Video Event Detection
Gulper autonomous underwater vehicle
Advancing a persistent presence
Aerostat hotspot
Benthic Event Detectors
Benthic rover
Long-range autonomous underwater vehicle Tethys
Marine “soundscape” for passive acoustic monitoring
Monterey Ocean-Bottom Broadband Seismometer
Shark Café camera
Wave Glider-based communications hotspot
Emerging and current tools
Communications
Aerostat hotspot
Wave Glider-based communications hotspot
Wet WiFi
Data management
Oceanographic Decision Support System
Spatial Temporal Oceanographic Query System (STOQS) Data
Video Annotation and Reference System
Instruments
Apex profiling floats
Benthic Event Detectors
Deep particle image velocimetry
Environmental Sample Processor (ESP)
How the ESP Works
Genomic sensors
ESP Web Portal
The ESP in the news
Investigations of imaging for midwater autonomous platforms
Lagrangian sediment traps
Laser Raman Spectroscopy
Midwater Respirometer System
Mobile flow cytometer
Smart underwater connector
OGC PUCK Reference Design Kit
Discussion
Promoters and manufacturers
Implementation
Manufacturer ID
Power
Wave-Power Buoy
Vehicle technology
Benthic Rover
Gulper autonomous underwater vehicle
Imaging autonomous underwater vehicle
In Situ Ultraviolet Spectrophotometer
Seafloor mapping AUV
Long-range autonomous underwater vehicle Tethys
Mini remotely operated vehicle
ROV Doc Ricketts
ROV Ventana
Video
Automated Video Event Detection
Machine learning
SeeStar Imaging System
Shark Café camera
Video Annotation and Reference System
Engineering Research
Bioinspiration Lab
Bringing the laboratory to the ocean
Bringing the ocean to the laboratory
Bio-inspired ocean exploration technologies
Machine autonomy
Fault prognostication
Wet WiFi
Machine autonomy blog
Persistence Lab publications
Technology publications
Technology transfer