AVED: Automated Visual Event Detection
Project Manager: Duane Edgington
Lead Scientist: Bruce Robison
Lead Engineer: Danelle Cline
The project seeks to develop and apply automatic image-processing technology to event detection and recognition of biological species. The applications for these technologies include:
- Lab analysis of underwater video.
- Real time analysis of video from cabled observatory cameras.
- Analysis of video gathered by ROVs in real time.
- Systems to be deployed on autonomous underwater vehicles (AUVs).
We will apply selected technologies to solve real scientific and operational problems. The initial goal was to develop AUV systems that could detect an organism and take an image of it. Eventually the system will be able to identify organisms in real time. We have expanded the vision to related problems that have fewer technical constraints than AUVs while still having large impact and importance to MBARI programs, science and discovery.
The biomimic (neuromorphic) saliency model of visual attention (implemented in software by Itti, et al.) is used as a processing approach to detect visual events. The model is augmented with selected approaches from machine vision work on object extraction, size estimation from single camera imaging, and efficient implementation in software, including deployment on compute clusters. The system is coupled with state-of-the-art object recognition methods to create a complete end-to-end automated image processing system. The system will eventually analyze and classify bioluminescent zooplankton. The science application focus is on mid-water and benthic animals, including those that use bioluminescence in the deep sea.