Automated visual event detection
Project Manager/Lead Engineer: Duane Edgington
Lead Scientist: Bruce Robison
In this project we develop and apply technology to automatically process images for event detection and for recognition of target biological species. Applications for these technologies include 1) systems to be deployed on autonomous underwater robotic vehicles (AUVs), 2) lab analysis of underwater video, 3) real time analysis of video from cabled observatory cameras, and 4) analysis of video gathered by ROVs in real time. Our initial driving vision was to develop imaging systems to add to AUVs to detect an organism in passing and to take a picture of it. Eventually, we hope to be able to identify organisms autonomously, in real time. We have expanded the vision to related problems that have fewer technical constraints than AUVs while still having large impact and importance to MBARI programs, science and discovery.
Progress to date: In 2003 we developed a process flow to enable automated comparison of automated event detection with human annotation. We demonstrated algorithms to separate “interesting” events from “boring” events detected by the saliency algorithm, and we are applying supervised learning to tune the selection of “interesting” events by comparing results with human annotation.Development goals for 2004: We will address the requirements for an automated annotation assistant. Interface and integrate with VARS (MBARI's video annotation and reference system). We hope to adapt a variation of the system to analysis of bioluminescent animals. We will also continue development and improvement of the core system.