Gregory Dudek, Ph.D.
Data collection using both human-in-the-loop and
automated summarization on the Aqua amphibious vehicle.
Wednesday — October 5, 2011
Pacific Forum — 3:00 p.m.
We have been developing a multi-vehicle system for marine applications. This encompasses a swimming vehicle that can go underwater, a surface vehicle, and an airborne vehicle that can overfly the target environment. Our underwater vehicle is unusual in that is uses legs for propulsion and can thus also walk on land, even on rugged terrain. Much of our work is aimed at building an assistant for a human diver, and thus we have looked at various issues related to human-robot interaction, especially an underwater environment where traditional communication modalities are not available. The presentation will briefly consider the vehicle design and behavior, and then address how a human operator can interact with the vehicle for data collection tasks. The primary mode of human-robot interaction in this context is via a visual language designed for this task.
I will also discuss work in progress towards allowing our robotic vehicle to automatically collect data for later use. We are interested not only in automatic data collection, but in the selection and screening for important events for human attention selection. In this context, we have focussed on the problem of extracting maximally interesting images or sub-sequences from a data log collected by the robot. In particular, we are interested in a terse "Navigation Summary," that provides a synopsis of where the robot has been, and what is has seen. This is computed using a range of image domain features and the notion of Bayesian or Set Theoretic Surprise to provide an extremely terse summary of a potentially substantive data collection run. I will show a few examples of this using data from different domains. This is work with my graduate students, particularly Junaed Sattar and Yogesh Girdhar.
Next: October 12 — Kimberly Halsey, PhD