On Thursday, April 21, 15:15–17:00 in NI:C0626, Jeff Pelz is a guest at the department of computer science, to hold a seminar with the title Measuring complex behavior: Tools to analyze mobile observer’s gaze.
Mature data-analysis tools are available to researchers using laboratory-based eye tracking systems, but new wearable eye trackers are creating huge data sets that are not compatible with existing tools. The new systems can monitor complex behaviors in natural environments that were inaccessible to previous eye tracking systems because of their inherent constraints on environment, movement, and behavior.
I will describe approaches to measuring observer behavior in these unconstrained environments using methods from machine vision (e.g., multiview geometry and SLAM) to code gaze targets spatially, and methods relying on semantic labeling that do not rely on fixed 3D spatial locations. The latter approach allows coding of dynamic scenes without the need for explicit object tracking and is more flexible and extensible than object-based coding schemes.
The speaker is the Frederick Wiedman Professor of Imaging Science and Co-director of the Multidisciplinary Vision Research Laboratory at the Rochester Institute of Technology (RIT) in Rochester, NY, USA. He received a Ph.D. in Brain and Cognitive Science from the University of Rochester, where he began his work in gaze and behavior in the 1990s. His research has focused on the development and application of robust wearable eyetracking systems that allow the study of complex behavior in natural environments, and on data-analysis tools to handle the resulting datasets.
The Multidisciplinary Vision Research Laboratory (MVRL) is a collaborative group of faculty, staff, and students at RIT representing a broad range of disciplines with a common interest in developing and using eyetracking tools to explore complex behavior. The MVRL brings together researchers from Imaging Science, Computer Science, Psychology, Linguistics, Engineering, and Health Sciences.