Figure 1 Probability distributions over facial features.
Figure 2 The Wollaston  effect. The perceived gaze direction differs in the two images even though the eyes are identical.
Opengazer is an open source application that uses a webcam to do gaze point estimation for human-computer interaction. The philosophy behind Opengazer is that gaze tracking should be done using cheap off-the-shelf cameras so as to make it available to as wide a user base as possible. The main application of this project is low-cost accessibility for the AEGIS project.
This page describes my own work on this project. The main Opengazer project page lives here. My line of work follows the gaze tracking pipeline: feature tracking, head pose estimation, gaze point estimation. I am specifically interested in Bayesian methods for this computer vision problem.
The first stage of the pipeline requires the location of various useful facial features (Figure 1). To solve this problem in the Bayesian framework, a generative model is built over image data around the features of interest, and the posterior distribution inferred based on measurements made on the image. My main contribution is the use of linear filters to model image data for real-time Bayesian feature localisation. It is possible to do feature tracking by incorporating a motion model.
My current work focuses on doing probabilistic head pose estimation. This is motivated by the need to know head pose to determine gaze point—called the Wollaston effect (Figure 2). The goal is find a probabilistic estimate of head pose parameters (location and rotation) relative to the camera, using facial feature location estimates. By incorporating a probabilistic motion model that constrains the inter-frame change in head pose parameters, we can do head pose tracking in a video sequence.
[Wollaston, 1824] W.H. Wollaston. On the apparent direction of eyes in a portrait. Philosophical Transactions of the Royal Society of London, 114:247–256, 1824.