Although Pursuits enables interaction without calibration, the interaction is limited to moving objects. For eye-tracking to be used with static content, users need to perform a calibration, which means mapping gaze location to the display. Traditionally, this is a standard procedure where the user needs to stare at static dots that appear at key locations on the display (for example the corners). This can be straining, because our eyes usually move every 200 - 300ms and are not used to keeping still on a target for a second or more.

To remedy this problem, we developed a calibration procedure that uses Pursuits. Instead of static targets, a single dot moves along the border of the screen. The system detects when the user looks at the moving dot and collects the corresponding eye data. In addition to using natural eye movements and being less straining to the eyes, Pursuit calibration enables us to calibrate users very robustly because the system does not collect datapoints when the user is looking away.

Because it is based on natural eye movements, Pursuit calibration also enables us to calibrate users that are not aware they are being calibrated. The calibration procedure can fit nicely in an application, instead of feeling like a separate step. The following video illustrates this.




Papers:

Pursuit Calibration: Making Gaze Calibration Less Tedious and More Flexible,
K. Pfeuffer, M. Vidal, J. Turner, A. Bulling and H. Gellersen, Proc of UIST 2013. October 2013.
PDF | DOI | Video