EyeSyn

Psychology-inspired Eye Movement Synthesis for Gaze-based Activity Recognition

More Info
expand_more

Abstract

Recent advances in eye tracking have given birth to a new genre of gaze-based context sensing applications, ranging from cognitive load estimation to emotion recognition. To achieve state-of-the-art recognition accuracy, a large-scale, labeled eye movement dataset is needed to train deep learning-based classifiers. However, due to the heterogeneity in human visual behavior, as well as the labor-intensive and privacy-compromising data collection process, datasets for gaze-based activity recognition are scarce and hard to collect. To alleviate the sparse gaze data problem, we present EyeSyn, a novel suite of psychology-inspired generative models that leverages only publicly available images and videos to synthesize a realistic and arbitrarily large eye movement dataset. Taking gaze-based museum activity recognition as a case study, our evaluation demonstrates that EyeSyn can not only replicate the distinct pat-terns in the actual gaze signals that are captured by an eye tracking device, but also simulate the signal diversity that results from dif-ferent measurement setups and subject heterogeneity. Moreover, in the few-shot learning scenario, EyeSyn can be readily incorpo-rated with either transfer learning or meta-learning to achieve 90% accuracy, without the need for a large-scale dataset for training.

Files

EyeSyn_Psychology_inspired_Eye... (.pdf)
(.pdf | 1.28 Mb)
- Embargo expired in 01-07-2023