Relating Human Gaze and Manual Control Behavior in Preview Tracking Tasks with Spatial Occlusion

More Info
expand_more

Abstract

In manual tracking tasks with preview of the target trajectory, humans have been modeled as dual-mode “near” and “far” viewpoint controllers. This paper investigates the physical basis of these two control mechanisms, and studies whether estimated viewpoint positions represent those parts of the previewed trajectory which humans use for control. A combination of human gaze and control data is obtained, through an experiment which compared tracking with full preview (1.5 s), occluded preview, and no preview. System identification is applied to estimate the two look-ahead time parameters of a two-viewpoint preview model. Results show that humans focus their gaze often around the model’s near-viewpoint position, and seldom at the far viewpoint. Gaze measurements may augment control data for the online identification of preview control behavior, to improve personalized monitoring or shared-control systems in vehicles.

Files