(Mis-)use of standard Autopilot and Full Self-Driving (FSD) Beta

Results from interviews with users of Tesla's FSD Beta

More Info
expand_more

Abstract

Tesla's Full Self-Driving Beta (FSD) program introduces technology that extends the operational design domain of standard Autopilot from highways to urban roads. This research conducted 103 in-depth semi-structured interviews with users of Tesla's FSD Beta and standard Autopilot to evaluate the impact on user behavior and perception. It was found that drivers became complacent over time with Autopilot engaged, failing to monitor the system, and engaging in safety-critical behaviors, such as hands-free driving, enabled by weights placed on the steering wheel, mind wandering, or sleeping behind the wheel. Drivers' movement of eyes, hands, and feet became more relaxed with experience with Autopilot engaged. FSD Beta required constant supervision as unfinished technology, which increased driver stress and mental and physical workload as drivers had to be constantly prepared for unsafe system behavior (doing the wrong thing at the worst time). The hands-on wheel check was not considered as being necessarily effective in driver monitoring and guaranteeing safe use. Drivers adapt to automation over time, engaging in potentially dangerous behaviors. Some behavior seems to be a knowing violation of intended use (e.g., weighting the steering wheel), and other behavior reflects a misunderstanding or lack of experience (e.g., using Autopilot on roads not designed for). As unfinished Beta technology, FSD Beta can introduce new forms of stress and can be inherently unsafe. We recommend future research to investigate to what extent these behavioral changes affect accident risk and can be alleviated through driver state monitoring and assistance.