scispace - formally typeset
Open AccessJournal ArticleDOI

What Attracts the Driver's Eye? Attention as a Function of Task and Events

Yke Bauke Eisma, +2 more
- 11 Jul 2022 - 
- Vol. 13, Iss: 7, pp 333-333
TLDR
In this article , the authors explore how drivers of an automated vehicle distribute their attention as a function of environmental events and driving task instructions, concluding that eye movements are strongly situation-dependent, with areas of interest (windshield, mirrors, and dashboard) attracting attention when events occurred in those areas.
Abstract
This study explores how drivers of an automated vehicle distribute their attention as a function of environmental events and driving task instructions. Twenty participants were asked to monitor pre-recorded videos of a simulated driving trip while their eye movements were recorded using an eye-tracker. The results showed that eye movements are strongly situation-dependent, with areas of interest (windshield, mirrors, and dashboard) attracting attention when events (e.g., passing vehicles) occurred in those areas. Furthermore, the task instructions provided to participants (i.e., speed monitoring or hazard monitoring) affected their attention distribution in an interpretable manner. It is concluded that eye movements while supervising an automated vehicle are strongly ‘top-down’, i.e., based on an expected value. The results are discussed in the context of the development of driver availability monitoring systems.

read more

Citations
More filters
Journal ArticleDOI

Studying Driver’s Perception Arousal and Takeover Performance in Autonomous Driving

TL;DR: Zhang et al. as discussed by the authors analyzed the change in drivers' perception level and its influence on takeover performance during autonomous driving and found that male drivers have higher perception levels than female drivers, and they prioritize leisure tasks more than professional ones.
Journal ArticleDOI

The Effect of Multifactor Interaction on the Quality of Human–Machine Co-Driving Vehicle Take-Over

TL;DR: In this article , the effects of non-driving related tasks, take-over request time, and takeover mode interactions on takeover performance in human-machine cooperative driving in a highway environment were investigated.
Journal ArticleDOI

Where drivers are looking at during takeover: Implications for safe takeovers during conditionally automated driving.

TL;DR: In this paper , a study involving 27 subjects was conducted on a high-fidelity driving simulator with a Steward motion platform of six degrees of freedom, with their maneuvers recorded by the system and eye gazes recorded by Smart Eye Pro and Smart Recorder.
References
More filters
Posted Content

YOLOv4: Optimal Speed and Accuracy of Object Detection

TL;DR: This work uses new features: WRC, CSP, CmBN, SAT, Mish activation, Mosaic data augmentation, C mBN, DropBlock regularization, and CIoU loss, and combine some of them to achieve state-of-the-art results: 43.5% AP for the MS COCO dataset at a realtime speed of ~65 FPS on Tesla V100.
Journal ArticleDOI

When does age-related cognitive decline begin?

TL;DR: Results from three methods of estimating retest effects in this project converge on a conclusion that some aspects of age-related cognitive decline begin in healthy educated adults when they are in their 20s and 30s.
Journal ArticleDOI

Spatial ability for STEM domains: Aligning over 50 years of cumulative psychological knowledge solidifies its importance.

TL;DR: The importance of spatial ability in educational pursuits and the world of work was examined in this article, with particular attention devoted to STEM (science, technology, engineering, and mathematics) domains.
Journal ArticleDOI

Where we look when we steer

TL;DR: It is found that drivers rely particularly on the 'tangent point' on the inside of each curve, seeking this point 1–2 s before each bend and returning to it throughout the bend, and this work examines the way this information is used.
Journal ArticleDOI

Strategies of Visual Search by Novice and Experienced Drivers

TL;DR: In this paper, six novice drivers drove a 2.1mi. neighborhood route and a 4.3mi. freeway route and eye movements (including blinks and glances to the vehicle's mirrors and speedometer) were videotaped.
Related Papers (5)
Trending Questions (1)
How do eye movements differ between drivers of autonomous vehicles and manual vehicles?

The provided paper does not compare eye movements between drivers of autonomous vehicles and manual vehicles. The paper focuses on how drivers of an automated vehicle distribute their attention based on environmental events and driving task instructions.