How drivers respond to visual vs. auditory information in advisory traffic information systems
Minjuan Wang,
Yuan Liao,
Sus Lundgren Lyckvi and
Fang Chen
Behaviour and Information Technology, 2020, vol. 39, issue 12, 1308-1319
Abstract:
To date, many efforts have been made to explore how to support driver's decision-making process with advisory information. Previous studies mainly focus on a single modality, e.g. the visual, auditory or haptic modality. In contrast, this study compares data from two simulator studies with 50 participants in total, where the visual vs. the auditory modality was used to present the same type of advisory traffic information under the same driving scenarios. Hereby we compare the effect of these two modalities on drivers' responses and driving performance. Our findings indicate that modality influences the drivers' behaviour patterns significantly. Visual information helps drivers to drive more accurately and efficiently, whereas auditory information supports quicker responses. This suggests that there are potential benefits in applying both modalities in tandem, as they complement each other. Correspondingly, we present several design recommendations on Advisory Traffic Information Systems.
Date: 2020
References: Add references at CitEc
Citations:
Downloads: (external link)
http://hdl.handle.net/10.1080/0144929X.2019.1667439 (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:taf:tbitxx:v:39:y:2020:i:12:p:1308-1319
Ordering information: This journal article can be ordered from
http://www.tandfonline.com/pricing/journal/tbit20
DOI: 10.1080/0144929X.2019.1667439
Access Statistics for this article
Behaviour and Information Technology is currently edited by Dr Panos P Markopoulos
More articles in Behaviour and Information Technology from Taylor & Francis Journals
Bibliographic data for series maintained by Chris Longhurst ().