, /PRNewswire/ — The integration of automated vehicles promises several benefits for urban mobility, including increased safety, reduced traffic congestion, and enhanced accessibility. Automated vehicles also enable drivers to engage in non-driving related tasks (NDRTs) like relaxing, working, or watching multimedia en route. However, widespread adoption is hindered by passengers’ limited trust. To address this, explanations for automated vehicle decisions can foster trust by providing control and reducing negative experiences. These explanations must be informative, understandable, and concise to be effective.
Existing explainable artificial intelligence (XAI) approaches majorly cater to developers, focusing on high-risk scenarios or comprehensive explanations, potentially unsuitable for passengers. To fill this gap, passenger-centric XAI models need to understand the type and timing of information needed in real-world driving scenarios.
Addressing this gap, a research team, led by Professor SeungJun Kim from the Gwangju Institute of Science and Technology (GIST), South Korea, investigated the explanation demands of automated vehicle passengers in real-road conditions. They then introduced a multimodal dataset, called TimelyTale, which includes passenger-specific sensor data …