You’re the One Whom I’m Talking To: The Role of Contextual External Human-Machine Interfaces in Multi-Road User Conflict Scenarios

You’re the One Whom I’m Talking To: The Role of Contextual External Human-Machine Interfaces in Multi-Road User Conflict Scenarios

IMWUT ’25 (Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Sep 2025)

Title: You’re the One Whom I’m Talking To: The Role of Contextual External Human-Machine Interfaces in Multi-Road User Conflict Scenarios
Authors: Kang, Y., Park, J., Hwang, S., Seong, M., Kim, G., and Kim, S.
Conference: Proc. ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT), September 2025


🚗 Project Overview

This study investigates how contextual external human-machine interfaces (eHMIs) can improve safety and communication clarity in multi-road user conflict scenarios involving pedestrians, cyclists, and drivers.
Through a VR-based multi-agent simulation (N=42), three types of contextual eHMIs—Whom, When, and Where—were compared against a baseline with No eHMI/No Context.

  • Whom eHMI specifies the intended recipient (e.g., “You may go”) and achieved the shortest reaction times and highest ratings for clarity and trust.
  • When/Where eHMIs provide timing and stopping-location cues, improving overall comprehension and perceived safety.
  • No eHMI/No Context led to lower performance and user confidence.

Results demonstrate that recipient-specific (Whom) contextual cues reduce ambiguity and accelerate decision-making in complex traffic interactions.


Click here for more information about this project.