This chapter explores the specificities of Human–Machine Interaction (HMI) in the context of autonomous vehicles (AVs). It examines how HMI in autonomous vehicles differs fundamentally from traditional car dashboards. With the human driver no longer actively involved in operating the vehicle, the challenge arises: *how should AI-driven systems communicate effectively with passengers, pedestrians, and other road users?*
HMI in AVs extends far beyond the driver’s dashboard. It defines the communication bridge between machines, people, and infrastructure — shaping how autonomy is perceived and trusted. Effective HMI determines whether automation is experienced as *intelligent and reliable* or *opaque and alien.*
Traditional driver interfaces were designed to support manual control. In contrast, autonomous vehicles must communicate *intent*, *status*, and *safety* both inside and outside the vehicle. The absence of human drivers requires new communication models to ensure safe interaction among all participants.
This section addresses the available communication channels and discusses how these channels must be redefined to accommodate the new paradigm. Additionally, it considers how various environmental factors—including cultural, geographical, seasonal, and spatial elements—impact communication strategies.
A key concept in this transformation is the Language of Driving (LoD) — a framework for structuring and standardizing how autonomous vehicles express awareness and intent toward humans (Kalda et al., 2022).
Understanding how humans perceive the world is crucial for autonomous vehicles to communicate effectively. Human perception is multimodal — combining sight, sound, motion cues, and social awareness. By studying these perceptual mechanisms, AV designers can emulate intuitive human signals such as:
Such behaviorally inspired signaling helps AVs become socially legible, supporting *shared understanding* on the road.
Driving is a social act. Culture, norms, and environment shape how humans interpret signals and movements. Autonomous vehicles may need to adapt their communication style — from light colors and icons to audio tones and message phrasing — depending on cultural and regional expectations.
Research explores whether AVs could adopt human-like communication methods, such as digital facial expressions or humanoid gestures, to support more natural interactions in complex social driving contexts.
Modern HMI systems increasingly rely on artificial intelligence, including large language models (LLMs), to process complex situational data and adapt communication in real time. AI enables:
The evolution toward AI-mediated interfaces marks a shift from fixed UI design toward *conversational and contextual* vehicle communication.