Human–Machine Interface and Communication

 Bachelors (1st level) classification icon

This chapter explores the specificities of Human–Machine Interaction (HMI) in the context of autonomous vehicles (AVs). It examines how HMI in autonomous vehicles differs fundamentally from traditional car dashboards. With the human driver no longer actively involved in operating the vehicle, the challenge arises: *how should AI-driven systems communicate effectively with passengers, pedestrians, and other road users?*

HMI in AVs extends far beyond the driver’s dashboard. It defines the communication bridge between machines, people, and infrastructure — shaping how autonomy is perceived and trusted. Effective HMI determines whether automation is experienced as *intelligent and reliable* or *opaque and alien.*

Changing Paradigms of Communication

Traditional driver interfaces were designed to support manual control. In contrast, autonomous vehicles must communicate *intent*, *status*, and *safety* both inside and outside the vehicle. The absence of human drivers requires new communication models to ensure safe interaction among all participants.

This section addresses the available communication channels and discusses how these channels must be redefined to accommodate the new paradigm. Additionally, it considers how various environmental factors—including cultural, geographical, seasonal, and spatial elements—impact communication strategies.

A key concept in this transformation is the Language of Driving (LoD) — a framework for structuring and standardizing how autonomous vehicles express awareness and intent toward humans (Kalda et al., 2022).

Human Perception and Driving

Understanding how humans perceive the world is crucial for autonomous vehicles to communicate effectively. Human perception is multimodal — combining sight, sound, motion cues, and social awareness. By studying these perceptual mechanisms, AV designers can emulate intuitive human signals such as:

  • Body orientation or focus of attention.
  • Gesture and trajectory anticipation.
  • Subtle speed or direction changes as non-verbal cues.

Such behaviorally inspired signaling helps AVs become socially legible, supporting *shared understanding* on the road.

Cultural and Social Interactions

Driving is a social act. Culture, norms, and environment shape how humans interpret signals and movements. Autonomous vehicles may need to adapt their communication style — from light colors and icons to audio tones and message phrasing — depending on cultural and regional expectations.

Research explores whether AVs could adopt human-like communication methods, such as digital facial expressions or humanoid gestures, to support more natural interactions in complex social driving contexts.

AI Role in Communication

Modern HMI systems increasingly rely on artificial intelligence, including large language models (LLMs), to process complex situational data and adapt communication in real time. AI enables:

  • Context-aware dialogue systems for passengers and operators.
  • Adaptive message prioritization based on urgency and environment.
  • Natural language explanations of AV behavior (e.g., *“Slowing down for crossing pedestrian”*).

The evolution toward AI-mediated interfaces marks a shift from fixed UI design toward *conversational and contextual* vehicle communication.

 Example of multimodal HMI used in TalTech autonomous shuttle research (source: Kalda et al., 2022).

en/safeav/hmc/hmi.txt · Last modified: 2025/10/20 18:52 by raivo.sell
CC Attribution-Share Alike 4.0 International
www.chimeric.de Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0