Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
en:safeav:hmc:modes [2025/06/05 12:13] pczekalskien:safeav:hmc:modes [2025/10/20 18:54] (current) raivo.sell
Line 3: Line 3:
  
 <todo @raivo.sell></todo> <todo @raivo.sell></todo>
 +
 +While the previous section described the foundations and goals of HMI, this section focuses on **how autonomous vehicles communicate with various stakeholders** and through which modes.  
 +These interactions can be categorized by user type, purpose, and proximity.
 +
 +===== 1. Passenger Communication =====
 +
 +The **vehicle–passenger interface** supports comfort, awareness, and accessibility. It replaces the human driver’s social role by providing:
 +
 +  * Visual or auditory cues explaining system decisions (e.g., “Yielding to pedestrian”).  
 +  * Clear indications of route, stops, and operational mode.  
 +  * Options for emergency stop, help request, or trip feedback.  
 +
 +Passenger communication must balance automation with reassurance. In an Estonian field study (Kalda, Sell & Soe, 2021), over 90% of first-time AV users reported feeling safe and willing to ride again when the interface clearly explained the vehicle’s actions.
 +
 +===== 2. Pedestrian Communication =====
 +
 +The **vehicle–pedestrian interface (V2P)** substitutes human cues such as eye contact or gestures.  
 +The *Language of Driving* (Kalda et al., 2022) proposes using standardized visual symbols, light bars, or projections to express intent:
 +
 +  * Green arrows — invitation to cross.  
 +  * White pulses — awareness of pedestrian presence.  
 +  * Red cross — do not cross / vehicle in motion.  
 +
 +Pedestrian communication must remain **universal and intuitive**, avoiding dependence on text or language comprehension.
 +
 +===== 3. Safety Operator and Teleoperation =====
 +
 +At current autonomy levels (L3–L4), a **safety operator interface** remains essential.  
 +Two variants exist:
 +
 +  * **Onboard HMI:** allows manual control, displays alerts, and ensures quick handover.  
 +  * **Teleoperation station:** enables remote monitoring and control via secure networks.  
 +
 +Teleoperation acts as a *bridge* between human oversight and full autonomy — essential for handling ambiguous traffic or emergency scenarios.
 +
 +===== 4. Maintenance and Diagnostics Interface =====
 +
 +A dedicated **maintenance interface** enables technicians to safely inspect and update the vehicle:
 +
 +  * Sensor and actuator diagnostics.  
 +  * Log analysis and system replay.  
 +  * Secure firmware updates and access control.  
 +
 +Such interfaces ensure traceability, reliability, and compliance with safety regulations.
 +
 +===== 5. Fleet Manager Interface =====
 +
 +Fleet-level interfaces provide centralized control and analytics for multiple vehicles.  
 +They support:
 +
 +  * Mission planning and route monitoring.  
 +  * Predictive maintenance using vehicle telemetry.  
 +  * Integration with smart city and MaaS platforms.  
 +
 +These tools operate mainly over remote communication channels, relying on secure data infrastructure.
 +
 +===== 6. Direct vs. Remote Communication =====
 +
 +Autonomous vehicle interaction can be divided into **direct** (local) and **remote** (supervisory) communication:
 +
 +^ Type ^ Example ^ Key Features ^
 +| **Direct (Local)** | Passenger, pedestrian, or on-site operator | Low latency, physical proximity, immediate feedback. |
 +| **Remote (Supervisory)** | Teleoperation or fleet control | Network-based, high security, possible latency. |
 +| **Service-Level (Asynchronous)** | Maintenance, updates, diagnostics | Back-end communication; focuses on reliability and traceability. |
 +
 +
 +===== 7. Design Principles for Effective Communication =====
 +
 +To ensure that human–machine communication is intuitive and safe, several universal design principles apply:
 +
 +  * **Transparency:** reveal intent and system state clearly.  
 +  * **Consistency:** uniform behavior across environments.  
 +  * **Accessibility:** accommodate diverse users and abilities.  
 +  * **Multimodality:** combine light, sound, and motion cues.  
 +  * **Security and privacy:** protect both human and machine data.  
 +
 +When applied systematically, these principles make autonomous systems understandable, predictable, and trustworthy.
 +
 +----
 +
 +**References**
 +
 +Kalda, K.; Pizzagalli, S.-L.; Soe, R.-M.; Sell, R.; Bellone, M. (2022). *Language of Driving for Autonomous Vehicles.* Applied Sciences, 12(11), 5406. [https://doi.org/10.3390/app12115406](https://doi.org/10.3390/app12115406)
 +
 +Kalda, K.; Sell, R.; Soe, R.-M. (2021). *Use Case of Autonomous Vehicle Shuttle and Passenger Acceptance.* Proceedings of the Estonian Academy of Sciences, 70(4), 429–435. [https://doi.org/10.3176/proc.2021.4.09](https://doi.org/10.3176/proc.2021.4.09)
  
en/safeav/hmc/modes.1749125612.txt.gz · Last modified: 2025/06/05 12:13 by pczekalski
CC Attribution-Share Alike 4.0 International
www.chimeric.de Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0