This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| en:safeav:softsys:autonomysoftstack [2025/10/17 11:54] – agrisnik | en:safeav:softsys:autonomysoftstack [2025/10/17 12:04] (current) – agrisnik | ||
|---|---|---|---|
| Line 15: | Line 15: | ||
| < | < | ||
| </ | </ | ||
| + | |||
| + | In Figure 1, the main software layers and their functions are depicted. | ||
| + | |||
| + | **Hardware Abstraction Layer (HAL)** | ||
| + | The HAL provides standardised access to hardware resources. It translates hardware-specific details (e.g., sensor communication protocols, voltage levels) into software-accessible APIs. | ||
| + | This functionality typically includes: | ||
| + | * Managing device drivers for LiDARs, radars, cameras, IMUs, and other devices needed for the autonomous system. | ||
| + | * Monitoring power systems and diagnostics, | ||
| + | * Providing time-stamped sensor data streams. The real-time or time-stamped data is the essential part of any data-driven system to ensure proper work of time-series analysis algorithms including deep learning methods. | ||
| + | * Handling real-time control of actuators (motors, servos, brakes). | ||
| + | HAL ensures portability — software modules remain agnostic to specific hardware vendors or configurations ((Lee, E. A., & Seshia, S. A. (2020). Introduction to Embedded Systems: A Cyber-Physical Systems Approach (3rd ed.). MIT Press.)). | ||
| + | |||
| + | **Operating System (OS) and Virtualisation Layer** | ||
| + | The OS layer manages hardware resources, process scheduling, and interprocess communication (IPC) as well as real-time operation, alert and trigger raising using watchdog processes. Here, data processing parallelisation is one of the keys to ensuring resources for time-critical applications. | ||
| + | Autonomous systems often use: | ||
| + | * Linux (Ubuntu or Yocto-based) for flexibility. | ||
| + | * Real-Time Operating Systems (RTOS) like QNX or VxWorks for safety-critical timing. | ||
| + | * Containerization (Docker, Podman) for software isolation and modular deployment. | ||
| + | Time-Sensitive Networking (TSN) extensions and PREEMPT-RT patches ensure deterministic scheduling for mission-critical tasks ((Baruah, S., Baker, T. P., & Burns, A. (2012). Real-time scheduling theory: A historical perspective. Real-Time Systems, 28(2–3), 101–155)). | ||
| + | |||
| + | **Middleware / Communication Layer** | ||
| + | The middleware layer serves as the data backbone of the autonomy stack. It manages communication between distributed software modules, ensuring real-time, reliable, and scalable data flow. IN some of the mentioned architectures middleware is the central distinctive feature of the architecture. | ||
| + | Popular middleware technologies: | ||
| + | * ROS 2 (Robot Operating System 2): Uses DDS (Data Distribution Service) for modular publish–subscribe communication. | ||
| + | * DDS (Data Distribution Service): Industry-standard middleware for real-time, QoS-based data exchange. | ||
| + | * CAN, Ethernet, MQTT: Used for in-vehicle and external communication. | ||
| + | * Logging and telemetry systems (ROS bags, DDS recorders). | ||
| + | * Fault detection and recovery mechanisms. | ||
| + | * QoS (Quality of Service) configuration for bandwidth and latency management. | ||
| + | |||
| + | **Control & Execution Layer** | ||
| + | The control layer translates planned trajectories into actuator commands while maintaining vehicle stability. | ||
| + | It closes the feedback loop between command and sensor response. | ||
| + | Key modules: | ||
| + | * Low-level control: PID, LQR, or MPC controllers for steering, throttle, braking, or propulsion. | ||
| + | * High-level control: Converts trajectory plans into real-time setpoints. | ||
| + | * State estimation: Uses Kalman filters or observers to correct control deviations. | ||
| + | Safety-critical systems often employ redundant controllers and monitor nodes to prevent hazardous conditions ((Broy, M., et al. (2021). Modeling Automotive Software and Hardware Architectures with AUTOSAR. Springer)). | ||
| + | |||
| + | **Autonomy Intelligence Layer** | ||
| + | This is the core of decision-making in the stack. It consists of several interrelated subsystems: | ||
| + | |||
| + | ^ Subsystem ^ Function ^ Example Techniques / Tools ^ | ||
| + | | Perception | Detect and classify objects, lanes, terrain, or obstacles. | CNNs, LiDAR segmentation, | ||
| + | | Localization | Estimate position relative to a global or local map. | SLAM, GNSS, Visual Odometry, EKF. | | ||
| + | | Planning | Compute feasible, safe paths or behaviours. | A*, D*, RRT*, Behavior Trees. | | ||
| + | | Prediction | Provide the environmental behaviour forecast. Usually, it provides an internal dynamics forecast as well. | Recurrent Neural Networks, Bayesian inference. | | ||
| + | | Decision-making | Choose actions based on mission goals and context. | Finite State Machines, Reinforcement Learning. | | ||
| + | |||
| + | These components interact through middleware and run either on edge computers (onboard) or cloud-assisted systems for extended processing ((LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436–444.)). | ||
| + | |||
| + | **Application & Cloud Layer** | ||
| + | At the top of the stack lies the application layer, which extends autonomy beyond individual vehicles: | ||
| + | * Fleet management (monitoring, | ||
| + | * Over-the-air (OTA) updates for software and firmware. | ||
| + | * Cloud-based simulation and analytics. | ||
| + | * Data collection for machine learning retraining. | ||
| + | Frameworks like AWS RoboMaker, NVIDIA DRIVE Sim, and Microsoft AirSim bridge onboard autonomy with cloud computation. | ||
| + | |||
| + | ===== Data Flow in the Autonomy Software Stack ===== | ||
| + | |||
| + | Autonomy systems rely on data pipelines that move information between layers in real time. | ||
| + | |||
| + | <figure Data Flow in an Autonomy Software Stack > | ||
| + | {{ : | ||
| + | < | ||
| + | </ | ||
| + | |||
| + | Each stage includes feedback loops to ensure error correction and safety monitoring ((Thrun, S. (2010). Toward robotic cars. Communications of the ACM, 53(4), 99–106)) ((Raj, A., & Saxena, P. (2022). Software architectures for autonomous vehicle development: | ||
| + | |||
| + | ===== Example Implementations ===== | ||
| + | |||
| + | **ROS 2-Based Stack (Research and Prototyping)** | ||
| + | * Used in academic and industrial R&D. | ||
| + | * Flexible and modular, ideal for simulation and experimental platforms. | ||
| + | * Integration with Gazebo, RViz, and DDS middleware. | ||
| + | |||
| + | **AUTOSAR Adaptive Platform (Automotive)** | ||
| + | * Industry-grade framework for production vehicles. | ||
| + | * Service-oriented architecture with real-time OS and safety mechanisms. | ||
| + | * Supports ISO 26262 compliance and multi-core systems. | ||
| + | |||
| + | **MOOS-IvP (Marine Autonomy)** | ||
| + | * Middleware focused on marine robotics. | ||
| + | * Behaviour-based architecture with mission planning (IvP Helm). | ||
| + | * Optimised for low-bandwidth communication and robustness ((Benjamin, M. R., Curcio, J. A., & Leonard, J. J. (2012). MOOS-IvP autonomy software for marine robots. Journal of Field Robotics, 29(6), 821–835)). | ||
| + | |||
| + | **Hybrid Cloud-Edge Architectures** | ||
| + | * Combine onboard autonomy with cloud processing (for model training or high-level optimisation). | ||
| + | * Used in large-scale fleet operations (e.g., logistics robots, aerial mapping). | ||
| + | * Requires secure communication channels and data orchestration ((Wang, L., Xu, X., & Nee, A. Y. C. (2022). Digital twin-enabled integration in manufacturing. CIRP Annals, 71(1), 105–128)). | ||
| + | |||
| + | ===== Layer Interaction Example – Autonomous Vehicle ===== | ||
| + | |||
| + | <figure Simplified Interaction Example | ||
| + | {{ : | ||
| + | < | ||
| + | </ | ||
| + | |||
| + | This closed-loop data exchange ensures real-time responsiveness, | ||
| + | |||
| + | |||
| + | |||