This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| en:ros:simulations:iseauto [2021/04/07 10:57] – raivo.sell | en:ros:simulations:iseauto [Unknown date] (current) – external edit (Unknown date) 127.0.0.1 | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| ====== Self-driving Vehicle ====== | ====== Self-driving Vehicle ====== | ||
| - | Self-driving vehicles can fall into many different categories and levels of automation as described in previous chapters. One of the higher level self-driving vehicles available nowadays is a Level 4 self-driving robot bus or Automated Vehicle (AV shuttle), also called a last-mile autonomous vehicle. The following describes an AV shuttle designed and developed in Tallinn University of Technology in cooperation with industrial partners in Estonia. | + | Self-driving vehicles can fall into many different categories and levels of automation as described in previous chapters. One of the higher-level self-driving vehicles available nowadays is a Level 4 self-driving robot bus or Automated Vehicle (AV shuttle), also called a last-mile autonomous vehicle. The following describes an AV shuttle designed and developed in Tallinn University of Technology in cooperation with industrial partners in Estonia. |
| ===== Vehicle Description ===== | ===== Vehicle Description ===== | ||
| Line 6: | Line 6: | ||
| The iseAuto project was a cooperation project between industry and university which has a range of objectives from both sides as well as a very practical outcome. The project started in June 2017, when TalTech and Silberauto Estonia agreed to jointly develop a self-driving bus that had its public demonstration in September 2018. The purpose from the company’s side was to get involved with self-driving technology to be aware of the future of the automotive industry and also get experience in manufacturing a special purpose car body, as that is one of the main activities of one of the branches of the company. | The iseAuto project was a cooperation project between industry and university which has a range of objectives from both sides as well as a very practical outcome. The project started in June 2017, when TalTech and Silberauto Estonia agreed to jointly develop a self-driving bus that had its public demonstration in September 2018. The purpose from the company’s side was to get involved with self-driving technology to be aware of the future of the automotive industry and also get experience in manufacturing a special purpose car body, as that is one of the main activities of one of the branches of the company. | ||
| + | {{ : | ||
| **Vehicle specifications** | **Vehicle specifications** | ||
| Line 14: | Line 15: | ||
| * Main engine power 47 kW | * Main engine power 47 kW | ||
| * Battery pack 16 kWh | * Battery pack 16 kWh | ||
| - | {{ : | ||
| - | |||
| - | **Measures: | ||
| - | |||
| - | * Height 2.4 m | ||
| - | * Length 3.5 m | ||
| - | * Witdht 1.5 m | ||
| **Sensors: | **Sensors: | ||
| Line 26: | Line 20: | ||
| * Front-top LiDAR Velodyne VLP-32 | * Front-top LiDAR Velodyne VLP-32 | ||
| * Back-top LiDAR Velodyne VLP-16 | * Back-top LiDAR Velodyne VLP-16 | ||
| - | * Side LiDAR Robotsence | + | * Side LiDAR Robotsense |
| * Front-bottom LiDAR Robosence 16 | * Front-bottom LiDAR Robosence 16 | ||
| * Cameras | * Cameras | ||
| Line 34: | Line 28: | ||
| {{ : | {{ : | ||
| - | The iseAuto 3D model and its lidar sensors are illustrated in the above figure. Two Velodyne VLP-16 and VLP-32 are installed at the top back and front of the vehicle respectively. Furthermore, | + | The iseAuto 3D model and its Lidar sensors are illustrated in the above figure. Two Velodyne VLP-16 and VLP-32 are installed at the top back and front of the vehicle respectively. Furthermore, |
| **Software: | **Software: | ||
| Line 42: | Line 36: | ||
| ===== Simulating the Self-driving Vehicle ===== | ===== Simulating the Self-driving Vehicle ===== | ||
| - | One of the most popular robotic simulator platforms is Gazebo. It is based on ROS and utilizes | + | TalTech iseAuto has a simulation model which can be simulated in the Gazebo |
| {{ : | {{ : | ||
| - | On the other hand, CARLA and SVL are modern open-source simulators based on the game engines, Unreal and Unity respectively, | + | On the other hand, CARLA and SVL are modern open-source simulators based on the game engines, Unreal and Unity respectively, |
| {{ : | {{ : | ||
| - | The above figure shows a full map of the simulation workflow and the relation between Autoware and the simulator. Vehicle 3D Models | + | The above figure shows a full map of the simulation workflow and the relation between Autoware and the simulator. Vehicle 3D models |
| - | The 3D mesh model of the vehicle should be imported to Unity to define physics components like collider and wheel actuation, in addition, to assign other features like lights and materials for appearance. Finally, the built vehicle exported from Unity is used in the simulator. Later, all the sensor configurations are defined via a JSON file inside the simulator. Terrain generation will be discussed in the next section. | + | The 3D mesh model of the vehicle should be imported to Unity to define physics components like collider and wheel actuation, in addition, to assign other features like lights and materials for appearance. Finally, the built vehicle exported from Unity is used in the simulator. Later, all the sensor configurations are defined via a JSON file inside the simulator. |
| ==== Virtual Environment Creation ==== | ==== Virtual Environment Creation ==== | ||
| - | In order to create a virtual environment as a testbed for simulating the shuttle, we selected | + | In order to create a virtual environment as a testbed for simulating the autonomous vehicle, the aerial mapping approach to map the desired environment |
| + | |||
| + | {{ : | ||
| + | |||
| + | Taking aerial photos is one of the most important steps in the mapping process as it will significantly affect the outcome of the process and the amount of work to be done to process those images. The images taken are georeferenced by the drone. The onboard IMU provides the pictures with the orientation so that later they can be stitched together and used for photogrammetric processing. Third-party software aligns and creates the dense point cloud from the pictures that were captured. Once the dense point cloud is created, the segmentation and classification of the points are needed in order to separate unwanted objects and vegetation from the point cloud data. The below figure shows the three main steps to generate the Unity train from geospatial data. | ||
| {{ : | {{ : | ||
| - | Now it is ready to load the final map that is built by unity for SVL simulator. | + | Now it is ready to load the final map that is built by Unity for the SVL simulator. |
| ==== Run the Simulation ==== | ==== Run the Simulation ==== | ||
| Line 66: | Line 64: | ||
| {{ : | {{ : | ||
| - | The above-left figure shows the shuttle | + | The above-left figure shows the vehicle |
| {{ : | {{ : | ||
| + | ==== Summary ==== | ||
| + | Simulation is a more and more important aspect of developing robots and intelligent vehicles like self-driving cars. The importance is rising in parallel with the complexity of control algorithms and the integration of mechatronic systems in the vehicle. Simulation helps to reduce drastically the development cost and increase the general safety of the end result. It is clear that early prototypes may easily cause accidents if the testing is done on real traffic but if thousands of miles are conducted on the simulations and most of the software bugs and electromechanical incompatibilities are eliminated it is safer to start open road tests. Therefore simulations during the self-driving vehicle development are almost must be today' | ||