This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| en:ros:simulations:uku [2021/03/30 12:34] – [Sensors] momala | en:ros:simulations:uku [Unknown date] (current) – external edit (Unknown date) 127.0.0.1 | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| ====== Outdoor Mobile Robot ====== | ====== Outdoor Mobile Robot ====== | ||
| - | Outdoor mobile robot UKU is a mid-sized self-driving vehicle for educational and research purposes. A //Gazebo// simulator model has been created to experiment with the UKU robot. This allows anyone, no matter where they are, to program the robot, and if the program runs correctly in the simulator, it can simply be run on a real robot. | + | Outdoor mobile robot UKU is a mid-sized self-driving vehicle for educational and research purposes. A Gazebo simulator model has been created to experiment with the UKU robot. This allows anyone, no matter where they are, to program the robot, and if the program runs correctly in the simulator, it can simply be run on a real robot. |
| Line 10: | Line 10: | ||
| **LiDAR** | **LiDAR** | ||
| - | The UKU's main sensor is lidar. Lidars are very good sensors to sense the surroundings, | + | The UKU's main sensor is Lidar. Lidars are very good sensors to sense the surroundings, |
| ^ Velodyne VLP-16 Specification | ^ Velodyne VLP-16 Specification | ||
| ^ Spec. ^ Value || | ^ Spec. ^ Value || | ||
| - | | **Channels** | + | | **Channels** |
| | **Measurement Range** | | **Measurement Range** | ||
| | **Range Accuracy** | | **Range Accuracy** | ||
| Line 22: | Line 22: | ||
| | **Rotation Rate** | | **Rotation Rate** | ||
| - | Lidar allows laser beams to obtain a point cloud from the surroundings, | + | Lidar allows laser beams to obtain a point cloud from the surroundings, |
| {{ : | {{ : | ||
| - | The //Velodyne VLP-16// | + | The //Velodyne VLP-16// |
| //RPlidar A1// costs around 100 € and //RPlidar A3// 600 €. Lidars differ mainly in the resolution and measurement frequency. | //RPlidar A1// costs around 100 € and //RPlidar A3// 600 €. Lidars differ mainly in the resolution and measurement frequency. | ||
| + | {{ : | ||
| - | {{ : | + | //RPlidar// has a 360 ° field of view but has only one channel. Therefore, the output of the Lidar is two-dimensional. Such Lidars are a good choice for navigating indoors-where walls and other vertical objects are obstructed. |
| - | {{ : | + | |
| - | //RPlidar// has a 360 ° field of view but has only one channel. Therefore, the output of the lidar is two-dimensional. Such lidars are a good choice for navigating indoors where walls and other vertical objects are obstructed. | + | {{ : |
| - | {{ : | + | In the Gazebo simulator, it is also possible to simulate Lidars. Many Lidar manufacturers have published Lidar models (URDF) on the Internet, which makes adding a Lidar to their robot model much easier. The official //Velodyne VLP-16// URDF model is also used in the UKU robot simulation. |
| - | //In the Gazebo// simulator it is also possible to simulate lidars. Many lidar manufacturers have published lidar models (URDF) on the Internet, which makes adding a lidar to their robot model much easier. The official //Velodyne VLP-16// URDF model is also used in the UKU robot simulation. | + | ===== UKU Simulation ===== |
| - | ===== Solid state lidar ===== | + | The necessary models and libraries for experimenting with the robot have been created to simulate the robot in Gazebo. For convenience, |
| - | + | ||
| - | {{ : | + | |
| - | + | ||
| - | + | ||
| - | ===== UKU simulation ===== | + | |
| - | + | ||
| - | The necessary models and libraries for experimenting with the robot have been created to simulate the robot in //Gazebos//. For convenience, | + | |
| Let's start by cloning the repository: | Let's start by cloning the repository: | ||
| $ sudo apt install git | $ sudo apt install git | ||
| - | $ git clone http:// | + | $ git clone http:// |
| + | | ||
| Install the required libraries and compile: | Install the required libraries and compile: | ||
| Line 58: | Line 51: | ||
| $ rosdep install --from-paths src --ignore-src -r -y | $ rosdep install --from-paths src --ignore-src -r -y | ||
| $ catkin_make | $ catkin_make | ||
| - | $ source devel / setup.bash | + | $ source devel / setup.bash |
| So that we don't have to get a working environment every time we open a new terminal window, we can add the following line to the //.bashrc// file: | So that we don't have to get a working environment every time we open a new terminal window, we can add the following line to the //.bashrc// file: | ||
| Line 70: | Line 63: | ||
| $ roslaunch uku_vehicle_gazebo uku_vehicle.launch | $ roslaunch uku_vehicle_gazebo uku_vehicle.launch | ||
| - | The //Gazebo// simulator and //Rviz// should now open. The UKU robot should be visible in the //Gazebo// virtual world. | + | The Gazebo simulator and //Rviz// should now open. The UKU robot should be visible in the Gazebo virtual world. Using the //Gazebo// interface, we can add objects to the virtual world. |
| - | + | ||
| - | Using the //Gazebo// interface, we can add objects to the virtual world. | + | |
| {{ : | {{ : | ||
| - | A robot remote control unit is also included with the workspace. | + | A robot remote control unit is also included with the workspace. Launch Remote Control Node in another window: |
| - | + | ||
| - | Launch Remote Control Node in another window: | + | |
| $ rosrun ackermann_drive_teleop keyop.py | $ rosrun ackermann_drive_teleop keyop.py | ||
| Line 87: | Line 76: | ||
| - | The startup file also launched the //Rviz// visualization tool. When we open the //Rviz// window, we see a 3D lidar image and a robot model. | + | The startup file also launched the //Rviz// visualization tool. When we open the //Rviz// window, we see a 3D Lidar image and a robot model. |
| {{ : | {{ : | ||