This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| en:ros:simulations:turtlebot [2021/03/30 09:30] – [SLAM Simulation] momala | en:ros:simulations:turtlebot [Unknown date] (current) – external edit (Unknown date) 127.0.0.1 | ||
|---|---|---|---|
| Line 4: | Line 4: | ||
| {{ : | {{ : | ||
| - | Most recent and developed version of the turtlebot is version 3. TurtleBot3 is made up of modular plates that users can customize the shape. Available in three types: small size Burger and medium-size Waffle, Waffle Pi. TurtleBot3 consists of a base, two Dynamixel motors, a 1,800mAh battery pack, a 360 degree LIDAR, a camera(+ RealSense camera for Waffle kit, + Raspberry Pi Camera for Waffle Pi kit), an SBC(single board computer: Raspberry PI 3 and Intel Joule 570x) and a hardware mounting kit attaching everything together and adding future sensors. Turtlebot3 was released in May 2017. | + | The most recent and developed version of the turtlebot is version 3. TurtleBot3 is made up of modular plates that users can customize the shape. Available in three types: small-size Burger and medium-size Waffle, Waffle Pi. TurtleBot3 consists of a base, two Dynamixel motors, a 1,800mAh battery pack, a 360 degree LIDAR, a camera (+ RealSense camera for Waffle kit, + Raspberry Pi Camera for Waffle Pi kit), and SBC (single-board computer: Raspberry PI 3 and Intel Joule 570x) and a hardware mounting kit attaching everything together and adding future sensors. Turtlebot3 was released in May 2017. |
| {{ : | {{ : | ||
| + | |||
| ===== Sensors ===== | ===== Sensors ===== | ||
| - | The main sensor of the turtlebot is the 360 Laser Distance Sensor LDS-01. The LDS-01 is a 2D laser scanner capable of sensing 360 degrees that collects a set of data around the robot to use for SLAM (Simultaneous Localization and Mapping). The below fidure | + | The main sensor of the turtlebot is the 360 Laser Distance Sensor LDS-01. The LDS-01 is a 2D laser scanner capable of sensing 360 degrees that collects a set of data around the robot to use for SLAM (Simultaneous Localization and Mapping). The below figure |
| {{ : | {{ : | ||
| - | The 3D camera is one of the most versatile robot sensors. One output of a 3D camera is a 2D camera image, which means that various object recognition algorithms can be used. Many machine vision libraries are available for ROS. One of the most widely used and versatile is [[https:// | + | The 3D camera is one of the most versatile robot sensors. One output of a 3D camera is a 2D camera image, which means that various object recognition algorithms can be used. Many machine vision libraries are available for ROS. One of the most widely used and versatile is [[https:// |
| Objects will be identified and a box will be drawn around them, with the name of the object type identified: | Objects will be identified and a box will be drawn around them, with the name of the object type identified: | ||
| Line 25: | Line 26: | ||
| ===== Turtlebot 3 Simulation ===== | ===== Turtlebot 3 Simulation ===== | ||
| - | To simulate a turtle robot, everything you need to simulate a Gazebos robot is freely available on the Internet. | ||
| ==== Install Dependent ROS 1 Packages ==== | ==== Install Dependent ROS 1 Packages ==== | ||
| - | First of all, you need to install some dependencies. These are based on Ubuntu 18.04 and ROS melodic. | + | First of all, you need to install some dependencies. These are based on Ubuntu 18.04 and ROS Melodic. |
| $ sudo apt-get install ros-melodic-joy ros-melodic-teleop-twist-joy \ | $ sudo apt-get install ros-melodic-joy ros-melodic-teleop-twist-joy \ | ||
| Line 48: | Line 48: | ||
| ==== Install Simulation Package ==== | ==== Install Simulation Package ==== | ||
| - | The TurtleBot3 Simulation Package requires // | + | The TurtleBot3 Simulation Package requires // |
| $ cd ~/ | $ cd ~/ | ||
| Line 55: | Line 55: | ||
| | | ||
| ==== Set TurtleBot3 Model Name ==== | ==== Set TurtleBot3 Model Name ==== | ||
| - | Set the default TURTLEBOT3_MODEL name to your model. Enter the below command to a terminal. | + | Set the default |
| In case of TurtleBot3 Burger: | In case of TurtleBot3 Burger: | ||
| Line 80: | Line 80: | ||
| {{ : | {{ : | ||
| - | In case you need to build your environment, | + | |
| + | In case you need to build your environment, | ||
| {{ : | {{ : | ||
| - | In the Gazebo Simulator, you can add simulation objects from the menu by clicking on the desired object. You'll also find tools for moving, enlarging and rotating objects in the same place. | + | In the Gazebo Simulator, you can add simulation objects from the menu by clicking on the desired object. You'll also find tools for moving, enlarging, and rotating objects in the same place. |
| {{ : | {{ : | ||
| - | Next we try to control the robot remotely. | + | |
| + | Next, we try to control the robot remotely. | ||
| ==== Operate TurtleBot3 ==== | ==== Operate TurtleBot3 ==== | ||
| Line 97: | Line 99: | ||
| Using the keyboard, we should now see how Turtlebot moves in the simulation. | Using the keyboard, we should now see how Turtlebot moves in the simulation. | ||
| - | ===== Visualize Simulation data(RViz) ===== | + | ===== Visualize Simulation data (RViz) ===== |
| RViz visualizes published topics while the simulation is running. You can launch RViz in a new terminal window by entering the below command. | RViz visualizes published topics while the simulation is running. You can launch RViz in a new terminal window by entering the below command. | ||
| Line 110: | Line 112: | ||
| **Run SLAM Node** | **Run SLAM Node** | ||
| - | Open a new terminal, and run the SLAM node. Gmapping | + | Open a new terminal, and run the SLAM node. G mapping |
| $ roslaunch turtlebot3_slam turtlebot3_slam.launch slam_methods: | $ roslaunch turtlebot3_slam turtlebot3_slam.launch slam_methods: | ||
| Line 116: | Line 118: | ||
| **Run Teleoperation Node** | **Run Teleoperation Node** | ||
| - | Open a new terminal with Ctrl + Alt + T and run the teleoperation node. Move the robot and take a look to RViz, you are scanning the area by the laser sensor and creating map out of it. | + | Open a new terminal with Ctrl + Alt + T and run the teleoperation node. Move the robot and take a look at RViz, you are scanning the area by the laser sensor and creating |
| $ roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch | $ roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch | ||
| Line 141: | Line 143: | ||
| ==== Navigation Simulation ==== | ==== Navigation Simulation ==== | ||
| - | ====== Clearbot ====== | + | Just like the SLAM in the Gazebo simulator, you can select or create various environments and robot models in a virtual Navigation world. However, a proper map has to be prepared before running the Navigation. |
| - | [[https:// | + | **1. Launch Simulation World** |
| - | The ClearBot robot is developed | + | Stop all previous launch files and running the node by "Ctrl + C". In the previous SLAM section, " |
| - | {{ : | + | $ roslaunch turtlebot3_gazebo turtlebot3_world.launch |
| - | + | ||
| - | ===== Engines ===== | + | **2. Run Navigation Node** |
| - | + | ||
| - | | Voltage | 12V | | + | |
| - | | Max rpm | 500 | | + | |
| - | | Max moment | 0.59 Nm | | + | |
| - | | Transfer | 19: 1 | | + | |
| - | | Encoder Resolution | 1200 cpr | | + | |
| - | ===== 3D Camera ===== | + | $ roslaunch turtlebot3_navigation turtlebot3_navigation.launch map_file:=$HOME/ |
| - | | Depth Resolution | 1280 x 720 | | + | **3. Estimate Initial Pose** |
| - | | Min Depth | 110mm | | + | |
| - | | Max depth | | | + | |
| - | | Vertical field of vision | 85.2 ° | | + | |
| - | | Horizontal field of vision | 58 ° | | + | |
| - | | RGB camera resolution | 1920x1080 | | + | |
| - | | RGB Camera Frame Rate | 30 fps | | + | |
| - | ===== Omnir wheels ===== | + | Initial Pose Estimation must be performed before defining the destination for autonomous navigation as this process initializes the AMCL parameters that are critical in Navigation. TurtleBot3 has to be correctly located on the map with the LDS sensor data that neatly overlaps the displayed map. |
| - | Omnidirectional wheels are wheels which are covered with rollers perpendicular to the wheel shaft, thus allowing the wheel to move in a transverse direction in addition to the normal direction of rotation. Omnidirectional wheels give the robot excellent maneuverability and flexible two-dimensional movement. | + | 3.1. Click the 2D Pose Estimate button |
| + | {{ : | ||
| - | {{ : | + | 3.2. Click on the map where the actual robot is located and drag the large green arrow toward the direction where the robot is facing. |
| - | {{ : | + | |
| - | < | + | {{ : |
| - | ===== Using Clearbot ===== | + | You can repeat steps 1 and 2 until the LDS sensor data is overlayed on the saved map. |
| - | To learn how to use Clearbot, there are in-depth internships created by the Institute of Technology at the University of Tartu: | + | {{ :en: |
| - | [[https:// | + | NB! 2D Pose Estimate actually publishes the position and direction into the /initial pose topic. So you can publish it by the command line or a script. |
| - | [[https:// | + | **4. Set Navigation Goal** |
| - | [[https:// | + | 4.1 Click the 2D Nav Goal button in the RViz menu. |
| - | [[https:// | + | {{ :en: |
| - | [[https:// | + | 4.2 Click on the map to set the destination of the robot and drag the green arrow toward the direction where the robot will be facing. |
| - | [[https://ut-ims-robotics.github.io/ | + | {{ :en: |
| - | [[https:// | + | *This green arrow is a marker that can specify the destination of the robot. |
| + | *The root of the arrow is the x, y coordinate of the destination, | ||
| + | *As soon as x, y, θ are set, TurtleBot3 will start moving to the destination immediately. | ||
| - | [[https://ut-ims-robotics.github.io/ | + | {{ :en: |
| - | </ | ||