Main

Wheel Robot1 with ROS2 Humble - SLAM Demo

I started exploring ROS (robotic operating system) 2 in 2023, hoping to be able to use ROS2 to scale up my robotic systems, and also to incorporate SLAM (simultaneous localization and mapping) library developed by ROS communities. This time nothing fancy, just a standard differential drive wheel robot with castor. I am using custom designed dsPIC33CK256MP506 micro-controller board for the robot controller, and a Raspberry Pi 4B (4GB RAM) for the application processor. Running ROS2 Humble LTS on Ubuntu Linux on the Raspberry Pi. https://github.com/fabiankung/ROS2_WheelRobot_V1 Some resources that I referred to to build this robot: For general concepts on ROS2, including concepts needed for navigation (humble), https://docs.ros.org/en/humble/index.html For learning Navigation2, https://navigation.ros.org/ For learning URDF, http://wiki.ros.org/urdf/XML/model. On sharing data for ROS2, https://articulatedrobotics.xyz/ready-for-ros-2-networking/. On TF2 for ROS2, https://articulatedrobotics.xyz/ready-for-ros-6-tf/. Official TF2 tutorial, http://wiki.ros.org/tf2/Tutorials.

fkungms

9 days ago

the robot and the computer are connected  through a local Wi-Fi network for the first part I'm going to create a map of the  environment the robot will be controlled manually through Bluetooth Link now I'm going  to fire up all the nodes in the robot uh also the navigation tool and the slam tool box  nodes now you can see that I also fire up the RViz uh this is the graphical user  interface for monitoring the states of a robot so the white pixel indicates uh area that is  uh unoccupied black mea
ns uh there is a obstacle and gray means unknown so as the robot move around  the data from the Lidar is used to create the maps now robot is going back to the initial positions all right so the map is completed so the second part I'm going  to let the robot move autonomously with the guidance from the map and also the  data from the Lidar now on the top RViz display you can see there are small green  color arrows these are the location and orientation where we want the robot to pass  through uh
we call this the way points so by comparing the feature obtained from the  Lidar and the features from the map the robot use probability to make a best guess  where it is and its orientation within the environment

Comments

@Nidhifactfiles

can you please make a detailed explaination on V2P1B

@ErkanUnal.

Have you ever worked with a bldc motor?