I really love this project and the way you approach this! This is really encouraging for people who want to get into stuff like this. Please don't stop this series any time soon, we need more people like this on TH-cam and in general. :)
I just found your channel and I love everything you do. I am building an AR4 robot and want to write the software for it myself and your channel is just what I need for this, thank you so much!
Can't believe I have made this far :) Thanks for the great tutorial! I would love to see more about ROS - Arduino Serial communication and integrate Ultrasonic (Ping) Sensor to the robot.
This is a really cool tutorial on ROS2 Navigation and thanks for your valuable time. Can you also make a video where there are multiple robots in the same environment and all of them can be controlled via namespaces?
Can you make a video of the robot using gps instead of slam for the map , and using lidar to avoid obstacles? Or can you give me tips on how to do this ?
As a hobbyist that is looking to incorporate non-gps navigation, are these tools Nav2/Ros etc developed enough in 2023 that they're a robust layer of abstraction, and not an endless time-sink of env/toolchain issues?
For me its showing nav2_recoveries package not found, I have sourced both workspace and ros, I am using ros2 humble, btw thanks for this amazing videos❤
I have slight confusion here, for this tutorial i’m using keyboard with /diff_cont/cmd_vel_unstamped rather than joystick to move the robot. So what should i do to make things work?
thanks for the twist mux idea! I tried to do an emergency control with joystick via code but it doesn't work. Nav2 cmd_vel would just take over. lol. Will try the twistmux package asap
I watched all the videos and really helpful. However, it seems like the entire ROS system is limited to WiFi Range. Is there a way to control a robot which is like say a kilometer away? I have used WifiBroadcast to create a bidirectional link between ground station and robot to transmit data upto 2km LOS (Using two Raspberry Pi controllers). But there is no ROS involved there. Any suggestions to get long distance connections?
Hy, thank you for this tutorials. I wanted to know if its posible to save the target and just launch it in the terminal and not have to got to rviz to setup a goal??
i faced a problem with map, while robot starting to move there is a difference between a lidar map and a slam map. How can I make it synchronized and shouted repeatedly?
i really love this project ,now i am overall completed but i have one problem when i am trying mapping real bot, that time i am changed odom to map when i am trying to run the map also moving what can i do
I have an issue with this tutorial, on the part of adding components to rviz2, i can add tbe first map and it works fine but the next one doesn't, it keeps saying "no map recieved" and same applies to my cameras and images, at first they worked fine but now they keep saying "error subscribing : Empty topic" and yet all the sensors are still configured the same way you did in the previous tutorials, so please help me out here, I've really tried looking for the problems but i really have no clue, when you do things in ur videos they work but they just seem not to work for me
Hey Josh, Just wanted to ask one question. Is there a way that we provide the goal position in form of (x,y,z) coordinates instead of telling it where to go by clicking on the desired destination inside SLM map? Does Nav2 provide any functionality on that?
Absolutely! I'm a little rusty but I think you should be able to publish a message to /goal_pose with ros2 topic pub. I just took a quick look and it looks like they also have a ros2 action for it as well.
how does the actual robot know where are the ros2 navigation stack in the network and vice versa? like in ros 1 there was a ros master, but in ros2 there is nothing like that. so what needs to be configured ?
Hello sir , i use your repo for example and i follow all your step , but My robot can't walk straight when navigating , it's just like zig-zag when walking , can you help me to make sure my robot sir ?
I am doing everything in simulation. My rob is not navigating autonomously after i set the goal location. Is there anything to change in my code. I have commented twist_mux and joy_stick.
@@ahmedwael9638 it's a probalistic localization algorithm called AMCL , which uses particle filters to guess the initial pose of the robot.EKF is totally different from this.
How to test a Real Velodyne instead of the simulation one? I mean which configuration files and/or launch files, urdf, etc must be changed? I still don't have the robot but would like to test if my VLP 16 could map and visualize on RVIZ it's surrounding (black and white map). I was already able to see the LaserScan data from my Velodyne on /scan topic, but it is dying when setting 2D pose along with NAV2. Some hint?
Replace joystick with keyboard and topic value as /cmd_vel However, AFAIU, since both navigation and keyboard publish to same topic, a twist_mux is not necessary.
Hi, my current setup is detailed in this tutorial th-cam.com/video/2lIV3dRvHmQ/w-d-xo.html I usually use Ubuntu MATE. These tutorials have been on 20.04/foxy but I will very soon be upgrading to 22.04/humble and releasing a video detailing any changes
Hey there! Working through this tutorial I get Nav2 up and running, but the robot spins relatively fast but moves incredibly slow. So much so it barely looks like its moving. I think its because my robot is so much larger than expected in the params folder so I follow your instructions to copy over the bringup folders and modify them. However, after modifying them Nav2 aborts bringup when changing the new launch location. Where instead of putting nav2_bringup you put articubot_one. That step does not work for me. Does anyone here have any help or tips? Thanks!
Hello from Korea, I want to run slam in my raspberrypi4 and develop self-driving car with 2d mapping, so I installed ubuntu 22.04 version on RP4, but it is hard to run slam :( I installed ros2 humble on it, and I bought lidar sensor. Could you recommend your videos for beginner?? Thank you.
Hi, Many thanks for the tutorials very clear, it helped me alot starting understanding ROS2. unfortently i keep getting a error once i change the Slam Toolbox to work with localization, a saved map is not loading, i am keeping getting a new map visualization, moreover, after i changed RViz to work with map i get the follow error, please any idea, i did not find hint in Google " Vertex Program:rviz/glsl120/indexed_8bit_image.vert Fragment Program:rviz/glsl120/indexed_8bit_image.frag GLSL link result : [rviz2-1] active samplers with a different type refer to the same texture image unit "
I manage to make it work. However sometimes I need to restart the launcher because of [ERROR] [bt_navigator-4]. This happened when I set 2D goal pose [ERROR] [bt_navigator-4]: process has died [pid 206597, exit code -11, cmd '/opt/ros/foxy/lib/nav2_bt_navigator/bt_navigator --ros-args -r __node:=bt_navigator --params-file /tmp/tmpfvwd9zhh -r /tf:=tf -r /tf_static:=tf_static']. my i know what is
I really love this project and the way you approach this! This is really encouraging for people who want to get into stuff like this. Please don't stop this series any time soon, we need more people like this on TH-cam and in general. :)
Thanks! I've been a bit busy over the summer (here in Aus) but there is plenty more coming this year :D
I'm still early in the tutorial series but can't wait to reach this stage. Thank you so much for guiding us through it!
I just found your channel and I love everything you do. I am building an AR4 robot and want to write the software for it myself and your channel is just what I need for this, thank you so much!
Can't believe I have made this far :) Thanks for the great tutorial!
I would love to see more about ROS - Arduino Serial communication and integrate Ultrasonic (Ping) Sensor to the robot.
Thank you so much for your amazing content!! I really appreciate it!! A video on the robot localization package in the future would be awesome!
Thanks! It is on my maybe-to-do list so...stay tuned?
15:02 will there be an AMCL tuning guide? Love your videos and they helped me create my ROS robot, it just jumps around a lot when using Nav2.
This is a really cool tutorial on ROS2 Navigation and thanks for your valuable time. Can you also make a video where there are multiple robots in the same environment and all of them can be controlled via namespaces?
Thank you for the amazing tutorials. I would love to see a tutorial on simulating multiple robots in gazebo.
Thank you so much for these videos. Really helpful & inspiring.
Thanks!
THANK YOU SO MUCH ! WAS WAITING FOR THIS FOR A LONG TIME!
sir you are really such a inspiration to us ❤
nice project,really helpful for people who want to learn ros/robot,come on !!! hope next project about robot arm.
Great video! Thank you so much!
thanks much for this series, youre explain very well
Thanks!
Thanks and have a great New Year!
Can you make a video of the robot using gps instead of slam for the map , and using lidar to avoid obstacles?
Or can you give me tips on how to do this ?
Love your video so much! Thank you!
Thank you! Very helpful
Sir ,
nice explaination.
which SLAM technique is used here?
gmapping ,hector mapping ,karto mapping and in the cartography ?
As a hobbyist that is looking to incorporate non-gps navigation, are these tools Nav2/Ros etc developed enough in 2023 that they're a robust layer of abstraction, and not an endless time-sink of env/toolchain issues?
For me its showing nav2_recoveries package not found, I have sourced both workspace and ros, I am using ros2 humble, btw thanks for this amazing videos❤
Is it possible to apply different path tracking algorithm than nav2?
I have slight confusion here, for this tutorial i’m using keyboard with /diff_cont/cmd_vel_unstamped rather than joystick to move the robot. So what should i do to make things work?
did you find answer @afiqariffin7859
Well done 👏
Thanks!
thanks for the twist mux idea! I tried to do an emergency control with joystick via code but it doesn't work. Nav2 cmd_vel would just take over. lol. Will try the twistmux package asap
It's always great to find one little gem that helps :)
Do you have any plans on adding the Oak-d lite instead of the Pi camera?
I watched all the videos and really helpful. However, it seems like the entire ROS system is limited to WiFi Range. Is there a way to control a robot which is like say a kilometer away? I have used WifiBroadcast to create a bidirectional link between ground station and robot to transmit data upto 2km LOS (Using two Raspberry Pi controllers). But there is no ROS involved there. Any suggestions to get long distance connections?
Yes. Beaglebone has a new board out that uses sub gigahertz radio and will reach one kilometer. It is not expensive.
@@russhall1097 thank you for the reply. Do you have a reference for it?
Hey, Is there something alike to do navigation based on a 3D map? (like a drone or underwater vehicle)
Thank you for you amazing robot👍🙌
guys do you know if the 10 bucks aliexpress lidar sensor (the ones for vacuums) is good for this purpose?
Hey i really enjoy your video !!!
I have a question about the pose goal is it possible to send it not from the rviz but from the terminal ?
Yes it is! I think it might be a PoseStamped on the topic /goal_pose? You can do that with ros2 topic pub ...
Hy, thank you for this tutorials.
I wanted to know if its posible to save the target and just launch it in the terminal and not have to got to rviz to setup a goal??
please i use conda ros2 , how to add package twist_mux ?
i faced a problem with map, while robot starting to move there is a difference between a lidar map and a slam map. How can I make it synchronized and shouted repeatedly?
i really love this project ,now i am overall completed but i have one problem when i am trying mapping real bot, that time i am changed odom to map when i am trying to run the map also moving what can i do
I have an issue with this tutorial, on the part of adding components to rviz2, i can add tbe first map and it works fine but the next one doesn't, it keeps saying "no map recieved" and same applies to my cameras and images, at first they worked fine but now they keep saying "error subscribing : Empty topic" and yet all the sensors are still configured the same way you did in the previous tutorials, so please help me out here, I've really tried looking for the problems but i really have no clue, when you do things in ur videos they work but they just seem not to work for me
For the case of the map it cannot even let me select s topic for it to subscribe to, same applies to the images and camera
Hey Josh,
Just wanted to ask one question. Is there a way that we provide the goal position in form of (x,y,z) coordinates instead of telling it where to go by clicking on the desired destination inside SLM map? Does Nav2 provide any functionality on that?
Absolutely! I'm a little rusty but I think you should be able to publish a message to /goal_pose with ros2 topic pub. I just took a quick look and it looks like they also have a ros2 action for it as well.
Thanks so much
how does the actual robot know where are the ros2 navigation stack in the network and vice versa? like in ros 1 there was a ros master, but in ros2 there is nothing like that. so what needs to be configured ?
Hello sir , i use your repo for example and i follow all your step , but My robot can't walk straight when navigating , it's just like zig-zag when walking , can you help me to make sure my robot sir ?
I am doing everything in simulation. My rob is not navigating autonomously after i set the goal location. Is there anything to change in my code. I have commented twist_mux and joy_stick.
i dont have a robot, is there a way to visually represent the robot in unity and then use slam and ros
Did you use kalman filter in your tutorial?
Not in this tutorial but maybe in a future one :)
@@ArticulatedRobotics can you tell the type of filter that you used?
@@ahmedwael9638 it's a probalistic localization algorithm called AMCL , which uses particle filters to guess the initial pose of the robot.EKF is totally different from this.
how to make object detection using ros2 foxy ??
How to test a Real Velodyne instead of the simulation one? I mean which configuration files and/or launch files, urdf, etc must be changed? I still don't have the robot but would like to test if my VLP 16 could map and visualize on RVIZ it's surrounding (black and white map). I was already able to see the LaserScan data from my Velodyne on /scan topic, but it is dying when setting 2D pose along with NAV2. Some hint?
Twist‐mux is like a Rosserial in ROS1?
No, Twist-mux merges two control messages into one. Ros2 uses the micro-ROS agent for Rosserial.
how to see velocity?
i am using keyboard instead of joystick what should i do in twist mux?
Replace joystick with keyboard
and topic value as /cmd_vel
However, AFAIU, since both navigation and keyboard publish to same topic, a twist_mux is not necessary.
What os do you have installed on Raspberry pi ? can you make a tutorial on installing ros2 on rasberry pi os/raspbian
Hi, my current setup is detailed in this tutorial th-cam.com/video/2lIV3dRvHmQ/w-d-xo.html
I usually use Ubuntu MATE. These tutorials have been on 20.04/foxy but I will very soon be upgrading to 22.04/humble and releasing a video detailing any changes
While installing Ros2 foxy I'm getting error N : repository package not found
Can you pleassseee help me with this
Can you implement this using only sonars?
No, they would help but don't give enough data
dose this work with the irobot create 2
Hey there! Working through this tutorial I get Nav2 up and running, but the robot spins relatively fast but moves incredibly slow. So much so it barely looks like its moving. I think its because my robot is so much larger than expected in the params folder so I follow your instructions to copy over the bringup folders and modify them. However, after modifying them Nav2 aborts bringup when changing the new launch location. Where instead of putting nav2_bringup you put articubot_one. That step does not work for me. Does anyone here have any help or tips? Thanks!
Hello from Korea, I want to run slam in my raspberrypi4 and develop self-driving car with 2d mapping, so I installed ubuntu 22.04 version on RP4, but it is hard to run slam :( I installed ros2 humble on it, and I bought lidar sensor.
Could you recommend your videos for beginner??
Thank you.
So good. Slam it!
Yeah!
Are you using ROS2?
Good day. Do you have an email address? How can I write to you?
Hi, Many thanks for the tutorials very clear, it helped me alot starting understanding ROS2.
unfortently i keep getting a error once i change the Slam Toolbox to work with localization, a saved map is not loading, i am keeping getting a new map visualization, moreover, after i changed RViz to work with map i get the follow error, please any idea, i did not find hint in Google
"
Vertex Program:rviz/glsl120/indexed_8bit_image.vert Fragment Program:rviz/glsl120/indexed_8bit_image.frag GLSL link result :
[rviz2-1] active samplers with a different type refer to the same texture image unit
"
I manage to make it work. However sometimes I need to restart the launcher because of [ERROR] [bt_navigator-4]. This happened when I set 2D goal pose
[ERROR] [bt_navigator-4]: process has died [pid 206597, exit code -11, cmd '/opt/ros/foxy/lib/nav2_bt_navigator/bt_navigator --ros-args -r __node:=bt_navigator --params-file /tmp/tmpfvwd9zhh -r /tf:=tf -r /tf_static:=tf_static'].
my i know what is
Exact same error I am experiencing.
Did you by change Solve this problem