Hi Tiziano, thanks for your share. If I would like to design a drone to autonomously tracking the ball like the car in the video, how can I design the controller's part? Can you give me some advices or do you have any recommend docs?
Is it possible that instead of the keyboard node, the /dk_llc_3 node subscribes to a topic published by the three ultrasonic sensors? So that while tracking the blob, the car also avoids obstacles? If its possible, how to do that?
My software only purpose is to provide an example and. Make the main loop subscribe also to ultrasonic and then create a repulsing vector that you sum to the commanded one
hi sir, i want to build this kind of project but the objective is to follow human, can i use this code and where the part that i need to change?? thamk you
@Tiziano Fiorenzani Thanks! Right now I'm trying to apply my own HSV filter for a yellow object, because I don't have a blue ball ;) Which code do i have to edit? Is it the blue_min and blue_max in the blob_detector.py node?
@@prandtlmayer thanks a lot!! my camera can now detect my ball :) now i have to edit the dk_llc node, because i don't have an adafruit control board, but a L298 H motor bridge. Any tips on that?
Sir i am trying RPLiDER a1 and want to add this to my drone but i get many failure can you help me i want to try your PI and ultrasonic obstacle avoidance and opencv all your projects to my autonomous drone plz sir help me
I want to make a bird tracker camera, to record brids flying in slow motion using a secondary camera and a pan-tilt system. I know some arduino, PLC programming (including artificial vision using CheckOptic, so I know some concepts) and some other languages like java script, php... just basic stuffs. This tutorial seems too hard for me. Do you know how can I better understand how to make this project? It might help me with mine
This tutorial is not made for your purpose, only to show how to interface Ros and opencv. Anyhow, first focus on the recognition and tracking algorithm. You need a beefy board for that, something optimized for running neural networks
Hi Tiziano , your tutorials are really useful, but I have a question, python could be use only with the cv2 library without the cv_bridge for ROS applications? I don't understand well what is doing in the program the function of the cv_bridge, if you could explain that to me I would be so pleaseful. Thanks
I m getting no transform between frames/ map and scanmatcher_frame available after 20.004533 seconds of waiting..This warning only prints once..can I get some help??
that happened to me only when I played data back from a rosbag because I forgot to set the simulation time. Could be related to the time, but I am not sure
Edit the source, add your blob tracking algorithm and I will test it!
Tactical comment for greater exposure. Thanks for the video!
This is awesome, thanks for sharing :)
Thanks a lot for such awesome contents
Thank you for your tutorial ! I was able to modify on some points it to control my Burger Turtlebot !
Sir,Please upload a video with path planning and obstacle avoiding using lidar sensor.
really awesome. Thanks a lot!!!
Any chance that you might review Jetson Nano?
You are best!
thank your share. perfect!
Hi Tiziano, thanks for your share. If I would like to design a drone to autonomously tracking the ball like the car in the video, how can I design the controller's part? Can you give me some advices or do you have any recommend docs?
There are a few examples, for example there is a follow red ball project on Ardupilot
I'm gunna get it, I'm gunna get it!!!
I am trying to get it !!!!!
Is it possible that instead of the keyboard node, the /dk_llc_3 node subscribes to a topic published by the three ultrasonic sensors? So that while tracking the blob, the car also avoids obstacles?
If its possible, how to do that?
My software only purpose is to provide an example and. Make the main loop subscribe also to ultrasonic and then create a repulsing vector that you sum to the commanded one
great tutorial as always, do you think it would be possible to run Aruco marker tracking and blob tracking on the same raspberry Pi?
Give it a shot, otherwise the beauty of Ros is that you can distribute the computing among multiple boards
After watching video-
I realised that
how simple can life be
Especially with TH-cam
Hi,could you guide me what changes do I required for person tracking and chasing.
Have you thought about a numer plate reader for open cv??
hi sir, i want to build this kind of project but the objective is to follow human, can i use this code and where the part that i need to change?? thamk you
Well, first you need to detect humans, then you publish the position in a topic and from there it should be the same
@Tiziano Fiorenzani Thanks! Right now I'm trying to apply my own HSV filter for a yellow object, because I don't have a blue ball ;) Which code do i have to edit? Is it the blue_min and blue_max in the blob_detector.py node?
I show a script that you can use to tune the HSV filter
@@prandtlmayer thanks a lot!! my camera can now detect my ball :) now i have to edit the dk_llc node, because i don't have an adafruit control board, but a L298 H motor bridge. Any tips on that?
@@sonnchenundhomie Hi, any luck with L298 H motor bridge
I cant find the pkg for raspicam_node, isn't it included in the repo? Or is it?
It is not, you install it as a Ros package
Great tutorial, but do you have any gazebo simulation scenarios with blob detection and turtlebot follow?
I am going to focus on gazebo and ros in the incoming months
Sir i am trying RPLiDER a1 and want to add this to my drone but i get many failure can you help me i want to try your PI and ultrasonic obstacle avoidance and opencv all your projects to my autonomous drone plz sir help me
I want to make a bird tracker camera, to record brids flying in slow motion using a secondary camera and a pan-tilt system. I know some arduino, PLC programming (including artificial vision using CheckOptic, so I know some concepts) and some other languages like java script, php... just basic stuffs. This tutorial seems too hard for me. Do you know how can I better understand how to make this project? It might help me with mine
This tutorial is not made for your purpose, only to show how to interface Ros and opencv. Anyhow, first focus on the recognition and tracking algorithm. You need a beefy board for that, something optimized for running neural networks
Wow
thanks for your video.
if using usb webcam with video1 not video0, How?
I do not know, maybe it does not have the drivers? Anyway I with the wrong webcam, performance might be very slow
@@prandtlmayer dk_llc_3-3 process has died __name:=dk_llc_3 __log:= ((my file))
whats wrong??
Hi Tiziano , your tutorials are really useful, but I have a question, python could be use only with the cv2 library without the cv_bridge for ROS applications? I don't understand well what is doing in the program the function of the cv_bridge, if you could explain that to me I would be so pleaseful. Thanks
That is simply a bridge between OpenCV and ROS. It translates input/outputs into publish/subscribe
I m getting no transform between frames/ map and scanmatcher_frame available after 20.004533 seconds of waiting..This warning only prints once..can I get some help??
that happened to me only when I played data back from a rosbag because I forgot to set the simulation time. Could be related to the time, but I am not sure
How can I make a face tracking drone with drone kit or Ros maybe.
I'd start with a face tracking rover... Less chance to get hurt
How do I install the raspi cam node to my raspberry pi?
It comes in the Ubuntu image I talk about in the first video
[dk_llc_3-5] process has died [pid 14557, exit code 1, please whats wrong??
Not possible to say from that
Damn I though that ball was levitating.