@@kingmasterlord yeah finally someone that agrees with me. Everyone should be teaches differently with more efficiency. I told this to my teacher and got suspended lol. Government come on do something
cool, really useful to have a robot agnostic navigation system. I imagine if you ever revisit making a biped, this along with the face tracking nerf turret would lay the foundations for our first robot overlords.
Great content, so valuable. Maybe we all own a robot in the future thanks to initiative like yours. If not, robots will only belong to the big corporations like today (yes, an ATM is already a robot). The implications are HUGE
James, if you are using the RealSense camera, that laser is an overkill. You can get the distance to the walls/obstacles simply from the depth map produced by the camera if you don't have a SLAM mapping module in ROS that can use the camera directly. That will save you also quite a bit of current (those LIDARs are pretty power hungry).
Suggestion: it may be better to place the sensors near the centre of rotation, particularly on OpenDog where you maybe remake the front panel to have a hole for them. This way there will be less noise as they won't move as much as on the top of the robot. You will lose some of the rear FOV but that can be fixed by adding more or just rotating the robot itself like a real dog would have to. Hope this helps!
You should look into using some sort of graph SLAM with the LIDAR. I didn't realize that all of your position data came from the wheel encoders, as those lose tracking over time, especially at higher speeds/accelerations.
You should probably put the lidar on a servo or stepper motor so that you can keep the lidar facing one way the whole time when it turns. I noticed that when you turn it messes with the whole map because the lidar isn’t turning the amount it’s programmed to turn it could be you a little faster or slower. If you put it on a servo or motor though you could get rid of that problem.
I still think it needs to be a on a gimble that self levels. If you can get it working without one, maybe ad in a head that can look around so the robot can get more data about the environment?
What about building the Lidar onto a platform with an angel, so as it moves, you can scan the whole enviorment in 3D. Do you think, you could get this to work? :)
Can you put it on a drone please! Would be awesome to see what it's capable of like navigating stairs and if combining with other drone sensors like barometer, optical flow and ultrasound will make it a beast!
very beautyfull work!! I notice You don't use belt anymore for transimssion...good choise...belt will broke after few years eather you use the robot or not....gears are more strong and will broke after more time ..and only if were used.... will be great it you will use gears (3d printed is better for makers XD) in all your future project...
i would be interested to see how a robot performs with a neural network for a brain. where you simply give it a mission (for example go to this place), attatch a laser to it and then let it learn to move around.
Would using a gimbal (or 2) (DIY or purchased) help for stability for the camera & sensors, on the dog!? That was just my immediate thought to maintain stability and to try and curb/minimise other issues like sensor drift? All theoretical, of course, but still worthy of trying if resources are available.
That is pretty neat! Any ideas of where I can find code to run an ROS node on a 65C02? I have an old Maxx Steele robot from the 80's that would make good use of the platform! Also, your "target" head looks like Ensign Ro from Star Trek The Next Generation.
Thought this could be ideal for the little ROS project I'm building but it looks like the T256 is discontinued and there's no new model coming out. Such a shame.
Hey with a 3d camera like the intel T265 that you have you should be able to get your map data without the additional lidar If the Camera doesnt give you the depth info into ROS on its own you should be able to get them with the raw Footage of the Camera with a little help from openCV
James, I think you underestimate the quality of your leg odometry and the sensor fusion algorithm the comes with the T265. Each joint is directly coupled with an econder you can read and all the wobble will be picked up by the encoder. The only issue would be if the leg slips on the floor but that issue is still present on wheeled robots.
can´t you build a modul under the dog that can act as legs were the robot can rest on. I mean that the legs can extend or retract when the dog is supposed to stand still. via servo or geared motor. sorry for my terrible englisch, im from germany.
Your amcl seems to have an uncertainty that is way too big. I think this is probaly causing the ghosting, because the laser scans are recorded in the wrong postions by the local costmap. Try to tune the amcl to rely more on the odometry by turning down the `odom_alpha` parameters. Especially `odom_alpha1`.
Your laser scan seems to distort when the robot is rotated rapidly. I wonder if that comes from a synchronization/latency issue between the tracking camera updates vs the laser scans. I'd be interested to know what's going on if you figure it out. Also, why flat head screws?!
Ultrasonic is pretty bad in comperision to lidar. Ultrasonic cant detected soft stuff like pillows or walls in an angle do to reflexion of the echo, same with round object like poles. The messearing is pretty slow do to the spead of sound and using multiple sensor is difficult.
Amazing work James ! From the video I guess you didn't install Jetson Nano Developer kit image on your Jetson Nano instead you used a normal Ubuntu , is this right ? and Is there is a reason for that? Best of luck with your upcoming projects :)
you just need to set your environmental variables for ROS_MASTER_URI, ROS_IP and ROS_HOSTNAME to the instance that's running roscore - checkout the networking section of the ROS docs for more details.
@jamesbruton I am looking for your Git repo that has the cad and such for the orange robot in this video but can not seem to find it. Would you please give me a direct link to this robot? Thank you in advance.
You should make a robot that both navigates your house and shoots everyone it sees so you just have an assassin robot trying to shoot everyone in the house
a cool implimentation/toy using this would be a RC car that you can drive that would sense if you were going to crash and prevent it. either by steering away from walls or stopping the car. would make a fun tesla autodriving rc car :P
I don't really understand the need for the Arduino; the Jetson nano has enough GPIO pins to control the motors and read the encoders. Is there some latency issue that makes it better to use the Arduino or are you just more comfortable having it control the locomotion?
@@jamesbruton I see, makes sense. I'm planning my own autonomous robot build with the Jetson Nano right now, so I was curious. I suppose there could be an added benefit of not having motors directly controlled by scheduled processes too.
One question in a video you published 5 weeks ago you said you just use codes from rebuild examples. Isn't that boring af to just copy and paste I mean the spirit of tinkering is building it your self and explorer new areas. That makes what makes it fun. I know you can't build anything your self but I think just relying on already finished projects with we'll define documentation is like let the robot be built by someone else. There is no part of yourself in it. Don't understand me wrong I know you put probably many hours of work into it. But don't try thinks and maybe failed at new thinks nobody has done before is not a bad thing because if hat no one that failed it the first place at a new topic there would be no evolution bc everyone is just relying on the thing that were made in the past by someone else.
We need more people like this
@JestEr yep
@JestEr true dude.
demand education reform then. half these schools are still using common core material
This is why we will die to Skynet..
@@kingmasterlord yeah finally someone that agrees with me. Everyone should be teaches differently with more efficiency. I told this to my teacher and got suspended lol. Government come on do something
cool, really useful to have a robot agnostic navigation system. I imagine if you ever revisit making a biped, this along with the face tracking nerf turret would lay the foundations for our first robot overlords.
This is so true I'm 100% sure anyone who saw this comment wants to be the one who said this like me.
I feel you but the Boston Dynamics may already have a few foundations in mind...
Very clever. I didn't think such accurate mapping was possible without wheel odometry.
Love it. FWIW, the reason most people keep the LIDAR low is for eye-safety
Really great job, Now my robot navigates without wheel odometry perfectly.
I'm so happy you picked up ROS along the way :)
as always, excellent video James! 👌😎 amazing how much work it went into this
I didn't think I would see you here, but then again it makes sense
Great content, so valuable. Maybe we all own a robot in the future thanks to initiative like yours. If not, robots will only belong to the big corporations like today (yes, an ATM is already a robot). The implications are HUGE
He is smart, his brain must have so many wrinkles
4 at least
Nuh uh einstein only had 3 folds this got has 2
Was that a hint at another Colin Furze Collaboration?
Really!!!! James and Collin building an autonomous tank with a cannon! This is how we get robot world domination people!!!
Fantastic engineering detail, Thanks James.
Really good video. My FRC team is looking at implementing this exact Lidar unit into our robot this year.
Hello James !
Thank for this sharing
If you'd make an online course about robotics or 3d desgin i'd totally buy it!
I'd pay good money for it too
I suggested this change on the last video, good thing it worked
That is sooo cool! I'm betting he'll add it to a skateboard too. I'd love to see that in one of the Star Wars robots!
Colin Furze controlling an autonomous robot that James will build in the future: At my signal, unleash hell !
James, if you are using the RealSense camera, that laser is an overkill. You can get the distance to the walls/obstacles simply from the depth map produced by the camera if you don't have a SLAM mapping module in ROS that can use the camera directly.
That will save you also quite a bit of current (those LIDARs are pretty power hungry).
the camera already looks like eyes and the lidar resembles a tophat ... maybe that could be a nice theme for the "module"
Nice progress ! For robot without wheel encoders, I recommend hector packages and robot_localization. You will get a bette steady odom :)
You are a great engineer!
Suggestion: it may be better to place the sensors near the centre of rotation, particularly on OpenDog where you maybe remake the front panel to have a hole for them. This way there will be less noise as they won't move as much as on the top of the robot. You will lose some of the rear FOV but that can be fixed by adding more or just rotating the robot itself like a real dog would have to. Hope this helps!
Don't forget to write a nice urdf for your robots 😀 . Nice work
Yes, it's mostly proof of concept for now.
Very good work
Very nicely done!
"The 3 Laws of robotics prevent a robot from harming a human."
*only programs 2 laws into robot*
I can't belive this information is free :D
we do like your kitchen. awesome video
Nice build! The laser module looks kind of funny though, maybe having a rotating mirror would be better
You should look into using some sort of graph SLAM with the LIDAR. I didn't realize that all of your position data came from the wheel encoders, as those lose tracking over time, especially at higher speeds/accelerations.
You should probably put the lidar on a servo or stepper motor so that you can keep the lidar facing one way the whole time when it turns. I noticed that when you turn it messes with the whole map because the lidar isn’t turning the amount it’s programmed to turn it could be you a little faster or slower. If you put it on a servo or motor though you could get rid of that problem.
I still think it needs to be a on a gimble that self levels. If you can get it working without one, maybe ad in a head that can look around so the robot can get more data about the environment?
I never realized ROS is that useful before
Would using nylon fasteners be an option to reduce weight? Or are they too weak? Another option would be to machine fasteners out of aluminium
What about building the Lidar onto a platform with an angel, so as it moves, you can scan the whole enviorment in 3D. Do you think, you could get this to work? :)
hello from Qazaqstan
I wish the Lidar module used in the new iPhone/iPad could be used independently for projects like this.
But where would we put the meter? /tesla dies broke
Can you put it on a drone please! Would be awesome to see what it's capable of like navigating stairs and if combining with other drone sensors like barometer, optical flow and ultrasound will make it a beast!
very beautyfull work!! I notice You don't use belt anymore for transimssion...good choise...belt will broke after few years eather you use the robot or not....gears are more strong and will broke after more time ..and only if were used.... will be great it you will use gears (3d printed is better for makers XD) in all your future project...
Great video!
Do you have a link to the joysticks that you used for the controller. It would really help me with my robot
Good video
So cool
i would be interested to see how a robot performs with a neural network for a brain. where you simply give it a mission (for example go to this place), attatch a laser to it and then let it learn to move around.
Excellent job any opportunity to learn from you
Would using a gimbal (or 2) (DIY or purchased) help for stability for the camera & sensors, on the dog!?
That was just my immediate thought to maintain stability and to try and curb/minimise other issues like sensor drift? All theoretical, of course, but still worthy of trying if resources are available.
Very Good
That is pretty neat! Any ideas of where I can find code to run an ROS node on a 65C02? I have an old Maxx Steele robot from the 80's that would make good use of the platform!
Also, your "target" head looks like Ensign Ro from Star Trek The Next Generation.
Thought this could be ideal for the little ROS project I'm building but it looks like the T256 is discontinued and there's no new model coming out. Such a shame.
Hey with a 3d camera like the intel T265 that you have you should be able to get your map data without the additional lidar
If the Camera doesnt give you the depth info into ROS on its own you should be able to get them with the raw Footage of the Camera with a little help from openCV
The T265 is tracking only. I have the Depth camera, but the LIDAR is 360'.
James, I think you underestimate the quality of your leg odometry and the sensor fusion algorithm the comes with the T265. Each joint is directly coupled with an econder you can read and all the wobble will be picked up by the encoder. The only issue would be if the leg slips on the floor but that issue is still present on wheeled robots.
Can this scale up to my lawn tractor? I have a 1 Acre lot with lots of bushes and trees to avoid.
Hello Nice video, What version of ros do you use?
can´t you build a modul under the dog that can act as legs were the robot can rest on. I mean that the legs can extend or retract when the dog is supposed to stand still. via servo or geared motor. sorry for my terrible englisch, im from germany.
Love you open your sources code
Your amcl seems to have an uncertainty that is way too big. I think this is probaly causing the ghosting, because the laser scans are recorded in the wrong postions by the local costmap. Try to tune the amcl to rely more on the odometry by turning down the `odom_alpha` parameters. Especially `odom_alpha1`.
And you must set `odom_model_type` to `omni-corrected`
I will be honest, if I hadn't used ROS myself before, I might have had trouble following what exactly you were doing in this video.
Your laser scan seems to distort when the robot is rotated rapidly. I wonder if that comes from a synchronization/latency issue between the tracking camera updates vs the laser scans. I'd be interested to know what's going on if you figure it out.
Also, why flat head screws?!
I probably drove too fast when I did the map, the laser is the cheapest available also. The screws are countersunk?
Working backwards - could you use the movement data from the RealSense to help stabilise the dog?
that Nerf turret is exactly the same technology as Iron Man's shoulder guns that shoot hostage-taking terrorists
Super!
can you use a laser mouse set up for track on the big dog to calculate it position
you might try using ultrasound in conjunction with the lidar?
Ultrasonic is pretty bad in comperision to lidar.
Ultrasonic cant detected soft stuff like pillows or walls in an angle do to reflexion of the echo, same with round object like poles. The messearing is pretty slow do to the spead of sound and using multiple sensor is difficult.
@@jonasstahl9826 might give a useful second opinion on the lidar ghost images. ultrasound is quite good at showing where things aren't
Awesome 👌
Amazing work James !
From the video I guess you didn't install Jetson Nano Developer kit image on your Jetson Nano instead you used a normal Ubuntu , is this right ? and Is there is a reason for that?
Best of luck with your upcoming projects :)
Next thing you'll know he'll have his whole house operated by robots
What lidar sensor did you use?
Amazing project, Im new to ROS and I have a quick question, how do you visualize rviz remotely?
you just need to set your environmental variables for ROS_MASTER_URI, ROS_IP and ROS_HOSTNAME to the instance that's running roscore - checkout the networking section of the ROS docs for more details.
Cool!
Amazing😱😱😱❤️
Please do a project using oak-D
11/10 not mad just dissipointed
@jamesbruton I am looking for your Git repo that has the cad and such for the orange robot in this video but can not seem to find it. Would you please give me a direct link to this robot? Thank you in advance.
James, so can you make the same map with a D415?
In the year 2020, James robots achieved self awarene.....
Apologies for the Rowan Atkinson death joke.
How does one get into robotics. any online courses you'd recommend?
You should make a robot that both navigates your house and shoots everyone it sees so you just have an assassin robot trying to shoot everyone in the house
a cool implimentation/toy using this would be a RC car that you can drive that would sense if you were going to crash and prevent it. either by steering away from walls or stopping the car. would make a fun tesla autodriving rc car :P
Flat head screws.....WHY!!!
Because I have some
I don't really understand the need for the Arduino; the Jetson nano has enough GPIO pins to control the motors and read the encoders. Is there some latency issue that makes it better to use the Arduino or are you just more comfortable having it control the locomotion?
I know how to do it this way, plus it can be bolted onto any Arduino robot with differential drive.
@@jamesbruton I see, makes sense. I'm planning my own autonomous robot build with the Jetson Nano right now, so I was curious. I suppose there could be an added benefit of not having motors directly controlled by scheduled processes too.
With this project, is it possible to make the T265 follow a human? If so, how?
what programm do you use to make the data visible on pc and can i use these datas on a rasberry pi to navigate??
Now build a hand and drop that on there. ThingBot.
Canyou made it smaller?
i sure hope your mannequin isnt named Sarah Conner
Why not do real Steel atom robots??
One question in a video you published 5 weeks ago you said you just use codes from rebuild examples. Isn't that boring af to just copy and paste
I mean the spirit of tinkering is building it your self and explorer new areas. That makes what makes it fun. I know you can't build anything your self but I think just relying on already finished projects with we'll define documentation is like let the robot be built by someone else. There is no part of yourself in it.
Don't understand me wrong I know you put probably many hours of work into it. But don't try thinks and maybe failed at new thinks nobody has done before is not a bad thing because if hat no one that failed it the first place at a new topic there would be no evolution bc everyone is just relying on the thing that were made in the past by someone else.
Nice, first 350 xD
self-excited auto-mapping, "getting to know"
you could do automatic, or scheduled on detected objects, like (waterproof) watering some plants, if found and state recorded
why does that sound like an old ribbon printer
spray watering might be the best
I wish Lidar was more accessible.
show !!!
road to 100 trillion subs 😀😀😉😉😉
👍✨
I did the 101 lkke
What books do you reed to understand arduino
The Adafruit Arduino tutorials are pretty good.
Thanks James
Attach it to that mantis robot :)
It would probably work, although it would need to be somewhere with a 360 view without the legs in the way.
@@jamesbruton i don't think I'm ready for that kind of scary haha
Imagine just seeing that patrolling the streets.
hi
Hello
intel L515 = no moving parts Lidar.