This project is open source and you can find the files on my Github: github.com/XRobots/ReallyUsefulRobot Patrons and TH-cam Channel Members can get my videos up to a week early!
Another good reason for having an Arduino microcontroller between the Jetson and the motors is that a microcontroller is much more reliable than a computer with an O/S. You can wire the Arduino to a few bump sensors, and use its watchdog timer, so that it can safely halt the robot if it hits anything or if the Jetson stops communicating. I have done this on all my robots as the higher level functions are more prone to fail.
With each new episode I get more and more in over my head. You have slowly gone from a mechanical genius to a full-stack-integrated mad man... As always, thank you for sharing James! -J
@@jamesbruton hey James, Software engineer here, you gotta start somewhere! You seem to know what the code does and thats like 70% of the battle. Soon enough you'll be able to write your own code whether you like it or not :P
Perhaps James could also include his keys to productivity because he seems to complete at least one mini-project per week and major projects monthly. He really is one of the most incredibly inspirational and undoubtedly fantastic examples of human that we have - the ultimate professor who freely provides instruction and source material.
THANK YOU VERY MUCH - this part 2 and the part 1 are very very helpful videos for anyone starting with ROS - cause even when you have started there is a good enough learning curve and there are many pits where one can get stuck. I hope you add IMU as well in future and integrate that as well in the odom messages. THANKS.
12:15 I can only imagine that you programmed the vision system with your socks. Imagine it showing "my master" and boxing your foot when that sock comes into view.
Very impressive. So many names and acronyms haha. You're doing a good job of convincing me that I was right to avoid ROS as long as I can get away with (although I'll have to go there eventually for one project). Still, I've gone from knowing nothing about it (other than "it's complicated") to knowing at least something about how it does stuff, which is good so thanks! I can see why it has it's place in the world, as clearly stuff can get complex fast and everyone would end up reinventing the wheel if it didn't exist.
Took a class in university where we learned the fundamental algorithms behind SLAM: path planning methods and mapping, and Kalman/particle filtering for localization. if you have the time, I highly recommend building it out from the ground-up, you learn so much about the software side without using ROS! My favorite part? You can write it all in python 😆
Great, you've done a very good job, thank you. My version of the ROS robot didn't move as smooth as this one. It would be nice if you get the chance to explain some of the optimizations, issues you faced and how you solved them. For example, I wasn't able to get my robot to drive backwards until it gets to a place big enough to spin. I also had a tough time with rotating fast enough to keep in sync with the navigation stack...
I mostly used the parameters from NOX RObot Project and Turtlebot, with just a bit of trial and error. I'm still trying to understand all of the parameters myself. I'll be uploading it all to Github soon.
Some people wish for boats or big houses if they hit the lottery, but I wish I could work together on a tracked humanoid project together with you. Thanks for sharing all your work with us.
Hello James! Great video as usual! I wanted to ask about the navigation stack. Are u running all the software(including the Navigation stack) on the Xavier? From what I have heard, navigation stack needs monstrous amounts of processing power.
The navigation part of it is running on a workstation - but it's a VM running under Windows so there are some issues. More on that in a couple of weeks.
I think yes. He runs the Ros launch-file inside the jetson. There doesnt seem to be any Ros installed on the Workstation. But it would be awesome if sometime in the future any heavy calculations where outsourced to the workstation. Ros is definetly capable of doing that.
marketing idea => put your humanoid robot on top of the base and have it do some tricks (rolling around while holding a service tray like a waiter, opening doors, using facial recognition to talk to you etc..) Great Buzz potential!
Wow, this is awesome!! It would be cool if you had a discord community. Also, thanks for the motivation I picked up an older Nvidia tx1for pretty cheap last week to lear on Thank You
James, I think your content is unlike anything else out there. It’s truly top-notch and I really enjoy watching it. However, and it’s possible I’m the only one that notices, but there is a high-pitched hiss in your audio - I’m not certain what frequency but very high - that should be really easy to fix in post processing that would make it so much more enjoyable from an audio perspective. It’s not in every shot - for instance, it’s not in the screen recorded content for the most part, but when you are on screen, it is very noticeable. Again, not trying to criticize at all. The content is fantastic. Just trying to bring some feedback about the audio
20:14 totally ignoring parameters ...............like a boss 😎 some thing tells me this is all leading up to a autonomous open dog with arms and nerf blasters
Your E-Stop method is nice to see. All too often I have seen e-stop buttons that are suppose to "stop all motion in the machine", but fail to actually cut the power to the motors moving it. You might want to consider a NO motor contactor instead of a relay if you go up much higher in amperage and voltage. They are better than the automotive solenoid you showed.
Your mechanical designs are really improving - nice platform design :) I‘m building a 6 DOF robot arm at the moment. By seeing this, I think about using ROS. Have you heard about the ROSCube-X? It seems to be a nice system..
so there is going to be an arm on it, but maybe you could put a spare motor on it for attachments like a blender (it's called really useful, so it should be able to do a lot of stuff right?)
Hi, awesome project! I've you ever considered or tried using PlatformIO ? It's a VSCode extension for programming micro controllers. Seeing people using Arduino IDE for big projects like that just make me feel... impressed :)
Thank you so much for your videos I find your work very interesting, would it be possible to have more information on the batteries, did you use 22.2 V ? because I have looked for it, but I cannot find 24 V
Thank you so much for this helpful video, I have a question, how did you manage to filter the laser angel from scanning the stick behind it? my robot has a front opening and doesn't have back opening so whenever I set a 2D goal, it keeps standing as it detects the object (detects its own body of the robot). how to filter the laser angel to be in front only. I am using RpLiDAR A1
@@jamesbruton I have managed to filter the unwanted laser range and make the readings in front only. my problem now is it keeps drawing the two wheels as an object in my map, can you please help me overcome this problem
I’m also working on a ros robot currently. What nodes are you running on what hardware? I saw in another comment you were running navigation (move_base and amcl?) in a remote workstation. Was this because the xavier couldn’t handle it? My RPi4 doesn’t seem to be able to keep up with amcl and move_base to stay localized so I get real funky behavior.
James Bruton gotcha, I’ve gotten really used to using ssh and bash to do everything so mostly I’m working over ssh and only using my workstation for rviz and roscore. You can also launch nodes on a remote machine in a launch file. I think I’m going to try running navigation on my workstation and see if the quality increases.
This is great. I just started building based on your design. I have ender3, and build volume does not allow me to print parts in the same size. Do you have any recommendations? I make the size 85% of the current model I can print in Ender3
question: isnt publishing to topics an overhead? i understand the reason for distributing as event based system but it wouldnt make the system less real time ?
hi there! I was thinking of an old robovac that I have( mainboard is bosted), to bring back alive with an Arduino. is it possible to do with just two Arduino Uno and a motor shield? (with navigation and mapping tech) perhaps you can make a video of bringing one to live to.
What if you put an ultrasonic sensor facing straight up(towards the ceiling), just a little bit in front of the aluminum extrusion so that when it gets "dangerously too close" to an object it can find another path.
I think you can get a way more accurate odometry using google cartographer. I've checked your previous youtube videos about navigation using rplidar. I can show you how to configure it without needing imu reading as well if you want. I'm not 100% sure about better results but it'll be interesting to compare the two outputs. Also, your Arduino code will become waaay shorter as it'll only have to control the motors. You could even omit the serial node altogether and directly control it using the jetson; although I'm still exploring this last part...
Great video, are you going to use the Xbox Kinect for the eyes of your robot. The guy from Stuff Made Here made a basket ball hoop that helps you score points with the Kinect sensor.
Cool robot, just to nitpick a bit, this is not AI but rather SLAM, ROS default to gmapping for mapping and localization, and trajectory rollout and dynamic window approaches for path planning.
Question: Your prior project piggy backed on the NOX Arduino project. This time around it seems that you have built your own stack from fundamentals. I have been trying to follow your ROS projects but now suddenly I'm lost. How do I setup a Navigation Stack on my own and not git clone catkin_make an already published project like linorobot or husky or turtlebot . I know there is ROS tutorial on Navigation Stack and I tried following it with no success. There are so many moving pieces and catkin_make won't compile unless I have all nodes setup properly and to verify individual nodes I need to compile catkin_make. Its a chicken and egg problem I reckon. Do you have video on how to accomplish this mammoth of a task? Or what resources should I refer to?
@@jamesbruton Thank for reverting. I am trying to figure out how to make a clean build. And this link wiki.ros.org/navigation doesnt help much in setting up a clean build... Do you have some references that has more details on how to actually set all the dependencies etc in make it all work...
Question for you, James: Roughly how much of maximum effort are your motors using? My problem I'm trying to solve: I'd like to make a similar robot, but one that is self-balancing. It needs to be quite fast and have sufficient torque to lift itself up in the event of a crash, but I can't necessarily afford the most expensive motors. And I've got little intuition for units of torque. Thank you for your videos: They've been much more useful to me than TheConstruct's in learning ROS conceptually.
@@jamesbruton derp, forgot you did that. But in your videos, I don't remember you showing a self-recovery. A couple of falls, and I assumed you manually lifted sonic back up. I'm interested in sufficient "motor oomph" to lift sonic from his face to his "feet." (But thank you!) At least watching your sonic robot gives me a place to start from!
You use very expensive components for what you’re building. I guess if they are already laying around or being donated it makes sense but a gearhead motor like the 37d form factor from Pololu with integrated encoder would be cheaper and allow you to spend time on the challenging parts like writing software. Or just go big and use a Hayabusa motor? I’ve got one that balances on 2 wheels and I’d be happy to donate my code if you like. Maybe we could trade for some those sweet sweet brushless motors.
This project is open source and you can find the files on my Github: github.com/XRobots/ReallyUsefulRobot Patrons and TH-cam Channel Members can get my videos up to a week early!
Here we go. One more AI system th-cam.com/video/3UaH4J91dTE/w-d-xo.html
Fantastic
good work, will be of use for the lonely elder people , if it can open the door or fetch a glass of water or medicine.
th-cam.com/video/LKJ4Iy-HxvE/w-d-xo.html
Another good reason for having an Arduino microcontroller between the Jetson and the motors is that a microcontroller is much more reliable than a computer with an O/S. You can wire the Arduino to a few bump sensors, and use its watchdog timer, so that it can safely halt the robot if it hits anything or if the Jetson stops communicating. I have done this on all my robots as the higher level functions are more prone to fail.
Or just write better software like we do in all sorts of mission critical hardware like guidance computers and flight management systems.
With each new episode I get more and more in over my head.
You have slowly gone from a mechanical genius to a full-stack-integrated mad man...
As always, thank you for sharing James!
-J
Thanks - it's mostly code I didn't write, but it's complicated enough for me!
@@jamesbruton hey James, Software engineer here, you gotta start somewhere! You seem to know what the code does and thats like 70% of the battle. Soon enough you'll be able to write your own code whether you like it or not :P
Your content delivery is simply brilliant. Each video is packed with a plethora of information.
I recommend you to make a tutorial series on ros for us
Perhaps James could also include his keys to productivity because he seems to complete at least one mini-project per week and major projects monthly. He really is one of the most incredibly inspirational and undoubtedly fantastic examples of human that we have - the ultimate professor who freely provides instruction and source material.
Seconded. I cannot find any ROS tutorials which I could properly follow.
@@ishraqhasan refer construct
This is exactly the type of robot I’m looking to build soon! Thanks for paving the road for us!
I have no idea what you are talking about most of the time but really enjoy your content.
Really appreciate your open source efforts James!
Awesome video. Wrapping the whole stack from From the micro controller to path planning, what a ride !
THANK YOU VERY MUCH - this part 2 and the part 1 are very very helpful videos for anyone starting with ROS - cause even when you have started there is a good enough learning curve and there are many pits where one can get stuck.
I hope you add IMU as well in future and integrate that as well in the odom messages.
THANKS.
I can't wait to see the robotdogs making map and navigation!!!
Awesome build - this reminds me of that robot
You had a couple of years ago, with the PC inside.
12:15 I can only imagine that you programmed the vision system with your socks. Imagine it showing "my master" and boxing your foot when that sock comes into view.
Very impressive. So many names and acronyms haha. You're doing a good job of convincing me that I was right to avoid ROS as long as I can get away with (although I'll have to go there eventually for one project). Still, I've gone from knowing nothing about it (other than "it's complicated") to knowing at least something about how it does stuff, which is good so thanks! I can see why it has it's place in the world, as clearly stuff can get complex fast and everyone would end up reinventing the wheel if it didn't exist.
Love the USB breakout board mount on the orange/blue bot. Finally, a use for those damn purge blocks! LOL
Great video James!
Thanks!
wow, this is getting sophisticated !
there is more to go yet!
Super inspiring James! Thank you! 👍
Took a class in university where we learned the fundamental algorithms behind SLAM: path planning methods and mapping, and Kalman/particle filtering for localization. if you have the time, I highly recommend building it out from the ground-up, you learn so much about the software side without using ROS! My favorite part? You can write it all in python 😆
Is the scan of the environment offset and blurring because the laser is rotating at an offset from the center of the laser device itself?
That is taken care of in the transform
Great, you've done a very good job, thank you. My version of the ROS robot didn't move as smooth as this one.
It would be nice if you get the chance to explain some of the optimizations, issues you faced and how you solved them. For example, I wasn't able to get my robot to drive backwards until it gets to a place big enough to spin. I also had a tough time with rotating fast enough to keep in sync with the navigation stack...
I mostly used the parameters from NOX RObot Project and Turtlebot, with just a bit of trial and error. I'm still trying to understand all of the parameters myself. I'll be uploading it all to Github soon.
Awesome project! Looking forward to the nexxt episode :)
super interesting and well explained!!! thank you sir
Some people wish for boats or big houses if they hit the lottery, but I wish I could work together on a tracked humanoid project together with you. Thanks for sharing all your work with us.
Hello James! Great video as usual! I wanted to ask about the navigation stack. Are u running all the software(including the Navigation stack) on the Xavier? From what I have heard, navigation stack needs monstrous amounts of processing power.
The navigation part of it is running on a workstation - but it's a VM running under Windows so there are some issues. More on that in a couple of weeks.
I think yes. He runs the Ros launch-file inside the jetson. There doesnt seem to be any Ros installed on the Workstation. But it would be awesome if sometime in the future any heavy calculations where outsourced to the workstation. Ros is definetly capable of doing that.
@@jamesbruton Oh ok. I wanted to check that. Even I have heard that navigation is so heavy that it needs to run on a seperate Pi or other PC.
marketing idea => put your humanoid robot on top of the base and have it do some tricks (rolling around while holding a service tray like a waiter, opening doors, using facial recognition to talk to you etc..)
Great Buzz potential!
In move_base, you can disable it being allowed to drive backwards, since your robot can't see backwards properly, I would recommend doing so.
I was considering it, but it seems to work ok, it's only about 5% blind spot
When you said that it will be able to go up and down and pick up stuff, I get Rob for super smash vibes!!
Wow, this is awesome!! It would be cool if you had a discord community. Also, thanks for the motivation I picked up an older Nvidia tx1for pretty cheap last week to lear on Thank You
If you drop a sock on the floor, would it be able to navigate around it?
As long as it's higher than the laser
James, I think your content is unlike anything else out there. It’s truly top-notch and I really enjoy watching it. However, and it’s possible I’m the only one that notices, but there is a high-pitched hiss in your audio - I’m not certain what frequency but very high - that should be really easy to fix in post processing that would make it so much more enjoyable from an audio perspective. It’s not in every shot - for instance, it’s not in the screen recorded content for the most part, but when you are on screen, it is very noticeable.
Again, not trying to criticize at all. The content is fantastic. Just trying to bring some feedback about the audio
Woo awesome
Do you have a fuse in series with your batteries?
no!
Hi! Great work! Will you share XavierNX Disk Image?
20:14 totally ignoring parameters ...............like a boss 😎
some thing tells me this is all leading up to a autonomous open dog with arms and nerf blasters
Absolutely amazing. Brilliant.
Your E-Stop method is nice to see. All too often I have seen e-stop buttons that are suppose to "stop all motion in the machine", but fail to actually cut the power to the motors moving it. You might want to consider a NO motor contactor instead of a relay if you go up much higher in amperage and voltage. They are better than the automotive solenoid you showed.
Champ is a repo for navigation with legged robots. It actually has an opendog v2 example
I've seen it, I'm just not ready for it yet. I think also getting accurate Odometry data would be hard because the legs are so springy in real life
I love this Video. Ros is so freakin awesome
thanks for the video.
Your mechanical designs are really improving - nice platform design :)
I‘m building a 6 DOF robot arm at the moment. By seeing this, I think about using ROS. Have you heard about the ROSCube-X? It seems to be a nice system..
Good job on the software! Running code wins.
Perhaps the Lidar is presuming deployment in a downward - facing installation (eg. under aircraft).
As it has several different batteries you should integrate the chargers into the robot and have a dock that it can drive over and get power.
I'd like to, but that is a whole consideration
Omg thank you so much for the reply @James Bruton
so there is going to be an arm on it, but maybe you could put a spare motor on it for attachments like a blender (it's called really useful, so it should be able to do a lot of stuff right?)
I love your Ros video...
Hi, awesome project! I've you ever considered or tried using PlatformIO ? It's a VSCode extension for programming micro controllers. Seeing people using Arduino IDE for big projects like that just make me feel... impressed :)
Is it possible to use stm32 blue-pill instead of teensy...?
Probably.
How will the robot go automatically without the arrows? How to save the different spots/location?
Thank you so much for your videos I find your work very interesting, would it be possible to have more information on the batteries, did you use 22.2 V ? because I have looked for it, but I cannot find 24 V
Yes it's a 6S LiPo, full charge is around 25V.
@@jamesbruton I'm not very expert in batteries, but would it be possible to choose one with 20000 mhA? or a less powerful one will be better suited?
can you tell me how is the angular precision of a robot like this?
so... ultimately this is gonna be a giant tea/coffee fetcher? ;-) great work, this is impressive for real! :-)
Super late, but could you use a high performance mouse sensor for aiding accurate floor movement data?
Thank you so much for this helpful video, I have a question, how did you manage to filter the laser angel from scanning the stick behind it? my robot has a front opening and doesn't have back opening so whenever I set a 2D goal, it keeps standing as it detects the object (detects its own body of the robot). how to filter the laser angel to be in front only. I am using RpLiDAR A1
Minimum range for the scanner is 15cm so it ignores the stick anyway
@@jamesbruton I have managed to filter the unwanted laser range and make the readings in front only. my problem now is it keeps drawing the two wheels as an object in my map, can you please help me overcome this problem
I’m also working on a ros robot currently. What nodes are you running on what hardware? I saw in another comment you were running navigation (move_base and amcl?) in a remote workstation. Was this because the xavier couldn’t handle it? My RPi4 doesn’t seem to be able to keep up with amcl and move_base to stay localized so I get real funky behavior.
It probably could, I just like having the config files for the navigation stack on a machine that's in front of me.
James Bruton gotcha, I’ve gotten really used to using ssh and bash to do everything so mostly I’m working over ssh and only using my workstation for rviz and roscore. You can also launch nodes on a remote machine in a launch file. I think I’m going to try running navigation on my workstation and see if the quality increases.
imagine adding some of these features to the opendog robot. would be so sick
Hello James, your work is really inspiring. How did you learn this all? How did you learn being a mechanical engineer?
Great video, very cool
do you ever need to replace you batteries? You seem to use the same batteries often so I guess you just take good care of them
Yep, just keep them on storage charge when they aren't in use
This is great. I just started building based on your design. I have ender3, and build volume does not allow me to print parts in the same size. Do you have any recommendations? I make the size 85% of the current model I can print in Ender3
Amazing
does someone know the reference for the voltmeters?
I just got them from eBay, but it was about 2 years ago and I've been reusing them ever since
I have a question, what does the laser scanner do with mirrors?
gets confused I should think
So amazing , I wanna build a robot like this
question:
isnt publishing to topics an overhead? i understand the reason for distributing as event based system but it wouldnt make the system less real time ?
Sir,A request, please show us how to incorporate an IMU sensors (mayb the MPU-6050)into our robots using ros.
Why aren’t linear and rotational brushes made?
(edit) the rotational ones are made.
Can I get technical support and answer some questions?
hi there! I was thinking of an old robovac that I have( mainboard is bosted), to bring back alive with an Arduino.
is it possible to do with just two Arduino Uno and a motor shield? (with navigation and mapping tech)
perhaps you can make a video of bringing one to live to.
sir is it possible to control the flysky transmitter using your laptop in order to control the drone with pc ,please make a video on it
Hey can you bulid a vacuum cleaning bot using regular objects without using complex 3D printing parts or complex electronics.
A vacuum cleaner without complex electronics is...a vacuum cleaner. :-)
I know that the king of AI is you.And I am jealous for that😀😀🤴🤴
TBH, James is still a padawan, but he did just build his first light sabre.
Wich program do you use for the robots navigation?
The entire video is about that - it's ROS
Have you got a plan for it to avoid driving under tables and such?
Probably manually change the map, and add the table as a no-go area. As well as chairs.
It would probably avoid the distance between the legs, You can always edit the map to block them out though
What if you put an ultrasonic sensor facing straight up(towards the ceiling), just a little bit in front of the aluminum extrusion so that when it gets "dangerously too close" to an object it can find another path.
@@finnsuchara1992 Might be triggered by the arm when that goes on.
@@queabbs Right, that would be sub optimal.
I think you can get a way more accurate odometry using google cartographer. I've checked your previous youtube videos about navigation using rplidar. I can show you how to configure it without needing imu reading as well if you want. I'm not 100% sure about better results but it'll be interesting to compare the two outputs. Also, your Arduino code will become waaay shorter as it'll only have to control the motors. You could even omit the serial node altogether and directly control it using the jetson; although I'm still exploring this last part...
Great video, are you going to use the Xbox Kinect for the eyes of your robot. The guy from Stuff Made Here made a basket ball hoop that helps you score points with the Kinect sensor.
It may get an Intel Realsense camera at some point
You should have it try navigating a corn maze kind of thing.
why can't i find the robot am looking for the autonomous a57 personal robot
This robot reminds me of the Stretch RE1 from Hello Robot.
Cool robot, just to nitpick a bit, this is not AI but rather SLAM, ROS default to gmapping for mapping and localization, and trajectory rollout and dynamic window approaches for path planning.
it's a weak AI ;-)
Question: Your prior project piggy backed on the NOX Arduino project. This time around it seems that you have built your own stack from fundamentals. I have been trying to follow your ROS projects but now suddenly I'm lost. How do I setup a Navigation Stack on my own and not git clone catkin_make an already published project like linorobot or husky or turtlebot . I know there is ROS tutorial on Navigation Stack and I tried following it with no success. There are so many moving pieces and catkin_make won't compile unless I have all nodes setup properly and to verify individual nodes I need to compile catkin_make. Its a chicken and egg problem I reckon. Do you have video on how to accomplish this mammoth of a task? Or what resources should I refer to?
I published my who navigation config on Github, but I pretty much did the whole thing on a clean build with no other previous robot's config on.
@@jamesbruton Thank for reverting. I am trying to figure out how to make a clean build. And this link wiki.ros.org/navigation doesnt help much in setting up a clean build... Do you have some references that has more details on how to actually set all the dependencies etc in make it all work...
Would very much like to see what he looks like now
you do a good job at making the videos slow but not get it over with slow but a nice maybe I'll stay a while slow
Really amazing !
thanks!
If the laser doesn't stay still so where am i wrong ?
Your 3D printer is a normal printer ?
Yep - I have a 1.2mm nozzle/toolhead option though which is what the larger parts are printed with.
James you could bring this to investors and start a company, I seen startups with way less getting millions in seed funding
I dont think you understand his stance on open robotics projects
Not selling to investors
Can you please explain about Teensy 4.1vs Arduino?
Amazing!
Hi James, great work! Can you please do a ROS2 version ASAP, it's quite stable now. &/OR Nvidia Isaac
Can it avoid moving obstacles?
Yes, as quickly as the map can update
Very Good!
Question for you, James:
Roughly how much of maximum effort are your motors using?
My problem I'm trying to solve: I'd like to make a similar robot, but one that is self-balancing. It needs to be quite fast and have sufficient torque to lift itself up in the event of a crash, but I can't necessarily afford the most expensive motors. And I've got little intuition for units of torque.
Thank you for your videos: They've been much more useful to me than TheConstruct's in learning ROS conceptually.
Hardly any. Check out: th-cam.com/video/zlucjoYLGi8/w-d-xo.html - it's pretty much the same setup but on 48v
@@jamesbruton derp, forgot you did that. But in your videos, I don't remember you showing a self-recovery. A couple of falls, and I assumed you manually lifted sonic back up. I'm interested in sufficient "motor oomph" to lift sonic from his face to his
"feet."
(But thank you!) At least watching your sonic robot gives me a place to start from!
Make a RoS tutorial series
nice.
can you imagine all that and its not even at the level to fetch you a beer.
replacing a human for even a simple task it such a immense step.
R.U.R. -- I see what you did there! :-)
It Will be very usefull
You use very expensive components for what you’re building. I guess if they are already laying around or being donated it makes sense but a gearhead motor like the 37d form factor from Pololu with integrated encoder would be cheaper and allow you to spend time on the challenging parts like writing software. Or just go big and use a Hayabusa motor? I’ve got one that balances on 2 wheels and I’d be happy to donate my code if you like. Maybe we could trade for some those sweet sweet brushless motors.
That's awesome dude =D