From Idea to Prototype: The Jugglebot Journey (so far)

แชร์
ฝัง
  • เผยแพร่เมื่อ 13 มิ.ย. 2024
  • Welcome to an exciting journey through the world of Jugglebot! This video presents the talk I recently gave, covering everything you've ever wanted to know about this juggling robot project. It's a unique chance to learn about the engineering challenges, design choices, and progress we've made so far in bringing this idea to life.
    This talk is a great place to start if you're new to the channel or if you want a comprehensive look at the Jugglebot story. We delve deep into the design process, the components, the software, and the many trials and tribulations that have led us to where we are today.
    So sit back, relax, and join me on this exciting journey. And don't forget to leave your questions, suggestions, or any other thoughts in the comments section. Enjoy!
    ##### Luke Burrage's video on Juggling Robots vs. Automaton #####
    • Juggling ROBOT or NOT???
    00:00 - Start
    00:27 - RRS Talk
    22:00 - Q&A
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 36

  • @harrisonlow
    @harrisonlow  ปีที่แล้ว +5

    I am thrilled to share this talk with all of you. It's been quite the journey designing and building Jugglebot, and I appreciate every bit of support, feedback, and curiosity you've shown.
    If you're new here or if you want a comprehensive understanding of the project, this talk is for you! I delve into the entire process, design choices, and the progress we've made so far.
    I'm always looking forward to hearing your thoughts, so please don't hesitate to leave your comments and questions below. I'll try my best to answer them all!
    Thanks again for being part of this journey, and happy watching! 🎪🎭

  • @qwertyuiop-ux6jk
    @qwertyuiop-ux6jk ปีที่แล้ว +2

    you are man of everything....Great video and very interesting project I ever seen dude....

  • @stefanguiton
    @stefanguiton ปีที่แล้ว

    Excellent showcase!

  • @joeidoni
    @joeidoni 9 หลายเดือนก่อน +1

    I was thinking about your actuators last night and realized that instead of ptfe tube, bicycle cable housing is a perfect replacement. It's designed for tension and won't kink.

    • @harrisonlow
      @harrisonlow  9 หลายเดือนก่อน +2

      Cheers for the suggestion! As it happens, I've actually redesigned the actuators to not need the bowden tubes at all 😁 Video hopefully coming soon 😊

  • @petersvideofile
    @petersvideofile ปีที่แล้ว

    This is really great!! Can't wait to see the update when you get your carbon fiber rods :)

    • @harrisonlow
      @harrisonlow  ปีที่แล้ว

      Cheers! Hopefully they're not too far off 😊

  • @CrashAndRebuild
    @CrashAndRebuild ปีที่แล้ว

    I love this plan and had been wanting to make a juggling robot for years now but never found the time to prioritize the project. I look forward to watching your progress, thank you for sharing the talk. You mention wanting to feed siteswap notation to the machine; do you have a plan to handle "1's " (essentially a 0 time count sideways direct throw from one hand to the other for any non-juggling nerds reading this comment). I can't think of a great way to get a stewart platform to pull off a 1 unless you have some special dedicated mechanism in addition just for them. It makes sense to just ignore them to start, plenty of patterns that don't have 1's but there are some fun patterns you'd be missing too. The current hand mechanism would also make multiplexing impossible I'd think but that seems like a MUCH lower priority.

    • @harrisonlow
      @harrisonlow  ปีที่แล้ว +1

      Thanks!
      You're the first person to notice that Jugglebot won't be able to do 1 throws! I'm just ignoring them for now 😂. It would be neat to include them, though doing so in the way humans do would be very challenging. One option would be to extend the duration of a "beat" so that the 1 spends a little more time in the air, though the patterns would get insanely big by doing that.
      Same thing re. multiplexing. I have toyed with the idea of putting multiple "hands" on one "arm", but the throws would need to be *very* accurate for that to work

  • @lzrstar3574
    @lzrstar3574 9 หลายเดือนก่อน

    For can, consider using star configuration where each component is connected centrally instead of creating a loop, makes it much easier to find problems.

    • @harrisonlow
      @harrisonlow  9 หลายเดือนก่อน

      Interesting - I didn't know that was possible! Definitely looks superior over a linear configuration for my application. I'll do some more reading into this, cheers 😊

  • @ramtinnazeryan
    @ramtinnazeryan ปีที่แล้ว +2

    Hi. I've done my PhD on Computer vision and I am kinda concern about the speed at which you want to track the balls using this technique. Although the camera can go to 100+ fps, the image processing part as far as I know will take a lot of time, sometimes more than a second. While I love to see your solution and success in this, perhaps locating the other bot using CV and trying to calculate the tragectory of the ball after throw depending on the angle and velocity (No CV here) is a faster and more feasible solution. I finished my body tracking paper based on a post process algorithm. my work takes about 1.3 second per frame to track a body joint. so implementing it in a live action would have been impossible. for a context I was using a core i7 intel 8th gen cpu for it. the feature/blob detection part itself would cost you about 0.3 sec. then considering the tragectory calculations and whether you are tracking one ball or more will make the situation harder and harder.

    • @harrisonlow
      @harrisonlow  ปีที่แล้ว

      Interesting! I'll have to run some tests to see how quickly I can get the data out. From the testing I've done so far (example gif at 15:40 in this video) I was definitely getting a lot more than 1 frame per second, though I didn't measure the latency. My feeling was that it wasn't _that_ bad, but I definitely need to do some more quantitative testing.
      Could you elaborate on what you're suggesting instead? I may be misinterpreting, but it sounds like you're suggesting to go entirely off of the throw pos/vel? It would be awesome if that worked, but I don't trust the hardware to make *perfect* throws, so I figured the CV + kalman filter would be a good way to compensate for any hardware issues
      Thanks for the input 😊

    • @ramtinnazeryan
      @ramtinnazeryan ปีที่แล้ว +1

      ​@@harrisonlow I don't wanna sound pesimistic and don't get me wrong I can't wait to see your succeed in this. but to share my experience (However limited), among robustness, accuracy, and speed you gotta pick two in CV :D. HSL filters are fast but then you have to start blob analysis to get rid of same hue in the frame(Could be even from the reflection of the ball's color itself sometimes). From what I understood, you were using a prespective n points solution for depth calculation. In my work, I didn't get a promising accuracy when it comes to depth measurement and I was using checkerboard patterns which has more data points. with stereo vision you have a better depth estimation i think.
      Now here is what I think: Use CV to precisely locate the reciever robot in 3D space, as they are stationary and you are gonna save a lot of time for your set up (point for robustness). Then as you understood correctly try to estimate the tragectory of the throw using the orientation and the vloc. of the thrower robot. If you can't trust the hardware perhpas a second measurement using laser and photo detector sensor across the diameter of the exit whole can ensure your velocity measurements. I was thinking the times during which the exiting ball blocks the light can be related to the speed of it. For the axis orientation of the exiting ball, I think you can get sufficient feedback from the robot itself.
      Cheers..

    • @5eurosenelsuelo
      @5eurosenelsuelo ปีที่แล้ว +1

      If you don't trust the hardware to give you the exact throw you need but computer vision proves too slow for your project you could add simple sensors to determine the trajectory and speed of the ball when leaving the hand.
      For example, for speed you could put a laser similar to punching machines. For the angle of the throw there must be also a simple sensor that gets the job done but it doesn't come to my mind now.
      That way you could adjust the receiving hand to the actual shot that happened instead of the shot that should have happened but the hardware failed to reproduce exactly.
      It's actually a little like computer vision but instead of checking each frame during the trajectory you just do a check during launch. Since no additional unknown forces should be involved, the trajectory can be described just with that.
      You'd still be weak against wind conditions and such but that was kind of out of the scope from the beginning and shouldn't affect much anyways due to the shape and weight of the balls as you said.
      As a last comment I'll add I don't think this needs to be a this or that scenario. You'd implement both solutions simultaneously so that the bot has as much information as possible. That might allow you to use a fewer amount of frames in the analysis of the trajectory. Just a couple of pictures should be enough to confirm the trajectory of the ball VS the predicted trajectory from the launching conditions.
      Congratulations on your work so far. It looks impressive. Just the things learned along the way already make it worth it.

    • @harrisonlow
      @harrisonlow  ปีที่แล้ว +1

      @@ramtinnazeryan I'm not sure I'll have as much computation to do per frame as it sounds like you did; I'm going to make sure that there's nothing in the frame that has the same colour as the ball, so the HSL + circle detection process should be quite quick. And as you say, stereo vision is decent at depth estimation (exactly how decent is something I need to test).
      I really like the idea of a laser at the outlet of the hand! Very clever. Do you have any experience with sensors like that? Are there any specific sensors you'd recommend?
      I have already been considering using an IMU on the top of Jugglebot's "arm", but I've never used IMUs before so my expectations are fairly low. I'm keen to be proven wrong though once everything is working!

    • @harrisonlow
      @harrisonlow  ปีที่แล้ว +1

      @@5eurosenelsuelo Very good suggestions! Have you had any experience with sensors like the laser punching machine one you described? Are there any specific sensors you'd recommend?
      I agree that "sensor fusion" would be the most robust approach, and I hope to get Jugglebot to that point eventually! Having an IMU on the "cup" part of the hand could be super informative, and I've considered adding a pressure sensor to the catching area to measure how smooth the catches are*
      * This may not seem that useful, but something I've found tremendously helpful in my own juggling is to pay full attention to making the catches as quiet as possible (the robot analogy would be as low contact force as possible). This can only happen if the pattern is super smooth, so I think it works as a good proxy for the "quality" of the pattern. Not sure how transferrable that idea is to the robot, but could be fun to explore!

  • @TheOnlyRiceman
    @TheOnlyRiceman ปีที่แล้ว +2

    Hi Harrison, I love this project. I’m a second year eng student in Canberra with a little bit of experience with object tracking and the nano. If you need help with setting up your nano or CV ball tracking I’d be happy help (although it looks like you’re all over it).
    Also, seeed studio ship to Australia and they’re pretty quick. I just got my hands on an orin Nano for a new project. My orin Nano can track an object at a network speed of 400FPS.
    Here’s a project I did over the summer break:
    th-cam.com/video/B8tGweROs4o/w-d-xo.html

    • @harrisonlow
      @harrisonlow  ปีที่แล้ว +1

      That's a pretty neat project!
      Cheers for the offer to help! I may have to take you up on it if I find myself lost in linux 😂
      That's very good to know re. Seeed. I've just been checking the NVIDIA product page for updates, and it's only showing edomtech as a supplier for Australia, and their page just has an "inquiry" button... I may end up getting one sooner than I thought! Big thanks for that tipoff ☺
      What eng are you studying? Seems like you've got a lot of comp skills, so mechatronic/software?

  • @wildgophers91
    @wildgophers91 ปีที่แล้ว +1

    smdh still think it should be a "posse" of jugglebots and not a troupe

    • @harrisonlow
      @harrisonlow  ปีที่แล้ว +2

      Hehe I'm still open to changing. Just seeing how different options feel 😉

    • @wildgophers91
      @wildgophers91 ปีที่แล้ว

      @@harrisonlow Troupe is 100% the right one. When I saw that on the slide it was like, "ohhh of course" it just _fits_. But in my heart it will always be a posse