Hand Tracking & Gesture Control With Raspberry Pi + OpenCV + Python

แชร์
ฝัง
  • เผยแพร่เมื่อ 4 พ.ย. 2024

ความคิดเห็น • 139

  • @sumedh1586
    @sumedh1586 2 ปีที่แล้ว +7

    Finally, the tutorial I was most awaited for. Just love it

  • @jeffschroeder4805
    @jeffschroeder4805 2 ปีที่แล้ว +11

    I am amazed at all the effort that you must have put into these projects. Thank you so much.

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว +1

      My pleasure mate 🙂 hopefully you try some out for yourself!

  • @pileofstuff
    @pileofstuff 2 ปีที่แล้ว +6

    That's clever.
    I didn't realize a Pi had enough grunt to do live image recognition like that.

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว +1

      Yeah it's pretty impressive, obviously a much beefier computer will get faster frame rates and be able to track more objects simultaneously, but for getting started in computer vision systems its a great starting point.

    • @stinger220
      @stinger220 4 หลายเดือนก่อน

      @@Core-Electronics doesn't work.

  • @adamboden766
    @adamboden766 ปีที่แล้ว +3

    This is absolutely bonkers!
    The project is incredible, and I'm blown away by the quality of the tutorial. Keep up the awesome work man!

    • @Core-Electronics
      @Core-Electronics  ปีที่แล้ว

      I think so too 🙂 and cheers mate, very kind words! I will keep my head down and knock out some more.

  • @LizzyTheLizard
    @LizzyTheLizard 2 ปีที่แล้ว +1

    this in VR would be epic

  • @asirisudarshana536
    @asirisudarshana536 2 ปีที่แล้ว +1

    Keep it up man you are awsome and simple

  • @TheLiquidMix
    @TheLiquidMix 2 ปีที่แล้ว

    Awsome dude, i just got my pi4 from you guys today.

  • @AnthonielPinnock-ci3rd
    @AnthonielPinnock-ci3rd 8 หลายเดือนก่อน

    I’m creating an invention and I really need the hand jesters so thanks 🙏 😊

  • @schrenk-d
    @schrenk-d 2 ปีที่แล้ว +2

    very nice.
    You should give this a try with a Coral TPU ?
    Which I'm pretty sure works nicely with a Pi and OpenCV.

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว

      Using Coral TPU definitely provides a significant boost to performance for all computer vision tasks when utilising a Raspberry Pi. Definitely something work doing 😊

  • @adrienguidat6805
    @adrienguidat6805 2 ปีที่แล้ว +4

    Damn, feels just like Tony Stark, awesome thx!

  • @mathkidofmemes6129
    @mathkidofmemes6129 2 ปีที่แล้ว +3

    Will a Rasberry 3 work? It seems that the Rasberry 4 is in very short stock, so I was wondering if a raspberry 3 has enough power, I won't be using the high-quality camera but the V2 camera module. I also want to connect this to Arduino Uno, will this be possible?

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว

      I reckon hand and finger recognition may be asking too much of a Raspberry Pi 3, but I'd love to be proven wrong. And there is lots of ways to hook up a Raspberry Pi to send information/instructions to a Arduino so that won't be an issue, they also both conveniently run at 5Volts.

  • @zgryx8428
    @zgryx8428 2 ปีที่แล้ว +1

    Thank you for the wonderful project and tutorial, i just want to ask if how can access those specific joints in the hand so that I can compare my custom made specific hand gesture?

  • @EronWahyu
    @EronWahyu 2 ปีที่แล้ว +1

    Hi, thank you for wonderful project, i just want to ask what method did you use for this hand gesture recognition?

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว

      Cheers mate, MediaPipe and OpenCV working together are the packages that make the backbone of this Machine Learned system.

  • @JC-le4dy
    @JC-le4dy ปีที่แล้ว +1

    So I have a plan to create a software that records the motion of your hand and essentially moves a prosthetic arm the same way. Do you think that software would possibly be able to accomplish that goal? And, I'm just wondering if you think that this concept would be able to match a hand's movement well enough. It's fine if you don't know :), I'm just curious. It's for a school project far off in the future but I'm planning in advance because engineering is my passion so I want to succeed while having fun

    • @Core-Electronics
      @Core-Electronics  ปีที่แล้ว

      This set up is definitely pushing the Raspberry Pi very hard. However if you save the Raspberry Pi from printing each frame to the screen and overlaying each frame with the wire frame over the hand by customising the Main scripts there will be enough computing power inside a RPI4 to do your task without lag 😊.
      Mind you I even did that! Just remembered - Come check the Where to Now Section in the main guide and you can see me controlling a hand - core-electronics.com.au/guides/raspberry-pi/hand-identification-raspberry-pi/
      And a good mate Garry edited the provided code here to control a Lego hand he created - th-cam.com/video/cadqkqh0zAY/w-d-xo.html

    • @JC-le4dy
      @JC-le4dy ปีที่แล้ว +1

      @@Core-Electronics Thank you very much! So essentially it could very well be possible, just would take a bit of tweaking. Also, I'll definitely check that out! Seems very neat. Thank you for your time in replying!

  • @adityajadhav4768
    @adityajadhav4768 ปีที่แล้ว

    thanks for the video but can you make pre installed opencv and tenserflow in .img fornat or os for rasberry pi

  • @leonoliveira8652
    @leonoliveira8652 ปีที่แล้ว

    interesting, I wonder how much better this can get with dual cameras on the pi5 now

  • @ajithsb1853
    @ajithsb1853 2 ปีที่แล้ว +2

    👌👌excellent project, thanks for sharing

  • @ryandowney1391
    @ryandowney1391 ปีที่แล้ว

    Would this model be a plug and play with a Coral USB Accelerator or is there other tasks when adding the Coral USB Accelerator?

  • @chapincougars
    @chapincougars ปีที่แล้ว

    Is this technology affected by high volume of UV light (ie light generated when arc welding) or would the camera require a lens to reduce light? Also could it track the light vs a reference point to determine how fast the light is moving linearly (2D and 3D)?

  • @elvinmirzezade7997
    @elvinmirzezade7997 11 หลายเดือนก่อน +3

    is it possible to make this on Raspberry pi Zero 2 W?

    • @ltd5480
      @ltd5480 หลายเดือนก่อน

      hmmmm

  • @Manaick007
    @Manaick007 ปีที่แล้ว +1

    Heya! I've been trying to get this setup working with my Raspberry Pi 3B, running Buster (as mentioned in your article), along with an Arducam IMX519 (It's a 3rd party alternative to native Pi Cameras). The only issue is that my Arducam requires a Pi running Bullseye to work. Since it's been over a year since you put this video out, I was wondering if I could replicate your setup but switch buster for bullseye?

    • @Core-Electronics
      @Core-Electronics  ปีที่แล้ว +1

      Definitely would be sweet to update al the AI guides l for the newer version of Bullseye. It has a very different software architecture to Buster so I'm hesitant to take the plunge. I'm very sure you can make that camera work with Buster OS - Here is a Forum post explaining how - forum.arducam.com/t/16mp-autofocus-raspbian-buster-no-camera-available/2464/6

    • @Manaick007
      @Manaick007 ปีที่แล้ว +1

      @@Core-Electronics appreciate the response. Will give this a spin 👍🏾

  • @yingwaisia2707
    @yingwaisia2707 ปีที่แล้ว

    So nice. Thank you so much. I love it very much, great tutorial!!! Much appreciated!

  • @alenninan5524
    @alenninan5524 7 หลายเดือนก่อน

    Hi , can this method be used in dark room where u will be on ur couch ?

  • @absolutllost
    @absolutllost ปีที่แล้ว

    If someone runs into the problem that some files in mediapipe can not be found: try downgrading your mediapipe version. Uninstall the version you tried before. Install an older version as root(can be found on pypi). That fixed the problem for me.

    • @rodrigoavilaengland
      @rodrigoavilaengland 11 หลายเดือนก่อน

      Hi! And what python did you use? 3.9? Now thonny just supports since 3.8 to up..

  • @spaceminers
    @spaceminers ปีที่แล้ว

    I have paralysis in my fingers, but I can move my wrist, elbows and shoulders. My fingers and thumbs just kind of flop around as they are always in a relaxed state. Can I use this technology to recognize custom arm gestures while sitting in a wheelchair in front of the camera? I want to be able to use both arms to navigate and control things in the Metaverse As well as mechanical devices. I would just be interested in rotational and positioning information from the wrist, elbow and shoulder joints to simulate a virtual joystick, for example

    • @Core-Electronics
      @Core-Electronics  ปีที่แล้ว

      These models are pre-baked to identify certain landmarks (finger poses) and return their status. It seems like you might be able to achieve the effect you're after by them into the gestures you require. It sounds like an awesome project, perhaps it's better to take it to the forum where we can share screenshots, code snippets and other helpful resources: forum.core-electronics.com.au/

    • @spaceminers
      @spaceminers ปีที่แล้ว

      @@Core-Electronics so you’re saying that it doesn’t really matter what position my hands are in? It will record that as a custom gesture? Because there is no way I could make a ✌🏼 or the 🖕 unfortunately LOL or even a 🤛

    • @stinger220
      @stinger220 4 หลายเดือนก่อน

      don't get your hopes up this doesnt work and is more than painful to install

  • @stevewang1061
    @stevewang1061 4 หลายเดือนก่อน

    hi,is there any updated info since rasperry pi has upgraded. maybe an intro in Rasperry pi 5?

    • @Core-Electronics
      @Core-Electronics  4 หลายเดือนก่อน

      This guide doesn't yet work with the new Bookworm OS, and unfortunately the Pi 5 right now only works on Bookworm OS. We have some updates for these vision videos in the works though!

  • @RealRaven6229
    @RealRaven6229 ปีที่แล้ว +1

    How do you get it so smooth? I'm doing my own project, and when I try to track a body, I'm getting maybe one frame every few seconds. I don't mind some frame rate drop but this is unusable. I increased swap size but this didn't seem to help.

    • @Core-Electronics
      @Core-Electronics  ปีที่แล้ว

      Definitely best to use a Raspberry Pi 4 Model B running absolutely nothing else except for the hand tracking system. You can lower the preview window size which will increase your frame rate if your using an earlier RASP PI

  • @RixtronixLAB
    @RixtronixLAB 2 ปีที่แล้ว +1

    Keep it up, nice video clip, thank you for sharing it :)

  • @cx3268
    @cx3268 2 ปีที่แล้ว +1

    Hmm, maybe sign language to text or speech might be a good application????????????

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว

      Absolutely, it would be an awesome project.

  • @reyroelortiz7448
    @reyroelortiz7448 ปีที่แล้ว

    Hi bro can u help us with our project? I have a few questions

  • @liampourliampouras3274
    @liampourliampouras3274 2 ปีที่แล้ว +1

    Congratulations for your amazing work.
    I have followed the article s instruction but unfortunately when i try to import mediapipe to the python script an error appears:
    Traceback (most recent call last):
    File "", line 1, in
    File "/usr/local/lib/python3.7/dist-packages/mediapipe/__init__.py", line 16, in
    from mediapipe.python import *
    File "/usr/local/lib/python3.7/dist-packages/mediapipe/python/__init__.py", line 17, in
    from mediapipe.python._framework_bindings import resource_util
    ImportError: libImath-2_2.so.23: cannot open shared object file: No such file or directory
    Any suggestions??

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว +1

      Heyya mate, thank you very kindly. My first two thoughts are whether you used the older 'Buster' Raspberry Pi OS or did you skip one of the MediaPipe installation Terminal commands. Type the following lines one by one into a new terminal command to try to get that MediaPipe to work.
      sudo pip3 install mediapipe-rpi3
      sudo pip3 install mediapipe-rpi4
      sudo pip3 install gtts
      sudo apt install mpg321
      We have a heap of successful troubleshooting that you can find at the comment section of the full written up article - core-electronics.com.au/guides/hand-identification-raspberry-pi/

  • @RamesTheGeneric
    @RamesTheGeneric 2 ปีที่แล้ว +1

    Will this work on the 64bit versions of raspbian?

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว

      No doubt the teams at Open-CV and Raspberry Pi are working furiously to reach full compatibility with the new OS versions. That hasn't happened yet, so until then I would recommend using the previous Raspberry Pi 'Buster' OS with this guide.

  • @rasmusbryld1832
    @rasmusbryld1832 ปีที่แล้ว +1

    Have you guys have any problems installing mediapipe? I get an error when importing mediapipe in python

    • @Core-Electronics
      @Core-Electronics  ปีที่แล้ว

      Come check the comments at the bottom of the full write up. There's a solution for you down there in the forum section😊

  • @ranidusoysa8789
    @ranidusoysa8789 2 ปีที่แล้ว +1

    Hello, does this work without connecting to a pc?
    Is it possible to remove the connection from the pc and hand detect it after coding?

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว

      Hey mate, this system is completely running on the Raspberry Pi 4 Model B. You can definitely run this system Headless (without a Monitor/Display) and still control hardware through hand signals.

  • @hibach9140
    @hibach9140 2 ปีที่แล้ว +1

    can you give us the source code of virtual mouse using the rasberry Pi 4 and the raspberry camera v2, thank you.

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว +1

      All the script done here you can download from the bottom of the Article page.
      I have yet to create a virtual mouse via computer vision using the Raspberry Pi (if I do I will tell you), but a video to check out in regards to building this kind of system from first principles is this - th-cam.com/video/iBwMi9iDZmQ/w-d-xo.html

    • @hibach9140
      @hibach9140 2 ปีที่แล้ว +1

      @@Core-Electronics thank you ☺

  • @reyroelortiz7448
    @reyroelortiz7448 2 ปีที่แล้ว +1

    Can we use raspberry 3?

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว

      For this task I would recommend using a Raspberry Pi B 3+ or better. Otherwise the FPS is too slow for snappy responses.

  • @GGhost-q9w
    @GGhost-q9w ปีที่แล้ว +1

    can we write this data to .bvh file and import it to Autodesk Maya?

    • @Core-Electronics
      @Core-Electronics  ปีที่แล้ว +1

      For sure, all the python scripts are open source and you really can do anything software wise with programming 🙂

    • @GGhost-q9w
      @GGhost-q9w ปีที่แล้ว +1

      @@Core-Electronics if possible, please make a video about it.

  • @aakashkoneru2001
    @aakashkoneru2001 ปีที่แล้ว +1

    Hey , i have been trying to use hand gesture recognition as the base for my medical application that is for the patients to communicate with the nurse or doctors and i am trying to use this setup , but i am stuck , can you please help accordingly?
    I am using raspberry pi 3 model B

    • @Core-Electronics
      @Core-Electronics  ปีที่แล้ว

      Love your project idea! And absolutely I can help. The best place to ask questions is at our forum here - forum.core-electronics.com.au/ - Pop through some images of the hardware and screen grabs of any software issues and we'll sort it !

  • @Redbeef
    @Redbeef 2 ปีที่แล้ว +1

    hi Tim, how can I add thumb tracking for the Are Fingers up or Down.py code? I've added the thumb ID (4) to the list, but not sure what I need to adjust afterwards. Can you please assist? Thanks in advance!

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว +1

      Hey mate, you definitely can add Location tracking of certain joints to | Are Fingers Up or Down.py |. If you have downloaded the code from the article open up the | Simple-Hand-Tracker.py | script. In it you will see a section that has been commented out which is the code I used to find the location of the index finger in the video.
      Copy that section across to your desired script and replace Index Finger ID number 8 with the Thumb ID number 4. Hope that helps!

  • @MattyEngland
    @MattyEngland 2 ปีที่แล้ว +1

    Identity the joint in my hand?.... Lemon Kush

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว +2

      Our technology isn't quite there yet 😅

  • @mohammedasal2767
    @mohammedasal2767 15 วันที่ผ่านมา

    hey, I have a 8x8 LED 5.5V matrix Max7219 and when I do the guide for the glowbit, it doesn't work. do you have a way where I could use the 8x8 LED 5.5V matrix Max7219 with the glowbit code?

    • @Core-Electronics
      @Core-Electronics  13 วันที่ผ่านมา

      Im sorry to say but that isn't very possible. They are both completely different technologies with different drivers. Our Glowbit library is designed to work with WS2812B LEDs.

  • @roracle
    @roracle ปีที่แล้ว

    How hard would this be to implement into Gnome to control the desktop? I have lots of ideas, but unfortunately I'm a visionary, not a programmer.

    • @madorart
      @madorart 8 หลายเดือนก่อน

      Im doing It in Windows, the biggest problem is that i want the app to let people change gestures.

    • @roracle
      @roracle 8 หลายเดือนก่อน

      @@madorart maybe have an options GUI, make it simple, add a number of different gestures, and maybe for each hand as well.
      If you could do this in Gnome, then you could get that "minority report" interface everyone's always referencing.

    • @madorart
      @madorart 8 หลายเดือนก่อน +1

      @@roracle Yeah, that's what I'm doing, using Flet for the GUI, and 16 gestures(all except for mouse and sound gestures are interchangeable using especifics parameters as hand orientation ). In the future, maybe I'll add more. My main focus right now is Windows, but maybe when it's finished, I can try it on GNOME. It should work on Linux because the Os library is compatible, but I might have to make some changes. If you'd like, I can let you know when it's finished

    • @roracle
      @roracle 8 หลายเดือนก่อน

      @@madorart do you have a GitHub project page?

  • @samuelmarshall100
    @samuelmarshall100 2 ปีที่แล้ว +1

    How would I stop the camera from lagging?

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว

      If you want an instant speed boost using this hardware consider checking out the Coral USB Accelerator.
      This video shows a nice comparison between using it and not for different machine-learned Computer Vision systems on a Raspberry Pi 4 Model B. At around 6.30 is where you’d want to check out. th-cam.com/video/7gWCekMy1mw/w-d-xo.html

    • @samuelmarshall100
      @samuelmarshall100 2 ปีที่แล้ว +1

      @@Core-Electronics Thanks, mate

    • @samuelmarshall100
      @samuelmarshall100 2 ปีที่แล้ว +1

      @@Core-Electronics it seems there's still a chip shortage and the coral usb accelerator is out of stock and some people are selling them and a high premium ☹️

    • @samuelmarshall100
      @samuelmarshall100 2 ปีที่แล้ว +1

      ​@@Core-Electronics I have looked at the Intel® Neural Compute Stick 2, would this be a good option?

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว

      Hey mate, don't know how good the Raspberry Pi support is for this but it is definitely in the right ballpark.

  • @indianaiscience3670
    @indianaiscience3670 2 ปีที่แล้ว +1

    Sir how to take audio from raspberry pi 4... Please help me

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว

      Come create a forum if you need some expert help 😊 - forum.core-electronics.com.au/

  • @rho35100
    @rho35100 2 ปีที่แล้ว +1

    Faaaaantastic!!

  • @w4tchtheDAWG
    @w4tchtheDAWG 2 ปีที่แล้ว +1

    The download scripts are not available, can you send the link please?

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว

      Code should be available at the bottom of article or in the comment section. If you can't see it pop me a reply and we'll figure out whats happening.

  • @jaysoni9568
    @jaysoni9568 8 หลายเดือนก่อน +4

    Is anyone able to implement it in 2024?

    • @Maisonier
      @Maisonier 8 หลายเดือนก่อน +1

      +1 also I want to control with voice and an assistant like chatgpt

    • @madorart
      @madorart 8 หลายเดือนก่อน +1

      Im doing an app that Will allow people to control things like sound, mouse, drag, stop video, etc, with gestures that you can change to others that can suit you more. Is my tfg

    • @ruban92
      @ruban92 8 หลายเดือนก่อน

      How can i contact you . I need your help for my clg project
      ​@@madorart

    • @stinger220
      @stinger220 4 หลายเดือนก่อน

      didn't work even though i did everything correctly

    • @SkylinesProductions
      @SkylinesProductions 20 วันที่ผ่านมา

      I got it working

  • @sightellaidglass4651
    @sightellaidglass4651 2 ปีที่แล้ว +1

    hi tim, how can i fix this error
    from mediapipe.python._framework_bindings import resource_util
    ModuleNotFoundError: No module named 'mediapipe.python._framework_bindings'

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว

      Seems to me one of the hand tracking packages hasn't installed correctly. Are you using the older 'Buster' Raspberry Pi OS version? As doing so is important to do until all the packages are updated for the newer 'Bullseye' Raspberry Pi OS version.
      Come check the bottom of the full article, lots of troubleshooting to be found there. Also if you need more troubleshooting help pop a message over there as I can help you better over there 😊 we'll get your system working.

    • @sedatdoganay4938
      @sedatdoganay4938 2 ปีที่แล้ว

      @@Core-Electronics mediapipe does not work on raspbian bullseye it works with raspbian buster

  • @dilaraburan1776
    @dilaraburan1776 2 ปีที่แล้ว +1

    hey mate, i tun the script yesterday and it worked well. Today nothing has changed, the window Pops up but wont track my hand... its also not showing up any Errors. i already connencted the camera again, Update my System and downloaded the script again and open it again but wont work. Anybody has some ideas?

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว

      Double check for me that you are running the older 'Buster' Raspberry Pi OS. There is also a great resource of successful trouble shooting that you can find on our Core Electronics Forum in relation to Open-CV/machine learned systems. Come write a post there if you keep running into issues and we'll work it out there 😊

    • @SwagatKumarIndia
      @SwagatKumarIndia ปีที่แล้ว

      Same with me. The code runs without any error but it does not track my hand. Running on Buster, Python 3.7.3, Opencv 4.5.5.

  • @samuelmarshall100
    @samuelmarshall100 2 ปีที่แล้ว +1

    For some reason when I run any of the scrips they don't seem to work properly they just have a purple filter over the camera and nothing else. I also get this error code: [ WARN:0@4.204] global /home/pi/opencv/modules/videoio/src/cap_gstreamer.cpp (1405) open OpenCV | GStreamer warning: Cannot query video position: status=0, value=-1, duration=-1
    INFO: Created TensorFlow Lite XNNPACK delegate for CPU.

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว +1

      Hey mate, just to start with the simple stuff first, double check you have the camera enable in the Raspberry Pi Configuration and that the camera is connected correctly to the CSI port. If you keep running into issue, pop me a message in the comment section of the article as I will be able to help you easier there 🙂

    • @SwagatKumarIndia
      @SwagatKumarIndia ปีที่แล้ว

      Gstreamer warning will go away if you use line:
      cap = cv2.VideoCapture(0, cv2.CAP_V4L2)

  • @사문이-k6h
    @사문이-k6h 5 หลายเดือนก่อน

    When i set up Open CV, there is a error. Should I contine the progress with error?

    • @Core-Electronics
      @Core-Electronics  5 หลายเดือนก่อน +1

      Hey, if you are still having issues we have a forum topic specifically on this video that might have some helpful information. If you are having a unique issue, feel free to post in there, we have lots of maker eyes over there that can help!
      forum.core-electronics.com.au/t/hand-recognition-and-finger-identification-with-raspberry-pi-and-opencv/12705/58

    • @사문이-k6h
      @사문이-k6h 5 หลายเดือนก่อน

      @@Core-Electronics Thank you so much! Your video is very helpful to raspberry pi user!

  • @rajasekhar5982
    @rajasekhar5982 ปีที่แล้ว

    How can I learn Python like yours?

    • @madorart
      @madorart 8 หลายเดือนก่อน

      W3school is a good begginer option

  • @mohammedasal2767
    @mohammedasal2767 16 วันที่ผ่านมา

    hey, I'm having a problem where because the code is old, it doesn't work with the 3.7 version of python on thonny. could you possibly make an updated version of the code or give me a way to downgrade the version of python I have?

    • @Core-Electronics
      @Core-Electronics  15 วันที่ผ่านมา

      Hey, we're currently updating our Raspberry Pi Computer Vision guides to be compatible with Raspberry Pi 5 and make them a little more future-proof against hardware and OS updates. We should have one on hand tracking and gesture control soon!

  • @igorgasparik4108
    @igorgasparik4108 3 หลายเดือนก่อน

    hello, Can you advise me please? I have done all the steps according to the tutorial but when I run the program I get an error: TypeError: Descriptors cannot not be created directly.
    If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0.
    If you cannot immediately regenerate your protos, some other possible workarounds are:
    1. Downgrade the protobuf package to 3.20.x or lower.
    2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).

    • @Core-Electronics
      @Core-Electronics  3 หลายเดือนก่อน +1

      If you aren't already, ensure that you are using Buster OS, this guide only works on that version, we also have a dedicated forum page for this topic with lots of maker eyes over there that can help, feel free to post on there with your code and setup!
      forum.core-electronics.com.au/t/hand-recognition-and-finger-identification-with-raspberry-pi-and-opencv/12705/3

  • @spacetechnology9718
    @spacetechnology9718 4 หลายเดือนก่อน

    I need help in website " If it fails at any point and you receive a message like | make: *** [Makefile:163: all] Error 2 | just re-type and enter the above line | make -j $(nproc) |". Same like for me also error come i repeatedly that cmd but same error again and again 😢
    Plz reply ASAP

    • @Core-Electronics
      @Core-Electronics  4 หลายเดือนก่อน +1

      Sorry to hear you are having issues, we have a dedicated community forums post that might have some helpful information, if not feel free to chuck a post there with your setup and problem, we have lots of maker eyes over there that can help!
      forum.core-electronics.com.au/t/hand-recognition-and-finger-identification-with-raspberry-pi-and-opencv/12705

    • @spacetechnology9718
      @spacetechnology9718 4 หลายเดือนก่อน

      @@Core-Electronics i have already posted (58th post) there also

  • @carlinelectronic7692
    @carlinelectronic7692 2 ปีที่แล้ว +1

    Genial ❤️

  • @JenniferEliseAtchiso
    @JenniferEliseAtchiso ปีที่แล้ว

    A Bit of trivia for you from an American Sign Language Interpreter… The gesture You and many others use as ‘Rock & Roll’ Really means ‘Bullshit!’ In American Sign Language. I laugh every time I see it being used.

  • @bridgetclinch3678
    @bridgetclinch3678 2 ปีที่แล้ว +1

    Made the mistake of updating my WIP robot with Bullseye, so now I can move it around but not using its camera in python, doh

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว

      I made a quick guide on how to deal with the new camera terminal commands for 'Bullseye' OS. It might come in handy for you - core-electronics.com.au/tutorials/raspberry-pi-bullseye-camera-commands.html

    • @bridgetclinch3678
      @bridgetclinch3678 2 ปีที่แล้ว +1

      @@Core-Electronics yeah saw that, had a play, still waiting on python libraries, might format a new card and go back to buster need to play with open CV too as I have the pi cam on the robot frame pointing forward and a USB cam on a servo pan tilt for looking around. Still such a coding n00b.

    • @Core-Electronics
      @Core-Electronics  2 ปีที่แล้ว

      I love the sound of that project! Excited to see what you come up with 😊 come make a post on our forum to show it off/if you need any help forum.core-electronics.com.au/

  • @rishabhkumar7405
    @rishabhkumar7405 7 หลายเดือนก่อน

    Is it possible to run this on a RPi zero 2W??

  • @shahedhaqqani9584
    @shahedhaqqani9584 หลายเดือนก่อน

    Hi, is this possible with raspberry pi 3 b+?

    • @Core-Electronics
      @Core-Electronics  หลายเดือนก่อน

      This set up will work with a Pi 3B+, but it may be a very slow and laggy experience :(