Detecting Human Body Poses | Core ML | Deep Learning |Visualize the Detected Poses|PoseNet-iOS Apple

แชร์
ฝัง
  • เผยแพร่เมื่อ 8 ก.พ. 2025
  • This Video tutorial is about Detecting Human Body Poses | Core ML | Deep Learning | Visualize the Detected Poses, PoseNet in iOS, Apple.
    This project is latest version in 2021. If this video helped you, recommend to subscribe this channel for more tutorials.
    👨‍💻 NOTE:- This video is meant for beginners to learn basic concept of Pose detection using Core ML model. If I went wrong somewhere, please comment it over to make it clear.
    👉 Download sample code :- developer.appl...
    Tools: Xcode 12.5
    Apple Products.
    Language:- Swift 5
    👉 Let's connect via LinkedIn - / mohammed-azeem-azeez-5...
    👉 Let's connect via Twitter - / azeemohd786
    ☕ If you are satisfied with the video, "BUY ME A CUP OF COFFEE" via Paypal- www.paypal.me/...

ความคิดเห็น • 12

  • @nagarajan4743
    @nagarajan4743 3 ปีที่แล้ว +1

    Thank you! Pose detection how we can use real time in our app ? Can you please tell me any one sceneries?

    • @swiftkatcodefactory
      @swiftkatcodefactory  3 ปีที่แล้ว

      This video which published is about real time tracking right ?

    • @nagarajan4743
      @nagarajan4743 3 ปีที่แล้ว

      @@swiftkatcodefactory Ok. Yes. What s the use detecting pose

    • @swiftkatcodefactory
      @swiftkatcodefactory  3 ปีที่แล้ว

      @@nagarajan4743 Human Pose Estimation (HPE) is a way of identifying and classifying the joints in the human body. Essentially it is a way to capture a set of coordinates for each joint (arm, head, torso, etc.,) which is known as a key point that can describe a pose of a person. The connection between these points is known as a pair. To learn more about it refer - viso.ai/deep-learning/pose-estimation-ultimate-overview/

  • @paenoi3701
    @paenoi3701 ปีที่แล้ว +1

    hello,how can I check confidence score and x,y?

    • @swiftkatcodefactory
      @swiftkatcodefactory  ปีที่แล้ว

      refer regarding x and y positions init : developer.apple.com/documentation/vision/vnpoint/3548330-init and apply vision algorithms for confidence score, refer: developer.apple.com/documentation/vision/recognizing_objects_in_live_capture

  • @darranshivdat4786
    @darranshivdat4786 ปีที่แล้ว

    how could i find the angles between the joints?

    • @swiftkatcodefactory
      @swiftkatcodefactory  ปีที่แล้ว

      Refer the documentation to know about more specific joint axis connection: developer.apple.com/documentation/scenekit/scnphysicssliderjoint

  • @mustafamiromerkalkan6584
    @mustafamiromerkalkan6584 2 ปีที่แล้ว

    Hello, how can we detect positions like push up etc. by using posenet ?

    • @swiftkatcodefactory
      @swiftkatcodefactory  2 ปีที่แล้ว

      It can be used the same used in this video. For push up change the camera position to topview

  • @slacker1014
    @slacker1014 3 ปีที่แล้ว

    is there any way to call the specific joints/body parts?
    eg. whenever the person raises his arm, the app will detect it

    • @swiftkatcodefactory
      @swiftkatcodefactory  2 ปีที่แล้ว

      Yes, there are other ways to customise joint/body parts.