Inference with SAHI (Slicing Aided Hyper Inference) using Ultralytics YOLOv8 | Episode 60

แชร์
ฝัง
  • เผยแพร่เมื่อ 17 ต.ค. 2024

ความคิดเห็น • 32

  • @Sasha-n2x
    @Sasha-n2x 2 ชั่วโมงที่ผ่านมา

    Wow, Episode 60 already! Given the technical innovations we're seeing with SAHI and tiled inference, what do you think are the potential environmental implications of this advanced AI application? Can it, for instance, help monitor wildlife in protected areas without too much footprint? 🌍 #AIForGood #SustainableTech

  • @m033372
    @m033372 3 หลายเดือนก่อน +1

    Awesome summary of SAHI with YOLO!

    • @Ultralytics
      @Ultralytics  3 หลายเดือนก่อน

      Thank you for the kind words! 😊 We're thrilled to hear you enjoyed the summary of SAHI with YOLOv8. If you have any questions or need further details, feel free to ask. For more in-depth information, you can check out our documentation docs.ultralytics.com/guides/sahi-tiled-inference/. Happy detecting! 🚀

  • @Smitthy-k9d
    @Smitthy-k9d 2 หลายเดือนก่อน

    Loving this SAHI breakdown! Quick question: how does SAHI compare in performance and efficiency with standard YOLOv8 inference for larger, high-res images? Anyone tested both side by side? 🧐

    • @Ultralytics
      @Ultralytics  2 หลายเดือนก่อน

      Great question! SAHI significantly enhances performance and efficiency for large, high-res images by slicing them into smaller, manageable parts. This reduces memory usage and speeds up processing, especially on hardware with limited resources. Standard YOLOv8 might struggle with such images due to higher computational demands. For a detailed comparison, check out our guide: SAHI Tiled Inference docs.ultralytics.com/guides/sahi-tiled-inference/. 🖥️✨

  • @LunaStargazer-v1s
    @LunaStargazer-v1s 3 หลายเดือนก่อน

    Your video sheds light on the fascinating world of SAHI and YOLOv8 with such clarity! I’m curious, what are the potential limitations or challenges one might face when implementing SAHI tiled inference in real-world applications, like autonomous driving or medical imaging? Do you foresee any controversial ethical implications arising from such advanced AI technologies, especially in terms of privacy or job displacement?

    • @Ultralytics
      @Ultralytics  3 หลายเดือนก่อน

      Thank you for your thoughtful comment! 😊 Implementing SAHI tiled inference in real-world applications can indeed present some challenges. For instance, in autonomous driving, the need for real-time processing might be hindered by the computational overhead of slicing and stitching images. In medical imaging, ensuring the accuracy and reliability of detections across slices is crucial to avoid misdiagnoses.
      Regarding ethical implications, privacy concerns are significant, especially when dealing with sensitive data like medical records or surveillance footage. Additionally, the potential for job displacement due to automation is a valid concern, necessitating a balanced approach to integrating AI technologies responsibly.
      For more on SAHI tiled inference, check out our detailed guide: SAHI Tiled Inference docs.ultralytics.com/guides/sahi-tiled-inference/.

  • @TheodoreBC
    @TheodoreBC 2 วันที่ผ่านมา

    So, bro, is SAHI just slicing the trail mix finer, or could this change how fast you can identify that grizzly behind the tree?

    • @Ultralytics
      @Ultralytics  2 วันที่ผ่านมา

      Hey! SAHI (Slicing Aided Hyper Inference) helps improve detection in large images by slicing them into smaller tiles. This can enhance accuracy and speed, especially for spotting hidden objects like that grizzly! 🐻 Check out more here: docs.ultralytics.com/guides/sahi-tiled-inference/

  • @Melo7ia
    @Melo7ia 2 หลายเดือนก่อน

    This is absolute fire! 🎸 Can you delve deeper into how SAHI handles overlapping regions in tiled inference? Just wondering if there might be any performance trade-offs. Rock on, Ultralytics team!

    • @Ultralytics
      @Ultralytics  2 หลายเดือนก่อน

      Thanks for the love! 🎸 SAHI handles overlapping regions by using smart algorithms to merge overlapping detection boxes during the stitching process. This ensures high detection accuracy without significant performance trade-offs. For more details, check out our guide: SAHI Tiled Inference docs.ultralytics.com/guides/sahi-tiled-inference/. Rock on! 🤘

  • @mangaenfrancais934
    @mangaenfrancais934 23 วันที่ผ่านมา

    Hello, Thanks for the video, it works with Yolov8 segmentation ?

    • @Ultralytics
      @Ultralytics  23 วันที่ผ่านมา +1

      Yes, SAHI works with YOLOv8 segmentation. You can find more details in the SAHI Tiled Inference Guide docs.ultralytics.com/guides/sahi-tiled-inference/. 😊

  • @AxelRyder-q1b
    @AxelRyder-q1b หลายเดือนก่อน

    Heyo!!! Killer content as always! 🎬🔥 Quick q - How does SAHI perform vs. other inference methods when dealing with high-res aerial footage or large terrain images? Any constraints we should know 'bout? Can't wait to mess with this! 🚁📸

    • @Ultralytics
      @Ultralytics  หลายเดือนก่อน

      Hey there! Thanks for the love! 😊
      SAHI shines with high-res aerial footage and large terrain images by slicing them into smaller, manageable pieces, optimizing memory usage, and maintaining high detection accuracy. This makes it ideal for resource-constrained environments.
      Constraints to keep in mind:
      1. Overlap Configuration: Proper overlap settings are crucial to ensure no objects are missed at slice boundaries.
      2. Processing Time: While SAHI reduces memory load, it might increase processing time due to the slicing and stitching process.
      For more details, check out our guide: SAHI Tiled Inference docs.ultralytics.com/guides/sahi-tiled-inference/. Enjoy experimenting! 🚀

  • @adelali3612
    @adelali3612 3 หลายเดือนก่อน +1

    I remember it was not working with some versions of yolo is it fixed?

    • @adelali3612
      @adelali3612 3 หลายเดือนก่อน

      I think when i install latest version of sahi

    • @Ultralytics
      @Ultralytics  3 หลายเดือนก่อน +1

      Hi there! 😊 Thanks for your comment. To help you better, could you please specify which versions of YOLO and SAHI you were using when you encountered the issue? Also, make sure you're using the latest versions of `torch` and `ultralytics`. You can find more details in our documentation docs.ultralytics.com. If you still face issues, feel free to share more details! 🚀

  • @ShrirangKanade
    @ShrirangKanade หลายเดือนก่อน

    is it good for pupil detection?

    • @Ultralytics
      @Ultralytics  หลายเดือนก่อน

      Absolutely! YOLOv8 can be adapted for pupil detection with the right dataset and training. For more details on training custom models, check out our guide: docs.ultralytics.com/guides/model-training-tips/. If you have any specific questions, feel free to ask! 😊

  • @umeshshrestha9650
    @umeshshrestha9650 หลายเดือนก่อน

    can you perform in yolov9?

    • @Ultralytics
      @Ultralytics  หลายเดือนก่อน

      Absolutely! YOLOv9 is designed for high-performance object detection, offering significant improvements in efficiency and accuracy. You can train, validate, predict, and export YOLOv9 models using both Python and CLI commands. For more details, check out the YOLOv9 documentation docs.ultralytics.com/models/yolov9/. 🚀

    • @umeshshrestha9650
      @umeshshrestha9650 หลายเดือนก่อน

      @@Ultralytics can you make a video about it

    • @Ultralytics
      @Ultralytics  หลายเดือนก่อน

      Thanks for the suggestion! While we can't take specific requests for video content, we appreciate your feedback and will consider it for future content. Stay tuned to our channel for updates! 😊

  • @sergiyk1974
    @sergiyk1974 2 หลายเดือนก่อน +1

    'from sahi.predict import predict'

    • @Ultralytics
      @Ultralytics  2 หลายเดือนก่อน +1

      For using SAHI with YOLOv8-OBB, you can use the `get_sliced_prediction` method, which supports oriented bounding boxes. Here's a quick example:
      ```python
      from sahi.predict import get_sliced_prediction
      result = get_sliced_prediction(
      "path/to/your/image.jpeg",
      detection_model,
      slice_height=256,
      slice_width=256,
      overlap_height_ratio=0.2,
      overlap_width_ratio=0.2,
      perform_obb=True Enable OBB
      )
      ```
      For more details, check out our guide on SAHI tiled inference: docs.ultralytics.com/guides/sahi-tiled-inference/

    • @sergiyk1974
      @sergiyk1974 2 หลายเดือนก่อน

      @@Ultralytics parameter perform_obb is not recognized in get_sliced_prediction:
      result = get_sliced_prediction(
      ^^^^^^^^^^^^^^^^^^^^^^
      TypeError: get_sliced_prediction() got an unexpected keyword argument 'perform_obb'
      I have sahi 0.11.18.

    • @Ultralytics
      @Ultralytics  2 หลายเดือนก่อน

      It looks like the `perform_obb` parameter isn't recognized in your current SAHI version. Please ensure you have the latest versions of both `ultralytics` and `sahi`. You can update them using:
      ```bash
      pip install -U ultralytics sahi
      ```
      If the issue persists, please provide more details about the error or the specific use case. For further guidance, refer to our SAHI tiled inference documentation: docs.ultralytics.com/guides/sahi-tiled-inference/

    • @sergiyk1974
      @sergiyk1974 2 หลายเดือนก่อน

      @@Ultralytics I upgraded to latest ultralytics and sahi, but still getting the same error. Here are the versions I have:
      sahi 0.11.18
      ultralytics 8.2.75

    • @Ultralytics
      @Ultralytics  2 หลายเดือนก่อน

      Thanks for the details! It seems like the `perform_obb` parameter might not be supported in the current version of SAHI. Instead, you can manually handle the OBB predictions by processing the slices and then applying the OBB logic.
      Here's a workaround:
      1. Perform sliced inference without the `perform_obb` parameter.
      2. Post-process the results to handle OBB.
      For detailed steps, please refer to our SAHI tiled inference guide: docs.ultralytics.com/guides/sahi-tiled-inference/
      If you continue to face issues, please share more specifics about your use case, and we'll do our best to assist you!