Stable Diffusion on Low VRAM GPUs - Token Merging, Xformers and other Methods

แชร์
ฝัง
  • เผยแพร่เมื่อ 3 ต.ค. 2024
  • How do you get Stable Diffusion to run on systems with less than 8GB of VRAM? What problems are there with Xformers? Why does token merging need careful handling?
    There are several methods that are reputed to enable low VRAM systems to triumph with Automatic1111, but what are the benefits and drawbacks? This video is your complete guide.
    Course Discounts
    BEGINNER'S Stable Diffusion COMFYUI and SDXL Guide
    bit.ly/GENSTART - USE CODE GENSTART
    ADVANCED Stable Diffusion COMFYUI and SDXL
    bit.ly/RESTAD - USE CODE RESTAD
    Installation Guide and Stable Diffusion playlist
    bit.ly/SDPlay
    01:38 LOWVRAM and MEDVRAM
    03:50 Xformers and Token Merging
    05:40 Results Testing Xformers and Token Merging Indivdually
    10:01 Activitating Token Merging
    10:42 Memory and Speed Performance comparison
    Join 🏆 this channel to get access to exclusive content and perks:
    bit.ly/XOVjoin
    Get Channel Merch ☕
    bit.ly/XOVstore
    #AIart

ความคิดเห็น • 18

  • @Pixovert
    @Pixovert  ปีที่แล้ว +2

    Course Discounts
    BEGINNER'S Stable Diffusion COMFYUI and SDXL Guide
    bit.ly/GENSTART - USE CODE GENSTART
    ADVANCED Stable Diffusion COMFYUI and SDXL
    bit.ly/RESTAD - USE CODE RESTAD

  • @Mashup_MagiQ
    @Mashup_MagiQ ปีที่แล้ว +8

    🎩 Reasoning:
    The use of techniques like Xformers and token merging can help optimize the use of Stable Diffusion on systems with low VRAM. However, these techniques come with trade-offs in terms of image quality and rendering variations. It's important to understand these trade-offs and conduct tests to determine the best approach for your specific needs. 🤓👍

  • @alt666
    @alt666 ปีที่แล้ว +5

    My laptops 2060 runs diffusion like its nothing. Seeing all the ways you can reduce vram makes me want to throw it on my steam deck just for fun.

    • @fadedninna
      @fadedninna ปีที่แล้ว +2

      rtx series has tensor cores

    • @696canaI
      @696canaI ปีที่แล้ว +1

      RTX 4060 TI here, its not as fast as I need, I think I need a 4090

    • @DragonW3rrior94
      @DragonW3rrior94 ปีที่แล้ว

      What laptop do you have?

    • @FrostyDelights
      @FrostyDelights 5 หลายเดือนก่อน

      @@696canaI 4070 ti it works good just want higher res and faster lol i have 12 gb vram but sometimes it slows my pc down and lags it out lol

  • @Lumbrax
    @Lumbrax ปีที่แล้ว +19

    I'm running on a 1gb Vram amd, can't even use --xformers because amd doesn't support them, I think I'm the definition of the bottom of the barrel lmao. Anyways, I'm the living proof that SD even runs on 1 gb Vram

    • @Daniel_WR_Hart
      @Daniel_WR_Hart ปีที่แล้ว +1

      Are you stuck generating 100x100 icons?

    • @Btrik000
      @Btrik000 11 หลายเดือนก่อน

      i have 1gb vram as well, what dimensions do you render your image in?

    • @Lairscapes-2023
      @Lairscapes-2023 7 หลายเดือนก่อน

      3gb here 448x448 seems to be as big as i can go with sum slight variation if i need to tweak size lol

  • @MichaelCrawford-gl3pd
    @MichaelCrawford-gl3pd 19 วันที่ผ่านมา

    I generate 1024x1536 images no problem with confyui on 2gb card

  • @teenudahiya01
    @teenudahiya01 ปีที่แล้ว +3

    I am getting rtx 4090 and rtx 3090 in the same price . which should i buy, but mean while rtx 4090 is not supporting nvlink sli , so I cant add another one rtx 4090 to upgrade the vram upto 48 gb, if there is need in near future for deep learning or AI purposes
    What is your suggestion should I go with rtx 3090 or invest money in rtx 4090

    • @erickromano5030
      @erickromano5030 ปีที่แล้ว +2

      rtx 4090, no doubt

    • @aryansoni7968
      @aryansoni7968 ปีที่แล้ว

      4090 is for gaming if you want to game then buy it

  • @SpacenSpooks
    @SpacenSpooks ปีที่แล้ว +1

    This has been a right head scratcher to get this running on a $1000 Windows Surface! And it STILL isn't working. So disappointed with the performance of such an expensive computer (for me, anyway)
    Edit: I may have gotten it working with at 1 minute 25 seconds per rendition with a handful of prompts. I used --lowvram --opt-sdp-attention in the arguments!
    I suppose they don't call them arguments for nothing, huh? Let's hope I'm not speaking too soon! 😅

  • @davehedgehog9795
    @davehedgehog9795 11 หลายเดือนก่อน +1

    i have 12gb Vram and still getting not enough vram. even with a batch of 1...

    • @TechPill_
      @TechPill_ 7 หลายเดือนก่อน

      It's good enough amount vram currently 8+ gb is all on 8 have started to get issues as it's minimum requirements