Stable Diffusion on Low VRAM GPUs - Token Merging, Xformers and other Methods
ฝัง
- เผยแพร่เมื่อ 3 ต.ค. 2024
- How do you get Stable Diffusion to run on systems with less than 8GB of VRAM? What problems are there with Xformers? Why does token merging need careful handling?
There are several methods that are reputed to enable low VRAM systems to triumph with Automatic1111, but what are the benefits and drawbacks? This video is your complete guide.
Course Discounts
BEGINNER'S Stable Diffusion COMFYUI and SDXL Guide
bit.ly/GENSTART - USE CODE GENSTART
ADVANCED Stable Diffusion COMFYUI and SDXL
bit.ly/RESTAD - USE CODE RESTAD
Installation Guide and Stable Diffusion playlist
bit.ly/SDPlay
01:38 LOWVRAM and MEDVRAM
03:50 Xformers and Token Merging
05:40 Results Testing Xformers and Token Merging Indivdually
10:01 Activitating Token Merging
10:42 Memory and Speed Performance comparison
Join 🏆 this channel to get access to exclusive content and perks:
bit.ly/XOVjoin
Get Channel Merch ☕
bit.ly/XOVstore
#AIart
Course Discounts
BEGINNER'S Stable Diffusion COMFYUI and SDXL Guide
bit.ly/GENSTART - USE CODE GENSTART
ADVANCED Stable Diffusion COMFYUI and SDXL
bit.ly/RESTAD - USE CODE RESTAD
🎩 Reasoning:
The use of techniques like Xformers and token merging can help optimize the use of Stable Diffusion on systems with low VRAM. However, these techniques come with trade-offs in terms of image quality and rendering variations. It's important to understand these trade-offs and conduct tests to determine the best approach for your specific needs. 🤓👍
My laptops 2060 runs diffusion like its nothing. Seeing all the ways you can reduce vram makes me want to throw it on my steam deck just for fun.
rtx series has tensor cores
RTX 4060 TI here, its not as fast as I need, I think I need a 4090
What laptop do you have?
@@696canaI 4070 ti it works good just want higher res and faster lol i have 12 gb vram but sometimes it slows my pc down and lags it out lol
I'm running on a 1gb Vram amd, can't even use --xformers because amd doesn't support them, I think I'm the definition of the bottom of the barrel lmao. Anyways, I'm the living proof that SD even runs on 1 gb Vram
Are you stuck generating 100x100 icons?
i have 1gb vram as well, what dimensions do you render your image in?
3gb here 448x448 seems to be as big as i can go with sum slight variation if i need to tweak size lol
I generate 1024x1536 images no problem with confyui on 2gb card
I am getting rtx 4090 and rtx 3090 in the same price . which should i buy, but mean while rtx 4090 is not supporting nvlink sli , so I cant add another one rtx 4090 to upgrade the vram upto 48 gb, if there is need in near future for deep learning or AI purposes
What is your suggestion should I go with rtx 3090 or invest money in rtx 4090
rtx 4090, no doubt
4090 is for gaming if you want to game then buy it
This has been a right head scratcher to get this running on a $1000 Windows Surface! And it STILL isn't working. So disappointed with the performance of such an expensive computer (for me, anyway)
Edit: I may have gotten it working with at 1 minute 25 seconds per rendition with a handful of prompts. I used --lowvram --opt-sdp-attention in the arguments!
I suppose they don't call them arguments for nothing, huh? Let's hope I'm not speaking too soon! 😅
i have 12gb Vram and still getting not enough vram. even with a batch of 1...
It's good enough amount vram currently 8+ gb is all on 8 have started to get issues as it's minimum requirements