Who Says You Can't Run SDXL 1.0 8GB VRAM? Automatic1111 & ComfyUi

แชร์
ฝัง
  • เผยแพร่เมื่อ 8 ก.ค. 2024
  • Who Says You Can't Run SDXL 1.0 on 8GB VRAM? Automatic1111 & ComfyUi. I've seen quite a few comments about people not being able to run stable diffusion XL 1.0 on RTX3080's, RTX4080's and you need minimum 12GB of VRAM but that's just not the case. Fake news! hahaha! I wanted to do a quick demo using my RTX3060Ti 8GB of VRAM and give you some tips that hopefully will help you too! Also we'll look at how it runs on Automatic1111 V1.5.1 and ComfyUi.
    Easy Standalone installer for Automatic1111 • Video
    Simple Basic ComfyUi Workflow Templates for SDXL (will add more soon!)
    -Basic SDXL workflow
    -Basic SDXL workflow + Upscale
    drive.google.com/drive/folder...
    Get the model here!
    SDXL Base Model huggingface.co/stabilityai/st...
    SDXL Refiner huggingface.co/stabilityai/st...
    SDXL supporting platforms
    Invoke AI github.com/invoke-ai/InvokeAI...
    ComfyUi github.com/comfyanonymous/Com...
    ComfyUi Template for SDXL, save the image and drag on to the interface, voila! comfyanonymous.github.io/Comf...
    Automatic1111 github.com/AUTOMATIC1111/stab...
    You can try SDXL 1.0 online here
    Playground Ai playgroundai.com/
    Dream Studio dreamstudio.ai/
    Clip Drop clipdrop.co/stable-diffusion
    Stable Foundation Discord / discord
    Stable Swarm github.com/Stability-AI/Stabl...
    API platform.stability.ai/docs/re...
    ⏲Time Stamps
    0:00 Automatic1111 Demo with 3060Ti
    1:32 ComfyUi Demo
    4:00 Possible solutions to run SDXL 1.0
    6:00 Civitai SDXL Models are out!
    *Disclaimer Affiliate Links Below*
    📸 Gear I use
    Sony A7C CAN amzn.to/3spWX8C
    Sony 35mm 1.8 CAN amzn.to/36Apekr
    OBS (FREE) obsproject.com/
    Editor Davinci Resolve www.blackmagicdesign.com/ca/p...
    Audio: Rode Podmic amzn.to/35sSnxv
    🔦 Find us on:
    Discord: / discord
    Instagram: / monzonmedia
    Facebook: / monzonmedia

ความคิดเห็น • 232

  • @EskaronVokonen
    @EskaronVokonen 11 หลายเดือนก่อน +4

    Awesome! Thank you very much. I didn't reinstalled, but just by adding --no-half-vae --medvram (I already had xformers), generation time went from 4-5 minutes for a 20 steps image down to 45seconds for a 40 steps image. I have upgraded from a GTX 1070 8GB to an RTX 3060 12GB earlier this summer, and the speed difference was night and day. So going back to 4-5 minutes per image with SDXL I was just about to throw the towel and go back to SD1.5

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      Glad it helped! I think for most people having speed issues that's the key there. It runs a bit faster on ComfyUi but on A1111 it's definitely acceptable! Have fun!

    • @EskaronVokonen
      @EskaronVokonen 11 หลายเดือนก่อน +2

      @@MonzonMedia I can't stand the UI of ComfyUI, so I'm glad A1111 keeps up with updates

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Hahaha I hear it’s not for everyone. 👍🏼

  • @Lahouel
    @Lahouel 6 หลายเดือนก่อน +1

    Excellent work. 🏆🏆🏆🏆🏆

  • @AntonKorvin
    @AntonKorvin 11 หลายเดือนก่อน +1

    Very useful info, thank you for this tutorial!

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      You're welcome! Another thing to mention, system RAM is important. I find mine spikes to 24GB sometimes. May be part of the reason for others having issues as well.

    • @BensMiniToons
      @BensMiniToons 11 หลายเดือนก่อน

      @@MonzonMedia I had to upgrade my system ram to 32gb or it would freeze my pc and take 4-7 minutes per image.. Auto111 has a memory leak switching models on windows 10. That may be the issue with some peoples saying its so slow.

  • @shiraze01
    @shiraze01 11 หลายเดือนก่อน +1

    medvram was a game changer for me and pretty much solved the issue, thanks!

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Awesome! Glad it was helpful, enjoy!

  • @hleet
    @hleet 11 หลายเดือนก่อน +4

    Very nice workflow there for your comfyUI ! I wish comfyUI have a kind of "5 favorites" workflow that you could custom and switch from and to ... but for now, I have to drag and drop workflows to start working with it ^^.
    I didn't try it yet this SDXL stuff, but with my 3070, I can generate 8 images in one batch with my other checkpoints. Just beware of the image resolution that you input. I think that's one of the beginner's mistake to try to reach 2K resolution in one pass ... which will give very wrong results and maybe they think their graphic card is broken !! (lol). The upscaling is another culprit by doing way too much (over 2.0 latent upscale) will put to its knees your graphic card.
    Another thing, if you overclock your card, it seems that comfyUI will stop working randomly. You have to redo the overclocking if needed.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      Appreciate the feedback! I did add a link to a google drive to download the workflow if you want to tweak it. There is also one for an upscaler that uses 4xUltrasharp. Yeah I agree, I wish there was a more seamless way to switch workflows, maybe there is and I just don't know. I'm still learning and discovering things but it's very fun to learn for me and also helps with learning what happens under the hood! Good to know about the overclocking, I wasn't aware of that. I typically don't overclock my GPU since I don't play games too much these days and I have my whole system where the temps are low on load with a slight CPU overclock but it never get's passed 75c when stressed. Keeps my system stable and barely ever crashes.

  • @freedom4sweden58
    @freedom4sweden58 11 หลายเดือนก่อน +2

    i have an old 1080GTX 8GB, and SDXL is very slow in A1111, but in in ComfyUi SDXL works like a charm

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Have you tried putting in the command arguments I mentioned? That should help speed things up in A1111. I do love using comfyui though 👍🏼

  • @MikevomMars
    @MikevomMars 11 หลายเดือนก่อน +12

    I am running SDXL on a *2060 SUPER 8GB* using ComfyUI - no problems at all. Using ComfyUI gives me 15x(!!!) more speed than using A1111, by the way, so I'd strongly recommend using Comfy when you are low on VRAM.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Awesome!! Yeah ComfyUi is a bit more optimized and so fun to learn. I love being able to create my own workflow! 👍🏼

    • @claffert
      @claffert 11 หลายเดือนก่อน +2

      I'm using SDXL under ComyUI on a painfully old 1070 with 8GB. It's slow doing 1024x1024, but no problems with VRAM.

    • @4rrxw794
      @4rrxw794 11 หลายเดือนก่อน +1

      You mean 1.5x more speed?

    • @MikevomMars
      @MikevomMars 11 หลายเดือนก่อน

      @@4rrxw794 Nope, fifteen (15!) times faster. Task is to create a 1024x512 image, refine it and upscale it using 4xUltrasharpSharp, A1111 takes 4-5 minutes(!) on my machine using SDXL (if it does not freeze at all) while ComfyUI takes ~20-30 secs only. It seems that A1111`s memory usage is not optimized at all, especially when using SDXL.

    • @rezhaknightwalker
      @rezhaknightwalker 11 หลายเดือนก่อน +1

      how about 6gb vram?

  • @terbospeed
    @terbospeed 11 หลายเดือนก่อน +11

    I'm running it on a 6GB 1060 with 16GB RAM. My machine is literally melting but that's what bleeding edge is all about. Using the refiner addon and tiled VAE, until a new server arrives.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +3

      Appreciate the feedback! I think with all the comments I've seen so far, has to be user error for those that are experiencing issues with more than capable cards. BTW still have my 1060 3GB GPU that my wife uses for a simple set up for streaming shows and browsing ☺ 1060 series has held it's own!

    • @crobinso2010
      @crobinso2010 11 หลายเดือนก่อน

      @@MonzonMedia If they're using --lowvram instead of --medvram then it will run very slowly, but that's normal for that setting and avoids the out of memory errors.

    • @chove93
      @chove93 11 หลายเดือนก่อน

      hi,how long does it take to generate an image with 1060 ?

    • @alwega2923
      @alwega2923 11 หลายเดือนก่อน

      Would *really* like to now how you got it running, on my rtx3060 with 6gb I cannot! Used all optimizations I could find, still no luck

  • @TheZolon
    @TheZolon 11 หลายเดือนก่อน +1

    I am running the new SDXL 1.0 and some of the new checkpoints using it in ComfyUI and Stable Diffusion Web UI with the RTX3050 MOBILE .. only 4GB of VRam. ComfyUI is pretty snappy, but I am still learning how to make custom workflows in it.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      Yeah ComfyUi is great that way. Not sure if you saw but I left some simple workflows in the description, one of them is with an upscaler. I'm also working on an img2img one and inpainting. Still learning myself but I'm enjoying learning it!

  • @murphylanga
    @murphylanga 11 หลายเดือนก่อน

    runs fine on my 3070 laptop with 8GB VRAM, have Cuda 2.0, tiled VEA and Xformer in Automatic1111, ComfyUi set itself up correctly and so does Invoke AI. For Automatic1111 I recommend installing and turning on the refiner extension *wow* Now Automatic1111 is faster then CompyUI
    Only for my Lora training I don't have a solution yet 😞

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Just found out about the extension yesterday and did a video on it as well. 👍Running great now!

  • @corejake
    @corejake 11 หลายเดือนก่อน +1

    I'm running Comfy on 6gb without any issues. Never had any issues, speed is top, no tweaking was done.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      The issue was more with automatic1111, comfyui is great! 👍🏼

  • @3diva01
    @3diva01 11 หลายเดือนก่อน +1

    Thankfully there's an addon now for Automatic1111 that lets you run the refiner automatically without having to switch to it.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      Yes! I discovered that after this video and made a follow up the same day hahaha! th-cam.com/video/zNZFelT_-Tk/w-d-xo.html Regardless, the extension is good but limited in control whereas comfyui give you more control.

  • @OmniSoundAI
    @OmniSoundAI 11 หลายเดือนก่อน +1

    The reason people have had long times generating images is cause of a driver update some seem to not have any issue but with the newer drivers it takes ages but downgrading to a older graphics driver it works just fine. Only issue is for certain games like Diablo 4 you require the newest driver.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Ah that’s a good point as well. Going to do a follow up video soon and mention this as a possibility. Appreciate the input! 🙌🏼

    • @TheSchwarzKater
      @TheSchwarzKater 11 หลายเดือนก่อน

      I'm in that situation right now. On my RTX3080 driver 531.61 was running fine, but the new one made rendering take 10 times longer, but 536.67 is required for some games. I'm so lost in what to do now. I guess I get to switch drivers daily now. oof

    • @OmniSoundAI
      @OmniSoundAI 11 หลายเดือนก่อน +1

      @@TheSchwarzKater Yea that's what I've been doing switch to old drivers for using AI then update for playing Diablo 4, its a little annoying but not to bad.

  • @yiluwididreaming6732
    @yiluwididreaming6732 11 หลายเดือนก่อน +1

    Great workflow...thanks so much for sharing!!! SUBBED. BTW Def Pref for Comfy UI. So much quicker render and image processing. I think you might be right there about extentions adding weight to AUTO and slower the 'machine' down to a ...putt putt......is the SDXL model censored? curious.....

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      Hey there! Really appreciate the support and thank you! SDXL models is not censored, take a look at the available models and lora's in Civitai as there are a few NSFW ones available. I'll be covering more ComfyUi stuff soon as I'm still learning new workflows but I'd still like to share with all of you what I'm learning. 👍

  • @polymath_wtf
    @polymath_wtf 10 หลายเดือนก่อน +1

    Thx for motivation try again
    glhf

    • @MonzonMedia
      @MonzonMedia  10 หลายเดือนก่อน

      You're welcome! You might find this video interesting as well th-cam.com/video/C97iigKXm68/w-d-xo.html What issues are you having and which GPU?

    • @polymath_wtf
      @polymath_wtf 10 หลายเดือนก่อน +1

      thx for reminding me
      i have adhd :P

  • @TheCriticalMastermind
    @TheCriticalMastermind 11 หลายเดือนก่อน +1

    Love your videos you are really teaching me so much I started on easy diffusion but I want to use this stable delusion UI do you have a link to show how to install etc?

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      Hey bud, thanks so much for the positive feedback. So stable diffusion is the general "model" that all these platforms use. There are various choices depending on your set up. I'd watch this video first before you decide, th-cam.com/video/OeqpwYLMaN0/w-d-xo.html then you can tell me more about your computer specs so that I can recommend something. Perhaps we can converse about this in the comments of the linked video? Others may benefit from it as well. I have a few install videos too.

  • @didegng4job437
    @didegng4job437 5 หลายเดือนก่อน +1

    can you please solve the error
    There is not enough GPU video memory available! using sdxl having amd gpu and comfyui

  • @synthoelectro
    @synthoelectro 11 หลายเดือนก่อน +1

    ComfyUI is running on my 4GB VRAM 16GB i5 Intel. Slow yes, but 4 mins to 1.35 min sometimes. Heard if you up your virtual memory it works on my setup.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Good to know! Appreciate the feedback.

    • @westingtyler1
      @westingtyler1 11 หลายเดือนก่อน

      slow, but that's nuts it works with only 4gb. someday we'll be running this at 60fps on our phones.

    • @synthoelectro
      @synthoelectro 11 หลายเดือนก่อน +2

      @@westingtyler1 it won't be long until my setup works well with it, fast, as someone will figure out a method.

    • @synthoelectro
      @synthoelectro 11 หลายเดือนก่อน +1

      @@MonzonMedia welcome

  • @solaieditz
    @solaieditz 11 หลายเดือนก่อน +3

    I’m running comfyui in my RTX 3060 even though I don’t have any problems it’s run butter smooth.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      I'm so addicted to ComfyUi. And yes it runs so nice! I know it may seem like more work to some but I love creating my own little workflows. I'm still learning but I'm really enjoying it!

    • @johannesson77
      @johannesson77 11 หลายเดือนก่อน +2

      Yep same here.

    • @solaieditz
      @solaieditz 11 หลายเดือนก่อน +1

      @@MonzonMedia i worked in blender lot, there's node is everything, no ComfyUi is not look big deal for who seen node base workflow, and im also having same feel that every time i found new work flow myself, im creating like new app for my own use, its like god like power, .......

    • @systemdersiebenwelten
      @systemdersiebenwelten 11 หลายเดือนก่อน +2

      Yes, it is even not easy to crash. It's crazy compared to Automatic1111 in terms of performance.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      @@systemdersiebenwelten Definitely more versatile and I like that you have complete control of the pipeline although it's not for everyone. I like to tinker so it's perfect! 😁

  • @BensMiniToons
    @BensMiniToons 11 หลายเดือนก่อน +1

    added a refined extetion. does all this with out going to img2img

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      The day I released the video is when the refiner came out so I made an update video that same day th-cam.com/video/zNZFelT_-Tk/w-d-xo.html 😆😊 moves too fast!

  • @yak2006
    @yak2006 11 หลายเดือนก่อน +1

    Im using with rtx 3060m 6gb and 40g of ram and this is sloooow to generate but works😂 if use only the Refiner in max 800x800 works too. Edit now try in ConfyUI with same hardware in a notebook and works great, 1 min 20s with 1024x1024 base+refiner. ConfyUI is better and faster for generate.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      ComfyUi will always be the better choice for generating images although not sure why you are experiencing slowness with A1111. If you already tried the command arguments, not sure what it could be? Besides, looks like node based UI's will be the way to go for a lot of people, including myself! 👍

  • @Xandercorp
    @Xandercorp 11 หลายเดือนก่อน +1

    I've tried it with a 3080 and did a fresh install, it just gets to 90-95% render and then the UI doesn't update to show the complete image, it just sort of hangs.
    The console shows that the task is finished but there's some miscommunication between the console and the UI. It also keeps the GPU at 100% after even though it's finished. I really don't think it's my setup but a bug that was introduced.
    Anyway, I also used ComfyUi and that is always smoother. I just have a hard time with being flexible with it, especially when wanting to add multiple things like ControlNets and Loras/extensions.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Interesting, I wonder if it’s a driver update? One of the other commenters mentioned the latest drivers are causing issues. Have you tried the command arguments?

    • @Xandercorp
      @Xandercorp 11 หลายเดือนก่อน +1

      @@MonzonMedia I've tried a bunch of things, command arguments included. "--no-half-vae" just gives me an error, something about running out of resources. I have the latest drivers, perhaps that's the reason. I'll check on that, but honestly feel like A1111 is trying to do too many things. I saw there's also some other A1111s, that have sort of split away from the original and some users reported they worked better. Ill likely try Swarm as well. Would appreciate tutorials on that :)

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      @@Xandercorp What worked for me was using arguments --xfromers and --medvram, other than that my 3060Ti works fine. The only other thing I could think of is system memory. From my tests it can peak at 24GB, so system RAM could be a factor if you have less than that. Fortunately I have 32GB but it's not the fastest RAM out there. 16GB will struggle.

    • @Xandercorp
      @Xandercorp 11 หลายเดือนก่อน +1

      @@MonzonMedia Cheers, man. I've got 12 GB vRAM and 32 system RAM at 3600 freq. I just tried --xformers and --medvram and it does the same, predictably. It renders the image to 100% in console, but in A1111 it only goes up to 95%, and it keeps processing at maximum even after it's done.
      The error I got: ERROR: Exception in ASGI application and a bunch of text lines after. I'll find a way eventually, thanks though.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      @@Xandercorp Other than removing --xformers, removing extensions and trying a fresh install, it's beyond me man, sorry I couldn't help. It could be a number of things. I did remember an older post on this from github that you can read to see if any of the suggestions work. When you figure it out, please keep me updated. I'm always open to learning troubleshooting tips. github.com/AUTOMATIC1111/stable-diffusion-webui/issues/6835

  • @westingtyler1
    @westingtyler1 11 หลายเดือนก่อน +2

    I run SDXL just fine in A1111 on an RTX 2060 Super, which has 8gb vram. to generate a 768x1024 image (vertical portrait) takes about 20 seconds.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Yeah seeing all the comments makes me think what is happening then with other users with bigger cards? Most likely user error?

    • @westingtyler1
      @westingtyler1 11 หลายเดือนก่อน +1

      @@MonzonMedia for me I was getting an error in automatic1111 until I forced the webui to update. after that it worked fine. i also disabled all addons except Roop.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      I suspect some of the extensions casing these errors. I did the same but left the image gallery and ControlNet. Works flawless.

    • @westingtyler1
      @westingtyler1 11 หลายเดือนก่อน +1

      @@MonzonMedia let me ask you this. does control net work just fine with sdxl in a1111, in the 1024x1024 resolution? i haven't tested it because I assumed it would not work.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Not at the moment that’s still to come. Haven’t heard any timelines either but I suspect it won’t be long.

  • @user-ph1ir8mb7w
    @user-ph1ir8mb7w 10 หลายเดือนก่อน +1

    how about controlnet guys is it enabled in confyui yet?

    • @MonzonMedia
      @MonzonMedia  10 หลายเดือนก่อน

      Yes caanny and depth which can be installed through the ComfyUI manager. You can even download the models from there too. 👍

  • @SavetheRepublic
    @SavetheRepublic 10 หลายเดือนก่อน +1

    I have a 3070 and it takes a few minutes to genarate one image. I'm not sure why.

    • @MonzonMedia
      @MonzonMedia  10 หลายเดือนก่อน

      Try these methods and see if it helps th-cam.com/video/7mlJQ6viH20/w-d-xo.htmlsi=jJksKD-Yzs7_ntj9

  • @Chris3s
    @Chris3s 11 หลายเดือนก่อน +1

    Is it possible to inpaint directly in ComfyUI? Haven't seen any info on that online

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      Great question and something I've been looking for as well. The process works but to actually do inpaint within ComfyUi I haven't seen yet and not sure if it's possible. I've only see methods where you create the mask outside of ComfyUi and bring in the aplha channel mask. If I find out I'll be definitely creating a video for it!

    • @sidheart8905
      @sidheart8905 11 หลายเดือนก่อน +1

      @@MonzonMedia you can create mask in comfyui itself in the image load u have mask option

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Oh awesome! I'll have to check that out, I haven't tried it yet as I'm still working on some simple workflows for SDXL and SD1.5. Thanks for confirming! 👍

  • @olvaddeepfake
    @olvaddeepfake 11 หลายเดือนก่อน +1

    my problem with sdxl in automatic1111 is switching models. it takes so long to switch but generating an image doesn't take long atall.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Are you using an SSD or HDD? I notice the initial load it was a bit long but not that bad, then after it started loading fine.

    • @olvaddeepfake
      @olvaddeepfake 11 หลายเดือนก่อน +1

      @@MonzonMedia im using HDD i do have an SSD so i might try it on that drive

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      @@olvaddeepfake That may likely be why. I'd suggest if you have space to dedicate a whole SSD for all your stable diffusion stuff. I have a dedicated SSD just for models hahaha!

    • @olvaddeepfake
      @olvaddeepfake 11 หลายเดือนก่อน +1

      @@MonzonMedia lol ah yes i will try it! one other thing renders on automatic1111 and comfyui both give me artefacts on the images. it's light faint colorful lines. iv'e seen people saying about a new vae but i don't know where to get that from or how to use it in comfyui.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      @@olvaddeepfake There is a new SDXL 1.0 model with VAE baked in, try that one. Also if you have the VAE drop down installed, change it to none or auto. Also you can download the separate VAE file here huggingface.co/stabilityai/sdxl-vae/tree/main just place it into (wherever you have it installed) C:\A1111\stable-diffusion-webui\models\VAE

  • @fredmcveigh9877
    @fredmcveigh9877 9 หลายเดือนก่อน +1

    All changed with automatic1111 point 6 with refiner now included .I don't like noodles .

  • @kirisuma2186
    @kirisuma2186 5 หลายเดือนก่อน +2

    seeing comfy doesn't let me feel comfy...

    • @kirisuma2186
      @kirisuma2186 5 หลายเดือนก่อน +1

      I apologize for the bad pun...

    • @MonzonMedia
      @MonzonMedia  5 หลายเดือนก่อน

      😂 Thanks for the laugh! I totally get it though, ironic isn't it? Once you wrap your head around the node based method it does get easier but definitely a big learning curve for most people.

  • @0A01amir
    @0A01amir 11 หลายเดือนก่อน +2

    It won run on 4GB Vram and 8GB system ram on Auto1111 but works with ComfyUI. i am using a GTX 970 4GB (3.5GB), it's a freeze fest on Auto1111 for hours but works on ComfyUi but it take ~40min for a 20step image to render, lol. 1.5 or other models are fine, 15 to 20 second.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      Did you try putting the command argument I mentioned in the webui user.bat file? --medvram should help with the speed, although it still will be faster in ComfyUi.

    • @0A01amir
      @0A01amir 11 หลายเดือนก่อน +1

      @@MonzonMedia How did you read my comment ? i can't even make sense of my own comment :))) but you got it right, thanks.
      Well, it's impossible to run SDXL (even models like Dreamshaper's SDXL version) on that hardware, wont even worth the time, i stick to other models for now until maybe i upgrade my PC someday.
      I liked WebUI's interfance and img2igm section but man, that thing breaks itself even without doing anything with it. yesterday i tried to run WebUI but it stuck at "DiffusionWrapper has 859.52 M params" command for 10min or so and i had to close it,. found the solution but didn't helped my case. i'm new to all of this recently learn how to do img2img on ConfyUI and it was ok, it's pretty similar to Blender's node system that i'm used to.
      And yes, tried all of the commands, --lowvram --medvram --xformers, one by one and even combiend or without any of them, didn't helped my case. maybe in a month or so we get some updates and better tools, at least one can hope.
      Thanks for your time man

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      @@0A01amir I could piece it together hahaha! All good man, so from my own experiments and findings you have 2 limiting factors which is your GPU VRAM but again on ComfyUi 4GB is enough. The other factor is the system RAM and SDXL will often use 16GB or more, with A1111 might even be higher. Let's hope that down the road they make it more optimized that's less system demanding. Thanks for sharing your experiences, it helps me learn how to help other people and what the limitations are for various systems. I appreciate it!

    • @0A01amir
      @0A01amir 11 หลายเดือนก่อน

      @@MonzonMedia Thank You

  • @niveZz-
    @niveZz- 11 หลายเดือนก่อน +1

    i have the exact same gpu and i thought i made a mistake not having enough vram bcs stable diffusion didn't exist back then
    and um i also managed to generate a 300*300 image on a snapdragon 855 cpu

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Hey at least it's running! That's a laptop?

    • @niveZz-
      @niveZz- 11 หลายเดือนก่อน

      @@MonzonMedia um no that's a phone (OnePlus 7 pro) i installed windows on
      it's also possible on Android using termux but when I tried it the lmk always stopped stable diffusion because I don't have enough ram

  • @JanWischnat
    @JanWischnat 11 หลายเดือนก่อน +2

    I'm running sdxl on auto1111 on a 2070s without issues. Each (base-) image takes about 25 sec to render. I'm using --xformers and --medvram...

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Appreciate the feedback, honestly not sure why others are complaining it's not working or even taking like 4-6min to generate images. I suspect they are new and don't know how to use the arguments.

    • @westingtyler1
      @westingtyler1 11 หลายเดือนก่อน +1

      this is my situation exactly. 2060 super, about 20 seconds to render a 768x 1024 portrait.

    • @Queenbeez786
      @Queenbeez786 9 หลายเดือนก่อน

      @@MonzonMediai’ve heard id fire to graphics card. use the previous version 531 nvidia n it works nicely

  • @aravindleo10
    @aravindleo10 11 หลายเดือนก่อน +1

    Can i run SDXL 1.0 on my RTX 3050 4GB Laptop
    Which ai modela can i run??

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Guess you will find out 😁I know ComfyUi will work but it's a bit more of advanced UI. I would say take your pick, Invoke AI, Easy Diffusion or Automatic1111 which I have installation videos for on my channel. SDXL relies on quite a bit of VRAM though. If you want to use something free just to try it out check out playgroundai.com

  • @mupmuptv
    @mupmuptv 11 หลายเดือนก่อน +1

    Does it the same as invokerai

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      I'm sorry can you clarify your question? Is what the same in Invoke AI?

  • @apnavu007
    @apnavu007 หลายเดือนก่อน

    I'm thinking a buy a Laptop 8GB Vram Will I be able to run a Stable Diffusion XL model smoothly?

    • @MonzonMedia
      @MonzonMedia  หลายเดือนก่อน +1

      For standard generations it's good enough but I always recommend people to get at least a GPU with 12GB of VRAM and minimum 32GB of system RAM along with ample storage, SSD if possible.

    • @apnavu007
      @apnavu007 หลายเดือนก่อน +1

      @@MonzonMedia Thank you so much 💕

  • @ashutoshnagrale4783
    @ashutoshnagrale4783 11 หลายเดือนก่อน +1

    Im running it on my 3060 , 6Gb Vram and 16Gb ram, It runs

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      What is your performance like in terms of generation times? I only ask because SDXL tends to use more system memory. If you upgrade your RAM to 32GB down the road it should make a difference.

    • @Dante02d12
      @Dante02d12 8 หลายเดือนก่อน

      @@MonzonMedia I have the same hardware. SDXL is unusable in Auto1111, even running lowvram argument. It fills up the RAM then uses more than the 6GB VRAM, it gets in the shared memory. Result : horrendous performance, the software estimates 30 minutes to generate the image. Unusable.
      ComfyUI is so memory efficient that SDXL is actually usable. A 1024x1024 picture renders in 1 minute the first time, then 40 seconds. The command prompt tells me it runs at a stable 1.76 s/it.
      All of that is without refiner. I still don't understand how to use that, lol.
      I can't say it's great quality though, my results are blurry, pixellated, paint-like despite using an anime model.
      I also have this issue with my 1.5 models on that interface, so I guess I should find a better workflow.
      For context, I'm using the SDXL Efficient nodes.

  • @Adventurerest
    @Adventurerest 11 หลายเดือนก่อน +1

    i'm running with 4GB 1050 Ti and it take 7 minutes to render one image. yeah, im using comfy UI

  • @himars_equalizer5125
    @himars_equalizer5125 11 หลายเดือนก่อน +2

    I can run it on my 6Gb VRAM without any problem

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Nice! 3060? What are generation times like for 1024x1024?

  • @csharpner
    @csharpner 11 หลายเดือนก่อน +1

    I can't get it to run on my RTX 3090 w/24GB of VRAM and 128GB of system RAM! :(

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Really very strange and I've seen others say the same. Do you have a lot of extensions? Is your graphics card drivers updated? If you haven't already try adjusting the webui-user.bat file as suggested in the video. Otherwise I'd try a fresh install with no extensions just to see if it can run.

    • @csharpner
      @csharpner 11 หลายเดือนก่อน

      @@MonzonMedia I have no extensions installed. I'm on Linux, so there's no webui-user.bat file. I'll see if there's an equivalent for Linux.

  • @ManHobbyChannel
    @ManHobbyChannel 11 หลายเดือนก่อน +1

    I'm running SDXL with Invoke AI on my 3070 with 8GB of VRAM just fine.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Indeed! Have you tried the latest update with the refiner?

    • @ManHobbyChannel
      @ManHobbyChannel 11 หลายเดือนก่อน +1

      @@MonzonMedia Same night as the SDXL1.0 was release.
      It's better, but not perfect yet. It does some stuff that is too far away from the prompt.

  • @amkire65
    @amkire65 11 หลายเดือนก่อน +1

    I'm running it on both ComfyUI and Automatic 1111 on a desktop computer with a 5gb Quadro P2000, so 8gb should be plenty... though it can be kinda slow.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Oh interesting! Curious to know how much system RAM you have? I find 32GB is enough but 16GB can be limiting. From my own tests I can get peaks of 24GB of system RAM which can account for some slowness.

    • @amkire65
      @amkire65 11 หลายเดือนก่อน +1

      @@MonzonMedia I've got 32gb of ram. At the moment I'm running one of the (many) workflows from Citivai and yes, it seems to be using around 24gb of ram. I'm using a slightly upgraded Z240 desktop, so the ram in it isn't the most up-to-date. Still, upgrading the ram of the computer is cheaper than upgrading the graphics card... if you don't mind images taking minutes rather than seconds to produce.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      @@amkire65 aaaahhhh awesome, good to know and thanks for sharing. Just want to confirm with you all but seems like those factors are what is causing slowness for some. but yes, upgrading system RAM can improve things greatly. I wonder if Stability will continue to optimize SDXL?

    • @amkire65
      @amkire65 11 หลายเดือนก่อน +1

      @@MonzonMedia Interesting that we're both using about the same amount of system ram when we have different amounts of vram. I guess your system would be a little quicker than mine in that case. I think Stability would continue to improve it to a point, they improved 1.5 up to a certain point, then the AI community took over with textual inversions, better check points, LoRA, Lycoris, etc. Maybe they like to get it started and see where others will take it?

  • @brennt501
    @brennt501 11 หลายเดือนก่อน +1

    In my case it's running at 6 Gb of VRAM. I tried 1024.

  • @andysjourney2874
    @andysjourney2874 11 หลายเดือนก่อน +1

    I can run it on Geforce 1660 Super, with 768x768 resolution. SDXL models loads in seconds. Can't wait for my new Gpu!

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      Appreciate the info bud! As I've said to the others, I'm kind of stumped as to why I'm seeing reports of people with top of the line GPU's not being able to run SDXL? What GPU are you getting next?

    • @andysjourney2874
      @andysjourney2874 11 หลายเดือนก่อน +1

      @@MonzonMedia I think it will be the ASUS TUF Gaming RTX 3060 OC. Don't have a lot of money, and it needs to work on my Windows 7:-) Thanks for your video!

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      It's a great card dude! 3060 is no slouch, works for me! Bruh but windows 7? hahaha! 😁☺

    • @andysjourney2874
      @andysjourney2874 11 หลายเดือนก่อน

      @@MonzonMedia Yeah, i know:-) I just love my Windows 7. I can't give it up! If I ever get the rtx 3090, i will most likely have one of the absolute best windows 7 machines on planet earth.

    • @chove93
      @chove93 11 หลายเดือนก่อน +1

      hi,how long does it take to generate an image with 1660 (i have same) ? you try on 1024 ?

  • @handsomelyhung
    @handsomelyhung 11 หลายเดือนก่อน

    Ive been using a 3060 in Stable Diffusion since Nov 2023 with Automatic 1111. Only problem ive had is images take a few minutes to generate.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      I’ve had mine since March 2022 roughly and it’s fine for my needs however the context of this video is for SDXL 1.0 many people saying it doesn’t work with this card which it clearly does. That being said, I really hope they start optimizing for regular users as gpu’s are so expensive! 😊

    • @liquidmind
      @liquidmind 11 หลายเดือนก่อน

      ""
      @handsomelyhung "Ive been using a 3060 in Stable Diffusion since Nov 2023 with Automatic 1111. " So YOU COME FROM THE FUTURE? its AUGUST 2023

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      👀😊 🔮

    • @handsomelyhung
      @handsomelyhung 11 หลายเดือนก่อน +1

      @@MonzonMedia lol i meant 2022

  • @proinfographic2022
    @proinfographic2022 11 หลายเดือนก่อน +1

    Can we sell these pictures of sdxl 1.0? I don't understand License: CreativeML Open RAIL++-M

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      In general yes. 👍🏼

  • @hairy7653
    @hairy7653 11 หลายเดือนก่อน +1

    i got a 3060 and 16 gb ram...it takes about 5 mins to load refiner in auto1111

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Where are your models located? On an SSD or HDD?

    • @hairy7653
      @hairy7653 11 หลายเดือนก่อน

      @@MonzonMedia ssd, does it sound abnormal? that was 16gb ram not vram. Changing models freezes the PC for about 5 min! its it my setup?..its a fresh install with no pluggins

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      @@hairy7653 Yeah way too long. When I first loaded it, it took a bit longer but not more than a minute. I've heard others having the same issue but it's mostly because their models are on mechanical hard drives. After it loads, does it generate an image normally though? If you are up for it, I would start with a fresh install, just save your models folder somewhere and bring them back in after. I have a feeling some extensions might be conflicting with it but I can't say for sure.

    • @hairy7653
      @hairy7653 11 หลายเดือนก่อน +1

      @@MonzonMedia yeah the images render well and fastish..its just the changing on models..im going to install another version now....thanks man

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      @@hairy7653 This is my favourite Automatic1111 installer and launcher, super easy! And it stays up to date! github.com/EmpireMediaScience/A1111-Web-UI-Installer

  • @justinwhite2725
    @justinwhite2725 11 หลายเดือนก่อน +1

    I run SDXL with 0 vram because I have an old AMD card that doesn't support RocM.
    Runs just fine from the CPU, albeit slow.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      ooohhhh I can imagine it would be pretty slow. Guess you have to make sure your prompts are solid! 👍

    • @diadetediotedio6918
      @diadetediotedio6918 11 หลายเดือนก่อน +1

      wow, from the CPU? How much slow it is?

  • @SamuraiG
    @SamuraiG 11 หลายเดือนก่อน +1

    Why is it that AI programs snub Radeon cards?

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Mainly because NVidia cards have cuda cores that speeds up compute processes. It's been that way for many years especially when it comes to "graphics". You can run AMD in a Linux environment but I can't tell you how well it runs as I've never tried it myself. But I feel for ya! It should be more widely accessible, there are very few people that even use linux as an OS.

  • @ESGamingCentral
    @ESGamingCentral 11 หลายเดือนก่อน +1

    My 4090 has an horrible performance I don’t know what’s going on with automatic1111

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      That's really strange for a 4090. Have you tried an older driver version? Do you have the latest A1111?

    • @ESGamingCentral
      @ESGamingCentral 11 หลายเดือนก่อน

      @@MonzonMedia fresh install A1111 SDXL model 4090 im getting 2s/it 1024x1024

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      Did you try putting the command arguments I mention in the video? You probably don’t need the others just -xformers. How much system ram? And are you using ssd?

    • @ESGamingCentral
      @ESGamingCentral 11 หลายเดือนก่อน

      @@MonzonMedia --opt-sdp-no-mem-attention

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Try to remove that and just use --xformers

  • @manticoraLN-p2p-bitcoin
    @manticoraLN-p2p-bitcoin 11 หลายเดือนก่อน +1

    I'm running on 4GB on ComfyUI. 500SEC the first generator, after that, next images takes about 75 to 235sec to create. 3050Ti Mobile

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      Appreciate the info bud! Good to know 👍🏼

  • @sidheart8905
    @sidheart8905 11 หลายเดือนก่อน

    i am able to run on rtx 2060 ,6gb vram. it takes 180 seconds for one image with 80 base and 20 refiner steps, with 20 and 5 i am able to run at 90 seconds

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      Good to known its working for 2060 and 6gb too! Thanks for the feedback very helpful! 🙌🏼

  • @fixelheimer3726
    @fixelheimer3726 11 หลายเดือนก่อน

    There is a refiner Extension for a1111

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Hahaha this video didn’t age well! Hahaha! I’m uploading a video just now on this! 🙌🏼👍🏼

  • @antoninsvrdlik7526
    @antoninsvrdlik7526 11 หลายเดือนก่อน +1

    I was able to run it on 1060 6GB but it is extremely SLOW

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      I think if you put in the command arguments in the webui user.bat file mentioned in the video it should help. Have you tried it?

  • @justinthehedgehog3388
    @justinthehedgehog3388 11 หลายเดือนก่อน +1

    I'm running it on an old GTX 1070

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Yes, others have confirmed too! How long is it taking for 1 1024x1024 image?

  • @liquidmind
    @liquidmind 11 หลายเดือนก่อน

    oh, but you CAN start at step 20 on automatic 1111 using CONTROL NET.....

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      There is actually an extension now! Uploading a video shortly! 🙌🏼

  • @TheSlaveKeyboardist
    @TheSlaveKeyboardist 11 หลายเดือนก่อน +1

    I manage to use it with only 6gb vram

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Awesome! Curious how much system RAM and how long for 1024x1024 for like 25-30 steps?

  • @CELLHOTAI
    @CELLHOTAI 11 หลายเดือนก่อน +1

    M1 still cant run 😭

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      Unfortunately I can’t help with Mac since I’m in pc. Reddit may be a good place to check for solutions.

  • @alirezasadeghi3760
    @alirezasadeghi3760 7 หลายเดือนก่อน +1

    I run it on gtx 1060 6gb

    • @MonzonMedia
      @MonzonMedia  7 หลายเดือนก่อน

      Cool! what are your times like for a 1024x1024 image?

    • @alirezasadeghi3760
      @alirezasadeghi3760 7 หลายเดือนก่อน

      @@MonzonMedia not so much. 10 min for every single image

  • @aegisgfx
    @aegisgfx 11 หลายเดือนก่อน +2

    Are you the playground guy?

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +2

      👀 that would be me 😊 Man people may read this out of context 😂

    • @aegisgfx
      @aegisgfx 11 หลายเดือนก่อน +1

      @@MonzonMedia playground is so amazing, I hope you guys can keep it going and/or are making money there. The PG version of SDXL is just crazy good.

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      I appreciate that bud! We're definitely trying to create our own path there, some exciting things ahead! See ya over there soon!

  • @Yoshi92
    @Yoshi92 13 วันที่ผ่านมา +1

    You act like you don't know what's going on lol yet you know exactly whats going on :D
    --medvram & --no-half-vae brought my SDXL generation from 6 minutes down to 40 seconds. Thanks a lot mate! 🏆
    edit: 3070 Ti 8GB

  • @SHOCKxDART
    @SHOCKxDART 11 หลายเดือนก่อน +1

    3060? ive been ruining it on a 1060XD and it works just fine

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      6GB? Nice! Curious how long a generation takes you for SDXL 1024x1024? Honestly don't know what the issue is for some with 3090's 4080's that say SDXL won't run for them? Great to see so many of you running it on these gpu's!

    • @SHOCKxDART
      @SHOCKxDART 11 หลายเดือนก่อน +1

      @@MonzonMedia it take 2-3mins for 1024x1024 but i found generating 680x680 take under a minute and then use image to image with the refiner model to make it 1024x1024 give same detail but takes less time on lower end GPUs

  • @allenmitchell3596
    @allenmitchell3596 9 หลายเดือนก่อน

    How do i set my video ram to 8gb in comfyui?

    • @MonzonMedia
      @MonzonMedia  9 หลายเดือนก่อน +1

      no need to as it's already optimized.

  • @panman1964
    @panman1964 11 หลายเดือนก่อน +2

    a 3060 ?
    Meh, I happily use a 1080Ti.
    You don't "need" a 30 or a 40 series card, they're "nice to have" but not "need to have" especially if you're just tinkering

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      Good to know it can work on a 10 series card still 👍

    • @Dante02d12
      @Dante02d12 8 หลายเดือนก่อน

      RTX cards architecture make AI calculations much more efficient and therefore, drastically faster. You need RTX as much as you need electricity in real life: sure, you can get by without it, but it's an archaic way of life.
      That said, as long as it works for you...

    • @winstonwoof732
      @winstonwoof732 8 หลายเดือนก่อน

      @@Dante02d12 the big plus the 1080ti has is having 11Gb VRAM.
      For reference a 30 step basic 1024x1024 SDXL image generation in ComfyUI takes approx 65seconds which lets be honest aint the end of the world ;o)

  • @MonzonMedia
    @MonzonMedia  11 หลายเดือนก่อน +7

    If you want to install Automatic1111 easily, this is the best standalone installer that I've come across! th-cam.com/video/A2s9OnGacxA/w-d-xo.html. Also Just released is a new refiner extension for automatic1111!

    • @travelpointone991
      @travelpointone991 11 หลายเดือนก่อน +1

      No worries about Vram etc.

    • @EskaronVokonen
      @EskaronVokonen 11 หลายเดือนก่อน +1

      Ah! The Refiner extention ! Great, I hated having to switch to img2img with all the extra manipulations and extra time with batches of images. I see you have a video about it, I'll check it out

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน +1

      @@EskaronVokonen Yes! It came out the same day I published this video so I felt obligated to do another! hahaha! Enjoy it's pretty easy to install just like the other extensions. Although not as much control as ComfyUi, it does the job. Also I'm finding with all the models and Lora's coming out in Civitai the refiner isn't necessary in most cases.

    • @EskaronVokonen
      @EskaronVokonen 11 หลายเดือนก่อน

      @@MonzonMedia Good to know, I was about to test DreamShaper XL. Thanks again and have a good day!

  • @posnake
    @posnake 11 หลายเดือนก่อน

    I use a RTX 2060s with no problem

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      Good to know, not sure why others are having issues. How long is it taking you roughly for a 1024x1024 image?

    • @posnake
      @posnake 11 หลายเดือนก่อน +1

      @@MonzonMedia like 10 seconds on comfyUI. 20 steps

    • @MonzonMedia
      @MonzonMedia  11 หลายเดือนก่อน

      @@posnake ComfyUi is great isn't it!

  • @MonzonMedia
    @MonzonMedia  9 หลายเดือนก่อน +1

    Hey everyone I made a follow up video on this here to optimize A1111 for 8GB or less. th-cam.com/video/7mlJQ6viH20/w-d-xo.html