Expand and Enhance: Perfect Outpainting with Flux

แชร์
ฝัง
  • เผยแพร่เมื่อ 11 พ.ย. 2024

ความคิดเห็น • 54

  • @ngelsoto
    @ngelsoto หลายเดือนก่อน

    Works like a charm. Good job! In the SDXL Outpaint section, you need to connect the Load Checkpoint VAE to the VAE Encode (for inpainting), otherwise you will get the "...4 channels but expected 16..." error.
    Cheers to everyone.

  • @baheth3elmy16
    @baheth3elmy16 หลายเดือนก่อน +1

    Oh wow. What a video. I know you have already provided the workflow, but out of respect and appreciation for your hard and wonderful work, I have to watch your videos till the end, I actually enjoy how you build the workflow, good education. Thanks a bunch!!!

  • @PixelArt_YW
    @PixelArt_YW หลายเดือนก่อน +2

    directly use alimama's flux controlnet inpainting to do outinpaintng, the effect is also very good

  • @dameguy_90
    @dameguy_90 หลายเดือนก่อน +1

    This is a really helpful video. If a pure flux workflow comes out that removes the sdxl model, please make a tutorial like this then too. ^^

  • @wellphoto3d
    @wellphoto3d หลายเดือนก่อน

    Thank's CG Top, ! 我从你身上学到了很多东西!

  • @normanrichter2422
    @normanrichter2422 5 วันที่ผ่านมา

    WOW

  • @Alex_Niko_Y
    @Alex_Niko_Y หลายเดือนก่อน

    Thanks for the video! Been waiting a long time on Flux and outpaiting )

    • @CgTopTips
      @CgTopTips  หลายเดือนก่อน

      Thanks, It is a tricky method that I want to share

  • @electronicmusicartcollective
    @electronicmusicartcollective หลายเดือนก่อน +2

    good video thx, but please change the music to more ambient or lofi to have a better focus on the topic. peace

  • @yklandares
    @yklandares หลายเดือนก่อน

    Thanks Bro for the workflow you klaasny dude

  • @baheth3elmy16
    @baheth3elmy16 หลายเดือนก่อน +2

    Surprisingly the Flux1.Dev.FP8 mode gave me an error. You seem to have used the Full FP8 model. I used the fp8 UNET model only and added the dual clip and VAE nodes. Also, I had to download the SDXL Union and then the Promax Union Controlnet to get the workflow to work. I added an image and prompted (A woman in a restaurant), changed the seed and fixed it. At the end, the image I input in the first group came out as is in the last output, nothing happened to it and it wasn't outpainted. The image I used seemed to be big in size. I used a 512x512 and it worked. Thanks again.

    • @Dany-w3g
      @Dany-w3g หลายเดือนก่อน

      It doesn't work for me either, maybe the secret is in his (flux dev+vae clip) model, but he didn't give us a link to download that model so it won't work for anyone.

    • @gardentv7833
      @gardentv7833 หลายเดือนก่อน +1

      i have to download 17GB model. it works

    • @user-fo9ce3hr5h
      @user-fo9ce3hr5h หลายเดือนก่อน

      @@gardentv7833 Bro can you share the
      controlnet-union-sdxl-1.0 and
      RealvizXL_V5_BakedVAE.safetensors
      so i can download it?

    • @user-fo9ce3hr5h
      @user-fo9ce3hr5h หลายเดือนก่อน

      @@gardentv7833 can you share the
      controlnet-union-sdxl-1.0 and
      RealvizXL_V5_BakedVAE.safetensors
      so we can download it?
      and where we should put them in comfy ui folder?

  • @gardentv7833
    @gardentv7833 หลายเดือนก่อน +1

    Nice work, Thank you, I noticed quality losses in face, deformed teeth, is there anyway to repair this?

    • @bwheldale
      @bwheldale หลายเดือนก่อน

      I noticed that too, so after I temporarily set the scale in the layer utility node from 0.5 to 1 the quality improved, but then the outpainting area was reduced.

    • @user-fo9ce3hr5h
      @user-fo9ce3hr5h หลายเดือนก่อน

      @@bwheldale Bro can you share the
      controlnet-union-sdxl-1.0 and
      RealvizXL_V5_BakedVAE.safetensors
      so i can download it?

    • @user-fo9ce3hr5h
      @user-fo9ce3hr5h หลายเดือนก่อน

      @@bwheldale can you share the
      controlnet-union-sdxl-1.0 and
      RealvizXL_V5_BakedVAE.safetensors
      so we can download it?

  • @philipp9550
    @philipp9550 14 วันที่ผ่านมา

    getting this error message: LayerUtility: ImageBlendAdvance V2
    `np.asfarray` was removed in the NumPy 2.0 release. Use `np.asarray` with a proper dtype instead.
    but i have the newest comfyui installed

  • @yngeneer
    @yngeneer หลายเดือนก่อน

    thanx alot

    • @CgTopTips
      @CgTopTips  หลายเดือนก่อน

      🙏🙏

    • @user-fo9ce3hr5h
      @user-fo9ce3hr5h หลายเดือนก่อน

      can you share the
      controlnet-union-sdxl-1.0 and
      RealvizXL_V5_BakedVAE.safetensors
      so we can download it?

  • @tetsuooshima832
    @tetsuooshima832 หลายเดือนก่อน

    I have a few questions :
    1. what's the difference using ImageBlendAdvance v2 Vs Pad Image for Outpainting (local node)
    2. why you invert the mask when there's already invert_mask option on the node. Maybe show us the output to make it clearer what's happening

  • @geekboystudio2405
    @geekboystudio2405 หลายเดือนก่อน +1

    i got this error KSampler
    unsupported operand type(s) for *: 'float' and 'NoneType' what should i do now

    • @LedroneMini
      @LedroneMini หลายเดือนก่อน

      Same for me. Maybe because I can't find apply controlnet (advanced) so i used Apply controlnet but i guess there is a problem with VAE

  • @yngeneer
    @yngeneer หลายเดือนก่อน

    at 9:09 you stated that it is possible to select any SDXL model, so why I get :
    ## Error Details
    - **Node Type:** KSampler
    - **Exception Type:** RuntimeError
    - **Exception Message:** Given groups=1, weight of size [320, 4, 3, 3], expected input[2, 16, 112, 187] to have 4 channels, but got 16 channels instead
    ## Stack Trace
    ??? can you help me a bit ???

    • @yngeneer
      @yngeneer หลายเดือนก่อน

      Lol, so it didn't take so long and I figured out myself... problem was in VAE... Because I used UNET flux, I served flux VAE in anything everywhere and the VAE encode (inpainting) recieved that, but it needs the sdxl VAE... so... thats it ;-)

  • @thesledge3481
    @thesledge3481 หลายเดือนก่อน +2

    This one breaks for me. Main error appears to be "Given groups=1, weight of size [320, 4, 3, 3], expected input[2, 16, 112, 187] to have 4 channels, but got 16 channels instead" in the KSampler SDXL Outpaint Group

    • @thesledge3481
      @thesledge3481 หลายเดือนก่อน

      I see the problem now - at 5:00 you fail to connect the LoadCheckpoint vae output to the vae input on the Vae Encode (for inpainting) node

    • @yngeneer
      @yngeneer หลายเดือนก่อน

      at least for me... > problem was in VAE... Because I used UNET flux, I served flux VAE in anything everywhere and the VAE encode (inpainting) recieved that, but it needs the sdxl VAE... so... thats it ;-)

  • @Dany-w3g
    @Dany-w3g หลายเดือนก่อน

    WOW!

    • @CgTopTips
      @CgTopTips  หลายเดือนก่อน

      🙏

  • @erichearduga
    @erichearduga หลายเดือนก่อน +1

    That comfy layer style breaks my comfyUI everytime: File "E:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\cuda\__init__.py", line 284, in _lazy_init
    raise AssertionError("Torch not compiled with CUDA enabled")... this is after it uninstalls a bunch of stuff including Torch.

  • @FilipBejtner-wg1oj
    @FilipBejtner-wg1oj หลายเดือนก่อน

    How mucho VRAM neded?

    • @CgTopTips
      @CgTopTips  หลายเดือนก่อน

      Minimum 6gb vram

  • @KasperskyGroup
    @KasperskyGroup หลายเดือนก่อน

    Hi Profe.. exist a mistake KSAMPLER mat1 and mat2 shapes cannot be multiplied (10528x16 and 64x3072) ... why?? .Thnaks in advance for all projects all are the best

    • @gastonboigues4124
      @gastonboigues4124 หลายเดือนก่อน

      me pasa lo mismo

    • @gastonboigues4124
      @gastonboigues4124 หลายเดือนก่อน

      ya funciona bien!!

    • @KasperskyGroup
      @KasperskyGroup หลายเดือนก่อน

      @@gastonboigues4124 Hola ya te genera la imagen ampliada correctamente? ... a mi solo me sale fondo verde en los nodos mejorados de flux con el 0.95

    • @gastonboigues4124
      @gastonboigues4124 หลายเดือนก่อน

      @@KasperskyGroup si me funcionó bien cambiando el modelo de control net

    • @KasperskyGroup
      @KasperskyGroup หลายเดือนก่อน

      @@gastonboigues4124 Buenas Gaston,puedes enviarme el link del controlnet por favor.

  • @MdNoman-tl6yf
    @MdNoman-tl6yf หลายเดือนก่อน

    bro, can i use flux with rtx 4060 ti 16gb?

    • @erichearduga
      @erichearduga หลายเดือนก่อน +1

      I run it on rtx3060 with 12gb. Both Dev and Schnell, 8 and 16 bit. Works fine. Takes maybe 30 seconds to do a regular generate, the more complex version of this workflow took about 2:00 using the Hyper-8 step lora.

    • @erichearduga
      @erichearduga หลายเดือนก่อน

      I just ran this workflow at 30 steps... 3:00 minutes... used 50GB or Ram and 11.4 GB of VRAM--- 16 bit model and clip

    • @tetsuooshima832
      @tetsuooshima832 หลายเดือนก่อน

      @@erichearduga haha that means the more RAM you have... the more RAM gets eaten xD I have 32GB, maybe I don't need to upgrade after all xDD

    • @sirjcbcbsh
      @sirjcbcbsh หลายเดือนก่อน

      Flux has so many models…and the GGUF ones. Which one would you guys suggest? I have rtx4070 Ti Super with 8vram and 16gb of memory. My laptop has rtx4060 with 8vram and 16gb of memory

  • @therookiesplaybook
    @therookiesplaybook หลายเดือนก่อน

    Comfy is so unnecessarily complicated to complete a simple task.

    • @lordmo3416
      @lordmo3416 4 ชั่วโมงที่ผ่านมา

      What is simpler?

  • @cjosejaen
    @cjosejaen หลายเดือนก่อน

    I cannot continue from the "Load Checkpoint" node
    ERROR: Could not detect model type of: B:\Data\Models\StableDiffusion\flux1-dev-fp8.safetensors