How FAST is the RTX 4090 for 3D Animation + Rendering??

แชร์
ฝัง
  • เผยแพร่เมื่อ 19 มิ.ย. 2024
  • It really IS twice as fast as the RTX 3090! ...sometimes
    ► My New Maya for Animators 2023 Online Course - bit.ly/mayaforanimators
    ► Sir's Animation Mentoring + Courses - courses.sirwade.com/
    ► Summer Class Enrollment is Open - bit.ly/animbasics
    ► Support the Channel - / sirwade
    ► RTX 4090 Unboxing - • My Thoughts on the RTX...
    ► ProRigs Character Rigs - bit.ly/prorigs
    ► RTX 4090 Info - www.nvidia.com/en-us/geforce/...
    -- Sir Wade's Links --
    ► Twitter - / sirwadefx
    ► TikTok - / sirwade
    ► Instagram - / sirwadefx
    ► Twitch - / sirwadefx
    -- Resources --
    ► Animation Discord - / discord
    ► Animation Books - bit.ly/animationbooks
    ► My Workflow Gear: www.amazon.com/shop/sirwadefx
    ► Adobe Animation Software - bit.ly/adobecreatives
    ► Animation PC Builds: kit.co/SirWadeFX
    ► Splines Shirts + Sweatshirts: bit.ly/animshirts
    ► Resume & Cover Letter Workshop Replay - bit.ly/2HCKean
    ► Maya for Animators Workshop - bit.ly/mayaforanimatorsreplay
    ► Animation Apparel: bit.ly/animshirts
    -- This Video's Music --
    ► Musicbed - share.mscbd.fm/SirWadeFX
    Chapters:
    0:00 - 3 Questions to Answer
    0:58 - Blender Benchmarks
    1:41 - 16K Image Renders
    2:40 - Rendering Animated Frames
    3:25 - Why VRAM Matters
    4:16 - Hybrid GPU Rendering Complications
    5:18 - Rendering an Animated Short
    6:33 - Viewport Speed Test
    7:31 - 3DMark Raytracing Benchmark
    8:34 - Maya Arnold Render
    8:53 - Maya Viewport Rig Performance
    10:32 - Is it Worth It
    11:08 - Render Farm Cost Comparison
    12:16 - Weigh the Costs
    12:38 - Why RTX Matters
    -----------------------------------------------------------------------------------
    Thanks for watching!
    Subscriber Tracking - 201,305
    SEO Stuff: rtx 4090, rtx 4090 3d, nvidia, nvidia rtx 4090, 4090, 4090 blender, 4090 maya, c4d, houdini, 4090 3d animation, 4090 rendering, blender 4090 speed, 4090 optix, 4090 cycles, 4090 eevee, 4090 arnold, 4090 redshift, 4090 octane, 4090 3d rendering, 4090 for artists, 4090 review, 4090 benchmark, blender benchmark, 4090 vs 3090, how much faster is 4090, is 4090 better, is 4090 worth it, 4090 speed, 4090 examples, 4090fe, nvidia studio, rtx comparison, 30 series comparison, 3090 vs 3080
    #rtx4090 #4090 #nvidiastudio
    Music Sync ID: MB01SUCKLFYHRLN
  • ภาพยนตร์และแอนิเมชัน

ความคิดเห็น • 717

  • @SirWade
    @SirWade  ปีที่แล้ว +95

    What do you think of the results so far?? Would you get use out of these speeds? And anything you'd like to see me cover next time? :)

    • @NightVisionOfficial
      @NightVisionOfficial ปีที่แล้ว +2

      Well ... i want to do more complex sculpting on Blender, and learn UE5 faster, since with my GPU is too slow to get any patience of learning it :/. So... i guess it's a yes. Also, i was looking into Substance Painter, but i have many materials in Blender, so, Baking textures quicker would be good!

    • @SSPBradley11
      @SSPBradley11 ปีที่แล้ว +4

      If you complete the test again I'd be interested in seeing how it handles particles for special effects.

    • @R1s1ngDaWN
      @R1s1ngDaWN ปีที่แล้ว

      I would love to see a review on how far the HiP backend has come, can't wait until radeon 7000 becuase nvidia does have a monopoly in blender rn

    • @adnanerochdi6982
      @adnanerochdi6982 ปีที่แล้ว +4

      Of course this is a giant leap in perf gen-to-gen, and it is worth it for someone who needs to double their speeds right away. Thank you Wade sm for the video by the way. But the value proposition that rendering farms provide makes one rethink before making the purchase especially if u have to upgrade the rest of ur gear like PSUs and MoBo. This is the main points for me, especially when a 3090 still provides one of the fastest viewport and rendering speeds.
      On a side note, and I think this is so important, 4090 owners can't double their VRAM with a second GPU like 3090/ti, which is worth considering if you're already stepping on the limits of 24GBs of video ram. And this alone lets me wonder, probably we can expect something next year that's gonna have NVLINK and slightly faster speeds. It's really exciting that now we can take super fast rendering for granted and how cheap it has become. Thanks again Wade, it was an exciting and a complete video.

    • @garrygiomarelli3476
      @garrygiomarelli3476 ปีที่แล้ว +6

      would be interesting to see 4090 vs multiple 3090s combined in the same machine. and some more info on power consumption comparisons when working on big projects.

  • @johntnguyen1976
    @johntnguyen1976 ปีที่แล้ว +306

    Probably the most useful of the huge influx of 4090 videos happening right now...cuz all we ever get are stats, and catering to the gamers. Thanks for putting one out for the creatives!

    • @Oscar4u69
      @Oscar4u69 ปีที่แล้ว +41

      I got bored of all the reviews talking only about games, and that's just a waste of GPU, games don't need that much power, the real use for a GPU is in things like this

    • @ishiddddd4783
      @ishiddddd4783 ปีที่แล้ว +23

      @@Oscar4u69 if they are to play at 4k 100fps + it's not really a waste, especially since atm it's the only gpu that can do so natively with modern titles

    • @Anti-FreedomD.P.R.ofSouthKorea
      @Anti-FreedomD.P.R.ofSouthKorea ปีที่แล้ว +5

      @@ishiddddd4783 but there are not many great games to play that suit this performance right now. I would rather play gmod than to play any of the stuffs made for 11, 12 yr olds that's being marketed "4K max settings" currently

    • @ishiddddd4783
      @ishiddddd4783 ปีที่แล้ว

      @@Anti-FreedomD.P.R.ofSouthKorea k, but that's you and gmod runs in 4k with almost a decade old hardware

    • @lavart7043
      @lavart7043 ปีที่แล้ว

      How is it usefel while no Radeon GPUs are included

  • @feloi3033
    @feloi3033 ปีที่แล้ว +19

    only this guy can say, "mom I need 4090 for homework".

    • @cryogenicheart2019
      @cryogenicheart2019 2 หลายเดือนก่อน +1

      Quickest way to get your parents to take you out of animation school and put you in a real university

    • @Libertystreet216
      @Libertystreet216 หลายเดือนก่อน

      I can say it as well.

  • @user-ik8vy1rg8f
    @user-ik8vy1rg8f ปีที่แล้ว +305

    Really cool that you got access to these cards like Digital Foundry / Gamers Nexus / et cetera. Those guys aren't focusing on the artist tools like you are, so it's really nice to see that aspect explored here! Definitely interested in Unreal / Houdini / Davinci/Fusion.

    • @n00buo
      @n00buo ปีที่แล้ว +6

      nvidia is desperate to sell this scam card, they'll send cards to anyone for a few bucks so they can lie to people

    • @RazielXT
      @RazielXT ปีที่แล้ว +32

      @@n00buo 4080 12gb is scam card, 4090 is beast

    • @n00buo
      @n00buo ปีที่แล้ว

      @@RazielXT 3080 is the best card so far, 4000 series can't compete with Ampere they're just heaters for tards with money

    • @user9267
      @user9267 ปีที่แล้ว +13

      @@n00buo
      4090 seems to be a pretty decent deal

    • @n00buo
      @n00buo ปีที่แล้ว +2

      @@user9267 HAHHAHA nvidia got bots for youtube comments, they know.

  • @JetCooper3D
    @JetCooper3D ปีที่แล้ว +56

    We work on Disney and Marvel films at Pinewood Studios. After doing similar tests, we changed our RTX 3090 cards over to 4090's. We stopped buying Quadro cards years ago.
    Great video - subscribed - thank you!

    • @nyahbinghiman5984
      @nyahbinghiman5984 ปีที่แล้ว +1

      Why you stopped buying Quadros?

    • @ahmetkocoval1375
      @ahmetkocoval1375 11 หลายเดือนก่อน +1

      ​@@nyahbinghiman5984muchhhhhhhh dollars 😂

    • @ichisenzy
      @ichisenzy 11 หลายเดือนก่อน +4

      tell your boss to make actual good movies

    • @insertname7458
      @insertname7458 10 หลายเดือนก่อน

      good job g, i dont usually watch films or shi but i appreciate yall being able to make all that realistic even like 10 20 years ago

  • @mjlagrone
    @mjlagrone ปีที่แล้ว +93

    Yes please for the Part 2. I would especially like see how it compares in Blender when you have a lot of hair and subsurface scattering! And maybe also with a giant pile of grass and other vegetation.

    • @SnakeTheCowboy
      @SnakeTheCowboy ปีที่แล้ว +4

      All hands up for more Blender testing!

  • @fxadam
    @fxadam ปีที่แล้ว +70

    Great video. Picked up the RTX 4090 today and it is incredible at hardware rendering in Arnold, Blender, Keyshot etc. Games are fun but this GPU is excellent for content creation.

    • @SW-fh7he
      @SW-fh7he ปีที่แล้ว +2

      How did you get it?

    • @fxadam
      @fxadam ปีที่แล้ว +2

      @@SW-fh7he Walked into Microcenter on launch day. They had plenty. They're sold out now but they should have more shortly. Apparently Nvidia is sending out links to Geforce Experience users that will allow them to easily order a 4090 from Best Buy without having to deal with the bots that are slamming best buy right now.

    • @checkmymovie
      @checkmymovie ปีที่แล้ว +1

      I gonna pick up mine today and build a whole new computer for Daz Studio because I'm character designer.

    • @zubairalam2795
      @zubairalam2795 10 หลายเดือนก่อน

      Also the psu... :/ and the wattage is hugee!!!

  • @KillahMate
    @KillahMate ปีที่แล้ว +121

    Note: if you're using Cycles (as opposed to Eevee) it's *all* raytraced. It's a path tracing renderer which means that every sample for every pixel has been raytraced, and has therefore gone through the RTX hardware pipeline - the only difference with reflective surfaces is how coherent the rays are. To test non-raytracing performance you'd need to use Eevee.

    • @Deezsirrer
      @Deezsirrer ปีที่แล้ว +6

      Note: scenes layer is awesome you can render cycles and eevee together

    • @Pixel_FX
      @Pixel_FX ปีที่แล้ว +2

      Ray tracing and path tracing are two different things.

    • @KillahMate
      @KillahMate ปีที่แล้ว +7

      @@Pixel_FX They are two related things - one is a subset of the other. The important bit is that if you have an RTX GPU and Cycles is configured to make use of it via OptiX for hardware acceleration, then the Cycles path samples are being calculated on the GPU's RT cores. And since everything Cycles does is path samples, then _everything_ is rendered with RT cores.
      This is unlike most video games, which must run in real time and therefore only use RT when they have to, like for reflections and such, and never use RT to do path tracing because it's still too demanding and slow for real time.

    • @SirWade
      @SirWade  ปีที่แล้ว +13

      I misspoke - I was talking about the shaders not being reflection / refraction-heavy in that scene. The scene didn't require much complex calculation compared to something like the Maya render later in the video

    • @BeheadedKamikaze
      @BeheadedKamikaze ปีที่แล้ว +14

      @@SirWade Diffuse lighting is *more* complex to calculate than specular reflections. As @KillahMate is trying to explain, this is how path tracing works - a diffuse surface is really just a crap-ton of reflections, all from different directions, and the colour is averaged over hundreds of samples until it becomes smooth. Whereas a specular shader reflects all the rays in more or less the same angle so it turns into a clean result much more quickly. You are getting confused with game rendering terminology. Path tracing is *all* reflections. 100%. It doesn't matter how many specular surfaces there are. And every single one of those rays is calculated using the RT cores.

  • @thevoid6756
    @thevoid6756 ปีที่แล้ว +7

    The "Why Vram Matters" chapter is like the hidden gem of this video. Glad Paul recommended your channel.

  • @techdraconis
    @techdraconis ปีที่แล้ว +88

    I would love to see a part 2 with unreal and houdini.

  • @user-ly1en7kl2o
    @user-ly1en7kl2o ปีที่แล้ว +9

    Great you're back, can't wait to see more of your animations.

  • @gcharb2d
    @gcharb2d ปีที่แล้ว +20

    That's why I got the 12 GB RTX 3060 instead of the 8 GB RTX 3070, a tad slower, but cheaper, and it handles larger scenes!
    Great video!

    • @MIchaelSybi
      @MIchaelSybi ปีที่แล้ว +3

      I got gtx 680 with 4 gb instead of 2, and it served me some more years than it would otherwise, as many programs had 4gb as a bare minimum with time.

  • @LiyoungMartin
    @LiyoungMartin ปีที่แล้ว +40

    Finally, an in-depth analysis of 4090 performance for 3d workflow!!! Could you pretty pleeeease (as you mentioned earlier in the video) do a separate video for unreal engine? Thanks!

    • @flyinggecko3322
      @flyinggecko3322 ปีที่แล้ว +3

      Yes, Unreal and even further look into blender, like different types viewport settings and final renders at 4K would be amazing!

    • @zubairalam2795
      @zubairalam2795 10 หลายเดือนก่อน

      Thats what im looking for as well.. thnxx mate

  • @marcusolivix
    @marcusolivix ปีที่แล้ว +1

    Thanks! finally a review for creators. ...and yeah! Please a part two!!! It would be great to see how it performs in different render engines and different 3D software.

  • @hanselespinosa8918
    @hanselespinosa8918 ปีที่แล้ว +6

    As an artist working with a 1080 Ti in the current year. I don't even fully grasp the amount of creative decisions I could make with this card if I was able to afford it. I agree with the statement in regards to the gaming community. The conversation about the 40 series ends the same way it does every year, more frames equals better performance better gaming experience. That's it. For the creative community it means time budgets can be allocated differently. When you mentioned the difference of 9 hours, man, 9 hours for sound design or post processing in general can make a huge difference.
    Really good review. First time I check out the channel, thanks for sharing.

  • @kellyshipman1341
    @kellyshipman1341 ปีที่แล้ว +1

    Fantastic video! Would definatly like to see some more.

  • @DonC876
    @DonC876 ปีที่แล้ว +10

    I think another angle of looking at this is efficiency. If you have your computer compute a lot then this will also add to the power bill fast. Just yesterday i saw a review where they tried to find the best efficiency by slightly underclocking (about 150mhz) and undervolting and they got the power consumption down almost to 3090 levels (roughly from 400W to 300W) with only a few percent of performance lost. That would mean that you basically double your efficiency in that case. So there's another cost factor that can make that investment worthwhile even quicker.

  • @theredredvideo4189
    @theredredvideo4189 ปีที่แล้ว

    Hands down one of the most comprehensive and useful reviews / deep dives on the 4090. Subbed, Liked and please do a part 2 on C4D!

  • @AndrewTanielian
    @AndrewTanielian ปีที่แล้ว +1

    This is a great video! You explained all this very well.

  • @GmanGavin1
    @GmanGavin1 ปีที่แล้ว +1

    Love the video format, the information in the video. This is exactly what I will send people whenever I have to explain why VRAM matters.

  • @TheRealLink
    @TheRealLink ปีที่แล้ว +1

    As someone dabbling a lot with Blender and doing some freelance work, your graphs were very helpful, whether brute-forced or RT native. Great explanations! Subbed.

  • @3dduff
    @3dduff ปีที่แล้ว +5

    yes please make a part 2. I am a freelance who does use Maya/c4d/Houdini, and render mostly with Redshift. I have my own 5 system minifarm stocked with 30xx GPUS. But a big part of my time is spent in simulations. Only a few situation excist that can speed up simulations with GPUs, but I would very much like to see some of these run on a 4090.
    Great video as always, keep up the good work.

  • @maxrose8845
    @maxrose8845 ปีที่แล้ว +2

    Love the focus on creators - not enough of that. You're the man Sir Wade!

  • @amineamamra2603
    @amineamamra2603 ปีที่แล้ว +3

    Results are mind blowing !!
    Would love a part two testing embergen maybe and some houdini and karma gpu on it !!

  • @legendhasit2568
    @legendhasit2568 ปีที่แล้ว +1

    Perfect video! Exactly the review I was looking for 👍 You have a new subscriber, keep it up 👍

  • @SATYAMKUMAROY
    @SATYAMKUMAROY ปีที่แล้ว +1

    Needed this review. Very good content

  • @brabes76
    @brabes76 ปีที่แล้ว

    Is great review thanks for sharing
    I would also like to say nice transition effect when you did the correction moment

    • @SirWade
      @SirWade  ปีที่แล้ว

      Thank you! :)

  • @pmAdministrator
    @pmAdministrator 11 หลายเดือนก่อน +1

    You're absolutely right! Thank you for the video. For us, who use these cards for work, these cards are INSANE!

  • @nerukas86
    @nerukas86 ปีที่แล้ว +1

    Great video, that's what i call quality content! Thank you.

  • @SimonC021
    @SimonC021 ปีที่แล้ว

    Thanks for this! Very tempted to get one.

  • @rcarter1690
    @rcarter1690 ปีที่แล้ว +16

    Finally some real world tests that really show why an animator would spend so much on a card like this. That 3 minute short test is the best I’ve seen that no other TH-camrs seem to understand. Thank you!

  • @henrybonney3208
    @henrybonney3208 ปีที่แล้ว

    hanks lot Sir.. You helping us..

  • @St1ngerGuy
    @St1ngerGuy ปีที่แล้ว +26

    Very interested in video export from unreal engine 5 using the movie render queue in a raytracing heavy scene. I have a 3090 right now and it does pretty good but the 4090 looks like it leaps ahead by quite a bit. Thanks for putting this video together.

    • @bikboi3292
      @bikboi3292 ปีที่แล้ว

      do you regret buying 3090?

    • @chillsoft
      @chillsoft ปีที่แล้ว +1

      I have two 3090s, so this card is mute for me. Can't pack 2 of these in the case to get the double, no MB supports two because of how thick they are. Gonna stay with NVLinked 2x3090s and skip this generation unless I'm able to watercool them.

    • @zilverheart
      @zilverheart ปีที่แล้ว +1

      @@chillsoft there exist a watercooled version of 4090

  • @kimmysander2447
    @kimmysander2447 ปีที่แล้ว +2

    Id love to see a part 2!

  • @susscrofastudio9171
    @susscrofastudio9171 ปีที่แล้ว

    Your videos really useful to me.....Thank you for your review~~ waiting for part.2

  • @MattHalpain
    @MattHalpain ปีที่แล้ว +1

    Great video. Super awesome to see the 4090 from an artist point of view.

  • @munkmade
    @munkmade ปีที่แล้ว

    awesome insight thanks on this one!

  • @ClickFlint
    @ClickFlint ปีที่แล้ว +1

    This is one video i have been looking for ,everything you hear about the RTX 4090 is gaming so great work Sir Wade

  • @flyinggecko3322
    @flyinggecko3322 ปีที่แล้ว +1

    Great video! Please do take a look at Unreal and an even further look into blender, like different types of viewport settings and final render times at 4K would be amazing!

  • @Bunderwahl
    @Bunderwahl ปีที่แล้ว +4

    Part 2 please, would be really cool if you could include 4080 and 7900 XTX, and even better if you could add video production and other creative applications!

  • @MikaiGamer1286
    @MikaiGamer1286 ปีที่แล้ว

    Glad i found you with this herdly ever see 3d artists benchmark these cards and i havent done it in years now even after graduating with a degree and everything as theirs no work in florida for it

  • @jooaquin
    @jooaquin ปีที่แล้ว

    Loved the review! Please do another one with C4D and Octane benchmarks!

  • @procrastinator24
    @procrastinator24 ปีที่แล้ว +2

    Pleade part 2! Im looking at this card specifically for blender and the unreal engine :D thanks so much for the content!

  • @IN1Studio
    @IN1Studio ปีที่แล้ว

    Excellent video, love the production quality, would like to see part 2 for Unreal Engine!

  • @enigmawstudios4130
    @enigmawstudios4130 ปีที่แล้ว +2

    You're always the go to for usable info on graphics cards. Everyone else is gaming

  • @MrTomanonamous
    @MrTomanonamous ปีที่แล้ว

    This is really helpful, thankyou! Everyone else just concentrates on games and that's not what we need.

  • @BenJi2DxD
    @BenJi2DxD ปีที่แล้ว

    Thank You for this video Sir Wade

  • @Animationcafe
    @Animationcafe ปีที่แล้ว

    Thank you. very helpful review

  • @kamylo777
    @kamylo777 ปีที่แล้ว

    I was really really expecting this video of yours, awesome content :D
    I would like to see the performance on this card using the new Karma XPU, now that you mention Houdini.

  • @blenderian.3d
    @blenderian.3d ปีที่แล้ว

    nice review. thanks for this.

  • @NayLinHtaik
    @NayLinHtaik ปีที่แล้ว

    Thank you so much !!! 😍

  • @mixtapechi
    @mixtapechi ปีที่แล้ว +31

    It's good to see someone making benchmarks on creative programs rather than games. Thanks!

  • @12463trf
    @12463trf ปีที่แล้ว

    I am glad to find some viewpoints from the creative workstation perspective on this new card. I'm not a professional but am lookking at doing 2D and 3D rendering eventually, and have been trying to balance that with wanting to game on the same rig. So these reviews are very helpful.
    That price though is yeeesh :(

  • @lsg2216
    @lsg2216 ปีที่แล้ว

    Thank you, it works perfect!

  • @gabrielmoro3d
    @gabrielmoro3d ปีที่แล้ว

    This is incredible and so thorough.

  • @farazshababi1
    @farazshababi1 ปีที่แล้ว

    I love the comparison to render farms ... i think a lot of people wanna know if they should invest in the hardware themselves! Very helpful! THANKS SIR

  • @mechaboy95
    @mechaboy95 ปีที่แล้ว +1

    you can use some of your own pc ram as dedicated video ram if you gpu doesn't have enough to render a scene.
    its in some windows setting, but I'm not sure how much it helps

  • @pariahgaming365
    @pariahgaming365 ปีที่แล้ว +1

    I’m an animation student and I have a 3080 ti. I just finished my first ever shader render on MAYA. It’s was just 150 frames but it smoked my M1 Mac-Mini. Once my work-loads start getting super heavy later on, I’ll definitely upgrade but I should be good. Also my cpu is a Ryzen 9 5900x with 32 gigs of ddr4 ram. I’m sure I’ll be upgrading to either 64 or 128 gigs of ram in the near future

  • @shitshow_1
    @shitshow_1 ปีที่แล้ว

    Honestly, this is one of the great contents I have watched. Thanks for posting : D

  • @geekydomstudios
    @geekydomstudios ปีที่แล้ว

    Please make a part 2! This was an amazing video. 🙂

  • @theshawnmccown
    @theshawnmccown ปีที่แล้ว +1

    These kind of results are the selling point for me. It's really a great value when it performs this well in work and play.

  • @luisa_fit_apps9947
    @luisa_fit_apps9947 ปีที่แล้ว

    Great video, the program works great

  • @jarabito001
    @jarabito001 ปีที่แล้ว

    looking forward for part 2!!

  • @danny3man
    @danny3man ปีที่แล้ว +1

    Very nice, i`m also interested in benchmarks using Vray, Fstorm or Octane(if they have a standalone benchmark).

  • @IndyStry
    @IndyStry ปีที่แล้ว +1

    Great video, waiting to see this exact comparison for Unreal Engine 5.1 with a scene with really heavy 8k textures that push the VRAM. Also BEAWARE that the 4000 series cards don't NVlink in case someone is thinking of getting two. (well you physically won't be able to fit two anyways) This might be why two A6000 or two 3090s might be a better bet for some looking to stack another card later down the line to get more future performance.

  • @jasonhoi85
    @jasonhoi85 ปีที่แล้ว

    Thanks! This is the best benchmark for 3d artists.

  • @BOSSposes
    @BOSSposes ปีที่แล้ว

    I needed this video ! Good content pal

  • @froggy3u
    @froggy3u 10 หลายเดือนก่อน +3

    At around 3:10 minute of the video. I am asking just in case.
    Do you put your viewport in shading render preview while rendering the scene?
    I learned this the hard way 5 years ago, I remember I ran out of memory using my 1070 when I rendering some heavy scene using Cycles.
    I noticed that I put viewport sample and render sample as the same value at 4096.
    I was curious why was it able to use 6GB of VRAM in viewport shading but not in render.
    Turns out, when I was rendering the scene, both viewport and render windows using the same amount of vram.
    I decided to put my viewport in image editor layout, it staring render each frame under a 15 minute instead of "out of memory" error.
    Nowadays, even when using 3090...
    I always turn on the [temporary editor > image editor] in the settings and ctrl+space in one of the windows for the render (so that other windows are inactive while I am rendering).
    For my case it improves my render x3 to x4 ever since I was aware of that viewport shading also uses vram even while I am rendering.
    Frame 1: 18s vs 1min09s (with viewport shading in background) of a same scene settings I did from 5 years ago.
    Not sure if your case the same as mine. Hope this info helps could someone.

  • @joonglegamer9898
    @joonglegamer9898 ปีที่แล้ว +1

    There's a lot more to consider here, especially for the average 3D modelling animation enthusiast, if you look at your results from the Animated Frames Sprite Fright production files, you can clearly see it's not "double the performance", if anything it's barely 20 percent more with a 4090 card than an 3090, so if you're the average hobbyist it might not be such a huge deal to miss 20 percent performance, it's certainly NOT twice the performance. And that brings me to another thing - cost - these cards in Sweden were I live, cost around 26K Sek which translates to 2321 USD, most of us who bought the 3090 for around 2000 USD might not be THAT motivated to junk our cards and pay an extra 2.3K to get the difference. In a professional STUDIO setting I totally get the value, even a 5 percent difference can make or break some larger budgets with time constraints. You also have to realize that DLSS isn't used everywhere, thats more an Interpolation thing like those used in older television sets to draw the frames inbetween two extremes or rendered images.
    So in short - I don't think you will notice much difference when working with Blender cycles in the viewport when rotating and inspecting the scene. The biggest major upgrade was in fact from 1080ti to 3090 were you went from choppy slow movements to relatively real time. From 3090 to 4090 - that difference is not as HUGE as you make it sound here.
    Also in Blender, animation files (especially rigged ones) are very CPU bound and here it's actually better to have a better CPU.

  • @tonymoore3122
    @tonymoore3122 ปีที่แล้ว

    Super helpful as I'm considering purchasing a 4090 or 4080. Thanks so much!

  • @whidzee
    @whidzee ปีที่แล้ว +7

    i'd love to see the performance differences between all the 40XX cards

  • @mad_archviz6478
    @mad_archviz6478 ปีที่แล้ว

    Thanks man, good video

  • @aroalien
    @aroalien ปีที่แล้ว

    Nice video, it works!

  • @martinvanstein.youtube
    @martinvanstein.youtube ปีที่แล้ว

    Great video ... well done!
    Honestly it is very hard to find a good video on real card performance ... I mean a 3DMark or Cinebench score is nice, but it doesn't say anything, heck even the ones where they show gameplay are ridiculous.
    What you did is amazing as it provides some real world context as to what to expect when you fire up a render.
    And you're right about renderfarms.
    I am not doing high end stuff (mostly 3D stuff for marketing clients who don't have any content), but even within that context a renderfarm is too expensive.
    I mean in one of my last projects I rendered a 250 frame clip of a doorknob with a slowly panning camera ... even with optimizing and compromising it costs €1 per frame ...times 5 variations ... not really doable.
    For the time being I switched to Redshift as it renders much faster and no fireflies , it's a biased engine and the GI and stuff looks a bit flaky as it is a biased engine... but if and when some money comes in I definitely will build a Desktop system with this beast of a card in it .
    What kind of CPU are you running?
    Intel based (Xeon or I9) or a Ryzen?
    Again great video !!!

  • @christopherjunkins
    @christopherjunkins ปีที่แล้ว

    PART 2! This is awesome work!

  • @ChaosOver
    @ChaosOver ปีที่แล้ว +1

    In a lot of workflows the artist is the limiting factor, not the hardware - even in LookDev. The real benefit will be in lighting and final rendering(if you are not working on complex shots that wont fit into your vram anyways).
    (Btw., if you are talking about rentability, the power draw is an important factor. So whats the min(idle/desktop), avg(working in the viewport), and max(rendering) power consumption? Whats the mix(desktop, viewport, render) in real world scenarios? How much does it draw in an example project per day? Compared to other cards? And how much does the room temp rise(believe me it does)? How much power is needed for the AC to cool it? These factors are also important to calculate your costs and to figure out whats the best solution for you.)

  • @LordLab
    @LordLab ปีที่แล้ว +3

    will be nice to see tests like this on new CPUs like AMD AM5 lineup and Intel 13Gen

  • @fabianoperes2155
    @fabianoperes2155 ปีที่แล้ว

    Man, before I watch the video, I just wanted to say you are GORGEOUS! GOSH!!!

  • @sleepingonthecouch95
    @sleepingonthecouch95 ปีที่แล้ว

    Great video as always, Sir.
    That part about VRAM was exactly what I experienced in last few months, can't render with my 3080, gotta use the CPU instead and slowed the whole project down. So I was wondering between 2 3090 as the 4000 series will be killed off the NVLINK. But, man, that 4090 looks juicy So I placed my order yesterday haha. Furthurmore, NVdia said that the PCIe 5.0 on 4090 will be fast enough to connect between GPUs as they won't need the NVLINK anymore. Hope that day will come so a combination of 2 4090 will be interesting.
    One more thing, in 4:45 you said you checked the note about that rendering history. Can you somehow brief the way you take note of your works, and how you manage them. That'd be super helpful to learn more about production other than just technical stuff.

  • @dominic.h.3363
    @dominic.h.3363 ปีที่แล้ว +1

    VRAM was the reason I went with a 3060 instead of a 3070, because whenever I would need to waste tens of hours using CPU+RAM as a fallback, wouldn't be worth it for the few tens of minutes saved with the faster 3070. It was very hard to justify replacing a broken GPU and being voluntarily fleeced with an almost $1k 3060 back at the height of the cryptomining craze, but seeing how fast viewport rendering is with a 4090, I'm tempted to bite the bullet once currency exchage rates stabilize.
    This review was everything I wanted from a creator perspective but never got from the usual outlets. Thanks!

  • @o.b.a6035
    @o.b.a6035 ปีที่แล้ว

    Loved the video and i watched the last one of the 4090😁😁😁 gotta make a part 2 video

  • @ivanoleaanimator
    @ivanoleaanimator ปีที่แล้ว

    I think there also needs to be a discussion on the power consumption of a machine that uses the 4090. From the top of my head I believe that it ranges from around 350 watts up to 600w which is like running a hand held vacuum cleaner while you render and half that at idle.

  • @SyntaxDomain
    @SyntaxDomain ปีที่แล้ว +1

    Great video! I'd love to see a Houdini and substance painter video as well if you get a chance.

  • @Betoromero22
    @Betoromero22 ปีที่แล้ว

    Hermano!!! Por fin veo un video completo enfocado a nosotros los creadores (arquitecto en mi caso) acabo de comprar mi tarjeta 4090 y no estaba muy seguro.
    Te estaría inmensamente agradecido si pudieras hacer una comparativa con Lumion, hacer videos con todos los efectos activados (reflexión y efectos de luz)
    Saludos desde México

  • @beintouch99
    @beintouch99 ปีที่แล้ว

    Thank you for sharing

  • @hardwire666too
    @hardwire666too ปีที่แล้ว +4

    I am so glad you talked about an actual animation rendering benchmark. I have gotten in countless arguments with people about how a single still frame tells me nothing about how a video card will perform for my needs as an artist. All that tells me is how well that video card will render that one single frame with the most optimal settings. It tells me basically nothing. People just DON'T understand that the settings used for one frame migh not be great for the next frame. So one frame might render in 30sec, but the next might render in a 1min30sec, and the frame after that might take 5min. So that single-frame bench mark is utterly useless. So thank you. I fell vindicated. lol.
    Also on the note of renderfarms Blender has a fairly popular one called sheepit where you can use your own hardware to earn time on the render farm for youru own projects.

    • @carlesv7219
      @carlesv7219 ปีที่แล้ว

      Imagine when you try to tell people that rendering scenes has nothing to do with actual viewport performance while you're working creating that scene, but people loves to see simplifications and numbers just to think they know something and they did the right choice, even if they only use it to play minecraft.

    • @hardwire666too
      @hardwire666too ปีที่แล้ว

      @@carlesv7219 For real. It's like I can seal with 10fps in the viewport, whjat I need is shorter rendertimes to help me itterate faster. lol

  • @jamesquao1028
    @jamesquao1028 ปีที่แล้ว +1

    Thank you for making this video, I have purchased a 4090 thinking that maybe i have overspent. But as a 3D visualiser you have helped me justify with a smile that i made a great investment

  • @darkveil07
    @darkveil07 ปีที่แล้ว

    thanks so much man

  • @jerghal
    @jerghal ปีที่แล้ว

    Interesting that the rendering over memory capacity took so much longer. What I would like to see is like Redshift Render benchmarks with normal rendering (within the 24GB mem budget) and the 'Out of Core' rendering. Very curious what results that would give.

  • @albertmunoz9075
    @albertmunoz9075 ปีที่แล้ว +1

    Loved the video! Would love to see you test it with Houdini as well. And would also love a comparision with the 4080 when it comes out. :)

    • @houdinipdf8147
      @houdinipdf8147 ปีที่แล้ว +1

      Yes! Thank you, very instructive.
      I have to say that I get tired of seeing bench’s for « game/gaming » in all videos while this card is addressed for pros in my opinion.
      Albert same request here, plus I would be curious to see a comparison on heavy fx sim as flip or pyro solver with 3090/4090 and 4080.
      If you can, it would be awesome to add the electricity consumption :)

  • @louixlinart
    @louixlinart ปีที่แล้ว +2

    This is such a helpful video. I'm in the midst of thinking if I should upgrade to a 3090 from a 3060 or just take a big leap to a 4090. Plus coming from someone that not only knows about pc specs but also does 3D themselves is a lot more reliable than just to watch some random benchmark videos. Thank you for making this video. Cheers!

    • @hman6159
      @hman6159 ปีที่แล้ว

      What did you end up doing

    • @louixlinart
      @louixlinart ปีที่แล้ว +1

      @@hman6159 nth yet 😆saving still HAHAHA

  • @MrWes-xe6cn
    @MrWes-xe6cn ปีที่แล้ว +1

    Very Nice. Thanks! BTW - Will you be testing the Ryzen 9 7950x?

  • @otegadamagic
    @otegadamagic ปีที่แล้ว +3

    Man thanks for being one of the very few to test for creators. Maybe do another that also shows benchmarks for editing softwares like davinci, premier and fcp.
    Cheers from Nigeria

    • @CreatorChaz
      @CreatorChaz ปีที่แล้ว +1

      A youtuber named Eposvox has a video that might be what you're looking for. I hope that helps.

    • @otegadamagic
      @otegadamagic ปีที่แล้ว +1

      @@CreatorChaz yeah thanks i saw his one before sir wade posted his. It would be good to get more people doing these benchmarks so we can compare i guess.

    • @CreatorChaz
      @CreatorChaz ปีที่แล้ว +1

      @@otegadamagic Yeah, It's kinda rough finding non-gaming benchmarks sometimes. I hope more people pop up in the space.

    • @otegadamagic
      @otegadamagic ปีที่แล้ว

      @@CreatorChaz yeah apparently NVidia cares more about gamers than content creators. No wonder they mainly sent test units to gamers for review

  • @nickcrofts1659
    @nickcrofts1659 ปีที่แล้ว

    Great video.
    To get around the Vram being too limited on the 3070 for bigger scenes i get around it by making the bucket size smaller, this seems to work almost every time. also some times i can get away with just making sure my viewport in inm wireframe mode before hitting render. I hope this tip works for anyone thinking of upgrading only because of this issue and not because they want crazy lightning speeds from a 4090 :)

    • @simonzhang3D
      @simonzhang3D ปีที่แล้ว

      how does that work? Makes no sense to me why that has anything to do with the Vram. It could render faster mb and i really dont understand the connection to Vram

    • @MIchaelSybi
      @MIchaelSybi ปีที่แล้ว

      @@simonzhang3D buckets apporition memory

  • @gerasimosioardanitis5494
    @gerasimosioardanitis5494 ปีที่แล้ว +5

    Now with the 4090 abandoning the NVLINK I m seriously considering skipping the 40x and waiting for the 50x with a hope they restore it. In the meantime I will seriously get my hands on 2 3090s and add them so I can get advantage of the 24+24=48GB with NVLINK.
    Imo I avoid quadro since I ain’t a studio owner or something. As an artist I can see a lot of value on rtx 2080s I had now 3090s I m planning on plus the 48gb with NVLINK for my budget is more than to make me happy and load my scenes and/or project. I ain’t getting crazy about if it will render in 12h instead of 8h.
    As long as I can improve my workload in a logical cost I am happy with it. As I mentioned above I invested on a good motherboard and a nice CPU that I believe serves 80% of the projects a blender artist needs. So in future I can add any 30x or 40x gpus but definitely I won’t spend 1600-2000 Europe prices for a 4090 that does give me a 1.7x of a 3090 and still get stuck with 24gb.
    For the same money I can get 2 x 3090s, Asus here costs 1200euros incl VAT and excluding VAT is 867€ x 2 = 1730 let’s say roughly. Let’s add an NVLINK 125€ that tops it at 1900€.
    I hope I don’t sound arrogant or biased but I cant see myself spending enormous amount of quadros.
    Farm rendering is still too expensive unless u r running a studio with lots of clients. If u r a solo artist I don’t see it as a solution for the time being. Maybe later that will be more competition around the market and prices become more reachable yes.

    • @Carlosmatos-nx4uc
      @Carlosmatos-nx4uc ปีที่แล้ว +4

      Finally someone that not getting fooled by a shiny object. As you said you can buy 2 3090 for the price of one 4090. I myself have a 3090 working towards my second on. My biggest disappointment with the 4090 is that I was expecting it to be 32gb not 24gb.

  • @Speed_Monger
    @Speed_Monger ปีที่แล้ว +2

    Finally someone showcasing what this card is actually meant for! Thanks

  • @TheSunMystic
    @TheSunMystic ปีที่แล้ว

    That helped me out a lot. Thanks. Time to buy the 4090 ;-)

  • @geoffreynganga4492
    @geoffreynganga4492 ปีที่แล้ว

    Works well!! DANKEEE

  • @Didjelirium
    @Didjelirium ปีที่แล้ว

    I cannot wait to try this card in Blender but for now the closest I got to a 4090 was by downloading a 3D model of it then zooming in on the details. XD

  • @JeanMarkTiconaLarico
    @JeanMarkTiconaLarico ปีที่แล้ว

    Nice and clearly video, I'm using blender and ue5 for architecture on a 3080ti and It's enough for me maybe will wait for 1-2 years before update my gpu, considering new processor's can't handle well all the power of 4090