How FAST is the RTX 4090 for 3D Animation + Rendering??

แชร์
ฝัง
  • เผยแพร่เมื่อ 24 ธ.ค. 2024

ความคิดเห็น • 786

  • @SirWade
    @SirWade  2 ปีที่แล้ว +98

    What do you think of the results so far?? Would you get use out of these speeds? And anything you'd like to see me cover next time? :)

    • @NightVisionOfficial
      @NightVisionOfficial 2 ปีที่แล้ว +2

      Well ... i want to do more complex sculpting on Blender, and learn UE5 faster, since with my GPU is too slow to get any patience of learning it :/. So... i guess it's a yes. Also, i was looking into Substance Painter, but i have many materials in Blender, so, Baking textures quicker would be good!

    • @SSPBradley11
      @SSPBradley11 2 ปีที่แล้ว +4

      If you complete the test again I'd be interested in seeing how it handles particles for special effects.

    • @rVox-Dei
      @rVox-Dei 2 ปีที่แล้ว

      I would love to see a review on how far the HiP backend has come, can't wait until radeon 7000 becuase nvidia does have a monopoly in blender rn

    • @adnanerochdi6982
      @adnanerochdi6982 2 ปีที่แล้ว +4

      Of course this is a giant leap in perf gen-to-gen, and it is worth it for someone who needs to double their speeds right away. Thank you Wade sm for the video by the way. But the value proposition that rendering farms provide makes one rethink before making the purchase especially if u have to upgrade the rest of ur gear like PSUs and MoBo. This is the main points for me, especially when a 3090 still provides one of the fastest viewport and rendering speeds.
      On a side note, and I think this is so important, 4090 owners can't double their VRAM with a second GPU like 3090/ti, which is worth considering if you're already stepping on the limits of 24GBs of video ram. And this alone lets me wonder, probably we can expect something next year that's gonna have NVLINK and slightly faster speeds. It's really exciting that now we can take super fast rendering for granted and how cheap it has become. Thanks again Wade, it was an exciting and a complete video.

    • @garrygiomarelli3476
      @garrygiomarelli3476 2 ปีที่แล้ว +6

      would be interesting to see 4090 vs multiple 3090s combined in the same machine. and some more info on power consumption comparisons when working on big projects.

  • @johntnguyen1976
    @johntnguyen1976 2 ปีที่แล้ว +339

    Probably the most useful of the huge influx of 4090 videos happening right now...cuz all we ever get are stats, and catering to the gamers. Thanks for putting one out for the creatives!

    • @Oscar4u69
      @Oscar4u69 2 ปีที่แล้ว +45

      I got bored of all the reviews talking only about games, and that's just a waste of GPU, games don't need that much power, the real use for a GPU is in things like this

    • @ishiddddd4783
      @ishiddddd4783 2 ปีที่แล้ว +24

      @@Oscar4u69 if they are to play at 4k 100fps + it's not really a waste, especially since atm it's the only gpu that can do so natively with modern titles

    • @Anti-FreedomD.P.R.ofSouthKorea
      @Anti-FreedomD.P.R.ofSouthKorea 2 ปีที่แล้ว +6

      @@ishiddddd4783 but there are not many great games to play that suit this performance right now. I would rather play gmod than to play any of the stuffs made for 11, 12 yr olds that's being marketed "4K max settings" currently

    • @ishiddddd4783
      @ishiddddd4783 2 ปีที่แล้ว

      @@Anti-FreedomD.P.R.ofSouthKorea k, but that's you and gmod runs in 4k with almost a decade old hardware

    • @lavart7043
      @lavart7043 2 ปีที่แล้ว

      How is it usefel while no Radeon GPUs are included

  • @JetCooper3D
    @JetCooper3D 2 ปีที่แล้ว +68

    We work on Disney and Marvel films at Pinewood Studios. After doing similar tests, we changed our RTX 3090 cards over to 4090's. We stopped buying Quadro cards years ago.
    Great video - subscribed - thank you!

    • @nyahbinghiman5984
      @nyahbinghiman5984 ปีที่แล้ว +1

      Why you stopped buying Quadros?

    • @ahmetkocoval1375
      @ahmetkocoval1375 ปีที่แล้ว +3

      ​@@nyahbinghiman5984muchhhhhhhh dollars 😂

    • @ichisenzy
      @ichisenzy ปีที่แล้ว +9

      tell your boss to make actual good movies

    • @insertname7458
      @insertname7458 ปีที่แล้ว

      good job g, i dont usually watch films or shi but i appreciate yall being able to make all that realistic even like 10 20 years ago

  • @雪鷹魚英語培訓的領航
    @雪鷹魚英語培訓的領航 2 ปีที่แล้ว +312

    Really cool that you got access to these cards like Digital Foundry / Gamers Nexus / et cetera. Those guys aren't focusing on the artist tools like you are, so it's really nice to see that aspect explored here! Definitely interested in Unreal / Houdini / Davinci/Fusion.

    • @n00buo
      @n00buo 2 ปีที่แล้ว +5

      nvidia is desperate to sell this scam card, they'll send cards to anyone for a few bucks so they can lie to people

    • @RazielXT
      @RazielXT 2 ปีที่แล้ว +35

      @@n00buo 4080 12gb is scam card, 4090 is beast

    • @n00buo
      @n00buo 2 ปีที่แล้ว

      @@RazielXT 3080 is the best card so far, 4000 series can't compete with Ampere they're just heaters for tards with money

    • @user9267
      @user9267 2 ปีที่แล้ว +13

      @@n00buo
      4090 seems to be a pretty decent deal

    • @n00buo
      @n00buo 2 ปีที่แล้ว +2

      @@user9267 HAHHAHA nvidia got bots for youtube comments, they know.

  • @KillahMate
    @KillahMate 2 ปีที่แล้ว +127

    Note: if you're using Cycles (as opposed to Eevee) it's *all* raytraced. It's a path tracing renderer which means that every sample for every pixel has been raytraced, and has therefore gone through the RTX hardware pipeline - the only difference with reflective surfaces is how coherent the rays are. To test non-raytracing performance you'd need to use Eevee.

    • @PoollShietz
      @PoollShietz 2 ปีที่แล้ว +7

      Note: scenes layer is awesome you can render cycles and eevee together

    • @Pixel_FX
      @Pixel_FX 2 ปีที่แล้ว +3

      Ray tracing and path tracing are two different things.

    • @KillahMate
      @KillahMate 2 ปีที่แล้ว +8

      @@Pixel_FX They are two related things - one is a subset of the other. The important bit is that if you have an RTX GPU and Cycles is configured to make use of it via OptiX for hardware acceleration, then the Cycles path samples are being calculated on the GPU's RT cores. And since everything Cycles does is path samples, then _everything_ is rendered with RT cores.
      This is unlike most video games, which must run in real time and therefore only use RT when they have to, like for reflections and such, and never use RT to do path tracing because it's still too demanding and slow for real time.

    • @SirWade
      @SirWade  2 ปีที่แล้ว +13

      I misspoke - I was talking about the shaders not being reflection / refraction-heavy in that scene. The scene didn't require much complex calculation compared to something like the Maya render later in the video

    • @BeheadedKamikaze
      @BeheadedKamikaze 2 ปีที่แล้ว +14

      @@SirWade Diffuse lighting is *more* complex to calculate than specular reflections. As @KillahMate is trying to explain, this is how path tracing works - a diffuse surface is really just a crap-ton of reflections, all from different directions, and the colour is averaged over hundreds of samples until it becomes smooth. Whereas a specular shader reflects all the rays in more or less the same angle so it turns into a clean result much more quickly. You are getting confused with game rendering terminology. Path tracing is *all* reflections. 100%. It doesn't matter how many specular surfaces there are. And every single one of those rays is calculated using the RT cores.

  • @thevoid6756
    @thevoid6756 2 ปีที่แล้ว +9

    The "Why Vram Matters" chapter is like the hidden gem of this video. Glad Paul recommended your channel.

  • @fxadam
    @fxadam 2 ปีที่แล้ว +74

    Great video. Picked up the RTX 4090 today and it is incredible at hardware rendering in Arnold, Blender, Keyshot etc. Games are fun but this GPU is excellent for content creation.

    • @SW-fh7he
      @SW-fh7he 2 ปีที่แล้ว +2

      How did you get it?

    • @fxadam
      @fxadam 2 ปีที่แล้ว +2

      @@SW-fh7he Walked into Microcenter on launch day. They had plenty. They're sold out now but they should have more shortly. Apparently Nvidia is sending out links to Geforce Experience users that will allow them to easily order a 4090 from Best Buy without having to deal with the bots that are slamming best buy right now.

    • @checkmymovie
      @checkmymovie 2 ปีที่แล้ว +1

      I gonna pick up mine today and build a whole new computer for Daz Studio because I'm character designer.

    • @zubairalam2795
      @zubairalam2795 ปีที่แล้ว

      Also the psu... :/ and the wattage is hugee!!!

    • @Citizen1482608
      @Citizen1482608 3 หลายเดือนก่อน

      No 4090 will live as long as a quadro (RTX A6000) for example under stressful work loads daily rendering as I use them. One of the main reasons why I was afraid to buy one.

  • @mjlagrone
    @mjlagrone 2 ปีที่แล้ว +95

    Yes please for the Part 2. I would especially like see how it compares in Blender when you have a lot of hair and subsurface scattering! And maybe also with a giant pile of grass and other vegetation.

    • @SnakeTheCowboy
      @SnakeTheCowboy 2 ปีที่แล้ว +5

      All hands up for more Blender testing!

  • @techdraconis
    @techdraconis 2 ปีที่แล้ว +92

    I would love to see a part 2 with unreal and houdini.

  • @froggy3u
    @froggy3u ปีที่แล้ว +3

    At around 3:10 minute of the video. I am asking just in case.
    Do you put your viewport in shading render preview while rendering the scene?
    I learned this the hard way 5 years ago, I remember I ran out of memory using my 1070 when I rendering some heavy scene using Cycles.
    I noticed that I put viewport sample and render sample as the same value at 4096.
    I was curious why was it able to use 6GB of VRAM in viewport shading but not in render.
    Turns out, when I was rendering the scene, both viewport and render windows using the same amount of vram.
    I decided to put my viewport in image editor layout, it staring render each frame under a 15 minute instead of "out of memory" error.
    Nowadays, even when using 3090...
    I always turn on the [temporary editor > image editor] in the settings and ctrl+space in one of the windows for the render (so that other windows are inactive while I am rendering).
    For my case it improves my render x3 to x4 ever since I was aware of that viewport shading also uses vram even while I am rendering.
    Frame 1: 18s vs 1min09s (with viewport shading in background) of a same scene settings I did from 5 years ago.
    Not sure if your case the same as mine. Hope this info helps could someone.

  • @feloi3033
    @feloi3033 2 ปีที่แล้ว +43

    only this guy can say, "mom I need 4090 for homework".

    • @cryogenicheart2019
      @cryogenicheart2019 8 หลายเดือนก่อน +2

      Quickest way to get your parents to take you out of animation school and put you in a real university

    • @trungtechandtoys
      @trungtechandtoys 5 หลายเดือนก่อน

      @@cryogenicheart2019after graduating from an animation university, I found more interests in tech and pc, not animation :)

  • @user-ly1en7kl2o
    @user-ly1en7kl2o 2 ปีที่แล้ว +9

    Great you're back, can't wait to see more of your animations.

  • @gcharb2d
    @gcharb2d 2 ปีที่แล้ว +23

    That's why I got the 12 GB RTX 3060 instead of the 8 GB RTX 3070, a tad slower, but cheaper, and it handles larger scenes!
    Great video!

    • @MIchaelSybi
      @MIchaelSybi ปีที่แล้ว +3

      I got gtx 680 with 4 gb instead of 2, and it served me some more years than it would otherwise, as many programs had 4gb as a bare minimum with time.

    • @Musaibavr
      @Musaibavr 3 หลายเดือนก่อน

      I bought RTX3060 12GB instead of RTX4060 8GB for 3Ds Max.

  • @DonC876
    @DonC876 2 ปีที่แล้ว +11

    I think another angle of looking at this is efficiency. If you have your computer compute a lot then this will also add to the power bill fast. Just yesterday i saw a review where they tried to find the best efficiency by slightly underclocking (about 150mhz) and undervolting and they got the power consumption down almost to 3090 levels (roughly from 400W to 300W) with only a few percent of performance lost. That would mean that you basically double your efficiency in that case. So there's another cost factor that can make that investment worthwhile even quicker.

  • @LiyoungMartin
    @LiyoungMartin 2 ปีที่แล้ว +43

    Finally, an in-depth analysis of 4090 performance for 3d workflow!!! Could you pretty pleeeease (as you mentioned earlier in the video) do a separate video for unreal engine? Thanks!

    • @flyinggecko3322
      @flyinggecko3322 2 ปีที่แล้ว +3

      Yes, Unreal and even further look into blender, like different types viewport settings and final renders at 4K would be amazing!

    • @zubairalam2795
      @zubairalam2795 ปีที่แล้ว

      Thats what im looking for as well.. thnxx mate

  • @hanselespinosa8918
    @hanselespinosa8918 2 ปีที่แล้ว +6

    As an artist working with a 1080 Ti in the current year. I don't even fully grasp the amount of creative decisions I could make with this card if I was able to afford it. I agree with the statement in regards to the gaming community. The conversation about the 40 series ends the same way it does every year, more frames equals better performance better gaming experience. That's it. For the creative community it means time budgets can be allocated differently. When you mentioned the difference of 9 hours, man, 9 hours for sound design or post processing in general can make a huge difference.
    Really good review. First time I check out the channel, thanks for sharing.

  • @klaus6474
    @klaus6474 2 ปีที่แล้ว +2

    I got the RTX S 5100
    CUDA cores: 41,984
    Boost clock: 2.3GHz
    Memory: 128GB GDDR6X
    Memory bus: 3760-bit
    Memory bandwidth: 2036GBps
    RT cores: 264 (3nd-gen)
    Tensor cores: 1042 (3rd-gen)
    NVLink SLI: No
    PCIe: Gen 5
    HDMI: 2.1
    HDCP: 2.3
    Display connectors: 2x HDMI 2.1, 4x DisplayPort 1.4
    Length: 15.3 inches
    Width: 6.0 inches
    Height: 4-slot
    Maximum GPU temp: 102
    Graphics card power: 460W
    Recommended power supply: 1000W
    Power connectors: 5x 8-pin (with supplied 26-pin adapter)

  • @pmAdministrator
    @pmAdministrator ปีที่แล้ว +2

    You're absolutely right! Thank you for the video. For us, who use these cards for work, these cards are INSANE!

  • @rcarter1690
    @rcarter1690 2 ปีที่แล้ว +17

    Finally some real world tests that really show why an animator would spend so much on a card like this. That 3 minute short test is the best I’ve seen that no other TH-camrs seem to understand. Thank you!

  • @3dduff
    @3dduff 2 ปีที่แล้ว +5

    yes please make a part 2. I am a freelance who does use Maya/c4d/Houdini, and render mostly with Redshift. I have my own 5 system minifarm stocked with 30xx GPUS. But a big part of my time is spent in simulations. Only a few situation excist that can speed up simulations with GPUs, but I would very much like to see some of these run on a 4090.
    Great video as always, keep up the good work.

  • @ChaosOver
    @ChaosOver 2 ปีที่แล้ว +1

    In a lot of workflows the artist is the limiting factor, not the hardware - even in LookDev. The real benefit will be in lighting and final rendering(if you are not working on complex shots that wont fit into your vram anyways).
    (Btw., if you are talking about rentability, the power draw is an important factor. So whats the min(idle/desktop), avg(working in the viewport), and max(rendering) power consumption? Whats the mix(desktop, viewport, render) in real world scenarios? How much does it draw in an example project per day? Compared to other cards? And how much does the room temp rise(believe me it does)? How much power is needed for the AC to cool it? These factors are also important to calculate your costs and to figure out whats the best solution for you.)

  • @St1ngerGuy
    @St1ngerGuy 2 ปีที่แล้ว +27

    Very interested in video export from unreal engine 5 using the movie render queue in a raytracing heavy scene. I have a 3090 right now and it does pretty good but the 4090 looks like it leaps ahead by quite a bit. Thanks for putting this video together.

    • @bikboi3292
      @bikboi3292 2 ปีที่แล้ว

      do you regret buying 3090?

    • @chillsoft
      @chillsoft 2 ปีที่แล้ว +1

      I have two 3090s, so this card is mute for me. Can't pack 2 of these in the case to get the double, no MB supports two because of how thick they are. Gonna stay with NVLinked 2x3090s and skip this generation unless I'm able to watercool them.

    • @zilverheart
      @zilverheart 2 ปีที่แล้ว +1

      @@chillsoft there exist a watercooled version of 4090

  • @pariahgaming365
    @pariahgaming365 2 ปีที่แล้ว +1

    I’m an animation student and I have a 3080 ti. I just finished my first ever shader render on MAYA. It’s was just 150 frames but it smoked my M1 Mac-Mini. Once my work-loads start getting super heavy later on, I’ll definitely upgrade but I should be good. Also my cpu is a Ryzen 9 5900x with 32 gigs of ddr4 ram. I’m sure I’ll be upgrading to either 64 or 128 gigs of ram in the near future

  • @dominic.h.3363
    @dominic.h.3363 2 ปีที่แล้ว +1

    VRAM was the reason I went with a 3060 instead of a 3070, because whenever I would need to waste tens of hours using CPU+RAM as a fallback, wouldn't be worth it for the few tens of minutes saved with the faster 3070. It was very hard to justify replacing a broken GPU and being voluntarily fleeced with an almost $1k 3060 back at the height of the cryptomining craze, but seeing how fast viewport rendering is with a 4090, I'm tempted to bite the bullet once currency exchage rates stabilize.
    This review was everything I wanted from a creator perspective but never got from the usual outlets. Thanks!

  • @joonglegamer9898
    @joonglegamer9898 2 ปีที่แล้ว +1

    There's a lot more to consider here, especially for the average 3D modelling animation enthusiast, if you look at your results from the Animated Frames Sprite Fright production files, you can clearly see it's not "double the performance", if anything it's barely 20 percent more with a 4090 card than an 3090, so if you're the average hobbyist it might not be such a huge deal to miss 20 percent performance, it's certainly NOT twice the performance. And that brings me to another thing - cost - these cards in Sweden were I live, cost around 26K Sek which translates to 2321 USD, most of us who bought the 3090 for around 2000 USD might not be THAT motivated to junk our cards and pay an extra 2.3K to get the difference. In a professional STUDIO setting I totally get the value, even a 5 percent difference can make or break some larger budgets with time constraints. You also have to realize that DLSS isn't used everywhere, thats more an Interpolation thing like those used in older television sets to draw the frames inbetween two extremes or rendered images.
    So in short - I don't think you will notice much difference when working with Blender cycles in the viewport when rotating and inspecting the scene. The biggest major upgrade was in fact from 1080ti to 3090 were you went from choppy slow movements to relatively real time. From 3090 to 4090 - that difference is not as HUGE as you make it sound here.
    Also in Blender, animation files (especially rigged ones) are very CPU bound and here it's actually better to have a better CPU.

    • @TelgramOfficialMetalJesusRocks
      @TelgramOfficialMetalJesusRocks 2 ปีที่แล้ว

      🔝🔝🔝🔝🔝🔝🔝
      *Thanks for watching*
      *you have been selected among the lucky winners, inbox*
      🔝🔝🔝.🔝🔝🔝

  • @Bunderwahl
    @Bunderwahl 2 ปีที่แล้ว +4

    Part 2 please, would be really cool if you could include 4080 and 7900 XTX, and even better if you could add video production and other creative applications!

  • @whidzee
    @whidzee 2 ปีที่แล้ว +8

    i'd love to see the performance differences between all the 40XX cards

  • @enigmawstudios4130
    @enigmawstudios4130 2 ปีที่แล้ว +2

    You're always the go to for usable info on graphics cards. Everyone else is gaming

  • @hardwire666too
    @hardwire666too 2 ปีที่แล้ว +4

    I am so glad you talked about an actual animation rendering benchmark. I have gotten in countless arguments with people about how a single still frame tells me nothing about how a video card will perform for my needs as an artist. All that tells me is how well that video card will render that one single frame with the most optimal settings. It tells me basically nothing. People just DON'T understand that the settings used for one frame migh not be great for the next frame. So one frame might render in 30sec, but the next might render in a 1min30sec, and the frame after that might take 5min. So that single-frame bench mark is utterly useless. So thank you. I fell vindicated. lol.
    Also on the note of renderfarms Blender has a fairly popular one called sheepit where you can use your own hardware to earn time on the render farm for youru own projects.

    • @carlesv7219
      @carlesv7219 2 ปีที่แล้ว

      Imagine when you try to tell people that rendering scenes has nothing to do with actual viewport performance while you're working creating that scene, but people loves to see simplifications and numbers just to think they know something and they did the right choice, even if they only use it to play minecraft.

    • @hardwire666too
      @hardwire666too 2 ปีที่แล้ว

      @@carlesv7219 For real. It's like I can seal with 10fps in the viewport, whjat I need is shorter rendertimes to help me itterate faster. lol

  • @lhbbq
    @lhbbq 4 หลายเดือนก่อน +1

    @Sir Wade Neistadt-Thank you for sharing. I like the A6, but is it unfair to compare it to a 4090? When you look at the A6000 at 300 watts and the 4090 at 450/600 watts of power, what kind of results would you get IF the A6000 had that kind of power? This would allow you to almost compare the cards apples to apples, up to 24GB of VRAM.
    Your thoughts?

  • @maxrose8845
    @maxrose8845 2 ปีที่แล้ว +2

    Love the focus on creators - not enough of that. You're the man Sir Wade!

  • @mixtapechi
    @mixtapechi 2 ปีที่แล้ว +30

    It's good to see someone making benchmarks on creative programs rather than games. Thanks!

  • @jamesquao1028
    @jamesquao1028 2 ปีที่แล้ว +1

    Thank you for making this video, I have purchased a 4090 thinking that maybe i have overspent. But as a 3D visualiser you have helped me justify with a smile that i made a great investment

  • @marcusolivix
    @marcusolivix 2 ปีที่แล้ว +1

    Thanks! finally a review for creators. ...and yeah! Please a part two!!! It would be great to see how it performs in different render engines and different 3D software.

  • @Didjelirium
    @Didjelirium 2 ปีที่แล้ว

    I cannot wait to try this card in Blender but for now the closest I got to a 4090 was by downloading a 3D model of it then zooming in on the details. XD

  • @TGA_anim
    @TGA_anim 2 ปีที่แล้ว

    FINALLY ,this is what i was looking for ,not reviews that talk about video games and stuff,3D is my thing

  • @zaydraco
    @zaydraco 2 ปีที่แล้ว

    This is the first serious content creator review, not just for TH-camrs

  • @TheRealLink
    @TheRealLink 2 ปีที่แล้ว +1

    As someone dabbling a lot with Blender and doing some freelance work, your graphs were very helpful, whether brute-forced or RT native. Great explanations! Subbed.

  • @rnbpl
    @rnbpl 2 ปีที่แล้ว +1

    5:05 when GPU's run out of VRAM, they can sometimes use system RAM (out of core) but then it slows down a lot. maybe that's what happened

  • @GmanGavin1
    @GmanGavin1 2 ปีที่แล้ว +1

    Love the video format, the information in the video. This is exactly what I will send people whenever I have to explain why VRAM matters.

  • @Im_Ninooo
    @Im_Ninooo 2 ปีที่แล้ว +3

    3:25 that's one of the reasons why I bought the 12GB model of the 3060, so that I wouldn't have to worry about running out of memory anymore (as I sometimes did with my 4GB 1050 Ti)

    • @macksnotcool
      @macksnotcool 2 ปีที่แล้ว +3

      Wow, someone else who went from a 1050 to a 3060, nice

    • @wachocs16
      @wachocs16 2 ปีที่แล้ว +1

      I do mostly CAD and 3d scanning and modeling (you don't need that much vram very often). But renders take a lot
      It's a shame to only upgraded from a 1060 6GB to a 3070 8GB. I was really mad about there wasn't any model in existance of a 3070 12GB. And there was a lot of difference for the 3080 12GB

    • @Im_Ninooo
      @Im_Ninooo 2 ปีที่แล้ว

      @@macksnotcool I've had a 750 Ti for years, then a friend gave me his old 1050 Ti which I used for a few months before upgrading. worth every cent.

    • @Im_Ninooo
      @Im_Ninooo 2 ปีที่แล้ว

      @@wachocs16 yeah, that's why the 3060 was so appealing to me, it was reasonably priced and had more VRAM than a 3070 (which was actually quite expensive)

    • @CaptainScorpio24
      @CaptainScorpio24 2 ปีที่แล้ว +1

      @@Im_Ninooo i too did from 1060 6gb to rtx 3070 8gb ..
      wanted 3080 but it was power hungry and expensive too in mining boom.
      my Cooler master v650 watts 80 plus gold cud handle upto 3070 only. 😭

  • @mikechristiansen2000
    @mikechristiansen2000 2 ปีที่แล้ว +1

    I would be interested in a Houdini 4090 benchmark with mantra and karma.

  • @soraaoixxthebluesky
    @soraaoixxthebluesky 2 ปีที่แล้ว +1

    Me getting into 3D animation scene now looking at 24Gb RAM doesn’t seem ridiculous, the fact it’s the opposite. 24Gb of ram look fairly conservative.
    Now I know why people are mad when GPU manufacturers or vendors increase their price because some people do all these things for a living. Not for fun.

  • @SATYAMKUMAROY
    @SATYAMKUMAROY 2 ปีที่แล้ว +1

    Needed this review. Very good content

  • @othoapproto9603
    @othoapproto9603 ปีที่แล้ว

    it's 1/27/23 Just built a new PC with an AMD 7059x + 128gb RAM + RTX 4090 with all current Win 11pro OS and Nivida drivers. I can't f12 render in Cycles or Viewport. I've tried to many things to list.

  • @fabianoperes2155
    @fabianoperes2155 2 ปีที่แล้ว

    Man, before I watch the video, I just wanted to say you are GORGEOUS! GOSH!!!

  • @gerasimosioardanitis5494
    @gerasimosioardanitis5494 2 ปีที่แล้ว +6

    Now with the 4090 abandoning the NVLINK I m seriously considering skipping the 40x and waiting for the 50x with a hope they restore it. In the meantime I will seriously get my hands on 2 3090s and add them so I can get advantage of the 24+24=48GB with NVLINK.
    Imo I avoid quadro since I ain’t a studio owner or something. As an artist I can see a lot of value on rtx 2080s I had now 3090s I m planning on plus the 48gb with NVLINK for my budget is more than to make me happy and load my scenes and/or project. I ain’t getting crazy about if it will render in 12h instead of 8h.
    As long as I can improve my workload in a logical cost I am happy with it. As I mentioned above I invested on a good motherboard and a nice CPU that I believe serves 80% of the projects a blender artist needs. So in future I can add any 30x or 40x gpus but definitely I won’t spend 1600-2000 Europe prices for a 4090 that does give me a 1.7x of a 3090 and still get stuck with 24gb.
    For the same money I can get 2 x 3090s, Asus here costs 1200euros incl VAT and excluding VAT is 867€ x 2 = 1730 let’s say roughly. Let’s add an NVLINK 125€ that tops it at 1900€.
    I hope I don’t sound arrogant or biased but I cant see myself spending enormous amount of quadros.
    Farm rendering is still too expensive unless u r running a studio with lots of clients. If u r a solo artist I don’t see it as a solution for the time being. Maybe later that will be more competition around the market and prices become more reachable yes.

    • @Carlosmatos-nx4uc
      @Carlosmatos-nx4uc 2 ปีที่แล้ว +4

      Finally someone that not getting fooled by a shiny object. As you said you can buy 2 3090 for the price of one 4090. I myself have a 3090 working towards my second on. My biggest disappointment with the 4090 is that I was expecting it to be 32gb not 24gb.

  • @zombiecharger65
    @zombiecharger65 2 ปีที่แล้ว

    Glad someone finally addressed VRam and rendering. Everyone is always talking about gaming. The render time is a big thing but I need the most VRam that I can get and Nvidia throttles that on most models.n

  • @crisschan2463
    @crisschan2463 2 ปีที่แล้ว +1

    its weird in your 3070 test, that the render only takes 4gb of vram, why is tho? is blender limiting the vram usage?
    great content btw

    • @pinkmoon5332
      @pinkmoon5332 2 ปีที่แล้ว +1

      I agree there's not much depth in that particular area of analysis
      Vram plays such an important part about rendering with gpus. Large studios still to this day refrain from relying on a gpu farm simply because 24GB vram is not enough to "cut the mustard" as they say.
      However having a 4090 with 48GB of VRAM significantly changes the playing field for smaller studios if they're willing to support the extra heat temps and wattage used to support 4090s versus 3090s.

  • @MattHalpain
    @MattHalpain 2 ปีที่แล้ว +1

    Great video. Super awesome to see the 4090 from an artist point of view.

  • @shermanwellons
    @shermanwellons ปีที่แล้ว +4

    Part 2 for Cinema 4D and Redshift would be awesome for the 4090. I am thinking about replacing my 3090.

  • @mechaboy95
    @mechaboy95 2 ปีที่แล้ว +1

    you can use some of your own pc ram as dedicated video ram if you gpu doesn't have enough to render a scene.
    its in some windows setting, but I'm not sure how much it helps

  • @rahulroaringrc
    @rahulroaringrc ปีที่แล้ว +1

    I want to render 3 hours 3d animation vedeo at 60fps in 4k quality. Which gpu card will be best for me any suggestion or I have to install many gpu on one computer to render it fast.

    • @KirillPodcast
      @KirillPodcast หลายเดือนก่อน

      Несколько. Но на один комп вы это не поставите 😉 Нужна специальная серверная мать под Xeon W Saphire или под ThreadRipper 😁

  • @procrastinator24
    @procrastinator24 2 ปีที่แล้ว +2

    Pleade part 2! Im looking at this card specifically for blender and the unreal engine :D thanks so much for the content!

  • @blazbohinc4964
    @blazbohinc4964 2 ปีที่แล้ว

    You correction is only half correct.
    If Blender was swapping memory to RAM, then there's no way it would take that long. However, if it was swapping to SSD, then yes. Idk what your setting was. Usually, VRAM issues are easily avoidable if you calculate with tiling. Takes longer depending on the speed of your SSD, but if the GPU doesn't have to store everything in VRAM.. you can render almost anything.

  • @mahadevovnl
    @mahadevovnl 2 ปีที่แล้ว +1

    I dunno about RTX and such but my dude, your beard is magnificent. I want to know your tricks. How do you keep it so nicely trimmed? Barber or DIY?

  • @MonstroInLA
    @MonstroInLA 7 หลายเดือนก่อน +1

    A6000 was surprisingly slower than I thought

  • @Speed_Monger
    @Speed_Monger 2 ปีที่แล้ว +2

    Finally someone showcasing what this card is actually meant for! Thanks

  • @slimerone
    @slimerone 2 ปีที่แล้ว

    @Sir Wade Possible to get a walk through of your PC setup? Not only a visual walk through, but a build part list etc? I think you said Peugent systems makes it for you or something?

  • @kellyshipman1341
    @kellyshipman1341 2 ปีที่แล้ว +1

    Fantastic video! Would definatly like to see some more.

  • @theshawnmccown
    @theshawnmccown 2 ปีที่แล้ว +1

    These kind of results are the selling point for me. It's really a great value when it performs this well in work and play.

  • @primetrader5062
    @primetrader5062 2 ปีที่แล้ว +1

    What you say makes little sense as if before 4090 existed artists could not render scenes with GPUs that had less than 12gigs, which is not true at all. The work around is to dedecate a protion of RAM to supplement VRAM and you can render any scene even with 1050, render time is another issue of course.

  • @bac483
    @bac483 2 ปีที่แล้ว

    Just came from a 1080ti , got a 3090ti both the company paid for so Im happy for now, and 4090ti is right around the corner not to mention 5000 series. Good video!

  • @iceman10129
    @iceman10129 2 ปีที่แล้ว +1

    I wish I could get a hand on one of these to test RenderMans XPU

  • @furyRender
    @furyRender 2 ปีที่แล้ว +1

    Are you going to test the Ada RTX 6000? The specs say it requires 300 watts. The 4090 has been tested to run -5% performance when limited to 300 watts. Thanks

  • @khoifoto
    @khoifoto 2 ปีที่แล้ว +1

    The fun stuff is when my friends and I went into Micro Center and each of us walk out with a 4090, people curse at us for being scalpers. Little did they know we're just building out own rendering stations :( . This card is a blessing.

  • @AndyMcMac
    @AndyMcMac 2 ปีที่แล้ว

    This is really helpful, thankyou! Everyone else just concentrates on games and that's not what we need.

  • @jorgiewtf
    @jorgiewtf ปีที่แล้ว

    I’m a 2nd year 3D animation student. As much as I’d love to upgrade my 3080ti to a 4090, my classes aren’t anywhere near advanced in complexity or size yet to warrant the cost. At this point, I figure by the time I’m actually doing grad classes, the 5090 will be out and from what I’ve seen, it’s supposed to be even twice as fast as the 4090 so I figured I’d wait till then. I have it paired with a Ryzen 5900X and as far as I’ve seen so far, we mostly use MAYA. I’m sure we’re not that far from it now. Great video, thank you. A part 2 to this video would be awesome!

    • @KirillPodcast
      @KirillPodcast หลายเดือนก่อน

      Я тоже работаю на паре 3080Ti, но с процессором R9 3900x 👌🏻😁

  • @otegadamagic
    @otegadamagic 2 ปีที่แล้ว +3

    Man thanks for being one of the very few to test for creators. Maybe do another that also shows benchmarks for editing softwares like davinci, premier and fcp.
    Cheers from Nigeria

    • @CreatorChaz
      @CreatorChaz 2 ปีที่แล้ว +1

      A youtuber named Eposvox has a video that might be what you're looking for. I hope that helps.

    • @otegadamagic
      @otegadamagic 2 ปีที่แล้ว +1

      @@CreatorChaz yeah thanks i saw his one before sir wade posted his. It would be good to get more people doing these benchmarks so we can compare i guess.

    • @CreatorChaz
      @CreatorChaz 2 ปีที่แล้ว +1

      @@otegadamagic Yeah, It's kinda rough finding non-gaming benchmarks sometimes. I hope more people pop up in the space.

    • @otegadamagic
      @otegadamagic 2 ปีที่แล้ว

      @@CreatorChaz yeah apparently NVidia cares more about gamers than content creators. No wonder they mainly sent test units to gamers for review

  • @TheNerd
    @TheNerd 2 ปีที่แล้ว +5

    1 year ago I switched from a 1080 to a 3080. It blew my mind when I realized, I'm able to move a fully rigged (human) character of 300.000 faces in the cycles viewport (with denoising) in the middle of a "Kids Bedroom Scene" with A LOT of stuff in it.
    Sure at a couple FPS, but a couple years ago this was just simply unthinkable and fully impossible.

  • @andrewferguson2221
    @andrewferguson2221 2 ปีที่แล้ว +10

    Ugh. I was really hoping that the card wasn't that good. Nvidia's been getting a little too bold lately, and I was more than ready to write this card off. But I was wondering what made you test at 16k instead of 4k? Either way, I love the video and hope you make a part 2! Thanks Sir Wade!

    • @SirWade
      @SirWade  2 ปีที่แล้ว +4

      I mostly just wanted to push the GPUs further to see if the performance scaled - I did a 4K / 16K test 2 years ago and figured I'd just stick with it :P Glad you liked it!

    • @amanda.collaud
      @amanda.collaud 2 ปีที่แล้ว

      @@SirWade that favours the 4090 ...

  • @GabrielGabeRodriguez
    @GabrielGabeRodriguez 2 ปีที่แล้ว +4

    Great video. The algorithm populated this for me! Just wanted to speculate when you mentioned the 3090 taking 45 mins per frame, the 3000 Fe are notorious for having really bad thermal pads and worse alignment for the cooler in the early batches (2020). I recently bought a 3090 Fe and noticed the memory temperatures at thermal throttle (110+) because gddr6 and 6x have error correcting components if the memory heats up too much it can start to trip over itself and create errors that will slow down it's performance. Using quality thermal pads and trying to improve the cooler's seating has increased my thermal headroom on the memory and it's running at a nice cool 92C max (my alignment might not have been perfect as some other people reported 88C max temp with the most memory intensive applications...mining).

  • @pgplaysvidya
    @pgplaysvidya 2 ปีที่แล้ว +1

    because of the large amount of gamers (probably) i've always had issues with the famous top tech youtubers not telling me the performance uplift on the software i use. finally!
    ps thanks to Paul from newegg for linking this video :P

  • @JUYAN16
    @JUYAN16 2 ปีที่แล้ว +1

    Will you do benchmark of the AMD Radeon RX 7900 XT as well?

  • @ivanoleaanimator
    @ivanoleaanimator 2 ปีที่แล้ว

    I think there also needs to be a discussion on the power consumption of a machine that uses the 4090. From the top of my head I believe that it ranges from around 350 watts up to 600w which is like running a hand held vacuum cleaner while you render and half that at idle.

  • @theredredvideo4189
    @theredredvideo4189 ปีที่แล้ว

    Hands down one of the most comprehensive and useful reviews / deep dives on the 4090. Subbed, Liked and please do a part 2 on C4D!

  • @shyamlok
    @shyamlok 2 ปีที่แล้ว

    I edit with Edius 9.55, And I use ISP Robuskey for chromakey, the 4090 plays back Robuskey footage from the timeline with 0 dropped frames, and it doesn't care what you throw at it, I was shocked initially and I thought without rendering, real-time playback isn't possible, even the website of robuskey says so.

  • @nubletten
    @nubletten 2 ปีที่แล้ว +1

    Beside the fact this is not about professional graphics cards then...:
    This video is good, but it would have been great if you included and gpus aswell.

  • @jasonhoi85
    @jasonhoi85 2 ปีที่แล้ว

    Thanks! This is the best benchmark for 3d artists.

  • @nerukas86
    @nerukas86 2 ปีที่แล้ว +1

    Great video, that's what i call quality content! Thank you.

  • @caninac
    @caninac 2 ปีที่แล้ว

    It would be interesting also, to do a calculation based off the power consumption for a animated render, which would give an indication as to how much air conditioning would be needed if there was a renderfarm of these guys.
    Also, with these sorts of tests, optix denoiser would be a better choice over open image denoise. yes open image denoise is a better denoiser, but its CPU bound not GPU bound, which skews the results

  • @MikaiGamer1286
    @MikaiGamer1286 2 ปีที่แล้ว

    Glad i found you with this herdly ever see 3d artists benchmark these cards and i havent done it in years now even after graduating with a degree and everything as theirs no work in florida for it

  • @kimmysander2447
    @kimmysander2447 2 ปีที่แล้ว +2

    Id love to see a part 2!

  • @gamin546
    @gamin546 2 ปีที่แล้ว +1

    Finally, a video benchmarking the RTX 4090 in animation and rendering, hopefully this dude gets more views and subscribers because he honestly deserves it.

  • @AndrewTanielian
    @AndrewTanielian 2 ปีที่แล้ว +1

    This is a great video! You explained all this very well.

  • @lhbbq
    @lhbbq ปีที่แล้ว

    @Sir Wade Neistadt - I never see a test where the scene is dense when comparing the 3090/4090 to the A6000. Most of the GPU tests are small in file size and in complexity. If the scenes are below 24GB, the 3090 and 4090 tend to appear on top. But what about placing the A6000, 4090 and etc., with scenes that are 40GB in size. The A6000 should respectively clean house. The A6000 is for larger file structures and etc - right?

  • @JoelinoPT
    @JoelinoPT 2 ปีที่แล้ว

    As a professional using blender, It's a no brainer: RTX 4090 is a worth buying. Even if you already have a 3090 or 3090ti. In work you will have return of your investment quickly. Of course, depends where you live and how much you can make with your work. Do your math.
    I remember when Titan X Pascal was the best in class and I was very happy back then with 2 or 3 cards... Today a 4090 does the job 10 to 14 times faster than a Titan X Pascal. 450Wh vs 3,5kWh (14x Titan X). It's an incredible progress.

  • @TheStrandedAlliance
    @TheStrandedAlliance 2 ปีที่แล้ว +1

    I wonder if at some point Cycles becomes faster than Eevee in VR rendering. Unfortunately, Eevee is still poorly optimized, and hasn't received many updates since its debut.

  • @tonymoore3122
    @tonymoore3122 ปีที่แล้ว

    Super helpful as I'm considering purchasing a 4090 or 4080. Thanks so much!

  • @DrivenKeys
    @DrivenKeys 2 ปีที่แล้ว +6

    Thank you for this, Sir. I'd like to see the Part 2. Eventually, if you can get them, I'm interested in how the 4080 cards will do with pro apps. I animate, so fast rendering isn't that important to me, but the Omniverse benefits with Unreal integration are very tempting. I already know I'll have to upgrade my 3060ti soon, it's extremely refreshing to hear an artist's view on what to choose. I like daydreaming of AMD competing here, but it really doesn't seem realistic. What do you think?

    • @CaptainScorpio24
      @CaptainScorpio24 2 ปีที่แล้ว

      how's your 3060 ti working

    • @DrivenKeys
      @DrivenKeys 2 ปีที่แล้ว +1

      @@CaptainScorpio24 So far, I enjoy it. If you're only animating with well designed rigs with Maya's viewport 2.0, it's good enough, but anything beyond that might want more power. Currently, I'm attending Animation Mentor, which uses very efficient rigs. The 3060ti paired with a Ryzen 5600x allows me to view my animation in Maya's viewer 2.0 with lights and textures. Of course, playblast is more accurate, but the viewer plays well enough as you tackle notes. Once I graduate AM, I'll want to explore more projects similar to what Sir is tackling on his channel. As he pointed out in the video, a 3070 couldn't handle rendering some files. That said, you only need gpu-only rendering for faster render times. If you're limited on budget, 3060ti will do for a while, and it games pretty well with ray tracing games at 1080p. GPU prices are going to go down soon, as the 4080 cards release next month and AMD introduces its new generation. So, if you can wait, I recommend holding out for a 3070 or 3080, or a less expensive 3060ti. Also, if I didn't care about ray tracing and Nvidia's Omniverse, I would have tried AMD. For the same price, they have more power and memory. That said, I love ray tracing, so I don't regret the purchase.

    • @rVox-Dei
      @rVox-Dei 2 ปีที่แล้ว

      Late to the party but AMD has been making a lot of updates and its been getting better, they're also finally adding hardware ray tracing with 3.5 in blender

    • @DrivenKeys
      @DrivenKeys 2 ปีที่แล้ว

      @@rVox-Dei Yes, I'm very interested in RDNA 3's improvements. They're getting Ray Tracing sorted, but people say CUDA still has a vast pro advantage. So, the general thought is that, for a 24gb card, a 3090 will still be a better choice than a 7900xtx, but we have yet to see if that's the case. I have a close eye on the 4080 (costs about the same as 3090ti), but I'm concerned about the 16gb vs 24 for rendering large scenes.

  • @ChosenMan37
    @ChosenMan37 2 ปีที่แล้ว +1

    I'm doing a workstation build and was wondering do I get this or the A6000 instead? I would like to game occasionally but not all the time. Main focus is workstation. I'm getting the ryzen threadripper and was wondering which GPU is best. Appreciate if anyone can help

  • @christopherjunkins
    @christopherjunkins 2 ปีที่แล้ว

    PART 2! This is awesome work!

  • @ahmedouardani2370
    @ahmedouardani2370 2 ปีที่แล้ว

    It might sound outlandish but Why nvidea won't let the VRAM be modular and can be upgraded or maybe add another card that can add more vrams ? The only feature that stops from combining multitude of video cards and test some crazy configs is vram.

  • @lucasdigital
    @lucasdigital 2 ปีที่แล้ว

    Yesterday I ordered a new PC, equipped with a 4090. Eye-watering cost, here in the UK. This video helped calm my jitters. I buy a new machine every 3-4 years. This is the first time I've bought chiefly for Blender performance, rather than gaming.

  • @amineamamra2603
    @amineamamra2603 2 ปีที่แล้ว +3

    Results are mind blowing !!
    Would love a part two testing embergen maybe and some houdini and karma gpu on it !!

  • @TempyEdits
    @TempyEdits 2 ปีที่แล้ว +2

    This video was off the charts

  • @dantheman1998
    @dantheman1998 2 ปีที่แล้ว +1

    I figured that the 4090 was going to sell out at the price it did for this exact reason. Professionals

  • @scientist1182
    @scientist1182 2 ปีที่แล้ว

    IIRC sometimes cycles will just like refuse to use your GPU for some reason and immediately fall back to CPU rendering for like, no reason whatsoever
    learned that the hard way wondering why it was taking so long for a short animation to render with my GPU when I checked task manager and my CPU was maxing out with my gpu at like, 0% utilization or something IIRC
    had to render in eevee because of that, which sucks

  • @LordLab
    @LordLab 2 ปีที่แล้ว +3

    will be nice to see tests like this on new CPUs like AMD AM5 lineup and Intel 13Gen

  • @amanda.collaud
    @amanda.collaud 2 ปีที่แล้ว +2

    He rendered in 16K resolution which favours the 4090. No one noticed?