I upgraded to a 4090 from a 3070ti for deep learning applications and also gaming. Gaming wise, I have set the card at 60% power target at which it only loses about 6-8% of performance. To give a perspective, I play Apex Legends at medium settings on a 4K 144Hz monitor, the game is locked at 144 fps constantly, with the card working at around 40% and consuming on average 120-150W. The 3070ti almost maintained 144 fps on 4K low but overclocked and it consumed on average around 300W. Insane efficiency.
That 60% power limit thing was done with the founders edition which has a better PCB then most of the AIB's. Basically it has 70a power phases and a larger count of them then reference boards and most AIB's with custom PCB's aren't matching. There's maybe 5 total 4090's not including liquid editions that exceed the FE. This matters because it allows both greater overclocks or the ability to drop the power limits lower without performance throttling. When I did crypto mining with the 10 series when those were new I would be able to power limit a good EVGA card to 60% meanwhile some PNY that uses a reference PCB would throttle hard on 70%. The reference PCB would perform less hashes at 80% then the custom PCB on EVGA at 60%.
@@Bubble1989 it has been running 24/7 at full load (training deep learning networks) for almost 4 months now and everything is perfectly fine. I am using the official Corsair cable though, not the adapter.
@@georgioszampoukis1966 thank you for your reply.the reason I'm asking is because have order me a 4090 aswel and I was this wondering if I made the right choice
@@georgioszampoukis1966 hi Geordios, I've been using a 2080 but have a 4090 on its way. How does it compare with the 3070ti for training models? Does it reduce training time substantially?
@@theTechNotice the issue is that the adobe suite is useless as a benchmark, especially after effects. it was made before multi-core computers were really a thing, let alone discrete GPU's. This is coming from a professional VFX editor standpoint. Only 4 or 5 of the effects in AE are actually GPU ready, and many aren't even 16 bit, and none are multithreaded. that was removed because they broke it during an update to the decrepit 30 year old program. As you can see here, in most adobe software, better hardware gets you diminishing returns at best, and many generations of hardware actually made it worse. *it's just not made to run on today's hardware, and is poorly optimized. there's only so much throwing more watts and cores at it can do.*
This is exactly the kind of stuff that needs to be talked about. Too many people are looking at these cards from the perspective of casual gaming, of course it's not going to be worth it for that unless you have deep pockets. The real value for a card like this is the time savings that can be had for creators and professionals where time is literally money. In such scenarios, the card literally pays for itself. I don't think anyone using these cards professionally care at all about how much power it draws but rather how quickly it can get the job done. And that's another thing, if it can complete heavy workloads quickly then how much total power is it actually drawing at the end of the day?
Seems you're not living in Europe. Energy prizes are something to consider here. Also the global warming thing. Energy efficiency is the easiest way to help out on that, each one bit by bit.
@@chebron4984 or climate change. What ever you call it. It's a thing, probably the biggest problem yet to solve by humanity. Whether you will accept it or not. I know that it is highly politicized in the US especially. Which it shouldn't. Science should be free of that.
@@Fin365 Sure. Which is a good thing. But I was reacting on the comment from Chebron which said, that nobody cares about energy consumption in the professional field and I said why they also should. It's a good development that now efficiency seems also on focus and not just pure speed and workload. But I guess mostly due to necessity that cost or climate reasons because you also have to get rid of all the energy putting in by dissipating the heat again.
Ok, I admit I wasn't expecting such good results in some benchmarks, and apparently I was wrong about how powerful the new 4090 could be. Thanks for the comprehensive video, it was quite helpful.
It's due to an older chip, OS, RAM clock speed, among other things. Benchmarking a 4090 requires a 7950x/13900k, 6000MT/s DDR5, Win11, Z790 Mobo, etc. He used (a) garbage CPU, RAM, and Mobo. Watch the Linus Tech Tips video.
This channel is underrated, one of the best channels for a creator's pc build. Rendering time is more important than the fps to me. When I find some creator pc build or comparison on the search engine, I just come back to this channel for the answers every single time. Love your video so much, good job. Btw I think you can also add win 10 for next comparison.
Thank you for putting this in a different perspective. The truth is these cards aren't targeted at casual gamers. Most of the people complaining about the power consumption and the price of these cards were never going to buy one to begin with. The graphics card landscape has shifted some. It's become obvious that the 4000 series of cards aren't meant for casuals. They're meant for hardcore gamers chasing the most performance possible and professionals for work. If you're a casual gamer and want an entry level or midrange card you need to look at last generation.
I definitely think the TUF feels more premium than a STRIX , yeah they got those race car decals and RGB but the TUF is just looks much better, that's why I got TUF3080 the design and the temps were great, I really hoped ASUS were gonna continue this line and they made exactly the 4090 card I wanted, ordered the moment it was listed :)
Thanks for being always focused on creators stuff, among all the mass of reviewers that think PCs are just for gaming. Are you planning to review also Intel ARC A770? Is Daz Studio performance comparable to Blender regarding benchmarks? Nobody tests DAZ Studio so it could be a nice thing to see in your channel. I also really like the technical quality of your videos regarding the colour palette, the accurate use of lights and the savy camera use. Clearly your videos are very well planned and have all a consistent style.
With a little undervolting, this card should still perform well above the 3090Ti, but at the TDP of my current 2070S. Now that is quite an achievement! I saw some tests (Derbauer I believe) that found the efficiency curve for the 4090 could easily retain most of its performance at far lower wattage than its stock tdp rating. And you'd have an ultracool gpu on top due to the overbuilt cooler for this gen cards. I truly like everything about it, only the price is quite spicy but I'd much rather get this than any of the AMD cards or the 4080.. My only personal drawback is that my render software of choice, Corona renderer, relies on CPU rendering and only uses the GPU for denoising 😭 At least my 13900K should take care of the rendering I hope
LinusTechTips did the undervolt experiment where he tried to undervolt the gpu to make it work with a 550w psu. Even at like 60% of its powerusage the gpu only lost roughly 10 frames at 4k ultra in cyberpunk.
You're the only reviewer among the videos I watched about the 4090s that noticed the typo on the card 😅, and I totally agree with you about the overall design of the cards, they all look so damn cheap other than a few exceptions.
Thank you for doing a creators perpective review of this card. I build a new pc last year with a 12900K and 128GB of DDR5 ram, but with my old graphics card ( GTX 1080 TI ). RTX 3090 costs about 3000€ in that time. And my most important program is davinci resolve and mostly fusion. So this card will completely change everything for me. I ordered directly the MSI 4090 suprim liquid X. I like the design... not like the other 4090s like you said in the video. And with a liquid cooler, so the card is way smaller than the others.
What's interesting is across several channels, testing multiple card manufacturers, none of them are any better than the FE. So as of now it appears NVIDIA has saddled its partners with producing higher cost, larger (non liquid) cards that offer...nothing in terms of performance. EVGA's withdraw becomes clearer.
Really good video. Everyone is looking at games and FPS and it hard to justify an upgrade on a really solid card like the 3090 or 3080 wich will keep you playing games for some years still. In 3D production the 3090 was a god send. The 4090 will enable smaller studios to achieve results faster and increase the scope of the projects. I agree most board partner cards are the worst examples of industrial design. Spending so much money on something that looks like a bad christmass tree, baffles my mind.
It's only hard to justify if you don't have a high refresh rate 4k monitor. I got a 4k 240hz monitor that's $1500 by itself. I think I can justify a $1600 gpu to push that amount of 4k frames to that beast of a monitor!
@@herculesmare4209 true, I got mine for gaming and I’m not regretting it. Currently have a 3080 and it’s okay but it’s definitely not able to push the amount of frames at 4k that I want to.
@@jefferybucci30 except they screwed us with display port 1.4a and hdmi 2.1 So we won't be gaming at 240hz 4k without chroma subsampling. That's the one thing that annoyed me about this launch
I'm going for a 4070 Ti instead, since I can't afford a 4090 & also I'd rather build a new system if I wanted a 4090. I'm going to pair that 4070 Ti to a Ryzen 9 5950X after I upgrade it from a 3950X.
Excellent analysis! Just what I wanted to hear for use with DaVinci Resolve. 👍 Do you plan to conduct any local AI Rendering (i.e., Stable Diffusion, Flux1 Dev, Mangio, etc.) comparisons if future videos? 🤔
I bought precisely this one (ASUS TUF 4090 OC) about 2 months ago, and I'm extremely happy I didn't go with STRIX (I had money, they just were out of stock). I like the design, I like the minimal RGB, which I set to reflect GPU temp. Never runs higher than 65 C, even on the most demanding games with 4K and RT on.
Nice comparison of this crazy gpu. Do you think in a Ryzen 5900x 12 core system the rtx 4090 can use it's full potential in davinci resolve? Or should I wait untill the 3090 prices drop some more and stick to that card?
Keep in mind that DLSS3 is just DLSS2 with reflex and frame generation. So, it's using AI to create frames that DO NOT exist, which I could see as being problematic for a content creator as well. And I take it that it has dual ENCODERS, but not dual DECODERS? I'm not as interested in encoding support as I am about decode support. I was super excited about this, but less so if it's just the encoders. Since I work in x264, I was hoping for big gains in timeline perf. Regardless... these were the benchmarks I was interested in. It'll be interesting to see where prices fall once NVIDIA burns through 30 series cards and they run out of less price sensitive people.
Hi, at 3:55, you show the encoder results for both RTX 3090 and 4090, showing a 26.5% increase on the RTX 4090. Are you sure this is correct? Isn't that time a lower is better, as is seems to consider the time it took for the encoder to finish the task, so less time should be better, no?
No h265 422 10bit decoding…😂…in 2022…thank you for pointing that out. This cements the mac as THE platform for videoediting to me. Sure, intels egpu does it kinda decent, but nit nearly as smooth and fast as what a macbook pro max does. So thanks, you saved me a TON of money!
We have the same effect when they pass the 980 ti to 1080ti. The change was enormous and at that time the people was shock, the double of performance for the 1080 ti was amazing.
Well the decoders are the same, so the only boost you get is the effects and colour grade and fusion etc... also dewatering on r3d and other stuff, but purely h.264 and h.265 timeline is very similar.
Just ordered a Asus TUF 4090 (non oc) Is it the same as the oc one? i can just do it myself? I read it was possible on old TUF to flash the bios of the strix, is it safe and doable on this card as well? I saw a test of the TUF, performance and temps looks amazing, 62c vs 74 on founder at load 😊 will get it by saturday can't wa
Im not sure if the non OC card allows overclocking. I understood that OC stands for “it allows overclocking” and not much for “it comes overclocked” That said , you won’t really need to overclock this card 😅
If I buy a 3080ti or 3090 for 1440p gaming, how many years will it be good for before I need to upgrade to another gpu to be able to play new AAA games in the future?
@@angelaguilera600 it’s cool xD I have a rog strix 3090 oc for that very reason plan to stick with 3440 1440p highest settings. Hope you find a good deal my 3090 was a limited edition 2200 pounds
very impressive for 3d workloads, I'm hoping AMD brings something impressive to the table with RDNA3 , especially for creators, I'm building a new workstation for my daughter soon and it would be great to have something that can compete, ideally at a slightly lower price point. Additionally i really don't want to give nvidia my money, the have a great product but very questionable business practices and i just don't like the way they treat the consumers of their product (and aib partners for that matter *cough*EVGA*)
yeah when i buy the 4090 i will lock it for FPS max so i get some energy saving in return. the down side of 4090 seems to be a quite high idle watt usage and tops the list
I'd like to actually see the test results at the low end and at idle for 2D productivity work. I'm not sure how low it can actually be throttled. Maybe... just maybe, the card actually gives people options to either run at bleeding edge speeds or scale down for work and save on electricity. I haven't seen any PC press look at it this way, yet. Everyone's been "OMG so much POWER!" - but it's actually completely optional that the card runs at these speeds.
Very helpful. Concise and well-organized review. You're getting high performance results on a gen4 mobo and a gen4 CPU. Although I wonder if you'd get the same results with a Ryzen 5950x.
when doing email/browsing, what are the power uses of a 3090 and 4090? Are they the same when just cruising along? Or are we going to see 225 watts even when idling away on the 4090?
@@theTechNotice I wonder why no other reviewers ever say this? So thankful you took the time to share it! Surprised how low the wattage is when its only idling!
PLEASE HELP ME TO CHOSE A BUDGET OPTION: 3060(12Go) or 3060Ti (8Go) or 3070(8Go) ? I'm doing only Davinci Resolve 4k with H.264 codex from my Pixel 3xl phone. My setup: 1060 (6Go), 64Go RAM, 6 Cores double thread CPU. Thanks for any comment here!
One of my favorite designs is probably the Colorful Vulcan, that one is nice looking and comes with a mini OLED screen that can show temps and other information.
Excellent video! I have a question, if you could make a video showing the differences between a system using an RTX 4090 and the same system using two RTX 4080s. Would there be much of a difference in performance within the programs? Thanks for the content! Keep up the great work you've been doing =D
The 4090 is a sick piece of tech. And people were ungrateful for the 20 series and 30 series already and bitching despite the ridiculous performance increases over the years. Hell, people still try to get their hands on a PS5 which has the gpu power of a 3060/2080. And in the end everyone tries to get his hands on these new cards. Hypocrites. Much love to NVIDIA for this next huge step.
I just got the MSI suprim Liquid X RTX 4090, in my 13700kf, 570, DRR5 6400 system and hit top 120 systems in the world on benchmarking (superposition) and can get 140 FPS in Cyberpunk 2077 at 4k everything maxed out. Loving it.
Also please Vertical mount your 4090, it looks badass, it protects it from damage, HYTE Y60 Case is small beautiful case, I got the white one with Red RGG with black accents and it looks out of this world. Got my 4090 with push pull and 360mm Rad in the top cooling the CPU. Cost me £3859 for the build and I've already had a mate offer me £5000 to buy it off me, I refused of course.
I just love how people will say it is not worth it.... I mean if you are on budget dont look at 4090.... And if you use it for work, then money you save on time are always worth. If you are just rich why even bother to think about it.... Buy the best :D
Thanks for all what you do for your community :) I'd like to know if GPU have an impact on the payback in general ? In Premiere pro ? Gor now, I'm using a GTX 1660. But somethimes the effects on the timeline are a bit buggy. I don't really care about exporting faster but, by changing a GPU, will the payback improve ? Even for a bit ? Technical question I know. Thanks ny advance !
Performance wise and thermally, is there a gap between the FE and the partner designs? I absolutely hate this generation of designs and the massive prices on the aib cards but because it's for work performance is king for me.
Its interesting to worry about the appearance of the card. I guess if I spent my time looking at the GPU instead of using it for its intended purpose I might care about aesthetics. My concern for the GPU's appearanace lasts as long as it takes to get it installed. Once Im using it I couldnt care less.
It launched a little over an hour ago. Availability obviously depends on where you're from. Assuming you're from America they went live on newegg and Best Buy this morning and are now sold out.
@@blueskys6265 Damn, I'm sure you'll be able to get one in the next round if not this round. They sure seem to have a lot more stock and without crypto/scalpers it should be easier.
i hoped it was a review of the TUF card. I think you missed out a lot of details like idle wattage where 4090 will use a bit more than 3090. also that ASUS is the only card having 2 HDMI. and for the last part if it was using 3-4 power connectors, as this depend on if it goes beyond 450W in overclocking. what is the the difference between the TUF and TUF OC model. all cards look very much the same. and looking at the INNO3D OC model it was not even able to overclock. so what does this OC mean in ASUS terms?
And then we look at "performance-per-£". Here in the UK I can pick up an RTX 3090 on ebay for ~£725 (though some, on rare occasions, go for £650), and then compare it to the prices from any vendor in the UK, the LOWEST PRICE I can get an RTX 4090 is £1800... but they go up to £2300. (Note: When you see the MSRP in the US, that is not including sales tax - so add another 5-10% to the pricetag. so it's not $1599, it's closer to $1700 USD, which, btw, is more than the cost of every other bleeding-edge component that I can put into a new build). So... that means (here in the UK) the RTX4090 is 2.5x the price of the RTX3090 (using the numbers that are most favourable to the 4090 - I could have made it look worse). So even taking the best score comparisons that you showed (which fluctuated between 1.9-2.3x the performance) then the 4090 it comes down to... 1. RTX 4090 consumes more power - other channels have measured from the wall, not from inside windows, as you have 2. 4090 gets better performance-per-watt 3. 3090 gets better performance-per-£ So the real answer is, what can you afford. I've seen sites that say "most will just spend a tiny bit more to get the 4090) - but at nearly £2000, most of us can't drop that kind of cash every couple of years on a GPU. It's utterly ridiculous. Basically, you would only buy the 4090 if you are going to make money from it (or if you just have the spare cash lying around). In my eyes, it's a golden opportunity to get into the 3090 if you are still on a 20-series card. Because buying that 4090 is gonig to cost you £1000+ to rent it for a year (if you flip it just before the 50-series come out), or if you like to keep your cards forever, you're going to absorb nearly the full cost of that card - ouch. I honestly think that the best solution for all of "us normal folk" to NOT buy any 40-seriest stock. Let nVidia control the stock levels... I would love to see them sitting on surplus inventory, drowning in pools of their own sweat. Sure, let the content creators and professional should buy it - they have a reason to - it's their livelihood. BUt the rest of us? I think we should vote for lower-priced GPUs by starving team green (and red). I have 4 TVs in my house and they - collectively - cost me £3000 and my family gets way more use out of them than most of us would ever get out of a GPU - and they'll last 5+ years. The GPU market just doesn't add up. People are wasting cash on these GPUs because videos like this are making it seem like a reasonable purchase. Please stop supporting this lunacy with positive messaging like this.
@David Fidler this is the most sane answer to the RTX 4090 i've read all week , everyone on reddit is just pushing this 4090 for gaming and not work load/ cash flow IRL and they don't take into account the actual cost of electricity. I live in Romania , it's not a 3d world country anymore but the salaries are still some of the worst compared to the rest of the europe , we pay 26-27 cents per kilowat in a house hold and our salaries are stuck at 700-800 EURO max. Ppl have become so ignorant today it's crazy.
I totally agree, the Founders Edition has the best design, but this Asus TUF looks nice too. Most other designs are for gamers as that's the majority of people who will buy these cards, much more than professionals. I'm a gamer too but I really prefer a more professional looking product over a flashy RGB one.
13:13 I am not sure what testing platform you had for Puget Bench for Davinci Resolve, but when lookin at user results on pugetbench page, most results of standard test for 3090 was about 2000 and most results of 4090 is about 3000.
Main problem with the 4090 is the 3090. Unless you only game at 4k, get a 3090 or 6900xt. Also many 3090s are going on sale between 799-980 new, and lower used. Can build an entire 3090 gaming computer for price of 4090 especially if you can’t find a 4090 FE.
I still prefer seeing 60-70% better Raytracing performance in games, as going with anything but max graphic settings is the sole reason to get a card like this for gaming anyways. Also why I am not fond of the AMD 7900XTX, as it doesn't care about anything but raster, which is quite short sighted imo. The 3090 would have to be at a really good deal for me to consider it, it's over 2 years old now and has its share of problems, some cooling hotspot issues etc, and overall just not as high quality as the 4090 - without even considering the performance upgrade. If the 3090 performance point is what's relevant, then I'd rather wait for 4080 cards to get in stock and drop down in price a bit.
My 3090 used to hit the side case I ended up getting a PCIe 4 riser cable and all is good with the world. Even if you manage to get this in you'll never be able to fit the power cables.
The Asus TUF is $1799 plus taxes for 40% increase in performance for video editing. You can get a 3090 (I just got one yesterday) for $830 taxes included. Less than half the price. It's a great card, don't get me wrong, but since for nvidia Moore Law is dead they Will Charger You for it. Meaning inna couple of months wey Will get a 4080 12 with less performance than a Last gen 3090 for the same money.
waited outside Best buy this am 4.30 only to be told they have nothing in stock rushed to a Micro center and got the Gigabyte OC as availability of TUF was already taken, now I see a bunch of scalped cards on ebay with 1k + markups, insane. Best buy and Nvidia seem to have no interest selling directly to users. consumers. grateful I got one, very frustrating with Best Buy and FE edition.
What do you think about the 4090 board partner designs? What's your favourite? 👇👇
None 😂🥱
do you know where I can download the driver for 4090?
Gigabyte
Yeah, on Nvidia Site :)
@@theTechNotice but I cant find 😭
I upgraded to a 4090 from a 3070ti for deep learning applications and also gaming. Gaming wise, I have set the card at 60% power target at which it only loses about 6-8% of performance. To give a perspective, I play Apex Legends at medium settings on a 4K 144Hz monitor, the game is locked at 144 fps constantly, with the card working at around 40% and consuming on average 120-150W. The 3070ti almost maintained 144 fps on 4K low but overclocked and it consumed on average around 300W. Insane efficiency.
That 60% power limit thing was done with the founders edition which has a better PCB then most of the AIB's. Basically it has 70a power phases and a larger count of them then reference boards and most AIB's with custom PCB's aren't matching. There's maybe 5 total 4090's not including liquid editions that exceed the FE. This matters because it allows both greater overclocks or the ability to drop the power limits lower without performance throttling.
When I did crypto mining with the 10 series when those were new I would be able to power limit a good EVGA card to 60% meanwhile some PNY that uses a reference PCB would throttle hard on 70%. The reference PCB would perform less hashes at 80% then the custom PCB on EVGA at 60%.
Hello how are you if I may ask is your 4090 connector still fine or has it melted
@@Bubble1989 it has been running 24/7 at full load (training deep learning networks) for almost 4 months now and everything is perfectly fine. I am using the official Corsair cable though, not the adapter.
@@georgioszampoukis1966 thank you for your reply.the reason I'm asking is because have order me a 4090 aswel and I was this wondering if I made the right choice
@@georgioszampoukis1966 hi Geordios, I've been using a 2080 but have a 4090 on its way. How does it compare with the 3070ti for training models? Does it reduce training time substantially?
Love the fact you include creator export and rendering times here. Becoming increasingly more relevant.
although just realised I mixed up the 3090 and 4090 screen grabs, but the stats are still the same :)
@@theTechNotice the issue is that the adobe suite is useless as a benchmark, especially after effects. it was made before multi-core computers were really a thing, let alone discrete GPU's. This is coming from a professional VFX editor standpoint. Only 4 or 5 of the effects in AE are actually GPU ready, and many aren't even 16 bit, and none are multithreaded. that was removed because they broke it during an update to the decrepit 30 year old program. As you can see here, in most adobe software, better hardware gets you diminishing returns at best, and many generations of hardware actually made it worse. *it's just not made to run on today's hardware, and is poorly optimized. there's only so much throwing more watts and cores at it can do.*
This is exactly the kind of stuff that needs to be talked about. Too many people are looking at these cards from the perspective of casual gaming, of course it's not going to be worth it for that unless you have deep pockets. The real value for a card like this is the time savings that can be had for creators and professionals where time is literally money. In such scenarios, the card literally pays for itself. I don't think anyone using these cards professionally care at all about how much power it draws but rather how quickly it can get the job done. And that's another thing, if it can complete heavy workloads quickly then how much total power is it actually drawing at the end of the day?
Seems you're not living in Europe. Energy prizes are something to consider here. Also the global warming thing. Energy efficiency is the easiest way to help out on that, each one bit by bit.
@@brolly8667 you completely lost me at global warming.
@@chebron4984 or climate change. What ever you call it. It's a thing, probably the biggest problem yet to solve by humanity. Whether you will accept it or not. I know that it is highly politicized in the US especially. Which it shouldn't. Science should be free of that.
@@brolly8667 this video is literally all about how you'll probably use LESS energy creating & rendering on a 4090 than a 3090.
@@Fin365 Sure. Which is a good thing. But I was reacting on the comment from Chebron which said, that nobody cares about energy consumption in the professional field and I said why they also should. It's a good development that now efficiency seems also on focus and not just pure speed and workload. But I guess mostly due to necessity that cost or climate reasons because you also have to get rid of all the energy putting in by dissipating the heat again.
For me Davinci Resolve performance is more important than gaming, that's why I appreciate your work.
Getting these cards just for gaming is ridiculous 3080 is more than enough I think. Unless you're loaded and just want the best available
For me gaming is more important than Davinci Resolve, that why I appreciate your work also.
You don't play, you put the work in. Subscribed.
Thanks dude, and you're right, don't play at all! :)
Ok, I admit I wasn't expecting such good results in some benchmarks, and apparently I was wrong about how powerful the new 4090 could be. Thanks for the comprehensive video, it was quite helpful.
haha, yeah, did you see your comment in the video ;)
Honestly, I was amongst the ones who were wrong!
@@theTechNotice 😄😄😄 yes
It's due to an older chip, OS, RAM clock speed, among other things. Benchmarking a 4090 requires a 7950x/13900k, 6000MT/s DDR5, Win11, Z790 Mobo, etc. He used (a) garbage CPU, RAM, and Mobo. Watch the Linus Tech Tips video.
We knew how powerful it would be months ago explore other Chanel’s .
This channel is underrated, one of the best channels for a creator's pc build.
Rendering time is more important than the fps to me.
When I find some creator pc build or comparison on the search engine, I just come back to this channel for the answers every single time.
Love your video so much, good job.
Btw I think you can also add win 10 for next comparison.
Great stuff! Glad you focused on a creator oriented review! Love your channel!
Glad you enjoy it!
The Tuf 4090 can draw up to 600W compared to other cards at 450W. It's a pretty good AIB card.
Thank you for putting this in a different perspective. The truth is these cards aren't targeted at casual gamers. Most of the people complaining about the power consumption and the price of these cards were never going to buy one to begin with. The graphics card landscape has shifted some. It's become obvious that the 4000 series of cards aren't meant for casuals. They're meant for hardcore gamers chasing the most performance possible and professionals for work. If you're a casual gamer and want an entry level or midrange card you need to look at last generation.
I definitely think the TUF feels more premium than a STRIX , yeah they got those race car decals and RGB but the TUF is just looks much better, that's why I got TUF3080 the design and the temps were great, I really hoped ASUS were gonna continue this line and they made exactly the 4090 card I wanted, ordered the moment it was listed :)
Not to mention this non-oc TUF is MSRP ($1600)
As a Blender artist, this video was so reassuring for me. Thank you for putting your time and effort into making a thorough creator review.
Its wrong though, since when can you use DLSS with blender? Its not applicable
Thanks for being always focused on creators stuff, among all the mass of reviewers that think PCs are just for gaming. Are you planning to review also Intel ARC A770? Is Daz Studio performance comparable to Blender regarding benchmarks? Nobody tests DAZ Studio so it could be a nice thing to see in your channel.
I also really like the technical quality of your videos regarding the colour palette, the accurate use of lights and the savy camera use. Clearly your videos are very well planned and have all a consistent style.
Thanks dude, yeah I'm in the works with Intel about their GPUs :)
That intro deserves an award!! Great video as always bro!
Thanks dude!
How good on Unreal Engine and Aximmetry software?
With a little undervolting, this card should still perform well above the 3090Ti, but at the TDP of my current 2070S. Now that is quite an achievement! I saw some tests (Derbauer I believe) that found the efficiency curve for the 4090 could easily retain most of its performance at far lower wattage than its stock tdp rating. And you'd have an ultracool gpu on top due to the overbuilt cooler for this gen cards. I truly like everything about it, only the price is quite spicy but I'd much rather get this than any of the AMD cards or the 4080..
My only personal drawback is that my render software of choice, Corona renderer, relies on CPU rendering and only uses the GPU for denoising 😭 At least my 13900K should take care of the rendering I hope
LinusTechTips did the undervolt experiment where he tried to undervolt the gpu to make it work with a 550w psu. Even at like 60% of its powerusage the gpu only lost roughly 10 frames at 4k ultra in cyberpunk.
You're the only reviewer among the videos I watched about the 4090s that noticed the typo on the card 😅, and I totally agree with you about the overall design of the cards, they all look so damn cheap other than a few exceptions.
Is this a 3 slot card ?? Want to put it in the NR200 always liked the tuf cards
Thank you for doing a creators perpective review of this card. I build a new pc last year with a 12900K and 128GB of DDR5 ram, but with my old graphics card ( GTX 1080 TI ). RTX 3090 costs about 3000€ in that time. And my most important program is davinci resolve and mostly fusion. So this card will completely change everything for me.
I ordered directly the MSI 4090 suprim liquid X. I like the design... not like the other 4090s like you said in the video. And with a liquid cooler, so the card is way smaller than the others.
$1749? RIP.
@@mct8888 I just ordered the 4090 Zotac Amp Xtreme Airó .. Rip my 1800$
@@Xyz_Litty I'm living in germany... it's so much more expensive here... I paid 2329€
@@rf-cinematic1576 same price in portugal. and the strix is 2549€. we getting fucked here
What's interesting is across several channels, testing multiple card manufacturers, none of them are any better than the FE. So as of now it appears NVIDIA has saddled its partners with producing higher cost, larger (non liquid) cards that offer...nothing in terms of performance. EVGA's withdraw becomes clearer.
interesting....very interesting...
Can you recommend a version of this card which is the most quiet? Is the ASUS TUF version quiet with 3 fans? Thanks.
Is GPU a necessity for streaming?
WHAT AN INTRO! Fastest like I've ever given. Thank you for this video!
Really good video. Everyone is looking at games and FPS and it hard to justify an upgrade on a really solid card like the 3090 or 3080 wich will keep you playing games for some years still. In 3D production the 3090 was a god send. The 4090 will enable smaller studios to achieve results faster and increase the scope of the projects. I agree most board partner cards are the worst examples of industrial design. Spending so much money on something that looks like a bad christmass tree, baffles my mind.
It's only hard to justify if you don't have a high refresh rate 4k monitor. I got a 4k 240hz monitor that's $1500 by itself. I think I can justify a $1600 gpu to push that amount of 4k frames to that beast of a monitor!
@@jefferybucci30 Sadly the pro or colour accurate monitors for production that is above 60 hz is so crazy expensive
@@herculesmare4209 true, I got mine for gaming and I’m not regretting it. Currently have a 3080 and it’s okay but it’s definitely not able to push the amount of frames at 4k that I want to.
@@jefferybucci30 except they screwed us with display port 1.4a and hdmi 2.1
So we won't be gaming at 240hz 4k without chroma subsampling.
That's the one thing that annoyed me about this launch
I'm going for a 4070 Ti instead, since I can't afford a 4090 & also I'd rather build a new system if I wanted a 4090.
I'm going to pair that 4070 Ti to a Ryzen 9 5950X after I upgrade it from a 3950X.
Excellent analysis! Just what I wanted to hear for use with DaVinci Resolve. 👍 Do you plan to conduct any local AI Rendering (i.e., Stable Diffusion, Flux1 Dev, Mangio, etc.) comparisons if future videos? 🤔
I bought precisely this one (ASUS TUF 4090 OC) about 2 months ago, and I'm extremely happy I didn't go with STRIX (I had money, they just were out of stock). I like the design, I like the minimal RGB, which I set to reflect GPU temp. Never runs higher than 65 C, even on the most demanding games with 4K and RT on.
Do you get any coil whine? If not, I'm curious what PSU you are using?
@@damienfiche2592
Mine tuf 4080 whines a lot.
Very nice review. Is the 4090 supported by the ASUS Z690 Creator Pro MB? Will it play nice with other components like RAM, i9 12900KS, etc?
Yes, it's supported on all motherboards that support Gen 4 PCIe x16 slots :)
Incredible intro !
Nice comparison of this crazy gpu.
Do you think in a Ryzen 5900x 12 core system the rtx 4090 can use it's full potential in davinci resolve?
Or should I wait untill the 3090 prices drop some more and stick to that card?
You should be good mate
what case would you recommend for the 4090 then? seen a mix of recommendations like the p600s, 011 dynamic xl etc.
yeah, both of these work. I guess all full towers work, just have to be careful with the mid-towers :)
Did you experience any coil whine on your TUF, and if so, has it gone away over time?
Never trust mid sized channels like this channel as they want to satisfy the big three to get free parts. Only watch niche or big TechTubers
Can't wait for a 13900k+4090 vs Mac studio Ultra for video editing extensive comparison!!
Keep in mind that DLSS3 is just DLSS2 with reflex and frame generation. So, it's using AI to create frames that DO NOT exist, which I could see as being problematic for a content creator as well. And I take it that it has dual ENCODERS, but not dual DECODERS? I'm not as interested in encoding support as I am about decode support. I was super excited about this, but less so if it's just the encoders. Since I work in x264, I was hoping for big gains in timeline perf.
Regardless... these were the benchmarks I was interested in. It'll be interesting to see where prices fall once NVIDIA burns through 30 series cards and they run out of less price sensitive people.
Yeah just 1 decoder but 2 encoders...
Do all 5 video outputs work at once? Or is it still limited to 4 monitors even with the extra output?
Nice to see that AV1 was used to encode this video. Probably why 8K was possible on my internet for this video!
Hi, at 3:55, you show the encoder results for both RTX 3090 and 4090, showing a 26.5% increase on the RTX 4090. Are you sure this is correct? Isn't that time a lower is better, as is seems to consider the time it took for the encoder to finish the task, so less time should be better, no?
Is this card good for Davinci Resolve? Better or worse than Radeon 7000?
No h265 422 10bit decoding…😂…in 2022…thank you for pointing that out. This cements the mac as THE platform for videoediting to me. Sure, intels egpu does it kinda decent, but nit nearly as smooth and fast as what a macbook pro max does. So thanks, you saved me a TON of money!
We have the same effect when they pass the 980 ti to 1080ti. The change was enormous and at that time the people was shock, the double of performance for the 1080 ti was amazing.
Outstanding video! Just received my 4090 TUF OC and waiting on Cablemod order before I install it.
Fair winds and following seas to all.
does it feel smoother in the resolve timeline? I'm tempted by this but also for a new intel chip with the iGPU
Well the decoders are the same, so the only boost you get is the effects and colour grade and fusion etc... also dewatering on r3d and other stuff, but purely h.264 and h.265 timeline is very similar.
@@theTechNotice thanks man!
Just ordered a Asus TUF 4090 (non oc)
Is it the same as the oc one? i can just do it myself? I read it was possible on old TUF to flash the bios of the strix, is it safe and doable on this card as well?
I saw a test of the TUF, performance and temps looks amazing, 62c vs 74 on founder at load 😊 will get it by saturday can't wa
Im not sure if the non OC card allows overclocking.
I understood that OC stands for “it allows overclocking” and not much for “it comes overclocked”
That said , you won’t really need to overclock this card 😅
Its not even 50mhz in oc so no difference :)
nice but needing a more powerful power supply and a bigger case is extra money.
If I buy a 3080ti or 3090 for 1440p gaming, how many years will it be good for before I need to upgrade to another gpu to be able to play new AAA games in the future?
5 years or more maybe 10 max
@@SC.KINGDOM thanks
@@angelaguilera600 it’s cool xD I have a rog strix 3090 oc for that very reason plan to stick with 3440 1440p highest settings. Hope you find a good deal my 3090 was a limited edition 2200 pounds
Best review of 4090. For Resolve and Blender, it's a no brainer, if you need that kind of performance.
How to download RTX 4090?
Finally a PC channel for stuff besides games :) Thanks you helped me chose my new rig.
very impressive for 3d workloads, I'm hoping AMD brings something impressive to the table with RDNA3 , especially for creators, I'm building a new workstation for my daughter soon and it would be great to have something that can compete, ideally at a slightly lower price point. Additionally i really don't want to give nvidia my money, the have a great product but very questionable business practices and i just don't like the way they treat the consumers of their product (and aib partners for that matter *cough*EVGA*)
I don,t want to pay tax i don,t like my government the way they treat people then *cough* inflation 8.1
I'd like to see the 4090 down clocked to a 350W max and see how it compares to a 3090. The Nvidia driver command line utility should let you do it.
yeah when i buy the 4090 i will lock it for FPS max so i get some energy saving in return. the down side of 4090 seems to be a quite high idle watt usage and tops the list
I'd like to actually see the test results at the low end and at idle for 2D productivity work. I'm not sure how low it can actually be throttled. Maybe... just maybe, the card actually gives people options to either run at bleeding edge speeds or scale down for work and save on electricity. I haven't seen any PC press look at it this way, yet. Everyone's been "OMG so much POWER!" - but it's actually completely optional that the card runs at these speeds.
Definitely putting this video on a loop at the store today. Or I will be answering all these questions today.
With you 100% on the partners designs of the cards. Only 2 I would consider are the ASUS TUF Gaming & Founders edition.
alright I take my words back, those days have gone where every generation saved 250W power limit :(
Very helpful. Concise and well-organized review. You're getting high performance results on a gen4 mobo and a gen4 CPU. Although I wonder if you'd get the same results with a Ryzen 5950x.
nice, can you please tell me what music you had played in the beginning of the video? I loved it and want to hear the full version
It was from Artlist, not sure what the title was called again :)
when doing email/browsing, what are the power uses of a 3090 and 4090? Are they the same when just cruising along? Or are we going to see 225 watts even when idling away on the 4090?
No no, it'll idle at a few watts, I saw it at like 8w lowest if I remember correctly.
@@theTechNotice I wonder why no other reviewers ever say this? So thankful you took the time to share it! Surprised how low the wattage is when its only idling!
PLEASE HELP ME TO CHOSE A BUDGET OPTION:
3060(12Go) or 3060Ti (8Go) or 3070(8Go) ?
I'm doing only Davinci Resolve 4k with H.264 codex from my Pixel 3xl phone.
My setup: 1060 (6Go), 64Go RAM, 6 Cores double thread CPU.
Thanks for any comment here!
Awesome video. Thanks again for the info and the work you put in.
One of my favorite designs is probably the Colorful Vulcan, that one is nice looking and comes with a mini OLED screen that can show temps and other information.
Excellent video! I have a question, if you could make a video showing the differences between a system using an RTX 4090 and the same system using two RTX 4080s. Would there be much of a difference in performance within the programs?
Thanks for the content! Keep up the great work you've been doing =D
Still a comparison 3090ti to 4090 would be great to know, which one is the better deal now.
The 4090 is a sick piece of tech. And people were ungrateful for the 20 series and 30 series already and bitching despite the ridiculous performance increases over the years. Hell, people still try to get their hands on a PS5 which has the gpu power of a 3060/2080. And in the end everyone tries to get his hands on these new cards. Hypocrites.
Much love to NVIDIA for this next huge step.
I just got the MSI suprim Liquid X RTX 4090, in my 13700kf, 570, DRR5 6400 system and hit top 120 systems in the world on benchmarking (superposition) and can get 140 FPS in Cyberpunk 2077 at 4k everything maxed out. Loving it.
Also please Vertical mount your 4090, it looks badass, it protects it from damage, HYTE Y60 Case is small beautiful case, I got the white one with Red RGG with black accents and it looks out of this world. Got my 4090 with push pull and 360mm Rad in the top cooling the CPU. Cost me £3859 for the build and I've already had a mate offer me £5000 to buy it off me, I refused of course.
How is coil whine on the Asus card?
Thank u man, love your channel, keep i up!
thank you very much for live playback performance comparison 11:47
Finally found my video. Thanks 🙏🏽
Thanks! Good review! For content creators it is a no brainer. Btw, it is Octanebench 😉.
Hi, could i check if the CPU you used for doing the benchmark testing is intel i7 12th generation?
Yes
Hi Thank you. Could I ask would there be a significant improvement if the CPU was intel i9? is it worth the extra cost in for 3D modelling, rendering?
I just love how people will say it is not worth it.... I mean if you are on budget dont look at 4090.... And if you use it for work, then money you save on time are always worth. If you are just rich why even bother to think about it.... Buy the best :D
Thanks for all what you do for your community :)
I'd like to know if GPU have an impact on the payback in general ? In Premiere pro ?
Gor now, I'm using a GTX 1660. But somethimes the effects on the timeline are a bit buggy. I don't really care about exporting faster but, by changing a GPU, will the payback improve ? Even for a bit ?
Technical question I know.
Thanks ny advance !
I don't really into creators stuff you know, but you can get the RTX 3060 to reduce and minimize stutters in Premier Pro
It does help. You still need to use proxies if editing 4k or 8k content for smooth playback.
Performance wise and thermally, is there a gap between the FE and the partner designs? I absolutely hate this generation of designs and the massive prices on the aib cards but because it's for work performance is king for me.
Now, not a huge if any difference at all :)
Its interesting to worry about the appearance of the card.
I guess if I spent my time looking at the GPU instead of using it for its intended purpose I might care about aesthetics. My concern for the GPU's appearanace lasts as long as it takes to get it installed. Once Im using it I couldnt care less.
People were wrong because some people like to complain and the rest are sheep. I am annoyed by some big tech TH-camrs constant whining about new tech.
Great video, too many gamers think these cards are made exclusively to satisfy their needs when gaming. 😅
This thing launches today right? How and were do I buy it? Nothing appears available
It launched a little over an hour ago. Availability obviously depends on where you're from. Assuming you're from America they went live on newegg and Best Buy this morning and are now sold out.
Oh thanks bud. I’ve been looking all morning on those two sites and even selected to auto notify when available. Such pain in the butt
@@blueskys6265 Damn, I'm sure you'll be able to get one in the next round if not this round. They sure seem to have a lot more stock and without crypto/scalpers it should be easier.
@@azyndragon89 Thanks a bunch. I’ll just keep looking. One will show up on the apps at some point.
I9 9900k with 4090, will it have a big loss of performance in 1440p? bottleneck?
13900k 4090 monster build
Wow 268 w on octan is crazy. This new gen of core cuda is unbelievable
Ridiculous counter? :) Great video!
Bro i can Garentee that u will reach 200 k in a very short while
Thanks for the review for creators. It is useful overview. :)
i hoped it was a review of the TUF card. I think you missed out a lot of details like idle wattage where 4090 will use a bit more than 3090. also that ASUS is the only card having 2 HDMI. and for the last part if it was using 3-4 power connectors, as this depend on if it goes beyond 450W in overclocking.
what is the the difference between the TUF and TUF OC model. all cards look very much the same. and looking at the INNO3D OC model it was not even able to overclock. so what does this OC mean in ASUS terms?
And then we look at "performance-per-£". Here in the UK I can pick up an RTX 3090 on ebay for ~£725 (though some, on rare occasions, go for £650), and then compare it to the prices from any vendor in the UK, the LOWEST PRICE I can get an RTX 4090 is £1800... but they go up to £2300. (Note: When you see the MSRP in the US, that is not including sales tax - so add another 5-10% to the pricetag. so it's not $1599, it's closer to $1700 USD, which, btw, is more than the cost of every other bleeding-edge component that I can put into a new build).
So... that means (here in the UK) the RTX4090 is 2.5x the price of the RTX3090 (using the numbers that are most favourable to the 4090 - I could have made it look worse). So even taking the best score comparisons that you showed (which fluctuated between 1.9-2.3x the performance) then the 4090 it comes down to...
1. RTX 4090 consumes more power - other channels have measured from the wall, not from inside windows, as you have
2. 4090 gets better performance-per-watt
3. 3090 gets better performance-per-£
So the real answer is, what can you afford. I've seen sites that say "most will just spend a tiny bit more to get the 4090) - but at nearly £2000, most of us can't drop that kind of cash every couple of years on a GPU. It's utterly ridiculous.
Basically, you would only buy the 4090 if you are going to make money from it (or if you just have the spare cash lying around).
In my eyes, it's a golden opportunity to get into the 3090 if you are still on a 20-series card. Because buying that 4090 is gonig to cost you £1000+ to rent it for a year (if you flip it just before the 50-series come out), or if you like to keep your cards forever, you're going to absorb nearly the full cost of that card - ouch.
I honestly think that the best solution for all of "us normal folk" to NOT buy any 40-seriest stock. Let nVidia control the stock levels... I would love to see them sitting on surplus inventory, drowning in pools of their own sweat. Sure, let the content creators and professional should buy it - they have a reason to - it's their livelihood.
BUt the rest of us? I think we should vote for lower-priced GPUs by starving team green (and red). I have 4 TVs in my house and they - collectively - cost me £3000 and my family gets way more use out of them than most of us would ever get out of a GPU - and they'll last 5+ years.
The GPU market just doesn't add up. People are wasting cash on these GPUs because videos like this are making it seem like a reasonable purchase. Please stop supporting this lunacy with positive messaging like this.
@David Fidler this is the most sane answer to the RTX 4090 i've read all week , everyone on reddit is just pushing this 4090 for gaming and not work load/ cash flow IRL and they don't take into account the actual cost of electricity.
I live in Romania , it's not a 3d world country anymore but the salaries are still some of the worst compared to the rest of the europe , we pay 26-27 cents per kilowat in a house hold and our salaries are stuck at 700-800 EURO max.
Ppl have become so ignorant today it's crazy.
I totally agree, the Founders Edition has the best design, but this Asus TUF looks nice too. Most other designs are for gamers as that's the majority of people who will buy these cards, much more than professionals. I'm a gamer too but I really prefer a more professional looking product over a flashy RGB one.
Same. I really want the FE or MSI Suprim X. I have a Gigabyte OC coming in, but thinking of cancelling.
As a reviewer, I like the fact you noticed the error in Asus coordinates listed on their GPU 🤦♀🤦♂😅😂🤣🤯
13:13 I am not sure what testing platform you had for Puget Bench for Davinci Resolve, but when lookin at user results on pugetbench page, most results of standard test for 3090 was about 2000 and most results of 4090 is about 3000.
Main problem with the 4090 is the 3090. Unless you only game at 4k, get a 3090 or 6900xt. Also many 3090s are going on sale between 799-980 new, and lower used. Can build an entire 3090 gaming computer for price of 4090 especially if you can’t find a 4090 FE.
You know this channel is about content creator right?
I still prefer seeing 60-70% better Raytracing performance in games, as going with anything but max graphic settings is the sole reason to get a card like this for gaming anyways. Also why I am not fond of the AMD 7900XTX, as it doesn't care about anything but raster, which is quite short sighted imo. The 3090 would have to be at a really good deal for me to consider it, it's over 2 years old now and has its share of problems, some cooling hotspot issues etc, and overall just not as high quality as the 4090 - without even considering the performance upgrade.
If the 3090 performance point is what's relevant, then I'd rather wait for 4080 cards to get in stock and drop down in price a bit.
As much as you stream you should check out a tricaster mini from Newtek. They have a feature where the cameras follow the mic you are wearing.
I think creators don't care what the card looks like, cost, stability and compatibility are more important.
Will the Asus 4090 tuff fit in the lian li 011 evo case? Some people say it might hit the side panel
My 3090 used to hit the side case I ended up getting a PCIe 4 riser cable and all is good with the world. Even if you manage to get this in you'll never be able to fit the power cables.
20:26 Eventhough my future 4090 will never see daylight and I couldn't care less about the designs, I still kind of agree...
Today my TUF GAMING RTX 4090 arrived 😁I just have to wait for the new power supply which arrives on Saturday and then I can finally run the 4090.
im a blender user, but 2x performance is irrelevant if the price is to high.
then for all that money i still have to deal with god damn LED's
4090 costs $2800 on eBay, the only place it is sold right now. There is no bang for your buck.
Well let's hope for the best is only been launched for a few hours....
Did Nvidia state that 5000 series will be double the performance of 4, or was it speculation. That def affects my decision to jump.
The Asus TUF is $1799 plus taxes for 40% increase in performance for video editing. You can get a 3090 (I just got one yesterday) for $830 taxes included.
Less than half the price.
It's a great card, don't get me wrong, but since for nvidia Moore Law is dead they Will Charger You for it. Meaning inna couple of months wey Will get a 4080 12 with less performance than a Last gen 3090 for the same money.
Your comment was spot on! tho its the same performance for MORE money! haha
waited outside Best buy this am 4.30 only to be told they have nothing in stock rushed to a Micro center and got the Gigabyte OC as availability of TUF was already taken, now I see a bunch of scalped cards on ebay with 1k + markups, insane. Best buy and Nvidia seem to have no interest selling directly to users. consumers. grateful I got one, very frustrating with Best Buy and FE edition.
nvidia also stated its 2-3 hours of measurement for heat percentage. Not initial hear measurements.