People who work on a budget, or anyone who isnt in an industry or job that might require the latest and greatest should not be foaming out the mouth to buy a new GPU. 2060s are available for sub 300 dollars on amazon and 3060s regularly at 400. i sit here doing game development, blender work, play my favorite games, steam my favorite content all on a 270$ 2060. Im generally baffled at the people who look at GPUs that SHOULD cost 1k+ and go "its not fair i cant buy that" when 2060s and 3060s exist. Its like looking at your 2018 ford focus and going "i cant believe they are selling lambos at that price, now im never gonna be able to get one." AS IF YOU NEED A LAMBO TO GET TO YOUR JOB AT BURGER KING. Your ford focus is reliable and when it breaks down, unlike a lambo- it isnt the end of the world.
As powerful as this is, I honestly can’t justify the price. Not to mention the cooling & space requirements for the 4090. And the fact that the 4000 series doesn’t have DisplayPort 2.0 is legitimately baffling.
huh?? its way easier to justify this price vs the 3090 or 3090ti? Im sorry but if you are someone who always buys the bleeding edge gpus this is the 1st time it doesn't feel terrible to upgrade. yea the dp 2.0 its shitty but the uplifts at 4k are pretty nice here.
OUt of context I tried displaying on 2 3 screens using an nvidia GPU, and it brought stability issues that could result to BSOD but then I bought USB display adapters and they were stable intel enabled the and no issues arose.
@@thatdogeguy9108 amd is on the same pricey path as nvidia sadly they would be if their old policy at the time of the 4870 still was on course i remeber at the time they were on a policy to make pc vid cards cheaper
I remember buying a 1060 for around 300€ and my parents mocking me for spending so much on it. The look on my mother's face when I told her about these gpus prices was priceless
The first GPU I bought was 11 years ago, a GTX560 for 200 bucks, BF3 and every other games runned at ultra with over 60fps (120hz just started to be a thing back then)... Now you have to spend double for a xx60 card so you can play current demanding games at medium/high settings. What a time to be alive.
They didn't put displayport 2.0 on the 4090 because they don't want these cards being used 5 years from now... so you'll have to buy the next gen gpu with the 2.0 port. It would be like making a car that last for 30 years, they want you to come back and buy another
Surprising differences in some results between reviewers. For example Jayz2cents had Cyberpunk 4K (ultra preset, RT/DLS off) average fps only at 76fps compared to LTT's 136fps. I wonder what could make such huge difference?
For years we've been focused so much at the pinnacle of gaming pcs that NVIDIA's forcing us to look back onto practicality by their insane pricing. Maybe Intel has a point.
more competition came out, competition showed nvidia that you can charge similar prices for a crappier product, so their prices adjusted to make sense in the market space. not very surprising, its what the people wanted, apparently
The real money is in selling budget cards. Which is why Intel hopped into the pool with their arc cards. Nvidia is moving further into an enthusiast/professional PC direction. It's like a Toyota vs a Lamborghini, one works well for most, but the other is most desirable.
@@nathanjokeley4102 RTX 4090 cards are aimed at hardcore, high-end gaming enthusiasts who demand the best there is. They don't care about prices. I did belong to that group for a long time, but times change, priorities in life change, so I dropped out of the race.
Though pricing for these cards are beyond crazy. This is the first card where turning on Ray Tracing will finally makes sense ( when I buy them at discount some years later).
atleast after 5 years of my younger brother mumbling about dlss and ray tracing superiority, he might even use these features for the 1st time 😆 (nvm, he said his going amd after release 🙃)
Another 3 generations it will probably preform decently on a 50series which means in a couple of generations after that you could expect pretty wide adoption. When those cards are hitting the used market.
@@esatd34 Seven years? ..You're probably just being facetious, but it definitely won't take that long. In just three years, the 4090 will practically be chop liver.. The technology moves exponentially fast. Moore's law isn't dead, Nvidia is just full of shit.
@@lejoshmont2093 I'm waiting to upgrade till the 50 series... the next 'big thing' is going to be path tracing in games and maybe by then the cards will be powerful enough to support it better for non-tech demo use.
I bet they are saving displayPort 2.0 for the Ti Series just so they can bring them out half a year later at around 50% more cost and justify the price hike for just putting on a display port and add pci gen 5.
I think a very valid test would be a "office" or "bed room" size room with something like this. We measure ambient and output, but more airflow = moving more hot air = more hot air into the small room. If that room is not near a thermostat you're gonna have a small sauna in your home office or a freezer in your living room while trying to keep the office cool.
Its not impressive when the price is almost 2k. It would be impressive if it was at a reasonable price. I mean before they too could make mote power gpus, but they were limited by price, where people wouldnt even consider buying a 1k gpu let alone 2k. Remember in 1080 era 800$ was too much. I mean everyone can make the fastest gpu if they got unlimited price and see a gpu for 5k or 3k.
Honestly I find the lack of Display Port 2.0 more concerning than the price since you atleast get some insane performance for it, but not being able to take full advantage of it for high refresh rate 4k gaming is pretty stupid lol
Anthony is such a great host -- clear, concise, covered all the bases, mentioned the case average temperatures, testing conditions, ambient air temperature, etc
@@xenox8553 Anthony is listed as both the episode's host and writer on the end slate, though that doesn't rule out others helping him. At any rate, while these videos are collaborative efforts, it's perfectly normal to have a favorite host(s), as they each have their own delivery.
They're probably reserving that for their Quadro cards, so that companies who need higher refresh rates or resolutions are forced to pay 5x more. It's the Nvidia way!™
I feel like if RDNA3 is able to drive 144fps at 4k Nvidia are really going to be kicking themselves, to the point where we might even see a revision 2 or something like that. They've made it so that AMD doesn't even have to *match* their theoretical performance in order to outperform them in the real world. If your card can't drive more than 120hz due to bandwidth limitations, does it really matter how many "more" frames you get?
@@spaceduck413 honestly I could care less about frames counts I really just like being able to play games with consistent frames which is why I like 30 and 60 fps no higher no less but honestly GPU's are absurdly priced which is why I don't even touch the RTX series and stick with my gtx 1060
I remember all the hype that came with the 30 series cards, especially the 3090, getting released. There was none of that with the 40 series. Back when the 20 and 30 series came out, me and my friends would talk about them all the time. No one said anything when the 4090 came out. It was so far out of reach that we just didn’t care.
$2000 USD in australia. I could spend the money on it but I just can’t justify it. No way in hell. It would cost more than my entire system. With 9x fans $200 case, water cooling etc. their just ripping of loyal customers now and I hope they lose sales
@@josdebosduif5 this is exactly what I was gonna say. The power draw on this card is actually insanely good compared to everything else out there including nvidia’s 3090 and 3090ti.
its competition. The 20 series might not have been as much of an improvement because nVidia ballooned the size of their chips in order to fit in all the extra RT/Tensor stuff, but the 30 series returned more to normalcy, however actual threat of competition from AMD lead them to be as aggressive as possible with performance with less regard for how power they are targeting per tier. Power/performance has never really gotten worse since the 10 series, you just have to be willing to apply your own limits to the cards. Take a 2080, tune it to draw no more power than a 1080, it will still be plenty faster. The 3080, even more so. What will be really interesting to see is the 12GB "4080", which has a die size smaller than the 1080 (and is barely larger than a 3050/60), but has more transistors than a 3090 Ti. Its just a major shame they're trying to fleece us for it with that absurd $900 MSRP.
All of this hardware (both CPU's and GPU's) is already extremely efficient, manufactures just crank the hardware to its limit and the power increases exponentially when its close to the limit.
RTX 4090's raster performance and with DLSS is pretty impressive, but for a whopping 1599$, I could buy a whole PC for that price, and with it's 450W+ TDP, even that 1600$ PC could consume less or same power as the RTX 4090 would
@paradox I bought it. I will legit have a heart attack if amd has a better card for a alower cost because the 4090 was hella fucking expensive AND i had to stretch my budget A LOT.
I swear, Nvidia just radiates so much utter contempt for their customers, partners, and worldwide power grids that it's actually giving Apple a run for their money.
@@manjindersinghsaini911 6900xt is a joke bro imagine buying that 💀💀 the 6950xt had half the performance of the 3090 in most games. Even worse in productivity. What a joke
Edit: Anthony has posted in the comments about this issue and confirmed it wasn't DLSS, but a different setting, that wasn't setting correctly, new numbers which are closer to what we see in other games were shared. Still a big uplift from the prior generation though :) I'm curious now how the other 40xx cards which are closer to the current 30xx pricepoints will compare to their past generation's numbers. The rest of my comment can be disregarded in light of the corrections coming from Anthony/LTT. There was another reviewer who mentioned that cyberpunk seemed to have issues with changing settings and it took several attempts of changing the setting and restarting the game to get it to run with DLSS off on the new Nvidia cards. DLSS being the thing that "creates fake frames", it severely affects framerate measurements if it's not actually turning off in this measurement. I'm not saying that has to have happened here, but it wouldn't surprise me. I would love if you could double check it for us :c
They said they were suing older drivers because new drivers were faulty at the start of that. That could be the difference, the also didn't use TAA in this video like DF. They showed with DLSS 2.x on and off in this video. With ray tracing and DLSS performance it was higher than with DLSS off and ray tracing off.
@@mrlacksoriginality4877 The charts said DLSS off on both versions (and he clearly says "without DLSS" @5:35). Only RTX was on in the second graph, not DLSS. But if DLSS had secretly been on in both (not even like they meant to leave it on, just because the drivers/cyberpunk have a known issue with leaving it on even though the settings say it's off) it would inflate the framerates drastically like this and invalidate the test results. If they had a separate measure where they claimed DLSS was on it would be helpful, but there were none in this video so I'm not sure what you're referring to when you say you saw a comparison with DLSS on in Cyberpunk from this channel. TL;DR: DLSS 3.0 (not 2.0) uses an AI to "guess" what frames should look like, and inserts a "fake" frame between each pair of "real" frames. It does this at the cost of adding measurable latency and a risk for visual artifacting / loss of clarity (as the reviewers noted in this video). I suspect the average player may prefer to play games with DLSS 2.0 instead of 3.0, or with DLSS off all together. Regardless though, if a game (like cyberpunk) were to accidentally run with DLSS 3.0 enabled, even though the reviewers checked and thought it was off, then that game would be able to roughly double it's FPS. And in comparison to the performance increase of other games which weren't lying about whether DLSS was on or not, it would show a much more massive performance jump from generation to generation. Which... is exactly what we see here, if you halve the 40-series performance in cyberpunk it's still a big increase, but it's much closer to the performance increase we see in the other games being benchmarked in this video.
@@Night_Hawk_475 They just posted the issue. It didnt have DLSS on but fidelity upscaling on because of some technical issues. At the 6 minute mark you can see DLSS on btw.
the fact is that you never really NEED the fastest gpu to play new games at the highest settings. you can go a few rungs down (even a few generations) and still do just fine. after a certain point the added performance isn’t enough to justify the price
I am sure you are talking about 1080p here, which can run high/very high settings 60 fps on bloody any GPU. The moment you go 1440p the requirements jump quite a bit, and when you add a 144Hz monitor it just even more, WAY more than the resolution increase, that is where i am at, my 1070Ti cannot run anything at 144 FPS, it could barely run Fortnite at very high when i bought it, but sure it's just one game, PUBG i cannot reach good quality and over 100 FPS, it's a trade of, same with Warzone, etc. Then you have people, i will be one of those one day, that are at 4K, and some aren't even at 4K 60 FPS, but at more, and NOTHING but this 4090 can run 4K at above 100 FPS. I am not advocating for such absurd prices or the 4090, but your statement is wildly incorrect. When someone tries and plays 1440p for a while, they don't want to go back to 1080p, same with 144Hz vs 60 FPS, and just reaching 1440p 144FPS is bloody absurd, and that isn't even that new or premium or whatever, 4K and refresh rates in the hundreds are the niche. Yes a lot of people play at 1080p 60FPS, but a lot of them never even tried anything higher.
I think it only matters to eSports gamers who need max frame rates and enthusiasts who like using all the settings to the max. I honestly turn off a lot of setting in every game which I believe ads bloating and a tacky look 🤷. But definitely would be nice if you had a lot of money 👍
Correct, on most games the jump from 1440p to 4K is not that noticeable, same between the high and ultra presets of most games. We literally dump electricity for very diminishing returns with most people unable to even tell the difference. (But swear up and down that they can which drives a lot of toxic elitism.) The chase for the ultimate hardware has gotten real mindless at this point with bragging rights the only metric that matters. All to play the same single player games that thrive at 60fps and the sweatfest online games like Apex legends and COD Warzone that look the same plus are so overwhelmingly infested with cheaters that an investment in the fastest GPU money can buy at the grossly inflated margins Nvidia asks for is completely unjustifiable.
can you believe we would rather intel enter this market to bring competition and theoretically help “fix” this problem than see nvidia continue this path. it’s actually amazing… and im here for it. i hope AMD and Intels drivers come to play ball.
That right there is why Nvidia wasn't permitted by regulating agencies to acquire ARM. Jenny demands total ownership and control, anything less is inadequate for him. He is just like Steve Jobs.
I'm a simple man, I see Anthony, I click video. I know it gets shot down all the time, but would love to see a Linux Tech Tips with Anthony if he's willing to do it.
I went with a sim racing 3080 for $400. Unless I absolutely need the fps -- I have a hard time justifying >$500. Things like this definitely keep the console market alive.
@@idwithheld5213 the 5800 X3D is a dropin replacement for his 3600, no new motherboard or anything else needed, and it is one of the absolute best gaming CPUs, maybe the 13900k can beat it on paper, but there is no smarter or better upgrade path for him right now, unless money means nothing, and for you giving "advice" ofc. someone elses money means nothing to you!
With prices ever rising ,I think it's time that tech youtube starts to refer to GPUs not by their names, but instead by the most useful thing you can buy for the same price. This Honda Prelude review is pretty good ,for example
Thank you Anthony for finally being the voice of reason on the lack of DP 2.0 Maybe if this was discussed earlier and there was more of a consensus among reviewers we could have pushed back on this anti-consumer tactic. I'm sure nVidia will be more than willing to sell me a 4090ti for $2099 in 6 months that actually supports DP 2.0
@@platinumjsi Can just use the HDMI 2.1 which is probably what most people getting this card will be using, though I question why there is just 1 HDMI 2.1 port but 3 DP 1.4 ports.
They did the same thing when they didn’t support the HDMI 2.1 on the 20 series cards which I missed when I bought the LGC9 to the point of thinking I should upgrade to the 30s series. So yeah I’m not gonna fall for the same mistake again, future proofing is a must so I will wait and see what AMD gonna offer it feels like they did it so they can add it on the next generation to justify upgrading not to save a buck
What I took out of this video are the impressive stats for the 6950X : not a perfect card, got it's pros and cons but the margins are smaller than most think and knowing it was cheaper by 50% than the 3090 and it's trading blows with the 3090Ti which costs almost double in many titles is a big W for gamers and got me really excited for RDNA3 coming up very very soon ! Been riding with NVidia so far but I've been considering jumping ship these last few years
AMD GPU really age fine because of the driver updates. You can see it as a good point or just see amd cards as unfinished when they come out. Vega GPUs became a lot better with new drivers as an exemple
No 2.0 support is basically making this gen an extension of the 30 series. 4090 should be flagship performance material, yet they wont offer latest tech on display port. Im also considering jumping ship after this gen for sure.
In a few years when the 6090 launches, hopefully competition will have forced prices back down closer to $1000, and I might be able to justify buying one so I can start gaming in 4K.
To be fair, I see the prices coming down. They can only keep charging so much when AI upscaling starts to do most of the work. The actual hardware is likely to plateau because there's just no need for monster cards with 600w power draw when the majority of the frames are software/AI based.
@@ignacio6454 yeah or $500 since everyone not buying it coz electricity is getting more expensive, and more GPU power wasted for nothing wont attract anyone
I really appreciate showing off the benchmarks with DLSS OFF. I'm not against using DLSS personally, but my "goal" is always to play without it if possible.
Game graphics have been in a stalemate for so long that these gimmicks seem more worrying than exciting. Anything above 1440p is just unnecessary unless you're display is over 70'
I remember when the flagship gpu 5-6 years ago (980 ti asus rog) was $650… no one can’t say that this performance isn’t amazing but the price is just ridiculous. While other hardware manufacturers have maintained the price or even lower it for better performance, nvidia has tripled the price
I paid $500 for a Geforce 2 Ultra in 2000, which is close to $800 today, it was outdated in less than a year. Kids today don't realize they had a good run of cheap components for the past decade, those days are over. Chinese workers don't want to be slaves anymore.
Yes price has increased a lot but also even price to performance increases every gen and in most cases also performance per watt too. I too hate that the prices are so high BUT you got to consider the fact that back then they didn't have a 90 series card nor was covid a thing which drastically affected inflation as well as costs of items and so on and so forth. Here in the UK for example I could easily get 2 Litres of milk for £1.50 back in like 2016/17 now its more like £2 minimum for the same amount of milk another thing that I remember the price of back in those days was a 2 Litre bottle of branded soda/fizzy drink such as Coca Cola/Pepsi/Fanta etc... which were always around £1 minimum (except for Coca Cola which would only be that much when it happened to be on offer) and I know there is a tax on sugary drinks now but sweeteners aren't taxed like sugar and back then both sugary and sugar free were around the same price and even sugar free which BTW is I think supposed to be around 20% cheaper (don't know the exact number) is now around £1.50 for 2 Litre bottles and for Coca Cola which is usually the most expensive common name brand you can only get it that cheap easily is if you get it on offer as part of a multi buy deal (such as buying more than one bottle such as 2 for £3) Inflation has ruined the market but even then NVidia is pricing these a little too high I think. I think including inflation a XX80Ti series nowadays should be around maximum $900 MSRP and the regular XX80 should be like $750 max and the reason I say those prices is because of inflation
@@neoasura They never wanted to be underpaid and overworked. Also, prices are not going up because they're getting treated (much) better. Prices are going up because their labor force is retiring and dying off thanks to the One Child policy. Also they've been pushing _hard_ to up the standard of living, so their prices are also going up a bit naturally, as well. Other less developed countries have cheaper labor, now.
120 FPS enough for 4K. 120 fps and 144 is not a big difference. It makes no difference as long as you are not blind. by the way, the RTX 4090 has HDMI 2.1 which supports 4K 144HZ, so I don't know what you're talking about?
ok, these test results are VASTLY different (referring to 4k, ultra, no DLSS, RT Ultra) to what some other reviewers are getting. Not slamming this video at all, just a reminder to watch other reviews. Test benches definitely impact the results it seems. Wild card, wild money. Will be waiting lolol Edit: good grief, this the most interaction I've ever had with my comment lol but yeah, almost 4x increase on that Cyberpunk performance when others have around a 40% increase (which is still bonkers) definitely raises an eyebrow. Gonna be an interesting WAN show I guess lolol
check out igors lab test report... he tested the card with an AMD 7950 CPU (the fastest current CPU you can get) and the only CPU it should be tested with as of today (Intel Raptor lake 13900K, has not released yet) The videos from igors lab may be in german, but his tests also come with an english translated 12 page article with all graphs benchmarks etc in english on his website. He is probably the most repected tester/ reviewer around in the industry, even linus tech tips, jay C 2 cents, gamer nexus, der 8auer and other reviewers constantly refer to his testings and give him credit for his work.
Video-Encoding: for streaming h.26x and AV1 are important/the future... but for video-editing you often have to cope with ProRes RAW. (Apple) ProRes is a codec supported by Apple's SOCs of course - but I have never seen it being implemented on GPUs outside Apple - though you find it on a lot of cameras (Sony, Fujifilm,...). Is Apple prohibiting alternative hardware-encoders for "their" codec, or am I missing some other reason?
Add the 40hz mode to that. It's no 60 but it feels really good compared to 30hz. I've played Valheim, Satisfactory and Cyberpunk, all felt really good for such a lightweight device.
If the RDNA3 x800XT can stay below or at 800$ and delivers comparable results it will be the new High End King. the 40 series lacks competitive pricing. 700$ was right at the edge for most gamers. I had friends that bought the 3080 with money saved up over the summer. I can't see that happen with a 1200$ card in the same class just 1 generation later.
Not really, it will fly off the shelves, don`t underestimate how much money people are willing to spend on pc-hardware worldwide, if this was priced at $2999 it would still sell out, not in all countries, but US and EU for sure. For business use is a no-brainer, time is money and this cuts a lot of time so the price is not a deciding factor, for streamers and content creators who always want/needs the latest and greatest it will also be a no-brainer, basically anyone but people on a budget, and those people should be buying the 30x according to Nvidia, if you divide FPS/$$$ the 4090 is the cheapest GPU you can get so $1199 3090ti or $1599 4090 that has 2.2x the performance? If we start seeing 3090ti`s priced at around $700-800 USD, than yeah no reason to get a 4090, but that`s not the scenario today.
@@shippy1001 it'll do as well as a niche product will like the thread ripper. That was popular and commercially viable too, but definitely not moving the same volumes as the other Ryzen models. Dedicated servers and workspaces likely won't switch to this from their workspace cards like quadro MI series. Supercomputers that use the MI series already is not switching to this. Many of those partners are contractually bound as well. The rtx4000 is for enthusiasts, not for businesses. Streamers, like you pointed out, probably have the most draw, but honestly they can just get a cheaper gpu as a hw accelerator for their streaming purposes for a much smarter decision. So sheep will prob buy these, but they prob won't be moving the same volume as previous gens for sure.
@@shippy1001 It will fly off the shelves because, as nVidia already said, they are keep inventory low to create a false scarcity. For business... it depends. If you're large enough that you need it, you're also likely large enough to be looking for Quadros. Streamers can justify it somewhat, but most professional streamers already run double setups to the point this doesn't mean much. Plus the extra image quality won't be translated to the streams anyway. On top of that at some point you just have to factor in the rising energy costs and cooling solutions. It is a good product in regards that if you have the money and a very specific use-case that will benefit from the extra performance, and don't care about energy or heat, it can be beneficial, as long as you're not large enough for business solutions. But at the same type the actual market that checks all these boxes is very, very small. A good product under certain circumstances? Yes. A no-brainer? No - specially because the lack of Displayport is one massive downgrade, specially for streamers, who are the people who would mostly benefit from this card. Add to it that it doesn't have support for PCiE 5, which y'know, is quite important if you want highbandwith to, I don't know, move around large pictures in high resolutions with minimal compression and very high speeds, and you might have your buyers wondering if they really should invest or wait for next year. The 4090 is an enthusiast/halo product. It's something to sell the brand, and nVidia is making sure to price people out of it in order to become the de-facto "premium" brand, like Apple does.
@@kenhew4641 Are you unaware that there are separate enterprise line of products in Intel, AMD, and NVIDIA? US Government has already invested into AMD MIs, EPYCs, and NVidia's A100 GPUs. THOSE are enterprise market products, not the RTX4000 LMAO. Thats exactly what makes these RTX4000 cards such a niche product like the threadripper.
@@whdgk95 That`s a different scenario, and those Ryzen situations yes you are absolutely correct, but for Rendering, VFX, and even streaming, the 4090 is still the much better value proposition, and don`t get too caught up with the DP1.4 thing, it has HDMI 2.1, even Nvidia knows that is a niche to use PCIe5.0 and DP2.0. Most professional office environment like indie game devs or VFX artists runs dedicated PC desks around a warehouse, they simply buy a new system for the best developers and move the 1-2-year-old system to the newer guys, and the 4-year-old systems are sold/traded. The dedicated HW accelerator is too much of a hassle to deal with, a single powerful GPU is much easier to work with and you will get more value out of that. Just to be clear, I`m not defending Nvidia practices, just explaining that this product even priced as it is right now, is still a good deal for people/businesses who got the money.
the displayport 2.0 thing is insane. With a card like that you want it to be relevant for as long as possible for that price point and its starting off a gen behind what a joke.
It's all on purpose. People who are gonna purchase this card will buy the next 4090ti or 5090 with dp2.0 without hesitation. Giving Nvidia their money hand over fist.
@@primegamer321 When? AFAIK there literally isn't a single model available for sale currently, most of the really good displays just have HDMI 2.1 and DP 1.4a
I don't have an explanation for the frame rate differences. If I had to guess though, I would say it might be related to the drivers. Nvidia drivers often contain specific optimizations and profiles for each game. It could explain why some games perform really well, while others perform poorly. Having the best hardware can't fix bad software and bad coding. Often bottlenecks and performance problems are purely software issues (I'm a software developer).
@@AtticusHimself he is right though, you people seem to focus more on hardware and less on software which also causes bad frames. Hardware is not to blame.
7:47 It's interesting to see how the RX 6950 XT has a massive advantage over even the RTX 4090 in some categories. Much of it is presumably a matter of optimization, but still, it goes to show that performance doesn't exist on a simple continuum. And it underscores the importance of choosing hardware optimized for your own use cases.
I did an AVI encode today and the 14GB video dropped to 2Gb and looked absolutely gorgeous, though still lost some very minor detail. Still couldn't believe it when i saw the file size. Though I didn't have hardware encoder so it took like 8-9 hours lol.
Pretty sure they made a mistake at 5:28 when they showed the cyberpunk performance (which imo is the most significant jump too), i also watched a few other videos, namely the one by JayzTwoCents and by DigitalFoundry where they measured the CP2077 RT ultra performance at around 45-50 fps at 4k while LTT is measuring it at 97....which is higher than what the other videos are showing as the non-rt performance edit: hardware unboxed...not digital foundry
@@Flukiest still...an almost 100% difference can't be because of a cpu bottleneck.....especially when the CPUs being used are one of the best ones of their respective brands
@@ahmedrauf8777 It can if you combine different CPU architecture _and_ different Windows versions _and_ different DDR RAM versions. You'd be surprised how much of a difference that can make. Hell just having different Windows versions can make a huge difference.
I hope AMD is able to compete with this. That's a huge improvement. On a different note, happy to see more Anthony and look forward to seeing more of him this launch season
@@stevieknight9225 I also got a 1060 and am looking for an upgrade, but there is no way I am gonna buy an amd GPU. I hope the 4060 is gonna be priced not too stupidly
as long as nvidia still has a massive stock of 3000 series cards, there will most likely not be a 4000 series besides the 4090, 4080 and 4070. err, i mean 4080 12GB.
@@endfm it's not about the frames, it's about all the other stuff. I would miss things like rtx voice, GeForce experience, having drivers that work well all the time and support everything etc. The fps are just the tip of the iceberg
for the blender benchmarks it would be nice if you guys specified if you used Optix or not, as it can give a massive improvement over CUDA, would be good to see both sets of results.
With how expensive Nvidia cards are getting, I am seriously considering buying an AMD card. I've been looking to replace my 1060 for almost 3 years now and thanks to Crypto miners, COVID (supply chain issues), silicon chip shortages, rising interest rates, inflation and the scarcity of 30x series cards + the black market it's created has driven the price up for cards almost 200% Nvidia promised that a GPU should always cost the price of a games console ($400-600) but they are still marked up way to high.
Their exactly the same. Their just matching nvidia prices that’s what they do because there is no real competition with 2x sellers. Buy the best AMD it’s not even as good as a RTX 4080 and around the same price.
3:27 the numbers shown for 4090 in 4k ultra preset (136fps avg) is a lot higher than what Hardware Unboxed showed for high preset (83 fps avg). The CPU used here is 7950X, and hardware unboxed used 5800X3D, but the difference is wild. The RAM is also different... DDR5-6000 CL36 here compared to DDR4-3200 CL14.
I don't understand what you're complaining about when they use two different RAM configs and different CPU. Obviously the difference is high when you're using super high-end components
Same for the cyberpunk 2077 4K RT Ultra DLSS off as well. I watched couple of other tech channels and their results show 30-40 fps average which is much lower than this video claims. I think they forgot dlss on somehow.
I got a 6900xt for that very reason. Wasn't fussed about RTX and got one new for £700. I undervolted so the power doesn't go over 250 watts and core clock stays at 2550Mhz. It has done me proud 👍👍
im still sticking with my 1070ti as it does great gaming in VR and 1440p, might be like 8000 series by the time i switch. always surprises me these people buy the 3090 and 4090 but still game on 1080p, like what you need that performance for? rendering hentai?
I have the 3090 trust me the 4090 is like the 2000 series new technology and the the 3090 is the refinde version of the 2000 series that means the 5000 series is going to be a refind version of the 4000 series I would wait for the 6000 series that would be the biggest jump in performance
@@MrPaxio 3090 is the best 1080p card out on the market, it's more than twice the fps compared to your 1070ti in most titels, even the 1% lows are higher than a 1070ti max fps in modern games = stable preformance plus buyers don't pair a high end gpu with a garbage cpu and that have alot to do with the stable preformance in 1080p and a little less if you play 1440p. It's night and day difference but your need the monitor and cpu to support a higher end gpu to.
@@SweatyFeetGirl Might be so but how many RX69xx/RTX3090 are paired to a i7 7700 or a Ryzen 2600x? for lower tire gpus like RX6600 that is actually posible to end up in a older system.
I'm right there with you with my 1080TIs. I would probably shit myself if I got on to a new machine but for now, I still have a beast... to me. Also, I would have to build an entirely new machine. Shit. I'm still running an i7 8700k. Have yet to try and over-clock though. Might be time to try!
Great video. Just one thing, when you show the DLSS, maybe you should consider showing the FSR active on the AMD side and XeSS on Intel side (Like GN did with the 4090 Review), because this technologies are so common now, and showing DLSS on and no AMD result could lead to consumer confusion, and yes DLSS technically has a little more quality, but with FSR 2.0 AMD is really close and even with 1.0 in now that far with the fixes they putted on it... Just saying, I hope who read this have a good day :D.
@@robertkeaney9905 even then. the only people who should be complaining are the .01% of people that have the Samsung Odyssey G8. The monitor that's 4K 240hz that can thit 2000 nits peak. Now that i think about it... why would anyone buy a VA panel right when samsung has their qd oled panels.
@@aHungiePanda Honestly, its probably because the other folks bought those other OLED's when they were on sale. There is a small but passionate market of well educated buyers who have the money to shell out for a Odyssey G8 year round. And then there's a bigger chunk of people who want to buy something high end because they just got their tax rebate. And will instead jump on any Oled that's on sale. Because On Sale, means a good deal. And Americans love good deals.
The price is already crazy, but the fact that they cut *any* corners on the functionality is basically unconscionable in my book. Someone buying this card would certainly be in the market for ultra high-refresh 4K, and probably even 120hz 8K when available (and this card can in fact handle high refresh 8K for games with lower demands and/or DLSS). And as said PCIe5 certainly has use cases and someone buying this card would be in the market for supported mobos. It's a testament to their attitude that they'd go this route on their "peak enthusiast" halo card.
It's great to see these insane levels of performance. But I thought spending $230 on my 1660S was a lot. The current prices are untenable, even at the equivalent tier.
In the cyberpunk benchmark JayzTwoCents got 74 fps on same settings vs your 136 fps, that cant be right? even if you tripple checked. Could there be some massive diff in cpu performance or windows stuff? I have had some cases myself in cyberpunk where the amd's 'dlss' did not disable since you have to hit apply before changing other setting for it to change
Yea, this seems crazy. I think I am just going to assume JayzTwoCents is more realistic and still get the card, if it performs better than that, then that's great too. But weird that the performance is so different.
I would love to see some triple screen games when reviewing cards of this caliber. Something like assetto corsa competizione on 3 4k displays would probably even make the 4090 sweat. Some sort of VR benchmark would be really interessting too. Those two are use cases where you could excuse to buy such an exprensive GPU
@Bubo Bubobubo most games aren't too hardware heavy for it, but DCS is an exception to the rule. A lot of recommended setups are stuff like a Ryzen 7 5800X3D and an RTX 3080 and above, the more VRAM the better. Hopefully when vulkan gets added and multicore support it'll help a lot
@Bubo Bubobubo HL:Alyx is also surprisingly easy to run, it's not that rough on the hardware. DCS or MSFS can be brutal on systems in VR, and many larger-scale games can push the PC much harder
I wish you reviewed its performance with programs like Unreal Engine, particularly for Virtual Production applications. That’d be great info to have to compare against the 3090.
I once bought the GTX 970 for about $400 when it came out. I thought that was expensive. That wasn't long ago. Now I'm spending almost $1000 for a 3090. Used. This is ridiculous.
Purchased an EVGA 980Ti on its release date for £570. Im out priced today from top of the range GPUs. I now wait some time before I buy them second hand.
I just got a RTX 3060 for about $350 with cyber-Monday deals. the GTX 970 now costs $340. the 3060 is 95% better on userbenchmark, when did you get yours? But my mother board could only support at latest 7nth gen intel cpu, so I did have to overpay to get the i7 7700K while there were far cheaper newer gen ones for less price that were the same or outperformed my cpu.
@@wizardemrys when did I get mine? My 970? As I stated, I got it when it came out, google search says late 2014 was launch. Currently using an ASUS Strix 3090, the best version of a 3090. I haven't overclocked it yet. I have an i9 9900K and game at 3440 x 1440p. CPU is maximum 73% under load even in the newest titles like Darktide, where even the GTX 700 series GPUs don't even work anymore. RIP I'll upgrade my CPU to a 13900K when the next generation launches and the prices drop. I'll have this i9 9900K for 5 or 6 years by that time.
We can see how much Samsung process node has been holding Nvidia card back. Similarly to what happened with Qualcomm 8 Gen 1 vs 8+ Gen 1, where performance difference almost a generational ahead when comparing those two after Qualcomm switch back to TSMC.
I'm crossing my fingers for AMD to come out strong with their GPUs. The last thing I want on a top of the line card is compromises. If I'm paying 1600 I want DP 2.0 and everything else. Maybe a good and affordable monitor will finally release within 2 years so I certainly would like to have a card capable of running it.
@@jarodAl-rw1ts The DP2 standard is for a very particular type of demanding top tier panel. 4k 120hz and up (18k, HDR etc) Every hardware choice should always be for what panel you want to drive.
@@keonxd8918 unfortunately, not this gen. The 4070 is targeted to be around the 3080 performance level. The fake 4080 (12GB), which is really the true 4070, has 95% the performance of the 3090 Ti. The 4060 performance is going to be on target at a similar level to a 3070, might be slightly faster, but slower than a 3080. The problem is going to the be pricing. Nvidia is pushing the price tag on all next-gen cards up. So the 4070 will more than likely sit at $699 and the 4060 at $499. If you want sub $300 4000 series GPU, you will probably have to wait for the 4030-4050 models.
It's weird how different your raytracing results are from those at Gamers Nexus. There, the 4090 managed to show around 70% improvement over the 3090 Ti - about as much as in the traditional rasterization scenarios. I guess that they tested a different section of the game, but in Cyberpunk, the 4090 gave them mere 79fps with DLSS! (4k upscaled from 1440p)
You know every year since I have bought the 960 I’ve been contemplating to buy a higher spec GPU, but I always often choose to wait, because I’m a hard believer in Moores law… 😅 I honestly think maybe next year is time for me to stop this and buy another GPU, not because I think Moores Law is finally breaking, but it has become to the point where GPU’s can run any application I would want to run at the highest settings 😂.
Man, at this point, you should just build a whole new rig! Replacing just one component will invariably result in a bottleneck if the rest of your PC is from the GTX 960 era.
Holding tight on my gtx 970 which still performs like a dream come true since 2014.Whenever they bring some sensible prices to their products i'll jump in and do a full upgrade.
I wonder how many units this could push in Ultimate Epic Battle Simulator 2. I can do around 4 million units ( medium settings ) with my 1070ti and I know a 3080 can do around 30 mil ( on low settings )
I can see Nvidia pushing pcie 5 & DP 2 in their refreshes. For me this means I wont upgrade to this limited generation. I'm still waiting on mesh shaders and the direct storage to make its way to more titles or even to a driver level like RSR did on AMD
Sure the performance is going to be way better than my 3090 ti but that's the last time I will pay top dollar ($2300 AUD) for a card, its not right prices need to change... will be interesting to see prices once they are actually on shelves
You still can factor in the secondhand market of selling your 3090, if thats something you plan to do so you're not "really" paying that price. I know here, $1599 for the founders card (the one I want, due to waterblocks) I will be getting it for around $1000~ after my friend buys my 3090. So for me thats a worthy upgrade.
Buying the latest and greatest PC hardware is always the worst because it's inferior in a matter of months. I don't see any point in shelling out top dollar for stuff that will age like milk.
@@PerplexedPhoton showing off :D also if you are lucky enough to have a followin gon youtube/twitch you can justify it because it will pay for itself over time when people throw money at you
@@PerplexedPhoton neither am I paid 2200 pounds for a rog strix 3090 oc white edition I don’t need a 4090 too late edit you said u are a cheap ass that’s your choice I was the same as you for a decade not spending over 1k on a pc max
It's insanely expensive. But if you compare it to the MSRP's and more importantly real pricing of the 3090ti and 2080ti the price/performance is far better. The intergenerational performance jump here is insane. I actually think some of these numbers are ridiculously overkill for almost anyone that doesn't have a 4k, 1440p UW or VR setup.
There will definitely be diminishing returns at that high of a framerate. No way the difference between 390 and 500 will be that noticeable. That's a shit load of frames in 1 second. I'm not in the "the eye can only see 60hz" boat, but the difference with that many frames is so miniscule.
thank you for actually covering professional workloads, unlike most of the other channels. it's baffling that a card who's main target audience is people who game AND do professional rendering and 90% of the reviews are questioning the value because they're only reviewing half of the value proposition.
Because typically two workloads are different customers? If you're a consumer you're going to be gaming at home with it. However if its being bought for professional workloads, Chances are its a business buying the cards and they likely don't want their staff using their work PC for gaming
@@jasonlib1996 I don't understand your point. There are a LOT of self employed or contract employees, especially in the VFX or film editing industry, who are interested in these cards, and for them $1600 for this level of performance in Blender is a no-brainer, day 1 purchase. As a systems engineer over seeing a 5000+ employee organization, I can confidently tell you that buying individual graphics cards is extremely rare in enterprise. You cannot possibly try to justify the lack of productivity benchmarks these day 1 reviews have shown. xx90 series is very often used for these workloads.
The reason the 4090 suffers with games from csgo era is because of virtualization of deprecated direct X dependencies. 3090 Ti is more streamlined just due to having a more stable software support and windows integration through drivers. After a few months of release this will probably be fixed.
Exactly. I do 3D design, so it's nice, but I can't imagine it being needed as creative programs can't utilize the power. Only my GPU renderers are designed for that kind of power.
CORRECTION: We're working on updating the video, but in the meantime, our numbers for Cyberpunk 2077 were with FidelityFX Upscaling enabled. We specifically *didn't* have this enabled, but stability issues with the bench seems to have messed with the settings. We've re-run all of the numbers for each card:
*No RT, no DLSS (1%low, 5%low, avg):
- RTX 4090: 54, 69, 81
- RTX 3090 Ti: 43, 46, 56
- RTX 3090: 35, 43, 50
- RX 6950 XT: 30, 39, 46
*RT, no DLSS (1%low, 5%low, avg):
- RTX 4090: 36, 39, 44
- RTX 3090 Ti: 17, 22, 26
- RTX 3090: 16, 19, 23
- RX 6950 XT: 10, 11, 13
*RT + DLSS (1%low, 5%low, avg):
- RTX 4090: 94, 97, 108
- RTX 3090 Ti: 58, 60, 67
- RTX 3090: 52, 53, 61
- RX 6950 XT: N/A
-AY
Heyyy daddy Anthony
Hey 5 minutes early to the comment:)
And hey Anthony👋
I knew it, cyberpunk was way too high. Thank you for the correction.
guess im staying with my RTX 3080 then id wait for 50 series
Need to be pinned
it´s impressive how a 600 bucks GPU is bargain now... we need serious competition in the GPU market
INTEL
I got a 3080 ti FE for that price and it truly was a bargain
This comment was 2 minutes after video was released
People who work on a budget, or anyone who isnt in an industry or job that might require the latest and greatest should not be foaming out the mouth to buy a new GPU. 2060s are available for sub 300 dollars on amazon and 3060s regularly at 400. i sit here doing game development, blender work, play my favorite games, steam my favorite content all on a 270$ 2060.
Im generally baffled at the people who look at GPUs that SHOULD cost 1k+ and go "its not fair i cant buy that" when 2060s and 3060s exist. Its like looking at your 2018 ford focus and going "i cant believe they are selling lambos at that price, now im never gonna be able to get one." AS IF YOU NEED A LAMBO TO GET TO YOUR JOB AT BURGER KING. Your ford focus is reliable and when it breaks down, unlike a lambo- it isnt the end of the world.
600 is what a responsible adult should be making in 3 days. If you can't afford a gpu you shouldn't even be on TH-cam right now.
As powerful as this is, I honestly can’t justify the price. Not to mention the cooling & space requirements for the 4090. And the fact that the 4000 series doesn’t have DisplayPort 2.0 is legitimately baffling.
wait are you serious, it DOESNT have DP 2.0
huh?? its way easier to justify this price vs the 3090 or 3090ti? Im sorry but if you are someone who always buys the bleeding edge gpus this is the 1st time it doesn't feel terrible to upgrade. yea the dp 2.0 its shitty but the uplifts at 4k are pretty nice here.
isn't this video posted like 8 minutes ago? How are you able to finish the whole 16 video in 4 minutes :o
@@Zapdos0145 Nope, it’s still 1.4.
@@fortwentiblazeit4177 by watching at 2x speed or higher
I have never wanted Intel to succeed so bad.
Or amd
Lmao same. It doesn't feel right 😄
OUt of context I tried displaying on 2 3 screens using an nvidia GPU, and it brought stability issues that could result to BSOD but then I bought USB display adapters and they were stable intel enabled the and no issues arose.
@@dennismwangangi doesn't sound like you ultimately fixed or figured out was wrong. Just used an alternative method to use the gpu.. but okay 👍🏽
@@thatdogeguy9108 amd is on the same pricey path as nvidia sadly they would be if their old policy at the time of the 4870 still was on course i remeber at the time they were on a policy to make pc vid cards cheaper
I remember buying a 1060 for around 300€ and my parents mocking me for spending so much on it. The look on my mother's face when I told her about these gpus prices was priceless
i bet she took back that statement now LOL
almost the same story here
The first GPU I bought was 11 years ago, a GTX560 for 200 bucks, BF3 and every other games runned at ultra with over 60fps (120hz just started to be a thing back then)... Now you have to spend double for a xx60 card so you can play current demanding games at medium/high settings. What a time to be alive.
@@vuri3798 I bought my 1660 Super for $220
The days of the 60-series cards being the best 'budget' card are gone
boomers think 18 year olds can go buy a house and live on their own out of high school still ... lol good luck
They didn't put displayport 2.0 on the 4090 because they don't want these cards being used 5 years from now... so you'll have to buy the next gen gpu with the 2.0 port. It would be like making a car that last for 30 years, they want you to come back and buy another
Apple set that standard, now Nvidia wants a piece of the pie.
@@Sad_King_Billy not apple. iSheep did.
AND pcie 5.0
Holy shit
Used in five years with what though? The card has HDMI 2.1a, which can do 4K / 120 with no display stream compression
Surprising differences in some results between reviewers. For example Jayz2cents had Cyberpunk 4K (ultra preset, RT/DLS off) average fps only at 76fps compared to LTT's 136fps. I wonder what could make such huge difference?
I wonder the same thing. Vote this up lads so LTT sees this
They fucked it up DLSS was on.
@@soaringspoon LTT Labs getting off to a good start I see.
Jayz' is the correct one. HardwareUnboxed got 83fps in 4k high dlss off.
@@rustler08 Don't think so, since they got chart for both DLSS when off and on.
For years we've been focused so much at the pinnacle of gaming pcs that NVIDIA's forcing us to look back onto practicality by their insane pricing. Maybe Intel has a point.
more competition came out, competition showed nvidia that you can charge similar prices for a crappier product, so their prices adjusted to make sense in the market space. not very surprising, its what the people wanted, apparently
lol true 😅
the people that buy these kinds of cards are such a tiny market.
The real money is in selling budget cards. Which is why Intel hopped into the pool with their arc cards. Nvidia is moving further into an enthusiast/professional PC direction. It's like a Toyota vs a Lamborghini, one works well for most, but the other is most desirable.
@@nathanjokeley4102 RTX 4090 cards are aimed at hardcore, high-end gaming enthusiasts who demand the best there is. They don't care about prices.
I did belong to that group for a long time, but times change, priorities in life change, so I dropped out of the race.
Though pricing for these cards are beyond crazy.
This is the first card where turning on Ray Tracing will finally makes sense ( when I buy them at discount some years later).
around 7 years maybe.
atleast after 5 years of my younger brother mumbling about dlss and ray tracing superiority, he might even use these features for the 1st time 😆
(nvm, he said his going amd after release 🙃)
Another 3 generations it will probably preform decently on a 50series which means in a couple of generations after that you could expect pretty wide adoption. When those cards are hitting the used market.
@@esatd34 Seven years? ..You're probably just being facetious, but it definitely won't take that long. In just three years, the 4090 will practically be chop liver.. The technology moves exponentially fast. Moore's law isn't dead, Nvidia is just full of shit.
@@lejoshmont2093 I'm waiting to upgrade till the 50 series... the next 'big thing' is going to be path tracing in games and maybe by then the cards will be powerful enough to support it better for non-tech demo use.
The enlarged GPU images in youtube thumbnails is usually for clickbait, but the 4090 may actually need to be downsized for thumbnails.
It feels like the 3080 and 3090 came out yesterday
I just got 3060 Ti in in May, cuz 3080 prices were still madness at the time
I'm just hoping that AMD won't disappoint us
Bro AMD is just synonym of disappointment
the only way they could is if they increase prices by a stupid amount. or capacitor problems… the expectations are that low.
@@aaditya4619 not really. Their CPUs are good, gpus are fine, at least in the high end.
They wont reach the 4090
How so? They had a pretty competitive lineup of gpus depending on your specific needs/desires.
I bet they are saving displayPort 2.0 for the Ti Series just so they can bring them out half a year later at around 50% more cost and justify the price hike for just putting on a display port and add pci gen 5.
Would more likely be the “Super” refresh next year or the year after. Rehash of the Turing product strategy.
That's most likely what they doing
DP 2.0 in TI cards in 25% more price, super 25% more price and pcie gen 5. dont at me.
At this rate, I legitimately think my next GPU is going to be made by Intel. What a fascinating turnaround for them.
Intel gpus are a synonym for hot garbage
@@velqt Right now they are, maybe in a couple of generations they can actually compete.
@@nadie9058 Intel will cry and give up before that happens. They’re already in damage control
Yes. Everyone needs to go Intel to send a message about pricing. Let Intel undercut the market.
@@oisiaa Ah you want to pay for hot garbage so intel prices even higher when they realize what people will pay for trash?
I think a very valid test would be a "office" or "bed room" size room with something like this. We measure ambient and output, but more airflow = moving more hot air = more hot air into the small room. If that room is not near a thermostat you're gonna have a small sauna in your home office or a freezer in your living room while trying to keep the office cool.
Way too expensive, yet still extremely impressive and powerful
Yeah it really is
@@PumpyGT test is not true...
@@Iskandr314 Are you able to describe *why* the test isn't "true"? Because that seems like a strong claim.
@@Iskandr314 so a test for ONE GAME isnt 100% correct. So what?
Its not impressive when the price is almost 2k. It would be impressive if it was at a reasonable price. I mean before they too could make mote power gpus, but they were limited by price, where people wouldnt even consider buying a 1k gpu let alone 2k. Remember in 1080 era 800$ was too much. I mean everyone can make the fastest gpu if they got unlimited price and see a gpu for 5k or 3k.
Honestly I find the lack of Display Port 2.0 more concerning than the price since you atleast get some insane performance for it, but not being able to take full advantage of it for high refresh rate 4k gaming is pretty stupid lol
I wonder if the 4090TI will have Display Port 2.0 when it goes on sale...🤔
Seriously, hearing that makes me want to not buy one.
The 3000 series was meant to have it, Nvidia trying to cheap out as much as possible
It's got an HDMI 2.1a port exactly for that reason my guy.
@@hotlocalbabes HDMI 2.1a doesnt support more than 4k 120HZ too so what are you trying to say "my guy"?
Anthony is such a great host -- clear, concise, covered all the bases, mentioned the case average temperatures, testing conditions, ambient air temperature, etc
Almost like they have writers.. or whatever they are called...
Yeah im always intrigued when a Cane toad hosts a show.
@@bigfoot3322 😭💀💀💀💀
@@xenox8553 Anthony is listed as both the episode's host and writer on the end slate, though that doesn't rule out others helping him.
At any rate, while these videos are collaborative efforts, it's perfectly normal to have a favorite host(s), as they each have their own delivery.
and delivering wrong info
The lack of Display Port 2.0 really is baffling
They're probably reserving that for their Quadro cards, so that companies who need higher refresh rates or resolutions are forced to pay 5x more.
It's the Nvidia way!™
@@Clawthorne Nope, RTX 6000 (Ada) also only has display port 1.4 which is a $5000+ GPU btw
I feel like if RDNA3 is able to drive 144fps at 4k Nvidia are really going to be kicking themselves, to the point where we might even see a revision 2 or something like that.
They've made it so that AMD doesn't even have to *match* their theoretical performance in order to outperform them in the real world. If your card can't drive more than 120hz due to bandwidth limitations, does it really matter how many "more" frames you get?
@@spaceduck413 honestly I could care less about frames counts I really just like being able to play games with consistent frames which is why I like 30 and 60 fps no higher no less but honestly GPU's are absurdly priced which is why I don't even touch the RTX series and stick with my gtx 1060
@@Nightengale537 respect imma buy a 1050 to LP
I remember all the hype that came with the 30 series cards, especially the 3090, getting released. There was none of that with the 40 series. Back when the 20 and 30 series came out, me and my friends would talk about them all the time. No one said anything when the 4090 came out. It was so far out of reach that we just didn’t care.
Ikr. It's now some kind of fantasy that we nod to and say "cool" and go on with our day knowing we can never get our hands near it
Just wait 5 years and get one
@@SC.KINGDOM in 5 years something newer will be out
you too do not casually carry 1800 EUR in pocket?
What a coincidence))).
I think not....
$2000 USD in australia. I could spend the money on it but I just can’t justify it. No way in hell. It would cost more than my entire system. With 9x fans $200 case, water cooling etc. their just ripping of loyal customers now and I hope they lose sales
I am so disappointed we haven't seen a set of cards with a good balance of power use to performance since the 10 series
Check DerBauer's review, he plays around with the power target and that shows some impressive efficiency results!
do what the other said. 70% power target => 300 W and onlx -5% in fps.
For a 300W card with this performance, it is bonkers.
@@josdebosduif5 this is exactly what I was gonna say. The power draw on this card is actually insanely good compared to everything else out there including nvidia’s 3090 and 3090ti.
its competition. The 20 series might not have been as much of an improvement because nVidia ballooned the size of their chips in order to fit in all the extra RT/Tensor stuff, but the 30 series returned more to normalcy, however actual threat of competition from AMD lead them to be as aggressive as possible with performance with less regard for how power they are targeting per tier.
Power/performance has never really gotten worse since the 10 series, you just have to be willing to apply your own limits to the cards. Take a 2080, tune it to draw no more power than a 1080, it will still be plenty faster. The 3080, even more so.
What will be really interesting to see is the 12GB "4080", which has a die size smaller than the 1080 (and is barely larger than a 3050/60), but has more transistors than a 3090 Ti. Its just a major shame they're trying to fleece us for it with that absurd $900 MSRP.
All of this hardware (both CPU's and GPU's) is already extremely efficient, manufactures just crank the hardware to its limit and the power increases exponentially when its close to the limit.
RTX 4090's raster performance and with DLSS is pretty impressive, but for a whopping 1599$, I could buy a whole PC for that price, and with it's 450W+ TDP, even that 1600$ PC could consume less or same power as the RTX 4090 would
Well that’s just, like, your opinion man…
Yeah but will that whole pc have the same performance?
@@vongdong10 yes :)
@paradox I bought it. I will legit have a heart attack if amd has a better card for a alower cost because the 4090 was hella fucking expensive AND i had to stretch my budget A LOT.
but what's gonna drive it kid? a 1060?
I swear, Nvidia just radiates so much utter contempt for their customers, partners, and worldwide power grids that it's actually giving Apple a run for their money.
Facts, I really hope AMD can pull though this year, because this is not okay
How about dealing with crypto before we start complaining about the power grids
Nintendo: "Hold my lawsuit."
They are the Apple of pc parts. Nvidia is a lifestyle at this point, they succeeded.
It's hilarious since Apple hates Nvidia too. They're a match made in hell.
It would have been interesting to see power draw at the same FPS to see how effective the card is generating the same media
I have to be honest, the 40 series really isn't worth it at this moment, especially with the recent reduction in the price of the 30 series gpus.
Not just 30series but rx6900 xt is going around for 699!!! For its power consumption, this is no Brainer
@@manjindersinghsaini911 6900xt is a joke bro imagine buying that 💀💀 the 6950xt had half the performance of the 3090 in most games. Even worse in productivity. What a joke
It's literally 2 times faster for productivity use cases. 2 times! Yes that is worth it. Time = money
@@propersod2390 LOL pulling numbers out of your ass.
@@propersod2390 not sure what games you've been playing... (*Psst* how much is Nvidia paying you for this, I want in)
Edit: Anthony has posted in the comments about this issue and confirmed it wasn't DLSS, but a different setting, that wasn't setting correctly, new numbers which are closer to what we see in other games were shared. Still a big uplift from the prior generation though :)
I'm curious now how the other 40xx cards which are closer to the current 30xx pricepoints will compare to their past generation's numbers.
The rest of my comment can be disregarded in light of the corrections coming from Anthony/LTT.
There was another reviewer who mentioned that cyberpunk seemed to have issues with changing settings and it took several attempts of changing the setting and restarting the game to get it to run with DLSS off on the new Nvidia cards. DLSS being the thing that "creates fake frames", it severely affects framerate measurements if it's not actually turning off in this measurement.
I'm not saying that has to have happened here, but it wouldn't surprise me. I would love if you could double check it for us :c
Clealy it happned and they probably know but instead of talking the video down they continue to spread misinformation
That has to be what happened. The performance in Cyberpunk shown here without DLSS is not consistent with other reviewers.
They said they were suing older drivers because new drivers were faulty at the start of that. That could be the difference, the also didn't use TAA in this video like DF. They showed with DLSS 2.x on and off in this video. With ray tracing and DLSS performance it was higher than with DLSS off and ray tracing off.
@@mrlacksoriginality4877 The charts said DLSS off on both versions (and he clearly says "without DLSS" @5:35). Only RTX was on in the second graph, not DLSS.
But if DLSS had secretly been on in both (not even like they meant to leave it on, just because the drivers/cyberpunk have a known issue with leaving it on even though the settings say it's off) it would inflate the framerates drastically like this and invalidate the test results. If they had a separate measure where they claimed DLSS was on it would be helpful, but there were none in this video so I'm not sure what you're referring to when you say you saw a comparison with DLSS on in Cyberpunk from this channel.
TL;DR: DLSS 3.0 (not 2.0) uses an AI to "guess" what frames should look like, and inserts a "fake" frame between each pair of "real" frames. It does this at the cost of adding measurable latency and a risk for visual artifacting / loss of clarity (as the reviewers noted in this video). I suspect the average player may prefer to play games with DLSS 2.0 instead of 3.0, or with DLSS off all together. Regardless though, if a game (like cyberpunk) were to accidentally run with DLSS 3.0 enabled, even though the reviewers checked and thought it was off, then that game would be able to roughly double it's FPS. And in comparison to the performance increase of other games which weren't lying about whether DLSS was on or not, it would show a much more massive performance jump from generation to generation. Which... is exactly what we see here, if you halve the 40-series performance in cyberpunk it's still a big increase, but it's much closer to the performance increase we see in the other games being benchmarked in this video.
@@Night_Hawk_475 They just posted the issue. It didnt have DLSS on but fidelity upscaling on because of some technical issues. At the 6 minute mark you can see DLSS on btw.
Seriously, this video is still up saying that the 4090 is 60% faster than the 3090 Ti? Come on LMG. This is a very bad look.
"We will remove the videos"
Keeps up all the worse offenders
the fact is that you never really NEED the fastest gpu to play new games at the highest settings. you can go a few rungs down (even a few generations) and still do just fine. after a certain point the added performance isn’t enough to justify the price
I am sure you are talking about 1080p here, which can run high/very high settings 60 fps on bloody any GPU. The moment you go 1440p the requirements jump quite a bit, and when you add a 144Hz monitor it just even more, WAY more than the resolution increase, that is where i am at, my 1070Ti cannot run anything at 144 FPS, it could barely run Fortnite at very high when i bought it, but sure it's just one game, PUBG i cannot reach good quality and over 100 FPS, it's a trade of, same with Warzone, etc.
Then you have people, i will be one of those one day, that are at 4K, and some aren't even at 4K 60 FPS, but at more, and NOTHING but this 4090 can run 4K at above 100 FPS. I am not advocating for such absurd prices or the 4090, but your statement is wildly incorrect. When someone tries and plays 1440p for a while, they don't want to go back to 1080p, same with 144Hz vs 60 FPS, and just reaching 1440p 144FPS is bloody absurd, and that isn't even that new or premium or whatever, 4K and refresh rates in the hundreds are the niche. Yes a lot of people play at 1080p 60FPS, but a lot of them never even tried anything higher.
That's how I have been feeling too, I am just not that interested or exited for the next generation the same way I was with the 6 series or 7 series.
I think it only matters to eSports gamers who need max frame rates and enthusiasts who like using all the settings to the max. I honestly turn off a lot of setting in every game which I believe ads bloating and a tacky look 🤷. But definitely would be nice if you had a lot of money 👍
You don't need a Ferrari to get from point A to point B either. But it is nice to drive a Ferrari 😄
Correct, on most games the jump from 1440p to 4K is not that noticeable, same between the high and ultra presets of most games. We literally dump electricity for very diminishing returns with most people unable to even tell the difference. (But swear up and down that they can which drives a lot of toxic elitism.) The chase for the ultimate hardware has gotten real mindless at this point with bragging rights the only metric that matters.
All to play the same single player games that thrive at 60fps and the sweatfest online games like Apex legends and COD Warzone that look the same plus are so overwhelmingly infested with cheaters that an investment in the fastest GPU money can buy at the grossly inflated margins Nvidia asks for is completely unjustifiable.
can you believe we would rather intel enter this market to bring competition and theoretically help “fix” this problem than see nvidia continue this path. it’s actually amazing… and im here for it. i hope AMD and Intels drivers come to play ball.
A race with no end in sight
@@RyTrapp0 i meant AMD (themselves) and intels drivers as two separate things, i should’ve clarified.
That right there is why Nvidia wasn't permitted by regulating agencies to acquire ARM. Jenny demands total ownership and control, anything less is inadequate for him. He is just like Steve Jobs.
Could you guys test wether or not you can heat your room with one of those? Maybe that is the solution to the rising energy prices.
If you can afford one of those you can afford the high energy prices
I mean that thing looks like it could shit out a way bigger power bill than your actual HVAC system honestly.
Around here we're heating a whole city with a supercomputer...
@@pvpdm pretty sure it was a joke
bruah i can heat my room with a rtx 2070 super
I'm a simple man, I see Anthony, I click video.
I know it gets shot down all the time, but would love to see a Linux Tech Tips with Anthony if he's willing to do it.
It was good having the second fastest gpu for 3 days.
But I did get the 3090 for 800 bucks.
I’m also upgrading to a 5800x3d today up from the 3600.
Hello brother
I went with a sim racing 3080 for $400. Unless I absolutely need the fps -- I have a hard time justifying >$500. Things like this definitely keep the console market alive.
I would 100% wait for the 13900k before changing platforms/cpu's.
@@idwithheld5213 the 5800 X3D is a dropin replacement for his 3600, no new motherboard or anything else needed, and it is one of the absolute best gaming CPUs, maybe the 13900k can beat it on paper, but there is no smarter or better upgrade path for him right now, unless money means nothing, and for you giving "advice" ofc. someone elses money means nothing to you!
@William B time to buy online from u.s.
With prices ever rising ,I think it's time that tech youtube starts to refer to GPUs not by their names, but instead by the most useful thing you can buy for the same price. This Honda Prelude review is pretty good ,for example
😆
"This card is X rent/mortgage payments worth."
I do my personal economics in terms of sandwiches. So to buy a 4090 I just have to starve for a decade.
Thank you Anthony for finally being the voice of reason on the lack of DP 2.0
Maybe if this was discussed earlier and there was more of a consensus among reviewers we could have pushed back on this anti-consumer tactic. I'm sure nVidia will be more than willing to sell me a 4090ti for $2099 in 6 months that actually supports DP 2.0
Keep crying lmaoo
Surprised he didnt mention DSC which enables higher refresh rates / resolutions on DP1.4 without the image quality degradation of Chroma Subsampling.
Yeah, I'm skipping this generation. If I buy a bleeding edge graphics card I want a bleeding edge screen
@Alex Unltd. 160hz 10 bit HDR works fine with DSC
@@platinumjsi Can just use the HDMI 2.1 which is probably what most people getting this card will be using, though I question why there is just 1 HDMI 2.1 port but 3 DP 1.4 ports.
14:03 the ARC cards were waiting in a warehouse for nearly a year and still have DP 2.0
They did the same thing when they didn’t support the HDMI 2.1 on the 20 series cards which I missed when I bought the LGC9 to the point of thinking I should upgrade to the 30s series. So yeah I’m not gonna fall for the same mistake again, future proofing is a must so I will wait and see what AMD gonna offer
it feels like they did it so they can add it on the next generation to justify upgrading not to save a buck
What I took out of this video are the impressive stats for the 6950X : not a perfect card, got it's pros and cons but the margins are smaller than most think and knowing it was cheaper by 50% than the 3090 and it's trading blows with the 3090Ti which costs almost double in many titles is a big W for gamers and got me really excited for RDNA3 coming up very very soon ! Been riding with NVidia so far but I've been considering jumping ship these last few years
And in some applications it slays. 7:54.
Well - higher rasterisation performance, lower Raytracing performance, better general purpose productivity, worse when compared against Cuda.
AMD GPU really age fine because of the driver updates. You can see it as a good point or just see amd cards as unfinished when they come out.
Vega GPUs became a lot better with new drivers as an exemple
@@davisbradford7438 this is almost certainly a mistake, there are many errors throughout the video
No 2.0 support is basically making this gen an extension of the 30 series. 4090 should be flagship performance material, yet they wont offer latest tech on display port. Im also considering jumping ship after this gen for sure.
In a few years when the 6090 launches, hopefully competition will have forced prices back down closer to $1000, and I might be able to justify buying one so I can start gaming in 4K.
The 6060 should be able to easily handle 4K. I'll be incredibly surprised if the 6060 doesn't have better performance than the 3090ti
Because of inflation, PC parts are not going to get cheaper.
If moore's law is dead, be ready to pay 5000$ for the 6090 :)
To be fair, I see the prices coming down. They can only keep charging so much when AI upscaling starts to do most of the work. The actual hardware is likely to plateau because there's just no need for monster cards with 600w power draw when the majority of the frames are software/AI based.
@@ignacio6454 yeah or $500 since everyone not buying it coz electricity is getting more expensive, and more GPU power wasted for nothing wont attract anyone
I really appreciate showing off the benchmarks with DLSS OFF. I'm not against using DLSS personally, but my "goal" is always to play without it if possible.
Game graphics have been in a stalemate for so long that these gimmicks seem more worrying than exciting.
Anything above 1440p is just unnecessary unless you're display is over 70'
why tho?
I remember when the flagship gpu 5-6 years ago (980 ti asus rog) was $650… no one can’t say that this performance isn’t amazing but the price is just ridiculous. While other hardware manufacturers have maintained the price or even lower it for better performance, nvidia has tripled the price
I paid $500 for a Geforce 2 Ultra in 2000, which is close to $800 today, it was outdated in less than a year. Kids today don't realize they had a good run of cheap components for the past decade, those days are over. Chinese workers don't want to be slaves anymore.
Try getting 16000 cuda cores for 1600 5-6 years ago.
They claim it's "inflation", but Nvidia are making even more obscene profits than ever so it's clearly a con.
Yes price has increased a lot but also even price to performance increases every gen and in most cases also performance per watt too. I too hate that the prices are so high BUT you got to consider the fact that back then they didn't have a 90 series card nor was covid a thing which drastically affected inflation as well as costs of items and so on and so forth. Here in the UK for example I could easily get 2 Litres of milk for £1.50 back in like 2016/17 now its more like £2 minimum for the same amount of milk another thing that I remember the price of back in those days was a 2 Litre bottle of branded soda/fizzy drink such as Coca Cola/Pepsi/Fanta etc... which were always around £1 minimum (except for Coca Cola which would only be that much when it happened to be on offer) and I know there is a tax on sugary drinks now but sweeteners aren't taxed like sugar and back then both sugary and sugar free were around the same price and even sugar free which BTW is I think supposed to be around 20% cheaper (don't know the exact number) is now around £1.50 for 2 Litre bottles and for Coca Cola which is usually the most expensive common name brand you can only get it that cheap easily is if you get it on offer as part of a multi buy deal (such as buying more than one bottle such as 2 for £3)
Inflation has ruined the market but even then NVidia is pricing these a little too high I think. I think including inflation a XX80Ti series nowadays should be around maximum $900 MSRP and the regular XX80 should be like $750 max and the reason I say those prices is because of inflation
@@neoasura They never wanted to be underpaid and overworked. Also, prices are not going up because they're getting treated (much) better. Prices are going up because their labor force is retiring and dying off thanks to the One Child policy. Also they've been pushing _hard_ to up the standard of living, so their prices are also going up a bit naturally, as well. Other less developed countries have cheaper labor, now.
Wow! Good on you guys for pointing out the lack of DisplayPort 2.0 support and what that means for this GPU
120 FPS enough for 4K. 120 fps and 144 is not a big difference. It makes no difference as long as you are not blind. by the way, the RTX 4090 has HDMI 2.1 which supports 4K 144HZ, so I don't know what you're talking about?
@@frantisekheldak2228 the 4090 performs beyond that. You've missed the point. Good day, sir.
@@julio_adame HDMI 2.1 support 4K 144 HZ !
@@frantisekheldak2228 and the 4090 can hit 4K 240Hz performance, which is what DP2.0 is necessary for. Do some reading and educate yourself, pls, ty
ok, these test results are VASTLY different (referring to 4k, ultra, no DLSS, RT Ultra) to what some other reviewers are getting. Not slamming this video at all, just a reminder to watch other reviews. Test benches definitely impact the results it seems. Wild card, wild money. Will be waiting lolol
Edit: good grief, this the most interaction I've ever had with my comment lol but yeah, almost 4x increase on that Cyberpunk performance when others have around a 40% increase (which is still bonkers) definitely raises an eyebrow. Gonna be an interesting WAN show I guess lolol
Don't forget random silicon quality differences.
Wait for Jufes from Frame Chasers results. Shows the in game tests and not just bs graphs anyone coulda thrown together
Der8auer saw some really impressive numbers even when limiting the GPU to about 60% power draw.
Yeah, they are claiming about double in Cyberpunk 2077 what other reviewers got. Something is up.
check out igors lab test report... he tested the card with an AMD 7950 CPU (the fastest current CPU you can get) and the only CPU it should be tested with as of today (Intel Raptor lake 13900K, has not released yet) The videos from igors lab may be in german, but his tests also come with an english translated 12 page article with all graphs benchmarks etc in english on his website. He is probably the most repected tester/ reviewer around in the industry, even linus tech tips, jay C 2 cents, gamer nexus, der 8auer and other reviewers constantly refer to his testings and give him credit for his work.
Video-Encoding: for streaming h.26x and AV1 are important/the future... but for video-editing you often have to cope with ProRes RAW.
(Apple) ProRes is a codec supported by Apple's SOCs of course - but I have never seen it being implemented on GPUs outside Apple - though you find it on a lot of cameras (Sony, Fujifilm,...). Is Apple prohibiting alternative hardware-encoders for "their" codec, or am I missing some other reason?
The legal battles between Apple and RED surely scared off would-be implementors.
The most impressive GPU I've seen in the last 12 months is the one Valve used in their Steam Deck, the performance that thing achieves is impressive!
which one
@@CaptainScorpio24 It's an integrated RDNA 2 based GPU with 8 CUs.
@@austinnafziger4159 ohhh ok
Add the 40hz mode to that. It's no 60 but it feels really good compared to 30hz. I've played Valheim, Satisfactory and Cyberpunk, all felt really good for such a lightweight device.
4090 probs has about 10x more performance. So impressive 🤨🥱
If the RDNA3 x800XT can stay below or at 800$ and delivers comparable results it will be the new High End King. the 40 series lacks competitive pricing. 700$ was right at the edge for most gamers. I had friends that bought the 3080 with money saved up over the summer. I can't see that happen with a 1200$ card in the same class just 1 generation later.
If this card wasn't a 4090 but a Titan or for workstations, almost no one would complain about the price.
Now they can save up 900$ and get a 70 class card.
How is AMD suppose to match these performances for more than half the price?
@@LetrixAR But.. it's not tho. So why bring it up lol
@@CptVein Not match, but if they can give us even 90% of a 4090 for 800$, they pretty much win this gen.
Power and size limits make this monster of a card with a monster price tag a "NO-GO". It will remain an enthusiest card with few in use world wide.
Not really, it will fly off the shelves, don`t underestimate how much money people are willing to spend on pc-hardware worldwide, if this was priced at $2999 it would still sell out, not in all countries, but US and EU for sure.
For business use is a no-brainer, time is money and this cuts a lot of time so the price is not a deciding factor, for streamers and content creators who always want/needs the latest and greatest it will also be a no-brainer, basically anyone but people on a budget, and those people should be buying the 30x according to Nvidia, if you divide FPS/$$$ the 4090 is the cheapest GPU you can get so $1199 3090ti or $1599 4090 that has 2.2x the performance? If we start seeing 3090ti`s priced at around $700-800 USD, than yeah no reason to get a 4090, but that`s not the scenario today.
@@shippy1001 it'll do as well as a niche product will like the thread ripper. That was popular and commercially viable too, but definitely not moving the same volumes as the other Ryzen models. Dedicated servers and workspaces likely won't switch to this from their workspace cards like quadro MI series. Supercomputers that use the MI series already is not switching to this. Many of those partners are contractually bound as well. The rtx4000 is for enthusiasts, not for businesses. Streamers, like you pointed out, probably have the most draw, but honestly they can just get a cheaper gpu as a hw accelerator for their streaming purposes for a much smarter decision. So sheep will prob buy these, but they prob won't be moving the same volume as previous gens for sure.
@@shippy1001 It will fly off the shelves because, as nVidia already said, they are keep inventory low to create a false scarcity.
For business... it depends. If you're large enough that you need it, you're also likely large enough to be looking for Quadros. Streamers can justify it somewhat, but most professional streamers already run double setups to the point this doesn't mean much. Plus the extra image quality won't be translated to the streams anyway. On top of that at some point you just have to factor in the rising energy costs and cooling solutions. It is a good product in regards that if you have the money and a very specific use-case that will benefit from the extra performance, and don't care about energy or heat, it can be beneficial, as long as you're not large enough for business solutions. But at the same type the actual market that checks all these boxes is very, very small. A good product under certain circumstances? Yes. A no-brainer? No - specially because the lack of Displayport is one massive downgrade, specially for streamers, who are the people who would mostly benefit from this card. Add to it that it doesn't have support for PCiE 5, which y'know, is quite important if you want highbandwith to, I don't know, move around large pictures in high resolutions with minimal compression and very high speeds, and you might have your buyers wondering if they really should invest or wait for next year.
The 4090 is an enthusiast/halo product. It's something to sell the brand, and nVidia is making sure to price people out of it in order to become the de-facto "premium" brand, like Apple does.
@@kenhew4641 Are you unaware that there are separate enterprise line of products in Intel, AMD, and NVIDIA? US Government has already invested into AMD MIs, EPYCs, and NVidia's A100 GPUs. THOSE are enterprise market products, not the RTX4000 LMAO. Thats exactly what makes these RTX4000 cards such a niche product like the threadripper.
@@whdgk95 That`s a different scenario, and those Ryzen situations yes you are absolutely correct, but for Rendering, VFX, and even streaming, the 4090 is still the much better value proposition, and don`t get too caught up with the DP1.4 thing, it has HDMI 2.1, even Nvidia knows that is a niche to use PCIe5.0 and DP2.0.
Most professional office environment like indie game devs or VFX artists runs dedicated PC desks around a warehouse, they simply buy a new system for the best developers and move the 1-2-year-old system to the newer guys, and the 4-year-old systems are sold/traded.
The dedicated HW accelerator is too much of a hassle to deal with, a single powerful GPU is much easier to work with and you will get more value out of that.
Just to be clear, I`m not defending Nvidia practices, just explaining that this product even priced as it is right now, is still a good deal for people/businesses who got the money.
Bruh always lookin like he just ate a 1000 mg edible 😂😂😂😂
the displayport 2.0 thing is insane. With a card like that you want it to be relevant for as long as possible for that price point and its starting off a gen behind what a joke.
It's all on purpose. People who are gonna purchase this card will buy the next 4090ti or 5090 with dp2.0 without hesitation. Giving Nvidia their money hand over fist.
No company has even announced a DP 2.0 display though, AFAIK
@@primegamer321 When? AFAIK there literally isn't a single model available for sale currently, most of the really good displays just have HDMI 2.1 and DP 1.4a
I don't have an explanation for the frame rate differences. If I had to guess though, I would say it might be related to the drivers. Nvidia drivers often contain specific optimizations and profiles for each game. It could explain why some games perform really well, while others perform poorly. Having the best hardware can't fix bad software and bad coding. Often bottlenecks and performance problems are purely software issues (I'm a software developer).
Oh ok "software developer"
Was this close to calling bs until I saw that impressive title
@@AtticusHimself He's right and it is basic knowledge.
@@user-ko4zp1wm2i "and it is basic knowledge" that's the entire point of my comment
@@AtticusHimself he is right though, you people seem to focus more on hardware and less on software which also causes bad frames. Hardware is not to blame.
@@AtticusHimself then you worded it poorly
Did I actually make it here on time for a LTT video?
yes we did, almost I guess
yes
Yes we did
7:47 It's interesting to see how the RX 6950 XT has a massive advantage over even the RTX 4090 in some categories. Much of it is presumably a matter of optimization, but still, it goes to show that performance doesn't exist on a simple continuum. And it underscores the importance of choosing hardware optimized for your own use cases.
Yea Really. How is nobody Talking about Those results
I feel this GPU would work well with animation, filming, medical and many other industries rather than consumer based
Isn't that more what the Quadro line is for, anyhow? Or do folks even use those?
@@SwordfighterRed as far as I can tell, the only PCs with Quadro GPUs are ones that are prebuilt specifically for those industries.
a100 or a6000, this is pathetic compared.
@@SwordfighterRed But many people nowadays game and work on the same PC so a card which is great in both is the way to go.
@@SwordfighterRed they are for servers
I did an AVI encode today and the 14GB video dropped to 2Gb and looked absolutely gorgeous, though still lost some very minor detail. Still couldn't believe it when i saw the file size. Though I didn't have hardware encoder so it took like 8-9 hours lol.
what are the exact steps to do so, link a guide or vid or something?
@@retrosapien1 look for ffmpeg encoding tutorials
@@retrosapien1 Handbrake's snapshot build has AV1 SVT support which is great if you want a simple to use GUI for encoding.
@@Ubya_ thank you
AV1, not AVI
Pretty sure they made a mistake at 5:28 when they showed the cyberpunk performance (which imo is the most significant jump too), i also watched a few other videos, namely the one by JayzTwoCents and by DigitalFoundry where they measured the CP2077 RT ultra performance at around 45-50 fps at 4k while LTT is measuring it at 97....which is higher than what the other videos are showing as the non-rt performance
edit: hardware unboxed...not digital foundry
I'm wondering if the different version of Windows plays into that.
The main difference is that LTT is using Ryzen 7950X whilst Jayztwocents is using 12900K. LTT is also using 16GB DDR5 memory.
@@Flukiest still...an almost 100% difference can't be because of a cpu bottleneck.....especially when the CPUs being used are one of the best ones of their respective brands
@@ahmedrauf8777 It can if you combine different CPU architecture _and_ different Windows versions _and_ different DDR RAM versions. You'd be surprised how much of a difference that can make. Hell just having different Windows versions can make a huge difference.
@@DavidStruveDesigns I will go with Ahmed doesnt matter to much if its the 7950x or 12900k
I love how Anthony is always focused on the wholistic experience and not get blown away by a few of extraordinary results!
My favourite host ever!
I hope AMD is able to compete with this. That's a huge improvement. On a different note, happy to see more Anthony and look forward to seeing more of him this launch season
i believe AMD is gonna Suprise consumers with something unique
just slap in display port 2.0 and then their good
@@kaslanaworld4746 And what will dp 2.0 change on a card that can't even get near the performance of the new rtx cards?
@@propersod2390 you dont even know the performance of rdna3
@@propersod2390 dp 2.0 is a lot more useful on a slower amd card than not even having it. Gaming isnt the only thing a pc is used for.
Honestly, I am definitely more interested to the 4060. Upgrading from my 1060 will be a big improvement
Always where the bang for buck is. Think AMD could focus this area this gen though
@@stevieknight9225 I also got a 1060 and am looking for an upgrade, but there is no way I am gonna buy an amd GPU. I hope the 4060 is gonna be priced not too stupidly
as long as nvidia still has a massive stock of 3000 series cards, there will most likely not be a 4000 series besides the 4090, 4080 and 4070. err, i mean 4080 12GB.
@@tyrannus00 lol AMD GPU's are fine, you will not miss frames.
@@endfm it's not about the frames, it's about all the other stuff. I would miss things like rtx voice, GeForce experience, having drivers that work well all the time and support everything etc. The fps are just the tip of the iceberg
for the blender benchmarks it would be nice if you guys specified if you used Optix or not, as it can give a massive improvement over CUDA, would be good to see both sets of results.
With how expensive Nvidia cards are getting, I am seriously considering buying an AMD card.
I've been looking to replace my 1060 for almost 3 years now and thanks to Crypto miners, COVID (supply chain issues), silicon chip shortages, rising interest rates, inflation and the scarcity of 30x series cards + the black market it's created has driven the price up for cards almost 200%
Nvidia promised that a GPU should always cost the price of a games console ($400-600) but they are still marked up way to high.
Their exactly the same. Their just matching nvidia prices that’s what they do because there is no real competition with 2x sellers. Buy the best AMD it’s not even as good as a RTX 4080 and around the same price.
3:27 the numbers shown for 4090 in 4k ultra preset (136fps avg) is a lot higher than what Hardware Unboxed showed for high preset (83 fps avg). The CPU used here is 7950X, and hardware unboxed used 5800X3D, but the difference is wild. The RAM is also different... DDR5-6000 CL36 here compared to DDR4-3200 CL14.
I don't understand what you're complaining about when they use two different RAM configs and different CPU. Obviously the difference is high when you're using super high-end components
Same for the cyberpunk 2077 4K RT Ultra DLSS off as well. I watched couple of other tech channels and their results show 30-40 fps average which is much lower than this video claims. I think they forgot dlss on somehow.
@@aeswere He's not complaining. That's not what complaining is. Observations ≠ Complaining.
@@fabuli9666 It was tested 3 times so I really doubt it was.
@@fabuli9666 All review I saw 4090 was maintain 40+ fps at 4k ultra rt without dlss. Where did you get "30-40" ?
I had no idea the 6950 XT was actually that competitive with with 3090's, might have to give it some more consideration
It always was. Everyone expects AMD to push prices down but the same people still choose to buy Nvidia
If you don't plan on using much RT then it could be worth it, depending on what pricing does after RDNA3 launch
LTT has always had a weird bias against AMD.... especially when RDNA2 proved to be a great generation of GPUs
I got a 6900xt for that very reason. Wasn't fussed about RTX and got one new for £700. I undervolted so the power doesn't go over 250 watts and core clock stays at 2550Mhz. It has done me proud 👍👍
@@GWT1m0 there is no contradiction
Gonna stick with 3000 series and wait for 5000 series. The nefarious pricing and irresponsible power consumption is ridiculous.
im still sticking with my 1070ti as it does great gaming in VR and 1440p, might be like 8000 series by the time i switch. always surprises me these people buy the 3090 and 4090 but still game on 1080p, like what you need that performance for? rendering hentai?
I have the 3090 trust me the 4090 is like the 2000 series new technology and the the 3090 is the refinde version of the 2000 series that means the 5000 series is going to be a refind version of the 4000 series I would wait for the 6000 series that would be the biggest jump in performance
@@MrPaxio 3090 is the best 1080p card out on the market, it's more than twice the fps compared to your 1070ti in most titels, even the 1% lows are higher than a 1070ti max fps in modern games = stable preformance plus buyers don't pair a high end gpu with a garbage cpu and that have alot to do with the stable preformance in 1080p and a little less if you play 1440p.
It's night and day difference but your need the monitor and cpu to support a higher end gpu to.
@@SweatyFeetGirl Wouldent say alot better, RTX3090 and 6950xt trade blows at 1080p and all depend on what titel.
And 1080 is very cpu dependent to.
@@SweatyFeetGirl Might be so but how many RX69xx/RTX3090 are paired to a i7 7700 or a Ryzen 2600x? for lower tire gpus like RX6600 that is actually posible to end up in a older system.
Notice the minecraft ”uoh!” Sound when he says cooling att 10:48 😂😂😂😂
I’ve got a 3090. It’s great. My next upgrade will be real-time rendering as I’ll have to upgrade the whole box due to Nvidia.
dang i remember contemplating whether or not i wanted to spend $400 on my 1070 lol. Feels good now.
I'm right there with you with my 1080TIs. I would probably shit myself if I got on to a new machine but for now, I still have a beast... to me.
Also, I would have to build an entirely new machine. Shit. I'm still running an i7 8700k. Have yet to try and over-clock though. Might be time to try!
You're making us feel old. Right around that time I dropped almost 800 on a 1080Ti.
i7 950 reporting in with a GTX 770 that I bought for 350€
1070s go for 100 bucks used. Congrats on your loss.
@@brad8122 I bought it new when it was released 6 years ago and the comparable GPU in the 40 series costs almost $1000.
Great video. Just one thing, when you show the DLSS, maybe you should consider showing the FSR active on the AMD side and XeSS on Intel side (Like GN did with the 4090 Review), because this technologies are so common now, and showing DLSS on and no AMD result could lead to consumer confusion, and yes DLSS technically has a little more quality, but with FSR 2.0 AMD is really close and even with 1.0 in now that far with the fixes they putted on it...
Just saying, I hope who read this have a good day :D.
I expect a sub 10 litre small form factor case with amazing cooling for DUAL 4090's by end of year.
The lack of display port 2.0 is a deal breaker for me cause I have a 240hz display that I couldn’t fully use.
DP 2.0 monitors havent been released yet... Why does it even matter to you?
@@aHungiePanda the man probably has an high refresh rate Oled hdmi 2.1 monitor, and got confused. Thinking that hdmi 2.1 is display port 2.0.
@@robertkeaney9905 even then. the only people who should be complaining are the .01% of people that have the Samsung Odyssey G8. The monitor that's 4K 240hz that can thit 2000 nits peak. Now that i think about it... why would anyone buy a VA panel right when samsung has their qd oled panels.
@@aHungiePanda Honestly, its probably because the other folks bought those other OLED's when they were on sale.
There is a small but passionate market of well educated buyers who have the money to shell out for a Odyssey G8 year round.
And then there's a bigger chunk of people who want to buy something high end because they just got their tax rebate. And will instead jump on any Oled that's on sale.
Because On Sale, means a good deal. And Americans love good deals.
@@aHungiePanda I have a Samsung odyssey g9 neo and yes it DOES support dp 2.0 and 8k 240 hz so shut up.
The price is already crazy, but the fact that they cut *any* corners on the functionality is basically unconscionable in my book. Someone buying this card would certainly be in the market for ultra high-refresh 4K, and probably even 120hz 8K when available (and this card can in fact handle high refresh 8K for games with lower demands and/or DLSS).
And as said PCIe5 certainly has use cases and someone buying this card would be in the market for supported mobos. It's a testament to their attitude that they'd go this route on their "peak enthusiast" halo card.
It's great to see these insane levels of performance.
But I thought spending $230 on my 1660S was a lot. The current prices are untenable, even at the equivalent tier.
This RTX 4090 review was WRONG WRONG WRONG
In the cyberpunk benchmark JayzTwoCents got 74 fps on same settings vs your 136 fps, that cant be right? even if you tripple checked. Could there be some massive diff in cpu performance or windows stuff? I have had some cases myself in cyberpunk where the amd's 'dlss' did not disable since you have to hit apply before changing other setting for it to change
Yea, this seems crazy. I think I am just going to assume JayzTwoCents is more realistic and still get the card, if it performs better than that, then that's great too. But weird that the performance is so different.
I would love to see some triple screen games when reviewing cards of this caliber. Something like assetto corsa competizione on 3 4k displays would probably even make the 4090 sweat.
Some sort of VR benchmark would be really interessting too.
Those two are use cases where you could excuse to buy such an exprensive GPU
I'd agree there. Would definitely like to see how DCS runs in flatscreen and VR with how performance heavy it is
@Bubo Bubobubo most games aren't too hardware heavy for it, but DCS is an exception to the rule. A lot of recommended setups are stuff like a Ryzen 7 5800X3D and an RTX 3080 and above, the more VRAM the better. Hopefully when vulkan gets added and multicore support it'll help a lot
@Bubo Bubobubo HL:Alyx is also surprisingly easy to run, it's not that rough on the hardware. DCS or MSFS can be brutal on systems in VR, and many larger-scale games can push the PC much harder
I wish you reviewed its performance with programs like Unreal Engine, particularly for Virtual Production applications. That’d be great info to have to compare against the 3090.
I once bought the GTX 970 for about $400 when it came out. I thought that was expensive. That wasn't long ago.
Now I'm spending almost $1000 for a 3090. Used. This is ridiculous.
I bought a 3090ti for $840 new Amazon flash sale
Purchased an EVGA 980Ti on its release date for £570. Im out priced today from top of the range GPUs. I now wait some time before I buy them second hand.
I just got a RTX 3060 for about $350 with cyber-Monday deals. the GTX 970 now costs $340. the 3060 is 95% better on userbenchmark, when did you get yours? But my mother board could only support at latest 7nth gen intel cpu, so I did have to overpay to get the i7 7700K while there were far cheaper newer gen ones for less price that were the same or outperformed my cpu.
@@wizardemrys when did I get mine? My 970? As I stated, I got it when it came out, google search says late 2014 was launch.
Currently using an ASUS Strix 3090, the best version of a 3090. I haven't overclocked it yet.
I have an i9 9900K and game at 3440 x 1440p. CPU is maximum 73% under load even in the newest titles like Darktide, where even the GTX 700 series GPUs don't even work anymore. RIP
I'll upgrade my CPU to a 13900K when the next generation launches and the prices drop. I'll have this i9 9900K for 5 or 6 years by that time.
A video with Anthony is something I would always watch
What happende to the RTX 4090 at 4:44 in CS:GO. The minimums are beyond terrible at 10 FPS for 1% lows. Is there maybe something wrong with the chart?
sounds like a really *really* bad stutter
Idk it happens, maybe a hacker or smthn
We can see how much Samsung process node has been holding Nvidia card back. Similarly to what happened with Qualcomm 8 Gen 1 vs 8+ Gen 1, where performance difference almost a generational ahead when comparing those two after Qualcomm switch back to TSMC.
0:27 " there are some other problems"...card disintegrates
It's been sooo long since the last time I saw Anthony host a video I almost forgot he exists. I'm glad he's back.
I'm crossing my fingers for AMD to come out strong with their GPUs. The last thing I want on a top of the line card is compromises. If I'm paying 1600 I want DP 2.0 and everything else. Maybe a good and affordable monitor will finally release within 2 years so I certainly would like to have a card capable of running it.
That's why they're not including it, so you buy another card in 2 years.
@@Moonmonkian i buy a new card every 7 years what
@@jarodAl-rw1ts The DP2 standard is for a very particular type of demanding top tier panel. 4k 120hz and up (18k, HDR etc) Every hardware choice should always be for what panel you want to drive.
@@Moonmonkian bros what
@@jarodAl-rw1ts Read the OP and my response again. I'm a stoner and can follow just fine.
Regardless of how well this performs, they'll have to make an insane 4060 for most people to go Nvidia due to the principle for new graphics cards
4060 will probably perform on par with a 3080
@@keonxd8918 big if true
@@keonxd8918 Yeah right lmao
@@keonxd8918 for the same price of the 3080
@@keonxd8918 unfortunately, not this gen. The 4070 is targeted to be around the 3080 performance level. The fake 4080 (12GB), which is really the true 4070, has 95% the performance of the 3090 Ti. The 4060 performance is going to be on target at a similar level to a 3070, might be slightly faster, but slower than a 3080. The problem is going to the be pricing. Nvidia is pushing the price tag on all next-gen cards up. So the 4070 will more than likely sit at $699 and the 4060 at $499. If you want sub $300 4000 series GPU, you will probably have to wait for the 4030-4050 models.
How did you guys get those results on cyberpunk I have seen other reviews and no one is getting that kind of results
Unless I missed it, I think it would be a great addition if you included Deep Learning performances on your GPU reviews
Does "Deep Learning" mean the sucking sound coming from your arse when your credit card goes through the scanner?
It's weird how different your raytracing results are from those at Gamers Nexus. There, the 4090 managed to show around 70% improvement over the 3090 Ti - about as much as in the traditional rasterization scenarios. I guess that they tested a different section of the game, but in Cyberpunk, the 4090 gave them mere 79fps with DLSS! (4k upscaled from 1440p)
They fucked up, look at their comment in the comments section. Idk why they haven’t pinned it yet
I imagine gta 6 for PC is eventually going to overtake cyberpunks 2077 thrown as most demanding game.
@@Daduu Thanks!
@LegendZzFTW it's optimized now, it's the system which can't run well. Also gt5 is eons old
Their testing was is flawed. Not noticing and throwing out bad numbers should be a staple.
Sometimes. Often, at least for me where the price/performance curve bends is what I shoot for.
0:26 - I thought that was a piece that fell of the GPU at perfect timing hahahaha
You know every year since I have bought the 960 I’ve been contemplating to buy a higher spec GPU, but I always often choose to wait, because I’m a hard believer in Moores law… 😅 I honestly think maybe next year is time for me to stop this and buy another GPU, not because I think Moores Law is finally breaking, but it has become to the point where GPU’s can run any application I would want to run at the highest settings 😂.
Man, at this point, you should just build a whole new rig! Replacing just one component will invariably result in a bottleneck if the rest of your PC is from the GTX 960 era.
@@PhantasmXD Imagine he puts in the new card and it just destroys the motherboard lol
Holding tight on my gtx 970 which still performs like a dream come true since 2014.Whenever they bring some sensible prices to their products i'll jump in and do a full upgrade.
just get a 6600, they are dirt cheap now
I wonder how many units this could push in Ultimate Epic Battle Simulator 2. I can do around 4 million units ( medium settings ) with my 1070ti and I know a 3080 can do around 30 mil ( on low settings )
I can see Nvidia pushing pcie 5 & DP 2 in their refreshes. For me this means I wont upgrade to this limited generation. I'm still waiting on mesh shaders and the direct storage to make its way to more titles or even to a driver level like RSR did on AMD
Bro the 4090 just feels like a FLEX
A Prelude is a good platform for mods too. Always wanted a BB chassis model.
I have one, bought a 2.2 VTI for £1100. Stupidly fun, although crap on track. A very overlooked platform though.
Sure the performance is going to be way better than my 3090 ti but that's the last time I will pay top dollar ($2300 AUD) for a card, its not right prices need to change... will be interesting to see prices once they are actually on shelves
You still can factor in the secondhand market of selling your 3090, if thats something you plan to do so you're not "really" paying that price. I know here, $1599 for the founders card (the one I want, due to waterblocks)
I will be getting it for around $1000~ after my friend buys my 3090. So for me thats a worthy upgrade.
Buying the latest and greatest PC hardware is always the worst because it's inferior in a matter of months. I don't see any point in shelling out top dollar for stuff that will age like milk.
@@PerplexedPhoton showing off :D also if you are lucky enough to have a followin gon youtube/twitch you can justify it because it will pay for itself over time when people throw money at you
@@SC.KINGDOM I'm just a cheap-ass, I have nothing against people who are willing to pay for it, but it ain't me chief.
@@PerplexedPhoton neither am I paid 2200 pounds for a rog strix 3090 oc white edition I don’t need a 4090 too late edit you said u are a cheap ass that’s your choice I was the same as you for a decade not spending over 1k on a pc max
It's insanely expensive.
But if you compare it to the MSRP's and more importantly real pricing of the 3090ti and 2080ti the price/performance is far better.
The intergenerational performance jump here is insane. I actually think some of these numbers are ridiculously overkill for almost anyone that doesn't have a 4k, 1440p UW or VR setup.
Good thing I need an upgrade for my 2070S. It has issue running RT even with DLSS on my 1440p UW and it's gonna help in VR.
@4:36 The 1% low has to be some sort of stuttering error which reduced the average. Right?
Nvidia: 8K is the future
Everyone else: 1080 is still good
2k is the golden spot for me.
1440p
I will buy a 4090 and use it with my 1080p 390 Hz display and then probably upgrade to the new ASUS 1080p 500 Hz display.
@@lizardpeter What a waste of energy, resources and in particular air.
There will definitely be diminishing returns at that high of a framerate. No way the difference between 390 and 500 will be that noticeable. That's a shit load of frames in 1 second. I'm not in the "the eye can only see 60hz" boat, but the difference with that many frames is so miniscule.
thank you for actually covering professional workloads, unlike most of the other channels.
it's baffling that a card who's main target audience is people who game AND do professional rendering and 90% of the reviews are questioning the value because they're only reviewing half of the value proposition.
Because typically two workloads are different customers? If you're a consumer you're going to be gaming at home with it. However if its being bought for professional workloads, Chances are its a business buying the cards and they likely don't want their staff using their work PC for gaming
totally agree. I was hoping someone would do a pcie 3 vs 4 with blender with the 4090
@@jasonlib1996 I don't understand your point. There are a LOT of self employed or contract employees, especially in the VFX or film editing industry, who are interested in these cards, and for them $1600 for this level of performance in Blender is a no-brainer, day 1 purchase.
As a systems engineer over seeing a 5000+ employee organization, I can confidently tell you that buying individual graphics cards is extremely rare in enterprise.
You cannot possibly try to justify the lack of productivity benchmarks these day 1 reviews have shown. xx90 series is very often used for these workloads.
the main target audience are people who will buy this to play Zelda on Project 64....
The reason the 4090 suffers with games from csgo era is because of virtualization of deprecated direct X dependencies. 3090 Ti is more streamlined just due to having a more stable software support and windows integration through drivers. After a few months of release this will probably be fixed.
the fun part is that less than 0.1% of the world would have any use for a graphic card like that
Exactly. I do 3D design, so it's nice, but I can't imagine it being needed as creative programs can't utilize the power. Only my GPU renderers are designed for that kind of power.
And yet the largest group of people i know who buy them are gamer kids with money from their parents...
So explain why it's still constantly sold out??
The 5090 is probably gonna come out before the Ti version of the 4090
Why is that?