TEAR-DOWN is now up! th-cam.com/video/eTFHsRQ-DKM/w-d-xo.html What are your thoughts on 500W GPUs? Are they worth it if the performance is high enough? The GN 3D component coaster pack is IN STOCK & SHIPPING as of today. Back orders and new orders are all going out this week! Support us and get a unique, high-quality PC part drink coaster pack here: store.gamersnexus.net/products/3d-coaster-pack-4-component-coasters
Needing air conditioning on to play games doesn't really appeal to me. Its already bad enough with a 4790k and an OCed gtx 970 pulling 220watts in Australia.
You've hit the nail on the head with that comment about knowing the market will just buy the more expensive option if stock is available, that technique clearly sold a boat load of 3090's to people that really only wanted a 3080 but got sick of waiting and watching prices rise
Yup, and honestly, that's why you have to be a little skeptical about all the supposed shortages. Somehow, we can pump out infinite CPUs but GPUs are so limited. Whatever. It's a scam.
@@bobsaget9182 mining cryptocurrency isn't really profitable on CPUs, but it is profitable on these GPUs, so it would make sense why they would be constantly sold out
The Founders Edition dropped 5 hours ago now in the UK and it's still available to buy. A 3080 FE drop lasts a minute tops. Market herr clearly has a problem with the pricing otherwise they would be long gone
Guilty, needed a new card last winter and the only cards in stock was GTX1650 or RTX3090. As GTX1650 is nowhere near the performance I need the only option was to buy a RTX3090. It's a good card, no question about that, but it's absolutely not worth the price for a gaming computer, I would much rather been able to buy a RTX3080.
This is actually brilliant, Nvidia is making such inefficient cards that cryptominers won't buy them anymore. Nvidia thinks about gamers too! (Hire me Jensen, I can spin bullshit too!)
@@beboH4o uhh actually 4000 series will be extremely power hungry as amd will be using new chiplet architecture on GPUs. 7700 XT will be 6900 XT level of perf for $400 msrp. Nvidia is worried so they’re going to push a ton of power through their cards to compete. Rumors have top 4090 Ti card at 600W+
I love how, in every single in-depth video, tech Jesus does a full cliff note summary in the first 3 minutes and basically says "you don't really have to watch the rest of the video". The guy just gets that there's 2 types of viewers here and doesn't force the non-tech detail focused viewer to sit through all 30 min to get the info they need. Content creation done right.
Naaah .... You need to call the Ghostbusters HQ to do some investigation. If they found no supernatural phenomenon, then yes. It might be caused by these s'iT 0903 XTR testings.
I got this card for $1000 yesterday. Beefy boi.. have it undervolted to 2100MHz @0.950mv. Auto fan speed keeps it under 60C with a max fan speed of 72% seen which was surprisingly quiet. Wattage pulled shows around 385w when undervolted. Overall quite happy. Oh and memory temps haven't been above 72C, VRMS around 58C and Hot Spot delta of only 8.5C (meaning an almost perfect thermal paste job and mount)
Completely true. We should be focused on efficiency, rather than irresponsibly high power draw. It would be like if instead of the electronic car trend, manufacturers just started making muscle cars and dragsters. Options that cost more and are worse for the environment, yet still serve the same purpose. An efficient, moderately powerful GPU is worlds better than any power hungry monster.
It is a lot of power haha. It must heat up a room pretty quickly. In comparison my 3600x / 5700xt gaming rig pulls 320w total out of the wall while gaming.
I don't think this is an achievement, is like putting more fire in your fire, achievement is when there is a card that has the same performace with half the power draw.
@Green Mamba Games because frequency and power consumption does not scale in a linear way. When you increase frequency, you have to increase voltage as well, and power consumption is calculated like this : power = capacitance * voltage^2 * frequency. In other words, this mean power = frequency^3.
It is somewhat cool that we can see how cooler design can save a card like this from overheating. That GTX 480 comparison has makes me want to see one of those older cards with a modern cooler lol.
There was a version of the gtx 480 with the Arctic Accelero Extreme cooler and it worked just fine (cool and quiet). There is a video from someone not too long ago doing a retrospective look at it. EDIT: Took me a while to find it, since it's from a really small channel, but here it is: th-cam.com/video/tVztmLG2qP4/w-d-xo.html
@27:45 : that's exactly my situation. No 3080ti 's in stock anywhere, so I upgraded to the 90ti for a couple hundred more. You totally nailed it there. 👍🤓
500 watts is insane! I'm surprised how good the temperatures are for such high power draw. You will really need super good PSU if you are gonna use this gpu
Yeah good point, and a good perspective. Power consumption will surely peak in the next gen? Question is, will it plateau again at 450,550,650W whatever it is, or will it steadily come down again. You could argue we've just come from a 250W era up to say, the 20 series, and we're at a 350W era in the 30 series
Kudos to EVGA for engineering such a cooler, but this is getting absurd. Could you look at the temperatures of the other components of the system in a regular case? Like, how much would this affect SSD, chipset or CPU cooling?
Why? There are once again more review samples than actual units for sale. Real world tests like "how would this work in a normal persons computer" are irrelevant. A normal person will never have a 3090 Ti.
Yup, I really hope people won't buy the 4000 series high end cards. There's actually not as much of a need for them performance wise and certainly not for performance per watt. You want 1440p high refresh rate with most modern games? Or 4K60ing ridiculously modded games? You already got that with 3000's high end. 1440p still remains a sweetspot for high end high refresh rate gaming, so even when budget is not an issue you SHOULDN'T buy the 4000 series purely from a power consumption and cooling perspective - you'll be heating your room like a motherfucker. I REALLY hope 4000 series flops
@@GewelReal there is just a tiny little bit of a difference between a atomic clock and a nuclear reactor, which makes inclusion of a reactor onto a pcb a little bit more difficult i'd say
Yeah depends a lot on what you pay for electricity. I pay about 0,14€/kWh including tax, so running a 3090 Ti for 5 hours per day for a year would cost me 5h/day x 365days/year x 0,5kW x 0,14€/kWh = 128€/year Comparing that to my 3080 with a 320W power draw: 5h/day x 365days/year x 0,32kW x 0,14€/kWh = 82€/year The price difference is kinda negligible at my energy price (luckily I got a flat rate two year contract last summer, and the contract came into effect in October). I have no idea how much electricity costs in the US or other countries, but you can just plug your own cost of electricity in your currency into that equation and get your own price.
Seriously the UK is having major electric supply issues. Our regulators have increased the cap that energy suppliers can charge consumers. By over 50%. It works out to a £693 per year increase for the average household. So yes, let’s absolutely add even more electricity usage with this monster.
@@GewelReal you obviously don't know what you're talking about. Low in relation to other countries, yes. Low in relation to past energy cost within the country, aaah no, absolutely not. Energy costs are on the rise in North America and the fake tree hugging hippies who want electric cars are making it worse.
The titan was no different than original threadripper. Made a semi pro halo product for a niche market and then watered it down for mainstream and beefed it up for pro, killing the niche hybrid card
This winter the price for a kWh in my country rose a couple 100%. Meanwhile Nvidia releases gpus that need 2x the power of the previous gen flagship... If rtx 4000 is as power hungry it won't be a compelling product, regardless of performance imo.
Same here in the UK i didnt even think about getting a good 30 series card before our energy prices rose by 60% this April and due to rise the same in October. Petrol is also £2 a Litre now.
Its not that big of a deal. People make more out of it than it really is. Unless you keep your PC on and loaded 24/7 the difference in elec bill each month is negligable. There have been 100's of studies and writeups on this.
@@junkiexl86 You are correct, but prices for food and everything is going up so this stuff maters when wages are stagnant and the world is going to shit. . but yes, People should read articles like that. If you could link one that would be great. That said,, I need to mine with my PC as much as weather permits. I am disabled and need the money. I have a 5800X with 32mb of L3cache which Monero mining likes, and a 6700 XT, for whatever the best thing NiceHash says each day. I live in Ontario Canada, where elec prices are sort of not bad. Alberta Canada has terrible prices and other countries are worse still and food prices are at record highs. Each month they shatter the old high. the Ukraine War has been to blame for much of this because 30% of the worlds wheat comes from Russia and Ukraine. We don't have strong left wing politics around the world so that means wages are stagnant and Corporations use times of crisis top push prices well beyond what inflation demands, and people keep voting in right wing ghouls who lie and pretend to fight the good fight. Im afraid I will have no way of being able to live soon.
Ah yes, the obligatory super expensive release right before the next gen is unleashed that will stomp the crap out of this for much less money. This would be like if Apple released an iPhone 13 Pro Ti for $2000 a month before releasing the iPhone 14 Pro for $1000 and the 14 Pro stomps the 13 Pro Ti.
Sadly that is not what has happened. The 2080Ti did not stomp the $600 1080Ti for less money. It didn’t even stomp it really. Like plus 22% for 2x the price. Then the 3080 came out but they were $1500-$1800 for a very long time. Lovelace will probably claim +60% faster but actually be +35% and consume prohibitive amounts of power at the high end. Like circuit breakers tripping is going to be a problem at those levels. Hopefully AMD and intel will produce something that’s meaningful.
Just got a ROG Strix LC 3090 Ti OC for 1200$, i'll be having a glorious time gaming the next year instead of waiting in backorders queue for the mystical 40's that likely won't show up until somewhere in 2023 and actually available at a reasonable price for consumer until maybe Q3 of 2023. Maybe right now is the best time to splurge on a monster card.
The community for sure will find value in power supply testing. These cards are bananas and I'm dreading the heat but also needing a 1000w PSU. Hopefully AMD has a more efficient next gen design....
Within just the past few days, 3080s and 3080ts are in stock and available on amazon, sold by amazon and shipped by amazon, at much lower prices. It has been a sudden turn of events!
This card is most likely a concept card, a prototype cooler/PCB for upcoming monstrosity 4000 series or Nvidia's last attempt at cash grab before prices take a plunge.
@Gareth Tucker At 600W power draws? They are likely going to have 600W power draws to work with the new PSU standard. The old coolers would not work with that.
they dont need to protyopye coolers with thousands of products lmao. This just a cashgrab to get the last money out of the miners before next gen AMD refresh.
Would you ever consider doing power-normalized testing? Like, how does a 3080 limited to 250W compare to a 3090 Ti also limited to 250W? With power consumption going up and up, I don't want what is practically a space-heater in the room while I'm gaming in warmer months, so I use Afterburner or the like to set the PL to something more reasonable. I've been surprised at how small the impact is on my 3080 FTW3.
Get yourself to do water-cooling lol other than that don't buy it simple as that it's funny even for next gen cards are gonna be worse than this I'm not surprised LMAO don't gimp a card that uses more watts you lose fps
@@redclaw72666 that's really not the point. It's to try and see how efficient the cards are. If a 3090 performs the same as a 3080 at the same power limit, then it means a crazy OCer can do a power mod on a 3080 and get it to perform close to a 3090 - which is an interesting prospect. Also, water cooling or air cooling isn't going to make a difference. The same amount of heat still enters the room. In fact water cooling is more efficient at removing heat from your system, thus more heat will enter your room quicker. Literally isn't solving anything by changing the type of cooling in your PC.
Something I'd love to see are videos on power scaling on this or GPUs in general. What kind of performance do you get if you power limit it to 300W, 200W, 150W, 75W? At what point does it stop turning on? Is it possible to make a card like this run only on the PCI slot power? What's the best way to lower power consumption, while losing the least performance? With GPUs consuming more and more power, I really want to learn about making them more efficient.
While I don't have the data for all of those scenarios, as someone with a hybrid cooled 3090 FTW3, I can mine (yes, I know) and run three monitors (1080/144, 4k/60, 1080/75) and play any game (usually on my 4k) at the same time. And this is on a mining overclock (+100 core, +1175 mem, 65% power ~290W).
I have built myself several PCs since 2010 and it's quite concerning how the priorities have changed for me. All i used to care about was Price/Performance for CPU and GFX card and wattage wasn't even a factor until i bought 2080ti in 2018 and 250W seemed like a tad bit too high but i did bite it anyways. Then i bought a G9 monitor and needed more horse power, so went for a 3080ti as i wanted to keep it under 350W no matter what and ditched 3090 !! (350w was still very high for me). Now we are at 500W territory ! Now a days when looking at GFX Cards first thing i look at is the wattage , then performance and lastly price ! Also my room already warms up quite abit after playing for a few hours and im genuinely terrified of next gen flagship cards.
@@camerontaylor3209 I already invested on a 1000W PSU a while ago but it boggles the mind to think that coupling it with my 5900X, if i ever want to go next gen flagship , 1000W may be cutting too close for comfort and i may need a beefier PSU. Plus how big are heatsinks gonna get ? Im not a watercooling fan and never been and its been aircooler for me all the way. ! Are we looking at triple slot 38-40 Cms length heatsinks too cooldown 500-600w GFX cards ? That makes almost all the mid towers and quite a number of full tower cases obsolete and so on. Where this is going, its not looking worrying !
3090 ti was $120 from the 3080 ti i just bought 2 weeks later. Easy trade in/upgrade for $1200 for the amazing cooling and memory upgrade the price went back up to 1400 in august 2022 so I got lucky. I couldn't wait for 40 series due to work. I hope ya'll are able to get what you need/want after these few rough years.
If 500w GPUs finds it’s way down to 3070/3080 level, this will be cause for concern, given rising energy costs/the carbon effect. As a consumer, I couldn’t justify 1. The entry cost of buying the GPUs given the current pricing trends and 2. The energy cost and the affect it will have on the planet.
If 5 people power these things on, energy will be depleted on a global scale. If you want to give mother earth the real middle finger you combine it with an i9 12900K.
Even if you were one of those climate change denial people, these cards are shit, because they cost a lot just for using them and they nearly melt your computer
Wont happen. Highly doubt 4000 series will be even as thirsty than 3000 series. 8nm Samsung is not the king of the ring. More compareable to 12nm TSMC when it comes to efficiency
@@Squilliam-Fancyson Current rumors put the RTX 4090 in the 450 to 600 watt range. The SGX version of the new H100 draws 700 watts. Big power draw is here, the 7900 XT will also apparently be around 450 watts.
I had a couple of OCP shutdowns with my 3080 FTW3 on my 6yr old EVGA 650w Gold PSU. I upgraded to a 850w Seasonic last year after having the issues, and it's been fine even when running the 450w VBIOS with the card water cooled and OC'd.
Do you think you’d ever include VR figures? For DCS I need raw power from a GPU & seeing these cards running that or something similar would be very helpful to the VR world which really pushes cards.
The only thing I love to see about this card is the Power Plug at the back of the Board meaning maybe in the Future there will be Blower style server Cards or at least compatible ones
if you are going to use up four slots, why not just an 80mm fan blowing the heat out of the PCI slots and filling all four slots with fins? I mean you already took up the space, might as well expel that heat instead of using a slot height for fans that dump the heat into the case. it might also be quieter overall at that point.
The 3090 Ti: See enclosed instructions for proper and full rectal insertion of card. Liberal use of included sand lubricant is recommended. Nvidia thanks you for your purchase.
Let's hope. I'm worried that Nvidia will MSRP the 4070 at 700 or something crazy. If that's about the going price from AIBs, I could see them pulling that.
I just bought a 3070ti and I’m sticking with it. If the price to performance is good for the 4K series then I’ll consider upgrading but I’m not giving my hopes up
Very skeptical that the 4070 will have a much lower price than 1500 or dollars. Not to mention we don't know if the 4070 will be worse than the 3090 TI, it all depends on how well the 40 series does overall. One thing you can know forsure is that 700 dollar graphics are a thing of the past....
@@redclaw72666 Water cooling doesn't fix an inefficient space heater part. It just transfers the heat to a radiator, that still ends up in your room. Quit with your bullshit.
The reality is that gamers are very tolerant of extreme power consumption and heat output. Accepting, even. It's psychological reinforcement that they've got an extreme gaming computer.
@@jimtekkit Tbh haven't high end gamers mostly always been that way? Before stupid GPU's like this came out it was multi-gpu setups, and before that it was 3dfx making you need a seperate psu for a gpu they never got out so man idk.
Well overclocking a pentium 4 is more fun and less risky. It offers the same space heating qualities for a tiny fraction of the cost and more significant clock increases until the system is unstable 😂🤣
I remember complaining about how much room two big cards in SLI/Crossfire take up in the case... now GPUs need the same amount of room by themselves :(
Great video as always! This is another prime example of gross price gouging. I won't buy one out of principal, all these cards are 25-30% overpriced at the MINIMUM. Anyone who needs this...have at it, more for you.
that power consumption is insane, that single card is using as much as my whole current rig. With power pricing atm that would burn your wallet even after buying
Hi. Thanks for the awsome content. I would have to disagree on the comments about human eye not seeing more than 100 fps. It's still quite noticable especially in competitive shooters. Lower frame times makes the game feel more smoother and more responsive. Even tho most players would't play competitive on 4k or even 8k alot of profesional players would fork out any kind of money to get that small edge.
this is madness, the world is literally falling apart and energy costs at a premium - NVIDIA lets make a GPU that uses more power than a boiling kettle!
Got mine Friday, put it in yesterday and BOOM, Defective. Nice wiggly horizontal lines with black screen and a nice burning smell in my office. RMA bound today!
its the same generation/architecture. its not like they're making a 3090++ revised edition. the only thing they can do is add more cores/frequency/memory etc to make it faster which will make it draw more power. albiet with the law of diminishing returns hitting alot harder
@@acatch22 But it also follows the 2000 to 3000 progression as well. 2080 to 3080 basically went from 200W to 300W with 50% performance increase. So it's been going on longer than just this Ti
@@Schytheron They did the same going from 2080->3080 and those were different architectures. Do you have an excuse for Nvidia there? It's been about 3.5 years without any meaningful change in efficiency
Would love to see how undervolting the 3090 Ti compares to other RTX-30 series card (which don't seem to have a lot of OC headroom without something exotic like LN2 cooling).
Yup the both Nvidia and AMD saw what people were willing to pay scalpers. Their investors and executives are going to start the next gen GPUs with a high price. I think a 4060 will be $429-$499MSRP but will as you said sell for $700 in store from mark ups. Crazy how the xx60 cards use to be budget now they are mid range approaching high end card prices. Idk even know what the 4080 will cost. That is the card I’m going to buy. If I had to guess $1000-$1299 MSRP but sell for $1400-$1700 In store. And these are my in store mark up prices. We all know scalpers will sell even higher and they will for sure buy all the stock. Scalpers are to blame for these price increases.
And i thought the 280w my old 1070 pulled was alot. Also i love the design of this card! Tiny dots in the plastic, curvy "grille" and the nice rgb area with the chrome evga badge! Wish more cards looked like this instead of the common flat yet spikey designs
I'm happy with my 3090. It will live a very long and fruitful life in my machine like my old 1080 did before I transferred it into a server/entertainment center build with the old FX-8350. Also 500 watts? That is insane 😳
Might as well wait for the 4080, which will beat this turd in everything for a third of the price. FE edition dropped 10 minutes ago and they are still in stock. Says everything you need to know about the demand of these.
A third of the price, like 700$? You're pretty optimistic, GPU prices go up generation by generation, 4080 should be 800$ at least, if not 850$. And the power consumption of course will be in line to 500W or even bigger
@Transistor Jump I literally bought 2 1080tis on the month of release. Cheapest AIB cards were $950, all the way up to around $1200 for expensive ones. Apparently you don't live in reality.
Picked up the EVGA 3090 Ti card that was $1,999.99 and I also had $115 in EVGA Bucks to apply towards the purchase. So I jumped on it. Already have a tracking number from EVGA, very excited to play with this bad boy!
Oh boy, a GPU literally only designed for people who panic bought an RTX 3090 and a giant power supply from EVGA within the last 90 days to step up to for like 200 bucks because it's actually cheaper with the memory on the front than buying a water-cooled backplate for the rear memory. Edit: or $437 to step up to the same high tier sku instead of having the option to go for the model that's only $80 more including $132 of sales tax, nevermind I'll just watercool the 3090.
The "Tie" is a nice touch. By the way, I paid $2500 for my 3080Ti retail from memory express, but thats Canadian. I made a mistake, don't judge me. My 1080Ti was showing its age.
have to wonder why you just didn't buy a 3080? 🤔 ok so I can see someone paying for a 3090ti because of the 20% uplift in performance but the 3080ti is within margin of error big rip I sold my 3070 for £600 and got a 3080 for £730 :)
I mean with the 4000 series release creeping up later this year.. i would honestly just wait and see how they perform, they will most likely compete or destroy this card at a lower price point. 3090ti in Denmark is 2750dollars xD haha
Regarding the whole 8k thing - there may be more use for this than you think. In the sim racing community it's very common to run triple 1440p monitors (6k) and a few try to run triple 4k (12k!). Getting these games/sims to run well at these resolutions is an ongoing challenge. Don't forget 8k VR for similar use cases. Obviously this is a bit of an edge case so your point still mostly stands.
Gamer's Nexus used a 1600w power supply to run their tests of the nVidia 3090 ti GPU card! This is not a criticism of GN. This an hilarity reaction to the absurdness of GPUs that consume 500w or power and have power spikes beyond that! lmao
I would love to see this kind of cooler on a mid-range to enthusiast card... would produce awesome temps while being extremely quiet. without any coil rattle or coil whine this would be heaven for every silent-pc-fan. 500W on its own is just beyond stupid...
TEAR-DOWN is now up! th-cam.com/video/eTFHsRQ-DKM/w-d-xo.html
What are your thoughts on 500W GPUs? Are they worth it if the performance is high enough?
The GN 3D component coaster pack is IN STOCK & SHIPPING as of today. Back orders and new orders are all going out this week! Support us and get a unique, high-quality PC part drink coaster pack here: store.gamersnexus.net/products/3d-coaster-pack-4-component-coasters
Needing air conditioning on to play games doesn't really appeal to me.
Its already bad enough with a 4790k and an OCed gtx 970 pulling 220watts in Australia.
don't like it and never will...gone are the days with low TDP devices
£1879 for the FE in UK card vs £645 for the 3080...hmmm, let me think...think I'll pass..oh and 530W...cost of electricity...ouch
@@HitSousouK Remember the 9700Pro needing a berg connection and OMG pulling 40 watts.....
You were supposed to wipe ur bum with those _hundees..._
Waiting for a 5.5 slot, $3000 MSRP 4090 Ti, with direct wall power!
Just look at the "captain workspace" April fools video about that xD
@@StYfReX His videos are amazing, they actually inspired mine
@Leanja 🤣
You probably need a fusion reactor for that.
6090 requires a power plant
Will stores be adding on options for partial payments using vehicles or possibly first-born children? It is already annoying to use prepaid cards.
gpus will soon be a taxable asset.
@@Gadtkaz stonks
Eventually they will be so expensive your bank will need 24 hour confirmation before you can purchase the damn thing.
Not wrong though. Wife and I living the DINK life is basically this.
Can I forego raising the first born child and donate them a testicle? I've got a spare.
You've hit the nail on the head with that comment about knowing the market will just buy the more expensive option if stock is available, that technique clearly sold a boat load of 3090's to people that really only wanted a 3080 but got sick of waiting and watching prices rise
Yup, and honestly, that's why you have to be a little skeptical about all the supposed shortages. Somehow, we can pump out infinite CPUs but GPUs are so limited. Whatever. It's a scam.
@@bobsaget9182 mining cryptocurrency isn't really profitable on CPUs, but it is profitable on these GPUs, so it would make sense why they would be constantly sold out
The Founders Edition dropped 5 hours ago now in the UK and it's still available to buy.
A 3080 FE drop lasts a minute tops. Market herr clearly has a problem with the pricing otherwise they would be long gone
Exactly! I wouldn't have bought 3080ti cards if I thought I had a chance of getting 3080s
Guilty, needed a new card last winter and the only cards in stock was GTX1650 or RTX3090. As GTX1650 is nowhere near the performance I need the only option was to buy a RTX3090.
It's a good card, no question about that, but it's absolutely not worth the price for a gaming computer, I would much rather been able to buy a RTX3080.
The year is 2027. The new Nvidia 8090 Tie needs 1.21 GigaWatts, and can only be powered by direct lightning strike.
jiggawatts
But when that baby gets up to 88k...you're gonna see some serious shit!
The environmentalists will be OK with that. They don't know about coal powering some of our electricity. 🤣
nono it will be bundled with a mini fusion reactor for just 200 000 dollars . stay away from gigabyte, they may explode
@@SFTaYZa "Gettin' Jiggy Watts"...they're unstable and prone to slap other components if they notice they're being observed!.
It's been said before but...
Ti now stands for "tiny increase"
Regarding the performance of course, not the price.
or power usage
@@Floturcocantsee and that
You couldn't have summed it up better..... That saying needs to become a thing!!!
Yep... And Tremendous increase (regarding price)
@@bcr003 nice lol, i like that
This is actually brilliant, Nvidia is making such inefficient cards that cryptominers won't buy them anymore. Nvidia thinks about gamers too!
(Hire me Jensen, I can spin bullshit too!)
This rhymes
Dont worry 4000 coming and intel coming miners will have their low wat gpus
Can't you like undervolt it quite a lot?
You would have to mine it, so you can afford it 😂
@@beboH4o uhh actually 4000 series will be extremely power hungry as amd will be using new chiplet architecture on GPUs. 7700 XT will be 6900 XT level of perf for $400 msrp. Nvidia is worried so they’re going to push a ton of power through their cards to compete. Rumors have top 4090 Ti card at 600W+
I love how, in every single in-depth video, tech Jesus does a full cliff note summary in the first 3 minutes and basically says "you don't really have to watch the rest of the video". The guy just gets that there's 2 types of viewers here and doesn't force the non-tech detail focused viewer to sit through all 30 min to get the info they need. Content creation done right.
Speaks to how well the GN Store and sponsors do.
And regardless, we also get chapters for at-will skipping at any point.
So THIS is why the lights dimmed across NC the past few days!
I’m sure the switchboard in the building is just jammed on.
Saw that too, last night there was a little flicker while I was eating dinner
Naaah ....
You need to call the Ghostbusters HQ to do some investigation. If they found no supernatural phenomenon, then yes. It might be caused by these s'iT 0903 XTR testings.
I got this card for $1000 yesterday.
Beefy boi.. have it undervolted to 2100MHz @0.950mv. Auto fan speed keeps it under 60C with a max fan speed of 72% seen which was surprisingly quiet. Wattage pulled shows around 385w when undervolted. Overall quite happy.
Oh and memory temps haven't been above 72C, VRMS around 58C and Hot Spot delta of only 8.5C (meaning an almost perfect thermal paste job and mount)
where did you get this for $1000 ?
The industry is going in the wrong direction when it comes to price, power consumption, marketing and a whole other bunch of stuff.
cause intelligent people buy them
Completely true. We should be focused on efficiency, rather than irresponsibly high power draw. It would be like if instead of the electronic car trend, manufacturers just started making muscle cars and dragsters. Options that cost more and are worse for the environment, yet still serve the same purpose. An efficient, moderately powerful GPU is worlds better than any power hungry monster.
At some point something will have to give. It won't go well for Nvidia going in this direction.
I really like my 75W 1650, just wish that there was a bigger market for them. Would also love a a 75W card with something like 2060 performance
@@zsookah3 "An efficient, moderately powerful GPU is worlds better" worlds better, and better for the world.
Almost 3x the power draw of a 1080 in a single card. That's something of an achievement!
It is a lot of power haha. It must heat up a room pretty quickly. In comparison my 3600x / 5700xt gaming rig pulls 320w total out of the wall while gaming.
I don't think this is an achievement, is like putting more fire in your fire, achievement is when there is a card that has the same performace with half the power draw.
@@Golecom2 (that's the joke :) )
1080s were pretty mundane 180w with up to 230w power limits on some cards.
@Green Mamba Games because frequency and power consumption does not scale in a linear way. When you increase frequency, you have to increase voltage as well, and power consumption is calculated like this : power = capacitance * voltage^2 * frequency. In other words, this mean power = frequency^3.
The neck tie got me good.😂
3090 Tie.
You mean Ti.
@@DeMichel93 woosh
It is somewhat cool that we can see how cooler design can save a card like this from overheating. That GTX 480 comparison has makes me want to see one of those older cards with a modern cooler lol.
There was a version of the gtx 480 with the Arctic Accelero Extreme cooler and it worked just fine (cool and quiet). There is a video from someone not too long ago doing a retrospective look at it.
EDIT: Took me a while to find it, since it's from a really small channel, but here it is: th-cam.com/video/tVztmLG2qP4/w-d-xo.html
@@mihakolbanar5494 sweet, ty
@27:45 : that's exactly my situation. No 3080ti 's in stock anywhere, so I upgraded to the 90ti for a couple hundred more. You totally nailed it there. 👍🤓
I remember when a top of the line computer was $2200... Prices are nuts these days
You mean 3 Years ago.
Basically pre-COVID. I thought $1200 for my 2080ti was insane, but these prices are ludicrous.
Lets blame russia 😂
@@wesleybrehm9386 I spent 1200 on a 6700xt in the height of all of it since my gtx 770 died
@@tessierrr Works for me. Blaming COVID is getting tedious.
500 watts is insane! I'm surprised how good the temperatures are for such high power draw. You will really need super good PSU if you are gonna use this gpu
850w psu or bigger. I'm glad I bought a 1000w psu 7 years ago since all these newer.cards are power hungry
Got a 1600w power supply handy? lol
@@randybobandy9828 imagine a 7980XE paired with this gpu
Yeah good point, and a good perspective. Power consumption will surely peak in the next gen? Question is, will it plateau again at 450,550,650W whatever it is, or will it steadily come down again. You could argue we've just come from a 250W era up to say, the 20 series, and we're at a 350W era in the 30 series
@@conza1989 considering the new connector, I hope it doesn't go higher
Kudos to EVGA for engineering such a cooler, but this is getting absurd.
Could you look at the temperatures of the other components of the system in a regular case?
Like, how much would this affect SSD, chipset or CPU cooling?
I'm starting to get very glad I opted for one of GN's airflow/recommended cases despite being hesitant about the dust now. Just in case...
Why? There are once again more review samples than actual units for sale. Real world tests like "how would this work in a normal persons computer" are irrelevant.
A normal person will never have a 3090 Ti.
@@Grimmwoldds Yea, when a full custom loop is cheaper than the gfx card, I don't think anyone that buys this will actually care.
@@bananya6020 concern with dust can easily be fixed with a brush and a vacuum/ blower
At this point swap the cooler for a server rack one and use a fat blower fan on it
This is exactly the kind of content I keep on coming back for. Thanks Steve and all the GN staff for putting out unbiased no bs content.
isn't it nuts a year+ later the 4080 has +10%-20% the performance and runs less than 310W
A prelude of what is to come power and price wise with Nvidia GPUs.
Yup, I really hope people won't buy the 4000 series high end cards. There's actually not as much of a need for them performance wise and certainly not for performance per watt.
You want 1440p high refresh rate with most modern games? Or 4K60ing ridiculously modded games? You already got that with 3000's high end. 1440p still remains a sweetspot for high end high refresh rate gaming, so even when budget is not an issue you SHOULDN'T buy the 4000 series purely from a power consumption and cooling perspective - you'll be heating your room like a motherfucker.
I REALLY hope 4000 series flops
Think I'll hold out for the first GPU that's fitted with it's own on-board fusion reactor 🔥
We already have atomic clocks PCIe cards. Waiting for nuclear reactor cards
Confusion reactor?
@@GewelReal there is just a tiny little bit of a difference between a atomic clock and a nuclear reactor, which makes inclusion of a reactor onto a pcb a little bit more difficult i'd say
Wouldn’t that make it a power generator instead of a power consumer?
skip the wall outlet and straight to breaker box.
With energy prices going through the roof, Nvidia are making the most power hungry cards ever.
US electricity prices are stupid low anyway
@@GewelReal BULLSHIT!
Yeah depends a lot on what you pay for electricity. I pay about 0,14€/kWh including tax, so running a 3090 Ti for 5 hours per day for a year would cost me
5h/day x 365days/year x 0,5kW x 0,14€/kWh = 128€/year
Comparing that to my 3080 with a 320W power draw:
5h/day x 365days/year x 0,32kW x 0,14€/kWh = 82€/year
The price difference is kinda negligible at my energy price (luckily I got a flat rate two year contract last summer, and the contract came into effect in October).
I have no idea how much electricity costs in the US or other countries, but you can just plug your own cost of electricity in your currency into that equation and get your own price.
Seriously the UK is having major electric supply issues. Our regulators have increased the cap that energy suppliers can charge consumers. By over 50%. It works out to a £693 per year increase for the average household. So yes, let’s absolutely add even more electricity usage with this monster.
@@GewelReal you obviously don't know what you're talking about. Low in relation to other countries, yes. Low in relation to past energy cost within the country, aaah no, absolutely not. Energy costs are on the rise in North America and the fake tree hugging hippies who want electric cars are making it worse.
I bet you the #1 reason they don't call this card a Titan is that they don't want to bother having to provide a professional driver
Or so they could still release a Titan later. This is Nvidia after all, why cap the SKUs at $2,200 if there's someone who'd pay $2,400 :/
You're telling me Nvidia has insane margins of profit and they don't even bother hiring The Stig!?!? Damn you Jensen!!!!
Just so they can charge 5-10k for the Titan
The titan was no different than original threadripper. Made a semi pro halo product for a niche market and then watered it down for mainstream and beefed it up for pro, killing the niche hybrid card
They dropped the Titan moniker because calling it a xx90 and saying it's the most powerful gaming GPU makes sure it sells to both gamers and creators.
What would we do without you Steve? Bless your heart. We love the "subtle" shade!
This winter the price for a kWh in my country rose a couple 100%. Meanwhile Nvidia releases gpus that need 2x the power of the previous gen flagship...
If rtx 4000 is as power hungry it won't be a compelling product, regardless of performance imo.
Same here in the UK i didnt even think about getting a good 30 series card before our energy prices rose by 60% this April and due to rise the same in October. Petrol is also £2 a Litre now.
@@daveward4358 that's horrifying. I'm not sure if being disabled inn canada is worse or better than having a low wage job in the u.k.
Its not that big of a deal. People make more out of it than it really is. Unless you keep your PC on and loaded 24/7 the difference in elec bill each month is negligable. There have been 100's of studies and writeups on this.
@@junkiexl86 You are correct, but prices for food and everything is going up so this stuff maters when wages are stagnant and the world is going to shit. . but yes, People should read articles like that. If you could link one that would be great.
That said,, I need to mine with my PC as much as weather permits. I am disabled and need the money.
I have a 5800X with 32mb of L3cache which Monero mining likes, and a 6700 XT, for whatever the best thing NiceHash says each day.
I live in Ontario Canada, where elec prices are sort of not bad. Alberta Canada has terrible prices and other countries are worse still and food prices are at record highs. Each month they shatter the old high. the Ukraine War has been to blame for much of this because 30% of the worlds wheat comes from Russia and Ukraine.
We don't have strong left wing politics around the world so that means wages are stagnant and Corporations use times of crisis top push prices well beyond what inflation demands, and people keep voting in right wing ghouls who lie and pretend to fight the good fight.
Im afraid I will have no way of being able to live soon.
I really like the humour in this videos. You get better every week! Great content. You put it a tie on it 🤣🤣🤣
EW EMOJI
And I thought the "tie" pronunciation exits because of being almost tied with 3090 in some tests :D
I find an absurd lack of LN2 with this GPU. Hopefully Steve can fix that.
One could say you found the lack of LN2 disturbing.
Bearded hardware to the fusion
An overclocking livestream with this card would be cool.
_lights start blinking_
He's putting together splitters to hook up dual 1200W PSUs to power it
Ah yes, the obligatory super expensive release right before the next gen is unleashed that will stomp the crap out of this for much less money.
This would be like if Apple released an iPhone 13 Pro Ti for $2000 a month before releasing the iPhone 14 Pro for $1000 and the 14 Pro stomps the 13 Pro Ti.
Sadly that is not what has happened. The 2080Ti did not stomp the $600 1080Ti for less money. It didn’t even stomp it really. Like plus 22% for 2x the price. Then the 3080 came out but they were $1500-$1800 for a very long time. Lovelace will probably claim +60% faster but actually be +35% and consume prohibitive amounts of power at the high end. Like circuit breakers tripping is going to be a problem at those levels.
Hopefully AMD and intel will produce something that’s meaningful.
@@Mattribute 1080 was a fluke and anything mid pandemic is out the window (not the norm).
Just got a ROG Strix LC 3090 Ti OC for 1200$, i'll be having a glorious time gaming the next year instead of waiting in backorders queue for the mystical 40's that likely won't show up until somewhere in 2023 and actually available at a reasonable price for consumer until maybe Q3 of 2023. Maybe right now is the best time to splurge on a monster card.
The community for sure will find value in power supply testing. These cards are bananas and I'm dreading the heat but also needing a 1000w PSU. Hopefully AMD has a more efficient next gen design....
@Green Mamba Games if you had any money you wouldnt btch about the price
3080TIs were $2000 two weeks ago at microcenter, the market is crazy right now lol
Actually Microcenter still has 3080TIs that are more expensive than the 3090TIs right now haha
theese are only 2000
Now easy to buy in the Neterlands for 1250-1400 euro.
@@OfficialUknow the cheapest ones are only 2, some are 2200+
Within just the past few days, 3080s and 3080ts are in stock and available on amazon, sold by amazon and shipped by amazon, at much lower prices. It has been a sudden turn of events!
This card is most likely a concept card, a prototype cooler/PCB for upcoming monstrosity 4000 series or Nvidia's last attempt at cash grab before prices take a plunge.
@Gareth Tucker At 600W power draws? They are likely going to have 600W power draws to work with the new PSU standard. The old coolers would not work with that.
200 iq comment
Yeah I think this is spot on
they dont need to protyopye coolers with thousands of products lmao. This just a cashgrab to get the last money out of the miners before next gen AMD refresh.
Would you ever consider doing power-normalized testing? Like, how does a 3080 limited to 250W compare to a 3090 Ti also limited to 250W? With power consumption going up and up, I don't want what is practically a space-heater in the room while I'm gaming in warmer months, so I use Afterburner or the like to set the PL to something more reasonable. I've been surprised at how small the impact is on my 3080 FTW3.
I would really like to see this as well
Bump
Bump
Get yourself to do water-cooling lol other than that don't buy it simple as that it's funny even for next gen cards are gonna be worse than this I'm not surprised LMAO don't gimp a card that uses more watts you lose fps
@@redclaw72666 that's really not the point. It's to try and see how efficient the cards are. If a 3090 performs the same as a 3080 at the same power limit, then it means a crazy OCer can do a power mod on a 3080 and get it to perform close to a 3090 - which is an interesting prospect. Also, water cooling or air cooling isn't going to make a difference. The same amount of heat still enters the room. In fact water cooling is more efficient at removing heat from your system, thus more heat will enter your room quicker. Literally isn't solving anything by changing the type of cooling in your PC.
Something I'd love to see are videos on power scaling on this or GPUs in general. What kind of performance do you get if you power limit it to 300W, 200W, 150W, 75W? At what point does it stop turning on? Is it possible to make a card like this run only on the PCI slot power? What's the best way to lower power consumption, while losing the least performance?
With GPUs consuming more and more power, I really want to learn about making them more efficient.
bump!
While I don't have the data for all of those scenarios, as someone with a hybrid cooled 3090 FTW3, I can mine (yes, I know) and run three monitors (1080/144, 4k/60, 1080/75) and play any game (usually on my 4k) at the same time. And this is on a mining overclock (+100 core, +1175 mem, 65% power ~290W).
Really
You can undervolt it and immediately reduce power consumption by 20% and keep the same performance.
Seriously. Really. How much more can they do and still make sense. What is more amazing is that people will buy ANYTHING!
Here in California we will be getting a monthly stimulus of $1000 for 3 years so yes I dont mind spending 2k on a gpu.
@@La-eh4jk can you pass some of that stimulation over to me?
This gpu alone almost costs as much as my whole system, including a 6900XT.
and dont' forget, 6900xt is projected for peasants.
I got a pre-built with a 3070, and the 3090ti costs more than my tower, monitor, periferals, and the desk lmfao
Same. Costs more than my entire PC and I finally got my hands on a RTX 3080 (last September) after a year on EVGA's waiting list.
My PC cost a few hundred maybe.
Same here. My PC cost that much before covid and it had a 2060.
I have built myself several PCs since 2010 and it's quite concerning how the priorities have changed for me. All i used to care about was Price/Performance for CPU and GFX card and wattage wasn't even a factor until i bought 2080ti in 2018 and 250W seemed like a tad bit too high but i did bite it anyways. Then i bought a G9 monitor and needed more horse power, so went for a 3080ti as i wanted to keep it under 350W no matter what and ditched 3090 !! (350w was still very high for me). Now we are at 500W territory ! Now a days when looking at GFX Cards first thing i look at is the wattage , then performance and lastly price !
Also my room already warms up quite abit after playing for a few hours and im genuinely terrified of next gen flagship cards.
I was thinking of getting a 1200W psu thinking it would be complete overkill. Now it seems like it'll be the bare minimum.
@@camerontaylor3209 I already invested on a 1000W PSU a while ago but it boggles the mind to think that coupling it with my 5900X, if i ever want to go next gen flagship , 1000W may be cutting too close for comfort and i may need a beefier PSU.
Plus how big are heatsinks gonna get ? Im not a watercooling fan and never been and its been aircooler for me all the way. ! Are we looking at triple slot 38-40 Cms length heatsinks too cooldown 500-600w GFX cards ? That makes almost all the mid towers and quite a number of full tower cases obsolete and so on. Where this is going, its not looking worrying !
as a businessmen I still have some questions: is it possible to run the 3090ti in quad sli? and will this enhance my 1080p Solitair gaming experience?
One mean game of FreeCell.
Fyi , an average electric heater is 1.5k watts. So no reason to buy 4 , just 3 should be enough.
Dual sli should be enough for an acceptable standard solitaire experience, but you will likely need quad sli if you want to play draw 3.
That should be enough to get 60 fps but 1080p is too high for these cards. You should try 720p
rofl of the day :D
Being patient was key. Just got this brand new for $900 thanks to the 4090 release, crypto crash, obviously, the tail end of 2022. Glad I waited.
3090 ti was $120 from the 3080 ti i just bought 2 weeks later. Easy trade in/upgrade for $1200 for the amazing cooling and memory upgrade the price went back up to 1400 in august 2022 so I got lucky. I couldn't wait for 40 series due to work. I hope ya'll are able to get what you need/want after these few rough years.
If 500w GPUs finds it’s way down to 3070/3080 level, this will be cause for concern, given rising energy costs/the carbon effect. As a consumer, I couldn’t justify 1. The entry cost of buying the GPUs given the current pricing trends and 2. The energy cost and the affect it will have on the planet.
If 5 people power these things on, energy will be depleted on a global scale.
If you want to give mother earth the real middle finger you combine it with an i9 12900K.
Even if you were one of those climate change denial people, these cards are shit, because they cost a lot just for using them and they nearly melt your computer
@@MerlinErdogmus yeah, but first drive to microcenter in a Tesla
Wont happen. Highly doubt 4000 series will be even as thirsty than 3000 series. 8nm Samsung is not the king of the ring. More compareable to 12nm TSMC when it comes to efficiency
@@Squilliam-Fancyson Current rumors put the RTX 4090 in the 450 to 600 watt range. The SGX version of the new H100 draws 700 watts. Big power draw is here, the 7900 XT will also apparently be around 450 watts.
I remember when it was claimed that ATI made space-heaters, ow how the tables have turned...
guess everyone forgot the 500 series
Still not fast enough, think I’m gonna wait for the 3090 Bow or Bolo Ti.
Just wait for the 3090 Tux.
I had a couple of OCP shutdowns with my 3080 FTW3 on my 6yr old EVGA 650w Gold PSU. I upgraded to a 850w Seasonic last year after having the issues, and it's been fine even when running the 450w VBIOS with the card water cooled and OC'd.
Do you think you’d ever include VR figures? For DCS I need raw power from a GPU & seeing these cards running that or something similar would be very helpful to the VR world which really pushes cards.
i'm going to use mine in SLI
Also remember to have a SLI of power supply.
I know this is a joke but SLI is dead
The only thing I love to see about this card is the Power Plug at the back of the Board meaning maybe in the Future there will be Blower style server Cards or at least compatible ones
Pretty soon you’ll need a dedicated 240/220 V outlet to just run your Nvidia gpu’s 3kW power supply.
*Laughs in European*
@@KriLL325783 Indeed, all our outlets are 230 V!
@@KriLL325783 Also laughs in Australian
Remember when used to be able to buy a super high end video card for under 500? I miss those days.
I love how excited you are in each vid man keep it up
if you are going to use up four slots, why not just an 80mm fan blowing the heat out of the PCI slots and filling all four slots with fins? I mean you already took up the space, might as well expel that heat instead of using a slot height for fans that dump the heat into the case. it might also be quieter overall at that point.
Because desktop owners dont like the whine of a server fan.
Why not have a 6 slot heatsink that extends to the front of the case so that the intake directly cools the GPU? Kinda like the Mac Pro
@@technetium_tech Because the heat pipes transfer almost no heat that far. The heat will still be concentrated around the die.
Release the blowiematrons.
th-cam.com/video/la0_2Kmrr1E/w-d-xo.html
Looking forward to the transient load spike testing with this gpu.
The 3090 Ti: See enclosed instructions for proper and full rectal insertion of card. Liberal use of included sand lubricant is recommended.
Nvidia thanks you for your purchase.
Recieved my Gamers Nexus pint glasses yesterday. Love them!!! Great job on the packaging guys. Thanks for all the info you guys provide for us.
Awesome review for such a time crunch, I’ll patiently wait on the rest of the through GN coverage in the coming week or so
Fantastic work as well. Insane card, insane power draw. Kinda hope we drop back down next generation but I don't think we will
Last money grab from Nvidia. When you can wait for a 4070 for a 4th the price and match the performance
Let's hope. I'm worried that Nvidia will MSRP the 4070 at 700 or something crazy. If that's about the going price from AIBs, I could see them pulling that.
To put it mildly this is very naive thinking.
That's what we thought with the 3070 and 2080ti... until chaos happened.
I just bought a 3070ti and I’m sticking with it. If the price to performance is good for the 4K series then I’ll consider upgrading but I’m not giving my hopes up
Very skeptical that the 4070 will have a much lower price than 1500 or dollars. Not to mention we don't know if the 4070 will be worse than the 3090 TI, it all depends on how well the 40 series does overall. One thing you can know forsure is that 700 dollar graphics are a thing of the past....
“We can’t let them beat us! Release the 3090 TI!” - Nvidia
*6950 XT enters chat*
The necktie was amazing. Thank you.
This bad boy is going to render sooo many excel sheets!
You'd need like 10 exhaust fans to actually get that card's heat out of the case...
New GPU! Space heater edition
Or do water-cooling lol it's pretty simple to know
@@redclaw72666 Stop spamming your idiotic comments.
@@redclaw72666 Water cooling doesn't fix an inefficient space heater part. It just transfers the heat to a radiator, that still ends up in your room. Quit with your bullshit.
@@ramen645 You joke, but I mine on my 3090 overnight to use it as a cost neutral heater in winter 😅
Nvidia's power consumption has become really stupid. They have become the new Pentium 4.
More like they're going circle around again with the 400 series. Cept with electricity instead of heat lmao.
You nailed it.
The reality is that gamers are very tolerant of extreme power consumption and heat output. Accepting, even. It's psychological reinforcement that they've got an extreme gaming computer.
@@jimtekkit Tbh haven't high end gamers mostly always been that way? Before stupid GPU's like this came out it was multi-gpu setups, and before that it was 3dfx making you need a seperate psu for a gpu they never got out so man idk.
Well overclocking a pentium 4 is more fun and less risky. It offers the same space heating qualities for a tiny fraction of the cost and more significant clock increases until the system is unstable 😂🤣
I remember complaining about how much room two big cards in SLI/Crossfire take up in the case... now GPUs need the same amount of room by themselves :(
Gpu’s that you can’t buy don’t take up that much space tbh
I bought this card last month when stock on [a popular online hardware site] was dropped to $1099. Worth the wait. They are usually sold out.
I’ve officially given up building a gaming PC until I win the lottery
i wonder if this will start making Dual PSU cases/builds more of a thing
Why just 2 PSUs? Why not 3? If you rock a Intel 12900K overclocked and this Nvidia Volcano, 2 PSUs might be tight.
For sure
most residential homes have rooms wired to just 1 circuit breaker for all outlets in the room. you're maxing out the breaker at ~1600w
@@BenK12345 if you house catches fire because of the Nvidia GPU you have 2 less things to stress over 🔥
@@BenK12345 Maybe in America, in Europe you can draw 3500 watt out of an normal outlet.
Protip: Lower end business class internet is intended for the customer to get frustrated with the bad upload speeds and upgrade to fiber.
Great video as always!
This is another prime example of gross price gouging.
I won't buy one out of principal, all these cards are 25-30% overpriced at the MINIMUM.
Anyone who needs this...have at it, more for you.
Looks like Steve’s on a bulking cycle! 👊🏼😤👊🏼
That tie at the very start of the video got me good, that was well-done dry humour
that power consumption is insane, that single card is using as much as my whole current rig. With power pricing atm that would burn your wallet even after buying
Hi. Thanks for the awsome content. I would have to disagree on the comments about human eye not seeing more than 100 fps. It's still quite noticable especially in competitive shooters. Lower frame times makes the game feel more smoother and more responsive. Even tho most players would't play competitive on 4k or even 8k alot of profesional players would fork out any kind of money to get that small edge.
This exactly. In certain games a 25% lift in FPS is an edge some people care about.
this is madness, the world is literally falling apart and energy costs at a premium - NVIDIA lets make a GPU that uses more power than a boiling kettle!
SIGN ME UP!
how is that nvidias problem?
Pretty small kettle.
my kettle is almost 2kW but only need like a minute to boil water
@hey it's pete FDT and your ignorance lmfao.
Got mine Friday, put it in yesterday and BOOM, Defective. Nice wiggly horizontal lines with black screen and a nice burning smell in my office. RMA bound today!
Thank you for reviewing this! My slovenly 3090 filthy casual appreciates recognition 😂. Love these kinds of reviews! ❤ That red tie though...! Nice!
4090 Ti: “Try finger, but 600w”
When you realize you have to buy a new PSU due to the new ATX 3.0 standard: "Try fork in wall socket".
600W for FE cards, try 650W for 3rd party 😁
Some didn’t get the ER reference here
500 W?!? So, they're basically just throwing more power at it rather than making any efficiency improvements
its the same generation/architecture. its not like they're making a 3090++ revised edition. the only thing they can do is add more cores/frequency/memory etc to make it faster which will make it draw more power. albiet with the law of diminishing returns hitting alot harder
@@acatch22 But it also follows the 2000 to 3000 progression as well. 2080 to 3080 basically went from 200W to 300W with 50% performance increase.
So it's been going on longer than just this Ti
Of course. It's the same architecture. What did you expect?
Where are these efficiency improvements going to come from? it's not like we can just pull new nodes out of our kiesters anymore. Those days are over.
@@Schytheron They did the same going from 2080->3080 and those were different architectures. Do you have an excuse for Nvidia there? It's been about 3.5 years without any meaningful change in efficiency
Would love to see how undervolting the 3090 Ti compares to other RTX-30 series card (which don't seem to have a lot of OC headroom without something exotic like LN2 cooling).
It's crazy, you can probably lose ~5% of performance for 15-20% lower power consumption...
Sorry if this seems like a stupid question but how do you undervolt?
Lol idk why but I absolutely lost it at the B-roll shot of the "RTX 3090 Tie" 20 seconds in
the cut to you with money stuck on you was hilarious XD
Gpu prices will never be the same again, even if the 4000 series come out expect a 4060 to start at $700 in retail stores
Honestly it wouldn't surprise me at the point.
Yup the both Nvidia and AMD saw what people were willing to pay scalpers. Their investors and executives are going to start the next gen GPUs with a high price. I think a 4060 will be $429-$499MSRP but will as you said sell for $700 in store from mark ups. Crazy how the xx60 cards use to be budget now they are mid range approaching high end card prices. Idk even know what the 4080 will cost. That is the card I’m going to buy. If I had to guess $1000-$1299 MSRP but sell for $1400-$1700 In store. And these are my in store mark up prices. We all know scalpers will sell even higher and they will for sure buy all the stock. Scalpers are to blame for these price increases.
Only if there is mining demand. Otherwise they won't be able to shift enough of them. There just isn't enough consumer demand at those prices.
@@theholt2ic219 the `xx60` cards were always the entry point to midrange.
Don't buy the top cards then
And i thought the 280w my old 1070 pulled was alot. Also i love the design of this card! Tiny dots in the plastic, curvy "grille" and the nice rgb area with the chrome evga badge! Wish more cards looked like this instead of the common flat yet spikey designs
How?
My 1080 ti draws only 250w.
@@jesuschristislord77733 i shouldve said 220w lol. it was a zotac amp! card
My gtx 1070 pulls 130w at full load. Mind you I did undervolt it a little bit mainly to reduce temperatures but even at stock it only peaked at 155w.
I'm happy with my 3070. I'll only upgrade if I can get higher performance for a similar wattage at a fair price.
Same here
I went from a 3070 to a 3080 Ti to now a 3090 Ti, very happy with my upgrade 😊
The "tie" on the GPU! Haha, love it!
Totally bought one of these today. It's a long story.
Please consider adding MS Flight Sim 2020 in your game benchmarks?
its a pretty good increase from what I seen in Ms Flight sim upwards of 20fps. For Vr that's a big deal.
I'm happy with my 3090. It will live a very long and fruitful life in my machine like my old 1080 did before I transferred it into a server/entertainment center build with the old FX-8350. Also 500 watts? That is insane 😳
Might as well wait for the 4080, which will beat this turd in everything for a third of the price.
FE edition dropped 10 minutes ago and they are still in stock. Says everything you need to know about the demand of these.
A third of the price, like 700$? You're pretty optimistic, GPU prices go up generation by generation, 4080 should be 800$ at least, if not 850$. And the power consumption of course will be in line to 500W or even bigger
@@luckyowl10 An AEB 1080ti started at $950 on release, there will never be something $800 ever again top tier
@Transistor Jump I literally bought 2 1080tis on the month of release. Cheapest AIB cards were $950, all the way up to around $1200 for expensive ones. Apparently you don't live in reality.
Bruh you really think nvdia is not gonna hike up prices after the shitshow that was this generation?
@@drunkhusband6257 I got 2 3080s for 740 dollars. That was top tier a year ago
You guys are legends. Keep up the great work.
Picked up the EVGA 3090 Ti card that was $1,999.99 and I also had $115 in EVGA Bucks to apply towards the purchase. So I jumped on it. Already have a tracking number from EVGA, very excited to play with this bad boy!
Oh boy, a GPU literally only designed for people who panic bought an RTX 3090 and a giant power supply from EVGA within the last 90 days to step up to for like 200 bucks because it's actually cheaper with the memory on the front than buying a water-cooled backplate for the rear memory.
Edit: or $437 to step up to the same high tier sku instead of having the option to go for the model that's only $80 more including $132 of sales tax, nevermind I'll just watercool the 3090.
The "Tie" is a nice touch.
By the way, I paid $2500 for my 3080Ti retail from memory express, but thats Canadian.
I made a mistake, don't judge me. My 1080Ti was showing its age.
Judging harshly for one of the best cards beign swapped out for triple the price.
have to wonder why you just didn't buy a 3080? 🤔
ok so I can see someone paying for a 3090ti because of the 20% uplift in performance
but the 3080ti is within margin of error
big rip
I sold my 3070 for £600 and got a 3080 for £730 :)
i paid $2200 cad for my 3090 strix on newegg lol i said go big or go home
I would have bought it too but I ran out of kidneys to sell.
@@rochester3 I mean at least you got the top end model of the time
this poor chap didn't even get satisfaction of that :/
I mean with the 4000 series release creeping up later this year.. i would honestly just wait and see how they perform, they will most likely compete or destroy this card at a lower price point.
3090ti in Denmark is 2750dollars xD haha
Yep. I don't see the point in getting this unless you really need a 3090 (i.e. work) and couldn't score an OG one.
Just wait till these things start pulling 1.21 Gigawats ;)
Great review as always!
Regarding the whole 8k thing - there may be more use for this than you think. In the sim racing community it's very common to run triple 1440p monitors (6k) and a few try to run triple 4k (12k!). Getting these games/sims to run well at these resolutions is an ongoing challenge. Don't forget 8k VR for similar use cases. Obviously this is a bit of an edge case so your point still mostly stands.
That is an actual good reason to have it. 8k monitor market is not there yet.
Gamer's Nexus used a 1600w power supply to run their tests of the nVidia 3090 ti GPU card! This is not a criticism of GN. This an hilarity reaction to the absurdness of GPUs that consume 500w or power and have power spikes beyond that! lmao
I guess I'll just use the budget I wanted to allocate for my gpu, to get DDR5. A 2070S still holds up anyway.
I would love to see this kind of cooler on a mid-range to enthusiast card... would produce awesome temps while being extremely quiet. without any coil rattle or coil whine this would be heaven for every silent-pc-fan. 500W on its own is just beyond stupid...
I kinda love this idea as an option for those who want it
"The 480s would be proud" haha as an old PC gamer I found that awesome Steve!
I can't wait to receive my Gamers Nexus glass and coaster set. Thanks, Steve!