Bro, nvidia doesnt let anyone benchmark them, that says alot. People are unboxing them like it´s gold, truth is that its only 25-30 percent faster than a 4090. The rtx 5070 as poweful as a 4090? More like a 4070ti. Mult gen is just another 3 fake frames with no new game data. Meaning no one with a 4090 should upgrade, im sticking with my 4070ti
I'm not entirely sure, but that may have been a joke ... a play on the "my grandma died" homework excuse. Just from the way he delivered the line. Best wishes if it wasn't a joke.
Let's face it, that's the real complaint isn't it. It's a shame so many Tubers use the absolute donkey's dangler of a card, because it can only generate a sense of need in the audience. It distorts the expectations of folks and leaves them feeling short-changed, inadequate, whatever: they are daily confronted with knowing there is a better experience, if only.....
I love that PC people are always spreading Leaks and Rumors about the GPU until they have in their hands then the embargo mutes them lol. It's kind of funny. Makes me believe manufacturers are in control of the leaks and that there's no such thing as real leaks
On the other news. Water is wet and sky is blue. Im glad your naivety cherry popped just a bit. Wait till you find out other consumerism BS youre being constantly lied to so you buy buy buy.
33% raw performance increase, with 33% higher cores, 28% more power draw, 25% official price increase, plus 2.5 years of wait. But is there any alternative out there? For a long amount of time, no. Can you combine the power of two high end cards while gaming like in the past, instead of this nonsense marginal increase? Again, no. End result: Desperation and stagnation at its best. Definitely not worth it.
@@piotr78 buy used 40 series, either 4070s for 1440p or 4080 for 4K and ur good (maybe wait for the new gen to do that tho, amd might be cooking too but they mysterious about it)
@@piotr78 wait and see the benchmarks of the 5080. beyond that it depends on if you are wanting to game in 4k and with however many settings turned on/up. gotta buy in at some point though.
Guess who was right? VEX was right, the prices are exactly what your source in Australia said, we already have leaked prices from a store in Spain and the 5080 starts at 1500+euros up to 2300 euros and the 5090 starts at 2500 euros.
There never WERE any problems with the connector. Every single recorded case was user error. Yes, newer 12vhpwr cables added more features to make stupidity less likely, like the click pin to give you auditory feedback of when its inside and not let it move, either way the main issue was people bending the adapter around.
Funny part is in well over 20 years of dealing with PCs (and 7 of them working as a system integrator, aka pc-slapper-together with all the support nightmares that come with that field), I've never seen _ANY_ regular PCIe power plug melt or burn up even if not 100 percent plugged in. Never. Even on cards requiring 3 of the damn things (3090). This only started with this garbo connector
@@ppeezThat's BS, the connector was flawed from the get go and it's been proven. Too much power through too small wires/connectors, then, there's the much smaller safety margin on the cable too. I know everyone likes to shill Nvidia as the best thing ever, but they really f*ck3d up on that connector, even cables that had been plugged in for a yerlar plus were catching fire. Hopefully they actually did fix the flaws so that we don't have more fire hazards on our hands.
If the selling point is DLSS 4 for a 2000+ dollar card, which isnt even adopted in all games, wouldnt it almost make more sense to spend way less on a lower card and spend 7 euro for something like Lossless Scaling which works with every game or even movies you watch?
dlss 4 will be supported in all games that have dlss 3. You will be able to configure it through the Nvidia App even if the game didn't have an update with dlss 4
Which is a huge redflag. If there is nothing to hide, why hide it? Now we're just 9 days from release and they should encourage customers with proven FPS increase for that price.
Everyone who has one keeps talking about the size and the ability to cool down the card to no end, but why is it that NO-ONE says a word on the acoustics of this "small form factor" solution being 2 slots? Did Nvidia prohibit ANY comments about the noise levels before embargo? I could understand not giving dB numbers, but it would be useful to have a assessment of how annoying the sound is compared to other high power solutions before. It's not great even if it fits on small cases if it sounds like it's about to take flight.
Because without enough cooling the GPU won't work well but it can be as loud as possible and still function. Do you have your priorities straight? The majority aren't as sensitive to volume. Put on some headphones and adjust your gain.
I think I'm most interested in the 5070ti, lol. The 4070 ti super has perfectly fine raster for 4k, so with a decent pump to performance, especially raytracing and AI upscaling, I think it'll be the best all-around card
@@cvd1 those are against the 4070 ti. But honestly, I'll be waiting for the youtube reviews. I'm also not upgrading from the 4070ti super, I'm doing a fresh build 😀
Probably won’t upgrade til 7000 series or higher having a 4090 in this AI WAR mess is nice just knowing I’ll be safe for a couple more years until I have to turn settings down
Same boat I will say a good time to upgrade is when the PlayStation 6 is released so in 2028 or 2029 by then it will be the rtx 7090 which will be more powerful then the PlayStation 6 because we know that consoles use older pc tech so most likely the PlayStation 6 will be using a rtx 6070 performance .
@@msg360 Consoles using old PC tech has not been true for a while. Consoles always incorporate tech that is inline with what modern PC's are using at the time.
@@actualyoungsoo you could say that for any generation those people who skipped the 40 series are the ones with 20 and 30 series who are upgrading now. Is 40 series owners are waiting 1-2 more generations before upgrading so 6090s or 7090s
14 ชั่วโมงที่ผ่านมา +19
I just realised... 575W of power from 12V rail is appx. 48A current. Time to develop different power inputs for GPUs, 48A is crazy and unefficient in terms of how much conducting material (copper wires, connectors, pcb traces) you need. A 48V rail and 12A input would be more ideal and safer.
Because its a proper room heater now. 575W is no joke, all that heat will end up in users room. A lot only talk about reference design cooler and fins how great it looks. But in the end all that heat will end up outside your case. I dont expect much from undervolting also as base watts is already too high
@@supremeboynot the point. The point is resistance in the cables and power efficiency of a 48amp connection. I didnt even think about this yet, actually an interesting point
@@ppeez That is standard that determines the limit. Currently its 600w but heat is already a problem that i'm sure a lot reviewers wont mention. They look mostly the temps how cooler can handle it and how silent it can stay. There is no way to deny TDP. 4090 was 450W default. +125w more is insane amount of increase for one electronic item.
They have a NDA. The NDA stops this coming Thursday. It gives them a few days to test without immediately jumping the gun towards a review. It’s a good thing
@@devinmurray5280 Clock increases aren't even half story when discussing uplift, mate. The 4090 was on average, what, like 60% faster than the 3090? That wasn't from clock speeds. The size of the 30 and 4090 dies were basically the same. Meaning every extra square MM on the 4090 was packed with more cores.
I've speculated since the 2xxx (RTX) launch that Nvidia switched their design philosophy from gaming/rendering to AI (since each gen has a datacenter/AI sister) and then found ways to make the gaming segment benefit from that (Raytracing, DLSS, DLSS2+, FrameGen). The massive memory bandwidth heavily benefits the AI segment as LLM's are often memory constrained and the AI/Data Center sister to this chip will probably have ~48 to 96Gb of VRAM (wonder if anyone will try to resolder upgrade their 5090's to see if it auto-recognizes the extra capacity).
I'm glad someone else noticed this too. I've been talking about it for ages. Every one seems to miss that pivot happened so long ago. Everything we've gotten since then is Nvidia's attempt to sweeten the deal to gamers. I wouildn't be surprised if Nvidia stops making gaming GPU's at some point. Every 5090 they sell for $2000 is an insane opportunity cost for them. That silicon could be going into $20,000 GPU's.
The RT and Tensor cores only take around 10% of the die space in total. A small price to pay for all of the benifits. I'd take that over 10% extra Cuda cores any day. There are so many uses for both inside and outside of gaming.
5090 is less than 1 % of the market for user ( more for professional ) user will look for 5060-5070 for 95 % or 9070 xt it depend on price and 2-3 % will look for 5080 .
Dude that thing looks AMAZING. I'm a total founders edition man, they're built like tanks, they look better than every other card on the market (except for some select MSI ones that have a really nice build) and they hold their value really well. Not to mention the cooling will probably be amazing for that wattage.
This design is really good it pushes all the hot air up towards the fans to then extract the hot air out the top of the case it's a good design for the GPU.
I have spent a few days trying a few different settings. I have to say, if you have a RTX 4090; especially if you have a one that is able to overclock. Example the Asus gaming oc RTX 4090 the default stock on the box is 2565mhz but boosts stock to 2790mhz and with a small OC it goes to 3000mhz+ core +1200mem very easy and stays below 65c. This corresponds to a 25%-50% faster frequency to the RTX 5090 with 2010mhz to boost 2410mhz. 4k performance was no better than a 2910mhz core freq. rtx 4090 and be aware that the extra heat with the 200watt increase made the test system CPU and RAM needing the overclock and XMP settings reduced. That is all. Keep your 4090s with the 3.65 slot cooler and 200w lower clocks. If you have a 4070 ti and below it would be a nice upgrade
It is a ridiculous price, but then again if it's anything like the 4090 it's pretty much a risk free purchase, since you could just sell it in 2-3 years for literally what you paid for it apparently, basically a free rental.
@@frallorfrallor3410 why i buy hight end? To save my dollar and soo i can retain the value of my dollar, you can atleast get 50% back inflation free while other cards just pissing money away has you just loses that $600-300 with almost nothing to recoup the cost.
Don't forget improvements compound when measured as you do. Improvements in series (from 3-4 or 4-5) are distorted by using percentages because the baseline is increasing. In nominal terms a 30% improvement from 4-5 can be far larger than a 50% gain from 3-4. You can have a 100% increase in FLOPs (say) from series 3 to 4 being 10 up to 20 (for instance). And if series 4-5 is only a 50% increase, that takes you from 20 to 30 - a difference of 10, the exact same uplift as previously but one that appears much worse than the previous series upgrade (because it's now "only" 50%). But it's actually an identical uplift. Expecting to maintain that sort of growth each generation is a bit much. Not intending to defend GPU makers as such, rather to just help put things in perspective, which people don't.
You do have to wonder though, is the 5090 being under baked firstly for AIBs and secondly for a 5090ti? The 4090 didn't really have anywhere to go with the Ti so they ditched it.
I am just wondering: if ai generated frames depend on rasterized performance, how much will there be development in the future if the development of rasterized performance is slowing down in the end? You can apply this ai gimmick so much until you hit its limits, it can't create frames out of thin air, unless hallucinations count
It's been said many times. They're nearing the limits of what you can push through the silicone physically. Game developers are going to have to put their asses into gear and get going on optimizing their games. If every game developer produced their games using 1060s as their testing hardware games would be running a lot better. Rasterized performance is also CPU limited. Again, physical limitations, the CPUs can't crank out enough frames, and neither can the GPUs, so the only way in the end is to optimize optimize optimize (which is what DLSS and the frame generation is doing)
@ I mean if you're already doing frame generation, and then adding stuff on top of it, yeah I don't know but we'll see what Nvidia comes up with for the 60 series as we run out of ways to make these cards faster.
I really hope we get some benchmarks for 3D work and rendering ;). Nobody talk about us artist trying to drop render times instead of getting those frames up lol.
Everyone is talking about increased latency due to frame generation but I rarely hear anyone talking about the advancements with Reflex to counteract that. BTW I hope your grandma is feeling better.
I want to see the maximum power draw and temperature readings of the connector from that card. Melting connectors is definitely a thing on peoples mind with this card.
Prices will most definetly not be MSRP, it never was and never will be. 4090 on launch was going for 2200eur here and now sits on shelfs for 2950euro..... 5090s will cost 3000euro on release for partner models and founders editon is super hard to get and they are always out of stock and only possible to buy in like 2 or 3 counties in whole EU.
*You work for 40yrs to have $1M in your retirement, meanwhile some people are putting just $10K into trading from just few months ago and now they are multimillionaires*
After I raised up to 325k trading with her I bought a new House and a car here in the states🇺🇸🇺🇸also paid for my son's surgery (Oscar). Glory to God.shalom.
Good day all 👍 from Australia 🇦🇺 . I have read a lot of posts that people are very happy with the financial guidance she is giving them! What way can I get to her exactly?
At the same time, you probably will not benefit of theses features before 2-3 years into video games, so it is really a big deal for maybe 5-10% raw fps increase ?
Test it with VR dude, if you have one, that's were my thinking lies, Trying to push frames on a VR headset is not easy, not for the full sweet-spot affect anyway.
Why is no one talking about the bus? They jumped from 384 to HALF A GIGABYTE. That is massive, most cards have around the number they've INCREASED by, but no one is talking about it.
@ We've yet to seen MFG's full breakdown, dunno seems to early to call LSFG bad , plus maybe it's a me thing but I actually prefer using LSFG in games that dont have FG, makes full use of my monitor's refresh rate in AAAs
Don't be surprised if the 5090 is not as fast as you thought it would be compared to the 4090. Nvidia has absolutely no competition at the top so why should it be. Expect the 60XX series to follow the same path unless something drastic happens within the next 2 years.
Thats one thing most forget to talk about, the fact top end cards are dumping out so much heat into the room. There is a desire to have great gaming performance obviously, but you also need to be comfortable in the room while gaming for hours at a time. Cards dumping out 575+ watts along with the rest of the PC parts makes for an uncomfortable gaming session unless you have an air conditioner right in the room with you and a temperature sensor for that in the same room as well
I disagree with the sentiment that you need a 200hz monitor to "feel it." I bet there are a lot of great videos out there that show that fps "feel" and refresh rate aren't so apples to apples. A lot of people still notice 60hz with 200 fps+ on a game like Counterstrike
Here is the thing with the newer GPU's. A lot more focus is being put on AI. And I don't mean using AI for playing games, even though DLSS 4 is essentially AI for games. I also mean AI as in running LLMs on your own computer. Newer Nvidia GPU's are no longer just for gaming. They are for productivity as well, which is why if you are a gamer, the gains between each generation will probably become smaller and smaller. 3090 to 4090 performance difference was crazy though so hopefully we get more of that in the future.
4090, you get more VRam it’s slightly faster then the 5080 and you get the full GPU die, you get all the features of DLSS 4 beside MFG which might still come to 40 series if not you can just buy lossless scaling and use 4x if you want a boost in performance.
Usually I don't care about boxes... but man that box looks like a bummer. Part of the joy was the smell and the opening of it. Imagine spending $2k USD (probably more tbh) and getting... that.
It will make a difference but not as much as people think, Most coolers are oriented with their fins perpendicular to the flow from the gpu and drawing in air from the sides. If you're really particular about it you could 3d print an air duct to make sure they never intersect. But it would have been cool if there were 2 versions of this card, one with an inverted cooler. If you use an inverted cooler with an inverted layout you could have it blow all the hot air out the top, but you would have to make sure to have a good intake fan setup to feed the case with fresh air.
@@samuelnavarro8044 That's like buying a lamborghini and having it delivered to you in a truck covered in bird poo everywhere. It gives a bad impression.
@ again, we don’t pay for the 📦 we literally throw that away right after. Just get me my 5090, and call it a day. Look at other reviews, it’s durable packaging, and gets the job done. Then thrown straight to the trash.
i upgraded to pg32ucdm in anticipation for these cards but honestly, the 4090 runs 4k just fine, which i honestly did not expect at all. i will probably just sit this out and get the next gen, just gotta pray this 4090 survives another 2-3 years
@@josenogueira410 like it or not. the game is shifted to this way. dlss fsr xess, its just about tech races now. amd with fsr 4 vs nvidia dlss 4, mult gen, reflex 2, transformer model, PLUS GDDR7. dont get me wrong, i love amd last gen (6750xt, 7900gre) but now? the technology behind it matters so much. idk how 50series is going to perform in real bench, but tbh it looks promising. amd needs to give us GDDR7 at the very least to stay in the game.
Not a fan of frame gen, never used it on my 4090. It is not the way it is meant to be played anymore by nvidia imo. DLSS is great though and sometimes it is necessary. Anyway skipping this generation.
That's so weird you hate the frame gen using AI but not DLSS also using AI? What about raw GPU performance at 4k? Why need the upscale all of a sudden? I'm not surprised you're one those people that joined a mob against Nvidia when they dropped DLSS 1, chastising them for it being unnecessary garbage but here we are now ~
who give you a card lol, how does nvidia gives cards just to everyone, like sign NDA and here is 5090 keep it, its yours test few games make it look better or whatever.
I would bet when the waterblock kits for this come out, they'll need gigantic gauge tubing, or ludicrous pump speeds. AND possibly a hybrid block system.
Since it's so new it's hard to find... Was wondering how this holds up against the 7900 xtx 24gb. That's what I currently have and want to upgrade to 5090. Just got a new job and wanted to treat myself, just wanted to know if it's worth it. Thanks! Also I hope your grandma gets better soon!
This 5090 FE is an engineering marvel! That center PCB is just so hot! All this makes me think that maybe we are at a crossroad now where we should really get the GPU (which is in fact a math co-processor) out of the PC and give it its own enclosure/case? I don't know: maybe that PCIe slot shoud be on the back of the motherboard now and have the GPU alone in it's own area? The motherboard could be more centralized in the case and spread things on both side? Just spouting this out loud here without much thinking, but I would prefer that to what we have been doing for the last 40 years. Or, maybe the SoC way (like Apple, AMD and now NVIDIA with their Digit mini-computer) is how things should be? Or maybe, just MAYBE, it is time for those case manufacturers to add baffles to redirect heat outside the case? OEM like Dell/Lenovo/HP/etc. have been doing this for years in their workstation/high-end cases, why can't we have this elsewhere? This is not rocket science and would prevent uselessly cooking the inside of our "10k$ rigs".
The whole "PCB in the center" thing with the PCIe port and the video outputs on extension cables makes me wonder how the hell custom waterblocks would work 🤔 Potentially the card could be much smalller than with the aircooler if the waterblock relocates the main PCB closer to the slot bracket (since the PCIe slot is no longer a positioning limitation)
Especially with how critical and negative he has been of Nvidia, yet they still gave him a test unit to try magically make him change up his mind 180 degrees.
these comments bouta be crazy
When embargo on test end?
If it runs hot it makes me way more curious about the big old classic AIB models. I still have faith in the big man
Bro, nvidia doesnt let anyone benchmark them, that says alot. People are unboxing them like it´s gold, truth is that its only 25-30 percent faster than a 4090. The rtx 5070 as poweful as a 4090? More like a 4070ti. Mult gen is just another 3 fake frames with no new game data. Meaning no one with a 4090 should upgrade, im sticking with my 4070ti
@@zhimanooka I believe it was 24th Jan for 5090 and 30th for 5080.
You know what else is crazy?
I'm wishing your grandma all the best. I hope she gets well
I'm not entirely sure, but that may have been a joke ... a play on the "my grandma died" homework excuse. Just from the way he delivered the line.
Best wishes if it wasn't a joke.
Maybe try plugging the 5090 into her to boost her sickness resistance.
(this is a joke, my gma is also sick as well)
she's fine
No one cares
Bro just discovered grams. Wait until he finds out about kilograms
when he discovers the metric system
sshh. that's a secret. don't let our Eastern Philosophies seep into the west
sshh. that's a secret. don't let our Eastern Philosophies seep into the west
I'm already pleased that he's using grams and not pounds ^^
what the hell is that
4090 and 5090 have one thing in common. Both are out of my wallet's reach
And he's showing you why... sell one give one to an influencer oh who's paying
The more you buy the more you save
I could buy it but it'd be too much of a hit for my bank account than I am willing to pay. A purchase like that wouldn't feel worth it for me.
Let's face it, that's the real complaint isn't it. It's a shame so many Tubers use the absolute donkey's dangler of a card, because it can only generate a sense of need in the audience. It distorts the expectations of folks and leaves them feeling short-changed, inadequate, whatever: they are daily confronted with knowing there is a better experience, if only.....
I will buy it since it won't make a dent in my bank account
I love that PC people are always spreading Leaks and Rumors about the GPU until they have in their hands then the embargo mutes them lol. It's kind of funny. Makes me believe manufacturers are in control of the leaks and that there's no such thing as real leaks
imagine entertaining “leakers” in the first place
On the other news. Water is wet and sky is blue.
Im glad your naivety cherry popped just a bit. Wait till you find out other consumerism BS youre being constantly lied to so you buy buy buy.
That's a half truth lmao ~
Most of the stuff abou tthis was correct though.
@@eduardomartin8510 lmao where??
33% raw performance increase, with 33% higher cores, 28% more power draw, 25% official price increase, plus 2.5 years of wait. But is there any alternative out there? For a long amount of time, no. Can you combine the power of two high end cards while gaming like in the past, instead of this nonsense marginal increase? Again, no. End result: Desperation and stagnation at its best. Definitely not worth it.
What do you recommend for a first timer. Sure i wouldn't upgrade if i had a PC already, but i dont! Buy old gen? Wait for this gen?
@@piotr78 buy used 40 series, either 4070s for 1440p or 4080 for 4K and ur good (maybe wait for the new gen to do that tho, amd might be cooking too but they mysterious about it)
@@piotr78 wait and see the benchmarks of the 5080. beyond that it depends on if you are wanting to game in 4k and with however many settings turned on/up. gotta buy in at some point though.
@wymanbartlett4648 Would rather play at 1440p with higher settings as opposed to 4k lower. Gonna start watching the second hand market around here
@@piotr783080 or 3080ti still are very powerful
Vex got giving a 5090 ??? He's moving up in the world now!
The world is doomed
Yeah nvidia are trying to buy them all out lol.
@@armesisp3201I could be wrong but I heard that every reviewer have to bring them back to Nvidia after making their reviews
@@armesisp3201 *Bought in case you never watched the video.*
@@adarion2994He is saying Invidia is buying out all the TH-camrs to control what is being said about the GPU
Guess who was right? VEX was right, the prices are exactly what your source in Australia said, we already have leaked prices from a store in Spain and the 5080 starts at 1500+euros up to 2300 euros and the 5090 starts at 2500 euros.
the real world pricing was right, the MSRP calculation was off, but nobody's getting them at MSRP anyway
@@iamspencerx The price of VEX was never the price of FE but of Aib. To get an idea, the Asus TUF 5080 costs 2200+ €
Fuck that shit, I'm not spending more than 1k for a GPU for gaming.
@@iamspencerx
Speak for yourself because I'm getting it at msrp like I always do & people say this every gen 😎
2500 euro in my country is gonna feel like 5000 XD over 10k where monthly salary is 3,500 netto. I am waiting to spend my 6k on new rx f this shit.
Hoping this power connector doesn't have any problems like last year - also hoping this new cooler redesign brings the temps way down!
Get the 12V- 2X6 cable. Not the shit from nvidia!
There never WERE any problems with the connector. Every single recorded case was user error. Yes, newer 12vhpwr cables added more features to make stupidity less likely, like the click pin to give you auditory feedback of when its inside and not let it move, either way the main issue was people bending the adapter around.
Melting problem was user error. Users didn't put the connector deep enough on the GPU (CLICK!)
Funny part is in well over 20 years of dealing with PCs (and 7 of them working as a system integrator, aka pc-slapper-together with all the support nightmares that come with that field), I've never seen _ANY_ regular PCIe power plug melt or burn up even if not 100 percent plugged in. Never. Even on cards requiring 3 of the damn things (3090). This only started with this garbo connector
@@ppeezThat's BS, the connector was flawed from the get go and it's been proven. Too much power through too small wires/connectors, then, there's the much smaller safety margin on the cable too.
I know everyone likes to shill Nvidia as the best thing ever, but they really f*ck3d up on that connector, even cables that had been plugged in for a yerlar plus were catching fire.
Hopefully they actually did fix the flaws so that we don't have more fire hazards on our hands.
If the selling point is DLSS 4 for a 2000+ dollar card, which isnt even adopted in all games, wouldnt it almost make more sense to spend way less on a lower card and spend 7 euro for something like Lossless Scaling which works with every game or even movies you watch?
Dlss is definitely better than lossless scaling, for sure. dlss can directly intetact with game files while lossless is an external program
Yes plus, with dlss 3 you are already getting up to plus 100% more farmes, which is very susptancial.
While lossless scaling is a good program for those with old gpus it is nowhere near dlss xd
dlss 4 will be supported in all games that have dlss 3. You will be able to configure it through the Nvidia App even if the game didn't have an update with dlss 4
@@fabriperoconalgomasytodojunto Nah you actually need a good GPU for LSFG too.
"I can't show you any perfomance".
Which is a huge redflag. If there is nothing to hide, why hide it? Now we're just 9 days from release and they should encourage customers with proven FPS increase for that price.
With the number of 5090 FE unbox videos I saw, I’m convinced that 900 out of the 1000 initial batch of FE cards are given to these reviewer/youtubers
Gotta hype up the BS
Then the official launch will be scalped
Most of them have to be sent back and then they get rotated to smaller and smaller youtubers for free nvidia clout.
Everyone who has one keeps talking about the size and the ability to cool down the card to no end, but why is it that NO-ONE says a word on the acoustics of this "small form factor" solution being 2 slots? Did Nvidia prohibit ANY comments about the noise levels before embargo? I could understand not giving dB numbers, but it would be useful to have a assessment of how annoying the sound is compared to other high power solutions before. It's not great even if it fits on small cases if it sounds like it's about to take flight.
Because without enough cooling the GPU won't work well but it can be as loud as possible and still function. Do you have your priorities straight? The majority aren't as sensitive to volume. Put on some headphones and adjust your gain.
Because until the embargo lifts nothing can be said about the actual cards operation or 3rd party testing results.
"Feels like a hairdryer". Bet it sounds like one too
I think I'm most interested in the 5070ti, lol. The 4070 ti super has perfectly fine raster for 4k, so with a decent pump to performance, especially raytracing and AI upscaling, I think it'll be the best all-around card
5070 Ti is not that much better than a 4070 Ti Super, there's a chance it has even worse rasterization
But it will be no contest with mfg enabled.@cvd1
@cvd1 Benchmarks out already? Where?
@@MatthewSimm Nvidia benchmarks on their own website, there are breakdowns too
@@cvd1 those are against the 4070 ti. But honestly, I'll be waiting for the youtube reviews. I'm also not upgrading from the 4070ti super, I'm doing a fresh build 😀
Probably won’t upgrade til 7000 series or higher having a 4090 in this AI WAR mess is nice just knowing I’ll be safe for a couple more years until I have to turn settings down
Same boat I will say a good time to upgrade is when the PlayStation 6 is released so in 2028 or 2029 by then it will be the rtx 7090 which will be more powerful then the PlayStation 6 because we know that consoles use older pc tech so most likely the PlayStation 6 will be using a rtx 6070 performance .
@@msg360 Consoles using old PC tech has not been true for a while. Consoles always incorporate tech that is inline with what modern PC's are using at the time.
People 2 years ago: I would wait for 5000 series or higher rather than upgrading to 4000 series.
@@actualyoungsoo you could say that for any generation those people who skipped the 40 series are the ones with 20 and 30 series who are upgrading now. Is 40 series owners are waiting 1-2 more generations before upgrading so 6090s or 7090s
I just realised... 575W of power from 12V rail is appx. 48A current. Time to develop different power inputs for GPUs, 48A is crazy and unefficient in terms of how much conducting material (copper wires, connectors, pcb traces) you need. A 48V rail and 12A input would be more ideal and safer.
Because its a proper room heater now. 575W is no joke, all that heat will end up in users room. A lot only talk about reference design cooler and fins how great it looks. But in the end all that heat will end up outside your case. I dont expect much from undervolting also as base watts is already too high
its a firehazard like the 4090 is.
@@supremeboynot the point. The point is resistance in the cables and power efficiency of a 48amp connection. I didnt even think about this yet, actually an interesting point
@@ppeez That is standard that determines the limit. Currently its 600w but heat is already a problem that i'm sure a lot reviewers wont mention. They look mostly the temps how cooler can handle it and how silent it can stay. There is no way to deny TDP. 4090 was 450W default. +125w more is insane amount of increase for one electronic item.
@@supremeboy Ok, and? It's not like most people are going to get too hot in their room while gaming xD
Wow. That power usage is equivalent to an air conditioner around .75 hp
"can't talk about the performance" after trying out the card is nuts
They have a NDA. The NDA stops this coming Thursday. It gives them a few days to test without immediately jumping the gun towards a review. It’s a good thing
Performance difference is unsurprising - 3090 to 4090 was a new node, 4090 to 5090 is the same node, just more cores
Node increases don’t do quite as much anymore. A 4090 had nearly 60% more cuda cores over a 3090 and a 25% increase in clock speeds.
@@devinmurray5280 Very true, still a percentage attributable to the node but those other factors you mentioned are far more important.
@@devinmurray5280 Clock increases aren't even half story when discussing uplift, mate. The 4090 was on average, what, like 60% faster than the 3090? That wasn't from clock speeds.
The size of the 30 and 4090 dies were basically the same. Meaning every extra square MM on the 4090 was packed with more cores.
I've speculated since the 2xxx (RTX) launch that Nvidia switched their design philosophy from gaming/rendering to AI (since each gen has a datacenter/AI sister) and then found ways to make the gaming segment benefit from that (Raytracing, DLSS, DLSS2+, FrameGen).
The massive memory bandwidth heavily benefits the AI segment as LLM's are often memory constrained and the AI/Data Center sister to this chip will probably have ~48 to 96Gb of VRAM (wonder if anyone will try to resolder upgrade their 5090's to see if it auto-recognizes the extra capacity).
Only high end benefits, fck the rest
I'm glad someone else noticed this too. I've been talking about it for ages. Every one seems to miss that pivot happened so long ago. Everything we've gotten since then is Nvidia's attempt to sweeten the deal to gamers.
I wouildn't be surprised if Nvidia stops making gaming GPU's at some point. Every 5090 they sell for $2000 is an insane opportunity cost for them. That silicon could be going into $20,000 GPU's.
The RT and Tensor cores only take around 10% of the die space in total. A small price to pay for all of the benifits. I'd take that over 10% extra Cuda cores any day. There are so many uses for both inside and outside of gaming.
5090 is less than 1 % of the market for user ( more for professional ) user will look for 5060-5070 for 95 % or 9070 xt it depend on price and 2-3 % will look for 5080 .
If the 5080 had 20GB of VRAM, I would buy it immediately.
@@Rakanay_Official Wait a few months until 3GB modules hit mass production. There already is a 5080 with 24GBs of VRAM coming.
that sound at 1:20 scared me😂
lol i dropped something when i was talking
1:20
Glass PCB ?? 🤣
@@Rocker696 🙏
@@vextakes man the timing tho! great video vex!
Dude that thing looks AMAZING. I'm a total founders edition man, they're built like tanks, they look better than every other card on the market (except for some select MSI ones that have a really nice build) and they hold their value really well. Not to mention the cooling will probably be amazing for that wattage.
With great power comes great responsibility.
This design is really good it pushes all the hot air up towards the fans to then extract the hot air out the top of the case it's a good design for the GPU.
Uh that’s how they’ve been for three gens now lmao when has it been it pushed air down or opposite??
All 600w straight to CPU cooler.
Get a riser cable and mount the gpu to the water heater at home.
I’m guessing a cpu air cooler as intake blowing from left to right would be ideal
@@NahBNahthe flow through is only on 1 side, the side near the the HDMI and display port have to make a 90 degree bend to get out
performance per watt is complete stagnation
I have spent a few days trying a few different settings. I have to say, if you have a RTX 4090; especially if you have a one that is able to overclock. Example the Asus gaming oc RTX 4090 the default stock on the box is 2565mhz but boosts stock to 2790mhz and with a small OC it goes to 3000mhz+ core +1200mem very easy and stays below 65c. This corresponds to a 25%-50% faster frequency to the RTX 5090 with 2010mhz to boost 2410mhz.
4k performance was no better than a 2910mhz core freq. rtx 4090 and be aware that the extra heat with the 200watt increase made the test system CPU and RAM needing the overclock and XMP settings reduced.
That is all. Keep your 4090s with the 3.65 slot cooler and 200w lower clocks.
If you have a 4070 ti and below it would be a nice upgrade
If you have a 4070Ti almost 0% chance you’re in the market for a 5090.
I’m staying with my 4080 Super but that VRAM amount is rough tho.
How did you get a founders addition?
Its only a 125 watt increase over a 4090.
@@msg360 Probably a couple years ago on eBay.
It is a ridiculous price, but then again if it's anything like the 4090 it's pretty much a risk free purchase, since you could just sell it in 2-3 years for literally what you paid for it apparently, basically a free rental.
high end gpu is smart investment if you plan to found your next gen higend
@@frallorfrallor3410 why i buy hight end? To save my dollar and soo i can retain the value of my dollar, you can atleast get 50% back inflation free while other cards just pissing money away has you just loses that $600-300 with almost nothing to recoup the cost.
I love your style. No silly jokes or unnecessary video editing with funny pictures inserted.
Box and cable, looks smart!
4 in the morning and that sound of glass dropping just made me fking jump.
Still no free support bracket?
Time to upgrade from my GT 1030 🔥
to a 1050, its all we can afford XD
Don't forget improvements compound when measured as you do.
Improvements in series (from 3-4 or 4-5) are distorted by using percentages because the baseline is increasing. In nominal terms a 30% improvement from 4-5 can be far larger than a 50% gain from 3-4.
You can have a 100% increase in FLOPs (say) from series 3 to 4 being 10 up to 20 (for instance). And if series 4-5 is only a 50% increase, that takes you from 20 to 30 - a difference of 10, the exact same uplift as previously but one that appears much worse than the previous series upgrade (because it's now "only" 50%). But it's actually an identical uplift.
Expecting to maintain that sort of growth each generation is a bit much.
Not intending to defend GPU makers as such, rather to just help put things in perspective, which people don't.
You do have to wonder though, is the 5090 being under baked firstly for AIBs and secondly for a 5090ti? The 4090 didn't really have anywhere to go with the Ti so they ditched it.
I am just wondering: if ai generated frames depend on rasterized performance, how much will there be development in the future if the development of rasterized performance is slowing down in the end? You can apply this ai gimmick so much until you hit its limits, it can't create frames out of thin air, unless hallucinations count
It's been said many times. They're nearing the limits of what you can push through the silicone physically. Game developers are going to have to put their asses into gear and get going on optimizing their games.
If every game developer produced their games using 1060s as their testing hardware games would be running a lot better.
Rasterized performance is also CPU limited. Again, physical limitations, the CPUs can't crank out enough frames, and neither can the GPUs, so the only way in the end is to optimize optimize optimize (which is what DLSS and the frame generation is doing)
Maybe AI can be implemented during rasterisation (or even pre-) and ease that burden?
@@CurtOntheRadio Probably not because literally ANYTHING that runs machine learning code runs on the GPU which would choke the GPU further.
@ But it might choke it less than otherwise. Meh, we'll have to see.
@ I mean if you're already doing frame generation, and then adding stuff on top of it, yeah I don't know but we'll see what Nvidia comes up with for the 60 series as we run out of ways to make these cards faster.
Everyone hated on these cards and now oh we love it lol happens every gen.
nah speak for yourself lol, I hate this thing already. Overpriced garbage relying on ass tech to inflate performance numbers
hivemind
there is a difference between hating the GPU and hating it's pricing. Every GPU can be good, at the right price.
Manchildren crying and being drama queens per usual
Like clockwork ~
Core info: 4:42 so you have not tried the card, as you cannot provide numbers. Skip the vid
Dude, watch the first minute, or any other 5090 video. They arent allowed to show the benchmarks because of an embargo
@@natiboy7705 The title is the major problem, this is equivalent to false advertisement.
0:35 you know what else is massive?
LO…….
@@maincharacter2677LO?
I hope your grandma gets well soon, thanks for the video
wait you use metric system? u arent in US?
tru, more cos grams are more precise for electronics (ounces kinda suck, but pounds are great)
@@vextakes based american
@vextakes The rest of the world say "thank you". Please don't change this.
@Finnishmanni i don't think any of us use ounces lol
he is evolving
I really hope we get some benchmarks for 3D work and rendering ;). Nobody talk about us artist trying to drop render times instead of getting those frames up lol.
hopefully they keep this FE design for 6090 and tone it down to 450w or lower.
Everyone is talking about increased latency due to frame generation but I rarely hear anyone talking about the advancements with Reflex to counteract that. BTW I hope your grandma is feeling better.
I want to see the maximum power draw and temperature readings of the connector from that card. Melting connectors is definitely a thing on peoples mind with this card.
Yeah If there is 0.1 mm gap it will burn and it's automatically considered user fault 😂
@@havocking9224 If there it the card will not work. 12v2x6 has shorter sense pins. If they are not fully seated, your card will not function.
Prices will most definetly not be MSRP, it never was and never will be. 4090 on launch was going for 2200eur here and now sits on shelfs for 2950euro..... 5090s will cost 3000euro on release for partner models and founders editon is super hard to get and they are always out of stock and only possible to buy in like 2 or 3 counties in whole EU.
Nvidia tends not to make enough of these high end cards for the demand so prices always shoot up.
*You work for 40yrs to have $1M in your retirement, meanwhile some people are putting just $10K into trading from just few months ago and now they are multimillionaires*
wow this awesome 👏 I'm 47 and have been looking for ways to be successful, please how??
It's Esther A Berg doing, she's changed my life.
I do know Ms Esther A Berg ., I also have even become successful..
After I raised up to 325k trading with her I bought a new House and a car here in the states🇺🇸🇺🇸also paid for my son's surgery (Oscar). Glory to God.shalom.
Good day all 👍 from Australia 🇦🇺 . I have read a lot of posts that people are very happy with the financial guidance she is giving them! What way can I get to her exactly?
Didn't know you are now getting samples. Grats dude, you've moved up.
The 5090 is going to be hard asf to get. Probably get one in two years lol
At the same time, you probably will not benefit of theses features before 2-3 years into video games, so it is really a big deal for maybe 5-10% raw fps increase ?
Lol you believe the leaks. SMH
@ I agree. Im leaning towards waiting it out longer. My 3080 is still doing great so no rush here!
Two years we will get the 6090 😂😂😂
@@justhomas83 it’s seems nvidia has moved to a 28 month time so over 2 years more like 2.5 years now smh
So proud of you bro, you’ve come a long ways on TH-cam.
patch 5000series: 20-30% buff to power speed price, enjoy
Test it with VR dude, if you have one, that's were my thinking lies, Trying to push frames on a VR headset is not easy, not for the full sweet-spot affect anyway.
4090 owner here and a 30% uplift isn't worth 2K USD unless I can find a buyer for my MSI Suprim Liquid X. 🙄
Ill buy it
How much?
I am willing to buy it
Why is no one talking about the bus? They jumped from 384 to HALF A GIGABYTE. That is massive, most cards have around the number they've INCREASED by, but no one is talking about it.
Multi frame generation vs Losslss Scaling frame generation at x3????? that'd be a cool comparison
Loseless scaling is dogshit in comparison but it would be nice to see the proof.
@ We've yet to seen MFG's full breakdown, dunno seems to early to call LSFG bad , plus maybe it's a me thing but I actually prefer using LSFG in games that dont have FG, makes full use of my monitor's refresh rate in AAAs
3:50 wait that's Risk of Rain 2 music!
Ending the segment on rasterization and ray tracing with "dunno lol" didn't really make you seem credible. 4:56 in this clip to be exact.
Do you get to keep it?
Im gonna try to get one at launch. I just built a blackout 9800x3d this would look perfect inside the case 👌
Don't be surprised if the 5090 is not as fast as you thought it would be compared to the 4090.
Nvidia has absolutely no competition at the top so why should it be. Expect the 60XX series to follow the same path unless something drastic happens within the next 2 years.
Thats one thing most forget to talk about, the fact top end cards are dumping out so much heat into the room. There is a desire to have great gaming performance obviously, but you also need to be comfortable in the room while gaming for hours at a time. Cards dumping out 575+ watts along with the rest of the PC parts makes for an uncomfortable gaming session unless you have an air conditioner right in the room with you and a temperature sensor for that in the same room as well
Hmmm. Try the UK? :D
Perfect in the winter
I live in Texas with 110 degree summers using a 350watt card, I don't notice it at all
I hope your grandma gets better soon
Got my ASUS ROG OC 4090 for £1500 :) Very happy, dream GPU :)
AMD fanboys not gonna like this VEX
I don’t think anyone can be a nvidia fanboy anymore lol. And I’ve owned like 11 Nvidia cards in the last decade. They’re turning into old day Intel.
I disagree with the sentiment that you need a 200hz monitor to "feel it." I bet there are a lot of great videos out there that show that fps "feel" and refresh rate aren't so apples to apples. A lot of people still notice 60hz with 200 fps+ on a game like Counterstrike
Thank you so muich for making these videos that show the reral performance of the new gpus
Here is the thing with the newer GPU's. A lot more focus is being put on AI. And I don't mean using AI for playing games, even though DLSS 4 is essentially AI for games. I also mean AI as in running LLMs on your own computer. Newer Nvidia GPU's are no longer just for gaming. They are for productivity as well, which is why if you are a gamer, the gains between each generation will probably become smaller and smaller. 3090 to 4090 performance difference was crazy though so hopefully we get more of that in the future.
4090 or 5080 which should i buy if theyre the same price
4090, but dont worry the 5080 will neither be available and the 4090 will still destroy it in raster
5080 is slower than 4090, and there are rumors that MFG will be available for the 40 series too.
4090
4090, you get more VRam it’s slightly faster then the 5080 and you get the full GPU die, you get all the features of DLSS 4 beside MFG which might still come to 40 series if not you can just buy lossless scaling and use 4x if you want a boost in performance.
depends what for?
Yes, yes. Very nice. Now lets see the real performance. Without fake frames.
Usually I don't care about boxes... but man that box looks like a bummer. Part of the joy was the smell and the opening of it. Imagine spending $2k USD (probably more tbh) and getting... that.
imagine being you
Indeed, much cheaper products have superior packaging compared to this mess.
how bros gonna feel after he needs to give nvidia their card back
i know whos NOT paying 2k for a gpu .......................me, ill take a look at that 9070 / 5070 though
i hope you grandma gets better soon
Dodging embargo breaking allegations :)
Also that heatsing glazing is nice, but what about ALL THAT HOT AIR GOING INTO INTAKE OF CPU COOLER?
It will make a difference but not as much as people think, Most coolers are oriented with their fins perpendicular to the flow from the gpu and drawing in air from the sides. If you're really particular about it you could 3d print an air duct to make sure they never intersect. But it would have been cool if there were 2 versions of this card, one with an inverted cooler. If you use an inverted cooler with an inverted layout you could have it blow all the hot air out the top, but you would have to make sure to have a good intake fan setup to feed the case with fresh air.
He's deffo broken embargo by saying he's tired it and it gets really hot.. as hot as a hairdryer etc
Love your vids, thank you. I have to stay informed because I'll be helping a friend build a whole new system.
That box is so unappealing 😂
You’re talking about a product you can’t afford. We don’t buy these GPUs for the box btw. Focus on AMD
@@samuelnavarro8044 That's like buying a lamborghini and having it delivered to you in a truck covered in bird poo everywhere. It gives a bad impression.
@ bro thinks i couldn’t afford it lmao when I have a setup worth 2s the cards worth. Sit down boy
@ again, we don’t pay for the 📦 we literally throw that away right after. Just get me my 5090, and call it a day. Look at other reviews, it’s durable packaging, and gets the job done. Then thrown straight to the trash.
@ wouldn’t you want it to be appealing at first reveal ?? The goodwill boxing ain’t it
i upgraded to pg32ucdm in anticipation for these cards but honestly, the 4090 runs 4k just fine, which i honestly did not expect at all. i will probably just sit this out and get the next gen, just gotta pray this 4090 survives another 2-3 years
The AMD to Nvidia conversion has begun
You mean upscaled with AI.
@@tmsphere Yup, works for me.
Why downgrade?
@@josenogueira410 like it or not. the game is shifted to this way. dlss fsr xess, its just about tech races now.
amd with fsr 4 vs nvidia dlss 4, mult gen, reflex 2, transformer model, PLUS GDDR7.
dont get me wrong, i love amd last gen (6750xt, 7900gre) but now? the technology behind it matters so much.
idk how 50series is going to perform in real bench, but tbh it looks promising. amd needs to give us GDDR7 at the very least to stay in the game.
may I know why you didnt show the gameplay?
Not a fan of frame gen, never used it on my 4090. It is not the way it is meant to be played anymore by nvidia imo. DLSS is great though and sometimes it is necessary. Anyway skipping this generation.
That's so weird you hate the frame gen using AI but not DLSS also using AI?
What about raw GPU performance at 4k? Why need the upscale all of a sudden?
I'm not surprised you're one those people that joined a mob against Nvidia when they dropped DLSS 1, chastising them for it being unnecessary garbage but here we are now ~
@bankaimaster999 it's trash it makes all the new games look Worse in motion than ancient games.
who give you a card lol, how does nvidia gives cards just to everyone, like sign NDA and here is 5090 keep it, its yours test few games make it look better or whatever.
0:41 You know what else is massive?
😬
Your ankles. 😉
Yo mama's ....
My moooom!
Oh I can't wait for this one
Congrats on making it, mate!
I would bet when the waterblock kits for this come out, they'll need gigantic gauge tubing, or ludicrous pump speeds.
AND possibly a hybrid block system.
oh yeah, water blocks will be wild for this one cos of the connectors. Might as well just get another model
@vextakes Let's see der8auer, LTT, GN & Kingpin all compete to overclock this on water, LN2 or something WILD.
Snooze
Man im definitely camping for the 5090 getting my hand warmers together now
@NahBNah I'm pretty sure if you snooze, you lose.
But that's just the proverb.
Since it's so new it's hard to find... Was wondering how this holds up against the 7900 xtx 24gb. That's what I currently have and want to upgrade to 5090. Just got a new job and wanted to treat myself, just wanted to know if it's worth it. Thanks! Also I hope your grandma gets better soon!
all these youtubers making videos on this crap who cares about 5090 bro, 99.9% of people cant afford that garbage
It's intentional marketing strategy. 5090 is what's going to sell Nvidia's cheaper models. It's the halo product effect.
Sorry to hear about your grandmother. I will pray for her health tonight.
I will shit myself if you glaze this product after doom posting about for weeks straight.
This 5090 FE is an engineering marvel! That center PCB is just so hot!
All this makes me think that maybe we are at a crossroad now where we should really get the GPU (which is in fact a math co-processor) out of the PC and give it its own enclosure/case?
I don't know: maybe that PCIe slot shoud be on the back of the motherboard now and have the GPU alone in it's own area? The motherboard could be more centralized in the case and spread things on both side? Just spouting this out loud here without much thinking, but I would prefer that to what we have been doing for the last 40 years.
Or, maybe the SoC way (like Apple, AMD and now NVIDIA with their Digit mini-computer) is how things should be?
Or maybe, just MAYBE, it is time for those case manufacturers to add baffles to redirect heat outside the case? OEM like Dell/Lenovo/HP/etc. have been doing this for years in their workstation/high-end cases, why can't we have this elsewhere? This is not rocket science and would prevent uselessly cooking the inside of our "10k$ rigs".
1 view in 17 secs, bro fell off hard
Sounds like you fell earlier
Nah uh
shut
@@71janas he must have hit his head when he fell if he thinks that joke is still funny.
@pieflies either that, or denied air the first minute when he was born.
crazy how these gpu's use nearly 5x the power of my entire pc
The 20 series was peak design from Nvidia and no one can't say otherwise 🗿
thanks for weighing all of those cards! Sorry to hear about your grandma. Your cpu testing idea is great!
The whole "PCB in the center" thing with the PCIe port and the video outputs on extension cables makes me wonder how the hell custom waterblocks would work 🤔
Potentially the card could be much smalller than with the aircooler if the waterblock relocates the main PCB closer to the slot bracket (since the PCIe slot is no longer a positioning limitation)
block that all pcb screws onto it or sandwitch design
Amazing video, thank you Vex.
Why not tryout a few LLM’s?
0:40 I giggled
Loowwww tapperrr fadeeee
Damnnn you seriously got to try the 5090? Damnnnn..
Especially with how critical and negative he has been of Nvidia, yet they still gave him a test unit to try magically make him change up his mind 180 degrees.
Congrats on your level up 🔝
$2800 is more than likely for the price of this card.
And you know what else is massive...