PLEASE NOTE: there have been many mentions of the power consumption, and how efficient the card can be when undervolted. I Totally understand this, but for this video I was testing at STOCK settings. I know it's hard to believe because most people on this channel are enthusiasts, but people run stock way more often then OC/Undervolted. Also the all over the place mixed comments on this, tells me it requires a totally separate video. My personal card starts to really tank in performance BAD around 80-75% limit, there are people in the comments saying drop it to 60% and I think thats bad advice, everyone has different use case scenarios. Thanks everyone. Also, this video is about this card and this card only, not comparing or contrasting 7900XTX/3090/3080ti etc.
Thanks for your story. I feel I lucked out getting mine more recently missing the whole adapter mess with Cablemod etc. I think they should change the connector but it's an amazing card. I max out 1440P now native, will possibly go 4K in the future. I still prefer the 4090 FE to my 7900XTX, but I can see a lot of people using something like a 7800XT and just FSR/be happy at the midrange. Thanks for the video.
Turns out the secret to me enjoying PC gaming as much as console gaming was to have a card where I could just turn everything to ultra and start playing without ever bothering to tune my settings.
Honestly that's the single reason PC gaming isn't for everyone. You often have to tweak settings instead of just running it without thinking. That's why consoles will always have a place for gaming.
C'mon my guy, kids reading this could start doing this and start their game with the dumbest tech in video games after loot boxes ! Motion Blur !! +ultrra -Motion sickness designed to mask 20fps dips
pc gaming was always better it was just most people couldnt afford it, aka mostly younger adults and teens. Thats i play pc and not console, Even a 1k gaming pc would beat any console by miles, the current ps5 and xbox tech is on par with a amd 5700 gpu that came out back in 2019. with pc you can always upgrade your gpu and get massive fps and visual boost im talking like easily 30-80 percent and iam talking about low end to mid cards. not only that the texture detail and lighting effects are like a increase of 120-290 percent depending on the card. Just compare something like a 6700xt gpu vs 5700, and its 40 percent increase in fps and texture detail is a 70 percent increase, and renders images at 112 percent faster, and this is a 310 dollar gpu, that is between low end and mid.
The 13700K is not enough CPU for a 4090. But then, neither are the 7800X3D or the 14900K. The 4090 is awesome, but if the sword is too heavy to properly swing, what good is it?
I bought a 1080ti w/ i7/7700k back 7 years ago or sumthn. And thats how I plan to purchase when it comes to teh 5090. Even if its $2500 and lasts 5 years. Thats $500/year I spent to damn near guarantee I wont have to address upgrading anytime soon. I have a motto. "Buy once, cry once." And it rarely lets me down
I have a 1080ti, too, and I think I'll hold on for the RTX 6080 or RDNA 5 cards... the 5090 will only be x3 the 1080 ti for x3 the tdp, and I don't want a 1000-watt computer.
& The maintenance needed on these cards lol. It definitely feels like you can't just plug and play with a 4090, there's so much seemingly that goes into even running it. Of course it runs great but man the stuff you have to monitor on a 4090 is wild (in comparison)
In europe, PC gaming is on massive decline because of overpriced GPUs. Since 2019, PC gaming has lost over 20 million gamer to consoles. We upgraded every generation to the 70 cards, which were around 400-450 euro (with good cooling). Now, a good 4070 with good cooling costs around 700 euro, which is nearly double the price. These price increases are hitting every sector, especially the low-end entry-level sector, which accounts for the largest share of the gaming market. They literally killed the casual-mass-market on PC. On the long run, this can't be good.
I simply refuse to pay more than 500 - 600 EUR for a good GPU. Still using my GTX1070 from years ago. (Got it in first 3 months of its launch day) Ive upgraded every other componnent in my PC over the last 2-3 years. Now i see the RX7800XT from AMD and i will probably make the switch, i just wanted more raw performance and VRAM from both companies the ladt 2 generations.
Well as a European we just don’t make that amount of money. Here in the U.S. my wife and I make close to $200k annually. In Germany we would make less than half that if at all. Herr I can afford a 4090. In Europe I couldn’t.
@@pandaonvodka6676 I played on my GTX 1060 for many years up until a few weeks ago. The card still holds up pretty well at 1080p as long as you don't play the latest games on the highest settings (and if you forget that RT exists, of course). I feel like playing on ultra settings brings a miniscule visual improvement anyway, though. I upgraded mostly for work reasons.
Friend of mine said he was considering a sex change. Uhhh, okay ! Who knew, turns out he needed the money for a 4090. Casually told him he had 2 kidney's. Just sell one of those as was calling him over the phone, the look on my wifes face as she was didn't really knew it was serious or not, it's was gold.
its 2 years old better to wait for the 5000 series no point buying it now and regretting when you see the 5000 series. This gpu flagship should be bought when its released thats the only way to justify the price!
Worst time to buy one, 2 years old AND Price increased ..... it used to cost 2,300$ now its 3500$ in my country after all the sanctions and shit. Plus 5090 is close so might as well get that on release thats what i did when i got my 4090 just bought it on release and built a whole pc arpund it
@skilletbakes420 yea I did get my 4099 for $1100 on Facebook marketplace. The 5000 series will probably be a worthy upgrade. Still the 4090 has at least 4 more years of decent performance.
I went from a GTX 680 which I used for 7 years to a GTX 1660 Super that I used for 3 years and currently I have an RTX 4080 because I saw a deal which was just a no brainer. Combined with the R7 7700X and 32GB DDR5 6000MHz RAM. This 4080 already is just... wow. I can play every game I ever wanted to play on max. settings on high fps. I couldn't even imagine the power of a 4090. 😅 so yeah, nobody really needs a 4090.
i have 7900xtx and a 5800x3d, can play 95 percent of games at max settings at 144 +fps at 4k settings. No need to upgrade anytime soon and if i wanna play in 4k just hook it up to the tv.
@@grandcanyon2 Having the XTX is like having the "better" 3090 Ti - It will beat the 4080 in rasterization and at least compete with Ampere in terms of RT. Get the FSR3 mod and it becomes actually playable in 4K. Having 120 FPS in 4K + RT maxed would be nice but even 75 is a progress given the incredible beauty, in games like Callisto Procotol you don't necessarily need 120.
i pushed my 12v connector on my 4090 in like it owed me money. The 12v 2x 8pin connector that came with my corsair is very tight fit with my rog strix 4090. There is zero lateral movement in the connection with that cable
Sadly that's not true. The 12VHPWR to PCIe adapters particularly I don't feel a click. In fact I think my FE card you don't feel a click with any cable. But when I got an ASUS 4090 TUF you could feel it, though still barely. Its the nature of the clip being so tiny, there isn't enough force behind it or enough mass to make a proper click. When I moved to official Corsair cables it was slightly better and definitely feels more secure, without the weight of the adapter pulling the connector at an unhealthy angle. I feel making the cables on the adapters longer would have been a good idea to reduce the strain. Yes I am sure every time mine were fully seated.
I think we're getting to the point of diminished returns as these cards become gargantuan. A new series of cards every two years doesn't allow enough time to shrink the die sizes. Pretty soon, we're all going to he buying cases according to how large the GPU is going to be
100% agree on the timeline, it would be nice to see a 3 year cycle, give us a little more time. Seems like AMD might be doing just that this year ironically. Nvidia is producing too much money to want to stop. I really hope the cards don't get much bigger too 😬
@@yjzep9922 They were certainly less eventful too back then. They'd just slight out releases randomly. Since the 700 series it's been a lot more "formal" or predictable.
That`s already the case. Besides what has die size to do with card size. I can see the die sizes shrink considerably compaired to card size. There are no diminished returns, you get what you pay for and in case of the 4090 a lot more.
Yeah, certainly with VR sets like Pimax Crystal Super coming. I'm personally using a RTX 4090 + Pico 5 right now. But I will upgrade to a higher end VR set when the RTX 6090 is on the market. Hopefully that will be powerful enough.
In MSFS I can get 90fps in VR on ultra but that's with the reverb G2. Not sure how my performance would be affected if I went to the Crystal Light or something with 2880x2880 pixels per eye. But I bet I could still max it out on any game aside from MSFS which is the most demanding VR game that I know of.
First video I've seen of yours. You have a very natural, relaxed air to your way. Very pleasant to listen to and to watch. Also, the subject matter and pacing is great. Thank you!
Needed it to run a 7680x2160 monitor for sim racing. I don't feel lucky or blessed lol, I work my ass off, and spent 2 decades being working poor before my endeavors paid off enough that I can easily buy a $2400 gpu (white rog strix...) in my mid 40s... I also drive a mid tier pickup with 200k miles. Buddy of mine spent more on Taylor Swift tix for his wife and daughter lol.... and that's 1 night.
It's funny you mention the pickup, I almost left in a second describing the 2013 Honda fit I drive. The video was just getting to long. All I will say is, hey man... whatever makes you happy. PCs are if anything kinda an affordable hobby when you start to look at car parts, camera gear etc. Appreciate the comment.
@@techwandoUS I have a relatively pricey sim racing setup.. ihave about 15k in it including the pc. That might buy me tires and brakes for 1 real race season lol... and I mentioned the truck because i have friends who buy a new premium truck every 3 years, have a corvette, and a harley lol. so when they give me crap for how much i spend to " play games", I kinda laugh..
Anyone thinking about the future of video cards might be interested in the fact that motherboards are coming out with the power delivery in the PCI slot.
Owning a high end card (7900xtx) for the first time I definitely developed thoughts about GPU maintenance and its longevity ("I invested so much, maybe I should change thermal pads at some point" etc.). The 4090 is in its own league but making such an investment comes with the expectation of having a free of issues, high quality experience in every way and a usage of at least 5 years. The reality is obviously different and I feel like the past 2 gens high end cards are more experiments than established, well tested products. Thats why I went all in and got the nitro+ top of the line model, just to minimize any quality issues..So far its been amazing, but not everyones GPU works well in their system. That said, this is a very nice review! I had to make my own (fairly short) researches and adjustments to make my 7900xtx work without issues (so far) and I hope u can enjoy your 4090 for as long as possible.
Appreciate the well written comment. Happy to hear you are having good experiences with the 7900XTX. Lately I hear nothing but good from the owners of that card. As for your concern with thermalpads, honestly if the cooler design is good and they are using decent pads (which you can use a GPU teardown site/video to see) then I'd say leave the pads for a long time and just replace the paste the second your warranty is up. The only time I was personally changing pads often was on the FE 30 series cards, because for some weird reason they decided to use really bad, almost paper like thermalpads.
@@techwandoUS Ofc. Ive seen some "low quality" pads on other top models and decided to go for Sapphire just to make sure. But since top GPUs seem to stay very cool (UV+OC-ed mine, 50°C , 65 °C hotspot) maybe maintenance is less of a problem, unless temps start rising unreasonably high. Thanks for your insight.
@@Greenalex89 I have the Sapphire MBA, it easily goes to 115°, 90° on the memory under load. It is likely to be the thermal issue. The performance is still incredible and so far I undervolted it. I plan to RMA it in October or whenever I find a cheap 6700 XT in hope for a refund, worse case would be a custom XTX. Best case I hold out and buy a used 4090 by the new year. You can be happy with those customs, they are much better.
@@voyagerdeepspaceexploratio5023 Dont u mean the AMD Radeon MBA 7900xtx? Cause that one is known to haver thermal issues with high hotspots, cause a few thousands of those didnt undergo a quality control before hitting the markets. AMD refunded those without issues. When it comes to thermals I would pick the nitro+ or the XFX version of the 7900xtx.
The evolution of a “4K” card inevitably becoming antiquated due to increased requirements at that resolution is a great point I haven’t heard anywhere else. Easy sub w bell noti’s.
Funny considering 3000 series were made on a shit process thats runs hot, uses tons of power and degrades fast. Samsung trash node, was cheap tho. For Nvidia that is.
Been so busy I haven't been able to watch any content. Im so happy that you are still going strong. Your channel is such a good watch. Very informative and entertaining.
Just built a new PC a month and a half ago. 14900k, 32g ddr5, 2tb Samsung 990 beast system. But “cheaped out” on a 4070 ti super instead of a 4090. The price difference would have been $1300 for hardly any significant performance gain. I haven’t shelled out the cash for a 4K monitor yet soooo… at 1440p I can max settings in any game and get buttery smooth frame rates. The 4090 was in my budget but 1300 extra cost simply isn’t justifiable. I’m very happy with my decision. I can use that money to upgrade to a 5090 less than a year from now instead.
@@Daud438 It's okay for a top end card to be expensive. $2K may be too much for gaming, but there are still professional users doing stuff like video editing. And I don't think NVidia could really avoid the insane power draw and consequent physical size of the card. But the power connector should not be a fire hazard.
Out of all the cards I own and have own I love the 4090, it's so powerful but also quite efficient, however that connector is going to be something that will more than likely devalue these cards or make it a headache for every owner. I got my TUF 4090 day one and last week after 18 months of use the connector melted on it. I had to document and show it because it melted on both sides GPU and PSU side. The PSU side got fused and couldnt be unplugged. I work with high and low voltage electricity always plugged mine fully. I think what happens over time and this is probably unavoidable is that due to the high current going through the wires the plug gets warm and cold, warm and cold and the thin pins expand and contract increasing resistance in the connection and heat until it escalates one day. I've heard every theory from user error to plugs being rated only for 20 or 30 plugins per cable. It's just a bad design period. I like the idea of one plug to rule them all but any replacement of an already established proven method should be an improvement. This connector isn't over the standard 8 pin connector. I've abused those connectors plugging and unplugging them so much without any issues ever whereas with this one it's almost a given it will one day fail and now I cant leave my computer to render and walk away. That's a huge side effect imo.
@@Zwank36 Were you able to get a replacement relatively hassle free? I had an extended warranty with my local MC so they just gave me a new one. Got another TUf because I like the build quality and cooler especially.
100% spot on about the long term value. It's a shame you had to go through that, and I have also been saying the same thing. It feels inevitable. It will probably happen to a lot of them as time goes on. Probably right out of their warranty window 😅
What we really need to do is tell Nvidia to stop being so greedy and sabotaging their own products (already overpriced) so you have to buy another. Why hasn't anyone put the blame on Nvidia yet? This is clearly a problem that they created to make more sales.
@@madboi1591 Not to mention people like this will pay 2500$ for 5090 ...making the whole market for all out us…and themselves... even more retarded insane.
"almost like we're test subjects for what's coming" This is how literally all of us felt when doom 3 came out, and again when crysis came out. It was remarkably more common back in the day to have to update your entire system if you wanted the good frame rate for video games
I've owned a 4090 TUF OC for a year and your points are spot on. It's a powerful card that the average consumer just doesn't really need. I've had to convince friends to not buy one simply because they just didn't need it. I had to get a replacement as I was having issues with driver crashing and thankfully my RMA experience was fast and great. They sent me a factory new replacement just under a week of diagnosing the issues with my card. During the down time of not having the card I had to use my 3080 ti and it just felt so different, once you get your hands on something like it, it's just hard to get off it.
It is just too expensive for just a video game card. But otherwise you quickly get used to worse if necessary. I survived the two "GPUless" months in 2022 with a R9 Fury, the cheapest option for some performance at the time. It could still play old games flawlessly in 4K, in BF4 actually the HDMI port is more a bottleneck than the performance. You only buy a current gen high end card to play AAA titles in 4K, RT, 120 Hz and more.
I still remember when I upgraded from a GTX 1060 6GB to an RTX 4090. It used to take almost a minute to generate AI images, but after the upgrade, it took only a second. I was in shock, and I'm still amazed by such speeds.
Undervolt is the thing to do. I’m not conservative. Running on 750w psu too.. UV at .95v is the key to stock or better performance but 75-100 less watts. So 300-350w max. Oc memory by 800-1000 and do the curve oc. There’s a few videos on it. Very simple. Very effective. Can go as low at .9v without much loss. So probably 250-275w. And it’s insane still. Maybe 5-8% loss. My card doesn’t break 60c ever on 50% fan speed. Tuf non oc
@@tyrellwreleck4226 I’m well aware of the power draw. And always have software running to check it. Corsair hx 750i. The psu fan doesn’t even come on most sessions. No joke. It’s totally fine.. total sys power draw is below 600w even at stock with no UV running cyberpunk or similar game…It could run this way non stop for a decade probably. Sure, I wouldn’t buy a new 750w psu when I replace this one. Probably 8 years old or older…. I’ll go with a 850w min but might just go to a 1k watt to have plenty of headroom, I like the idea of it not having a fan come on ever. But I think in general most people are over cautious and/or worry about stuff that will never be an issue. Also used a cheap 3 to 1 8 pin to 16 pin connector from Amazon. Did upgrade it to Corsair 2 to 1 cable tho. So it only plugs into 2 8 pins on psu… still can pull around 500-550w tho in testing. And this is close to the max psu rating. But I wouldn’t ever run it this hard. Again, this gpu is so damn efficient and looses so little performance capping v at .95. Max is 350-400w tops this way and is very close or better than stock. Of course with mem oc and curve oc. Runs about 30-45hz lower than stock v
@@LaneBatman-c2v thats not the point he was trying to make. There is a reason why PSU's are rated for 80%. Thats the peak effecience. Any PSU running above 80% utilisation can have issues. Even without heat, capaciters can still blow when used at or close at 100% for long periods of time. Every PSu should have around 20% overhead capacity for your system of thew way you use your system. Thats just how electronics work. This is aspecialy true when you think that any component in your system can fail and it will just emergency kill off you system and not damage anything, but a failed PSU usualy takes down components with its death. And you will realy kick yourself in the head when your pricey GPU dies because of a blown PSU that has been utilized close to 100% for to long.
@@ronniebots9225 na.. its fine. And it might run in the 500-500w range even at stock. Closer to 400w with uv. Total power. Not just gpu. This is under 80% in both cases… hmmm…600w is 80%.
Owning mine has been such a blast. Yeah its expensive as hell but I have no regrets. The last 2 years of gaming has been incredible. I highly recommend pairing this with an ultrawide or 4k OLED monitor. Match made in heaven.
I have never bought a GPU until i got the 4090. To me it looked like a normal card but my friend saw it and said that GPU was ridiculously big. Then i opened up my old pre-made pc i bought a decade ago and saw the Nvidia card it came with... the card was tiny with a single fan. The 4090 is gigantic. TUF even gave me a little stand to help hold the card in the new case so it doesnt start to lean from how heavy it is. Owned it for 6 months and happy with it so far
I use a CORSAIR Premium 600W PCIe 5.0 / Gen 5 12VHPWR PSU Cable for my Gigabyte RTX 4090 Gaming OC and haven't had any issues what so ever. The plug has a snug fit and connects with a very audible click. It also uses solid wire instead of stranded. My motto is click it and forget it.
Is a 600w PSU not pushing it a bit for a 4090? They advise on 850w minimum. I have one of the MSI MPG pcie 5 psu's with the right cable and thats been problem free too
I bought a 4090 well over a year ago now, and I have no intention of upgrading for another generation or two. It's so far ahead of the rest of the competition that it'll end up being the 1080Ti of the 2020s and still be hitting good frames in 4-5 years time.
I came across your channel and your style works with me, so I subscribed with all that good good. I have a MSI 16Gb Expert 4080 Super with that really enclosed shroud. Your size ideas are right. It's going in the Thermaltake Tower 500 with a Z570 Godlike. FYI the Tower 500 packaging foam works as the CATBED 500.
Hey! Thanks for the sub, and yeah the sizes are crazy. Rumour is they went smaller on the new designs (50 Series) I guess we will see. Also nice pro tip on the cat bed. It's hard to keep them out of customer boxes haha 😅
@@techwandoUS Yeah you get two sides of foam for cats. You get a rectangle perimeter with a "skin" that allows cat belly to droop into the cavity. Thanks for the(edit) reply.
This is the first GPU where I needed something to keep it from sagging and possibly breaking over time. My Asus TUF 4090 came with a little metal magnetic stand thing that is adjustable. I made sure all power cables are completely plugged in. I feared the melting adapter issue. I've had this card since last October and it's been nothing but amazing. I have it in an i7-3700 system. I was using the i7-8700K and didn't want to bottleneck this new GPU. The i7-8700K was still running games great when paired with a powerful GPU. I also just felt the need to move on from six year old hardware. I fear hardware failure when things get so old. I'm using the Lian Li LANCOOL 216. This case has excellent cooling. The 4090 is so big that it gets real close to the two giant from case fans keeping things cool and quiet.
Great video and breakdown. I absolutely love my 4090, it's just a beast with everything I throw at it. I don't think I'm gonna even look at 5080/5090 and skip to the 6000 series at this point in another 3 years which is kinda insane when we look back at the 2 year upgrade cycle a lot of us are on with GPU's. Star Citizen is the only game in the future I'm really looking forward too that's demanding and it runs it fantastic. I just can't see needing an upgrade for many years to come. Yes, the price is high but I'm gonna get more years out of this card vs. all the cards I've owned over the past couple of decades.
Thank you, appreciate it. I think with the possibility of Nvidia having to limit the 5080 to the 4090 spec due to China (rumours) it's a good idea if you own a 4090 to definitely skip the 5000 series. I wish you many years of luck with your card my friend!
Good video man! I really enjoyed it! I have a 4090 and you're correct about everything. No one needs it. But once you have it, it is so hard to go back down. I will say this. There is something nice about knowing there is no where else to go when you're gaming. What I mean is this. I used to always wonder "What am I missing? How much more FPS could I get if I had a 3080 in stead of a 3070? A 3090 instead of a 3080?" The list goes on and on. Eventually, it just becomes a chase. You then reach the top and like Thanos, you rest. Now when I am gaming or video editing, I know that if I can't do something then no one else can either. I hope that makes sense lol. But with that being said, it is a stupid amount of money. If I was not a creator, there is no way I would ever spend my money on a GPU over 1K. Last night I fell asleep with my PC turned on. It stayed on all night, I never do this. This morning my first thought was "Oh no!" and "What if the cable is smoking on the 4090?" Thank God everything is okay. But man I should not have to worry about that crap. Anyway, great video!
Thanks man, appreciate it. So the question is Erock.... Will you be scooping a 5090 😎😉. I will try and borrow one from someone. Also that falling asleep thing... Did that once too, woke up frantically like I had committed a crime the night before. I think we definitely hear about more of the failures due to viral popularity, but you can never be too safe. The slight undervolt seems to be the best thing you can do with the 4090. Thanks again man.
@@techwandoUS As tech creator who focuses mostly on GPUs, I feel forced to buy a 5090. So I will plan to get it. I may stand in line the night before it launches and make it an event type thing with a friend or two. I don't know yet.
I had the 4090 from March 23 to October 24. The good things everyone knows, incredible fast, I did run everything at double 4K (Samsung Neo G9 57") with DLSS quality and the experience was amazing. But I always had that thing in the back of my mind, "dude you did put 1900 in this card and you play a lot of Factorio and BeamNG, not just Cyberpunk". So at the end of this generation I sold it for 1400. I got the XT 7900 XT (not XTX) for 600. Even the room is a little more fresh since I'm putting only 300W out as heat, not 450W. My temperatures are back in the 50's but my framerate went from 120 to 90 with FSR3 Balanced and FG in demanding games. Again, 4k x2. It was amazing but I won't be back to any 90 series again. TO MUCH MONEY. At most I will get an RTX 5080, the cheapest available and thats it. This is not like a truck, when you drive a truck you cannot go back to cars. The BFGPU is not that. That framerate is achievable is you change the settings here and there. Nobody need "ultra" or "psyco" at every setting. About Raytracing, that was beautiful. But not worth the extra 800 dollars.
I can confirm that the worry about the cable is a thing. I turn off my PSU and disconnect my PC every day because I don’t trust it when I go off to work or something.
Why do you think anything would happen as long as the computer is turned off? Even a completely "almost broken" cable will have absolutely zero effect on your computer when turned off or in standby.
You can tweak power consumption quite a bit with the OC voltage curve in MSI Afterburner and then using nvidia-smi to reduce the maximum clock rate. Its kinda odd you can't do both these in Afterburner as clock rate offsets seems to disable the custom voltage curve. That might seem nuts, but its like on CPUs, having more cores clocked slower can actually perform just as well as less cores clocked higher - under the right workload. GPU tasks inherently are multi-threaded loads by their nature so usually scale well to more cores compared to CPUs. So overall, you still end up much faster than a 4080. Compute workloads like Folding@Home I was able to get about 80% more performance on a 4090 vs 4070 Ti, at the same power consumption. It shaved about 100W off the 4090 power consumption with just a slight reduction in performance. I still haven't gotten around to checking how it impacts gaming, but a lot of people at the 4090 launch found just limiting the TDP to 60-80% barely reduced performance in games. NVIDIA just pushed the clock rate on 40x0 series way past the efficiency range. They actually seem most efficient at around 2.4Ghz which makes sense as this is around where the gaming GPUs are clocked.
Cablemod is seriously sketchy. They refused to accept an unopened return because “It’s not worth the shipping cost”. They will NOT stand behind their product.
it would help to also focus at what monitors one is using, as to where i am gaming on a 57" ultra wide with over 16 million pixels, where I need a RTX 4090 (which I have and love it)....
Wow 57-in is insane lol. I agree though I moved over to a 48-in OLED for my monitor and you can't really play in anything less than 4K at that point. It just doesn't look right
For the cost of the 4090 i'd rather upgrade my entire platform. I play 1440P 165Hz with a 4070 Ti Super with no issues. Even that card is overpriced. A 70 series card used to be $600
Techwando : the 12-pin connector has smaller aluminium pins and thus smaller combined surface to transfer current than the 8-pin standarized has since 2006. As well the RTX 4090 has a TGP of 450 Watts when in reality it continuously eats anywhere from 540-580 Watts (=45-50 Amps of current) with max peaks up to 641 Watts. To give you a perspective....my Manli RTX 3080 Ti 12GB ate continuously 390 Watts (=32,5 Amps) with peaks up to 441 Watts. And my Corsair RM850x wasn't able to feed it despite connecting 2 separate PCI-EX cables in 2 x 8-pin on the GPU. I've measured and my PSU can continuously feed a max of 354 Watts through 2 x PCI-EX cables. Whereas my AMD RX 6900 XT, which I've upgraded to from my 3080 Ti, eats continuously around 250-275 Watts with peak up to 294 Watts....while being faster than my RTX 3080 Ti 12GB is every rasterized game today. I've measured them both extensively with current clamps and an osciloscop and then properly calculated the actual PWR consumptions. Nvidia GPU-s 3000 and 4000 series are less efficient and eat more PWR for less/equal performance compared to AMD 6000 and 7000 series. The 12-pin connector is a FLAWED DESIGN and it's Nvidias design and thus Nvidias fault because they should have tested it properly before releasing it into the wild.
I got a Asus Strix 4090 and my Card never and i say NEVER, above 450W. Most of the time it pulls 300w +/-. This is not a lie, its true and even without undervolt, it wont go above 450w. Also i like the new adapter, why? because i dont like 3-4 massive cables in my Card.
@@venusprinzj8094 either you're lying or your reading / using a software that isn't reporting the power consumption properly. The RTX 4090 has a TDP of 450. This is Thermal Design Power which is completely different measurement from the actual PWR consumption. In reqlity the RTX 4090 consumes variably from 450 to 641 WATTS....with continuous (in modern-ish games) around 550-590 Watts with spikes up to 641 Watts.
@@lflyr6287 No i dont lie. HwInfo and ThermalGrizzly WireView tells the same. My 4090 does not pull above 450w because it cant, i dont OC my GPUs and allways do Undervotlt. The 40series is efficient af. Optimum did a Video and a 4080 pulls less than 150w compared to a 7900xtx in CS GO. A 7900xtx pulls more or the same as my 4090 with undervolt. Its simply not true what you are telling.
As a fellow 4090 owner I would agree with most of the gripes that you have about the card, the only thing I disagree with is regarding the power draw. I have a founders 4090 which is technically capable of pulling 600w, but I have never actually seen it pull anywhere near that except in benchmarks like furmark and 3dmark. Even the most demanding modern games like Alan Wake 2 or Cyberpunk only pull about 380-400w maximum with path tracing and everything maxed out, with my undervolt it pulls about 250-280w. I upgraded from a 3080 which often pulled more power than my 4090 does. Sorry this looks like a rant but I didnt intend for it to be. Overall nice video!
I'm basically always ranting haha, what are your power settings/CPU etc? The 4090 is definitely A very efficient card if used properly and tuned, but most people don't do that. Especially if they have like a strix model or something that's high-end that just instantly sucks as much power from the wall. Also, some people probably don't see the full power draw because the games they play might not even fully use the card. Appreciate the comment!
Lenovo has had (they go in and out of stock on it; just checked and it was currently in stock when I posted, it was out of stock this morning) the MSI Liquid Suprim X 4090 for $1,749 and discount codes can be applied. For me this brought the price down to $1,525. The liquid cooled card is much smaller (in width it is only up to two slots versus something more like 4 in width). So, those factors together ended up moderating (somewhat) a couple of the drawbacks mentioned in the video (though the price is still very steep).
@@techwandoUS Thanks! Yeah, I have a couple of additional use cases as well. 1) I plan to use it for GPU multi-core programming / processing and 2) I use a freely available older, beta version of madvr coupled with videoprocessor, a capture card and the 4090 for real-time video processing (mostly for dynamic tone mapping to a projector in a theater setup). The equivalent hardware device (the madvr envy extreme - which is basically a PC with a high end NVIDIA card and the software) goes for around $9,495 to $14,995.
Dukele432 - I just tried again (5/3/2024) and was able to get it back down to $1,529.37. $1,749.99 (goes in and out of stock). I got $50 off by using the $100 off code I got when signing up (this only lets me take off $50). I got $87.50 off by using EXTRAFIVE. I got $83.12 off by using my wife's nurse discount. They also have student, teacher, first responder discount (verifying through id.me).
4090 is the king of 4K gaming. I have zero complaints about my 4090's. Now we are getting close to the 5090 and looking forward to seeing how it games at 4K.
I almost bought one of these yesterday. My local Best Buy has one 4090 Founder's Edition Card in stock. And I had it in my cart. $1,711.99 after Taxes. I am currently using a MSI Gaming X Trio RTX3090 card with Intel 12th CPU. I really want to upgrade. But, I decided to remove the card from my cart and just wait for the 5090 cards reviews and go from there. I don't need it right now. But, I plan on buying that I Pimax Crystal Light headset and want more power to run it smoothly. I Appreciate the video. I just justify the cost. I got my RTX3090 card for $700 used from someone who upgraded to the 4090 card.
hey buddy i have almost exactly the same story. I got my RTX3090 in 2022 Dec right before the 40 series release for about $900. I am recently SUPER tempted to upgrading to 4090 when i saw a deal of $1400 on it. I did an analysis and found that there is a relatively small improvement on my experience (I mostly only want to collect it, not so much gaming or doing machine learning stuff), but the downsides are 1) $1400 still something I would struggle to spend 2) I am concerned about the quality issues as mentioned in this vid 3) maybe 5090 will be released in 6 months and 4090 will depreciate much more 3) I don't know how to deal with the replaced 3090 and PSU cuz I need a bigger one, even selling them I assume only $600 can be collected. In short, I gain somewhat not so much, but would spend $800 in net, and worry about quality and depreciations. Not a good deal. Maybe I will buy 5090.
@@leoxu6743 They say good things come to those that wait! I am going to save up for a 5090 card. But, may end up buying a Used 4090 card if I can get one cheap. We will see. I am curious to see how much it will cost and how much better it is.
@@RichLifeStories depending on how much the 5090 card is and if there is any available. I may end up just buying a cheap 4090 card if I can. But, no matter what. I will be upgrading. 😉👌
i built my current pc from scratch with 1080Ti 12g gpu with intel i7-7700K CPU back in 2015, it has served me well aand i have played all my games on highest setting until few months ago, i will start saving for another build from scratch and let's see how long that will take great video
I upgraded my 6700K + GTX 1080 to a RTX 4090 + 7950X3D. It will last me A LONG TIME as I use it with a 1440P 240 refresh OLED. I never pay more/year for my PC hobby. I just put 1K aside each year and I buy when I see a great upgrade.
Cool man! I did the same thing. I went also from a 6700k + 1080 ti strix (yes the tie version 🙂) to a whopping 13900k with a 4090 strix. Its hard explain the huge improvement. 😂 I also went to 4K while I did game at 1440p with my trusted 1080ti. Its crazy how this card hold up for so long. Lets hope the 4090 does the same!
This is a great vid even 6 months later. It’s nice that you talked about gpu longevity and how the 4090 might progress. It’s not like if you have a 3080 like myself that you need to blow out 2000 on the 4090 because it’s twice as fast.
I honestly went with the 7900 XTX than a rtx 4090 mainly due to the 12 volt connector. My co-worker's MSI rtx 4090 connector melted on him. That pretty much gave me my decision to go with the 7900 XTX. This is coming from a guy that has purchased mainly Nvidia cards - 98000 GT Ultimate, gtx 660, gtx 770, gtx 970, 1080 and now on my 1st AMD card - 7900xtx.
My Asus strix 4090 OC is working vore 13 months with no problems at all. Using a 4K 120 hertz 55inch LG TV as monitor... Edit : Y got the cable from Corsair just like my PSU.
4090 will last you a few years! My budget allowed me an rx6800 with 16GB. It is a good companion to the 5600x. VR beast. 1080p 165fps effortlessly. Runs cool. I hope to get a few years. 20 years ago I would upgrade every year. I understand chasing that waterfall. Shelf life is more important to me now.
AMD isn't likely going to release a GPU to go against the 5090, so the 5090 could end up being what they were going to use for the 5080. They could realistically only make the 5090 25% or 30% faster than a 4090.
@@Killswitch1411 I personally think they are going the make the 5080 a bit faster than the 4090, and make the 5090 much faster, making the gap in performance between 80 and 90 class cards even bigger than before. That way they can make the case for a substantially more expense 90 class card. At least that would be the strategy if I were Nvidia.
@@nossy232323 I currently have a 4090 and struggle finding any games worth having the extras horsepower. I may stick with 4090 this next generation. My 4080 build plays the games I have time for well enough.
I only game on my 4090. I use a OLED ultrawide with 175hz. I definitely need this card to run steady high frames in modern games. The last 7 or so years of owning ultrawide monitors has forced me to buy top of the line GPUs.
I bought a MSI Suprim X 4090 in January, probably the worst time to buy one, but I was too tempted... I agree with the general message for gamers who are planning to use it only for gaming: it's not worth it, especially with the 50 series right around the corner... on the other side... i won't upgrade my AM4 system anytime soon and so I'm limited to PCIe 4.0 which means I won't benefit from the 50 series cards anyway since these cards will most likely adapt the PCIe 5.0 standard, so yeah... great video btw, really liked it.
Appreciate your comment, and yeah I'd hold on to your platform as long as you can. AM4 is still so solid. I assume you are running a 5800X3D with that 4090?
@@techwandoUS Yes, undervolted 5800x3D and 4090, I really like the performance and honestly, I see no need to upgrade for the next years, 4k@120fps should be no problem for most of the games I play (non-competitive) 👍🏻
@Btburkhardt It's fantastic. I bought a used Samsung Odyssey G8 and decided set up for 4K and it doesn't slow anything down at all. I thought I was having an issue because the gpu usage kept going down to 0%. Turns out I wasn't taxing the system enough so it would revert to the cpu card instead. Quick system configuration change and it's been a dream since. Fast and smooth. I haven't had any issue with overheating or crashing.
@@carterskindle7086 I definitely recommend the 4090. This thing is a beast. I've tinkered with the settings to really get the juice flowing and it is smooth as ice.
I personally have an RTX 4000 SFF. No power connector, 75 watts (65 under load is sometimes more realistic), enough VRAM that I'm not usually worried about it, and enough performance as long as I'm not trying to run at ultra settings. I'm absolutely delighted with my purchase, and do not at all regret avoiding anything more cost effective in the low end, or not getting a 4090 in the high end.
What is your opinion on GPUs and UE5? Im in the market to build another computer after running a 1080ti and 6700k for 7-8 years, but alittle hesitant on possibly getting a 4090 or waiting for 5000 series to handle the up and coming UE5 games. I build to last 6-8years
UE5 is going to only get more complex as time goes on. Maybe some better optimization will come out of more dev time. That being said, if you have managed this long with your current setup (bravo btw, that's awesome) keep it going for a little bit longer. It's looking like the 5080 is probably going to be somewhere around the performance of the 4090 because they want to sell it in China. 5090 is going to also be quite insane according to some leaks (taken with salt). So you could either wait for all the 5090 buyers to fast flip their 4090s and buy one of those. Only issue with that is all the 4090 issues, some of which mentioned in the vidoe. I'd say just aim for a next gen card personally. That GDDR7 will be a game changer for hard to run titles. You'll also have peace of mind that you'll get an extra 2 years of support.
Actually thank you. You put yourself in my shoes and gave a grounded and realistic assessment without the overbearing technical mambojumbo and that "but it's up to you" BS at the end. So thank you
This is a really nice and informative video, thank you! It has made my decision to get a different card much easier. Also with the recommendation on the likelihood of 2k gaming being better long term. Good luck with your channel and I look forward to your future videos!
Happy to have helped! Also appreciate the kind words. Definitely this late into the 40 series lifecycle, I would probably wait to see what the 50 series offers, unless of course you need something ASAP.
It wouldn't add a dime over any other card if you did the same workload. This card doesn't just magically peg itself at 600w 24/7. Most of these are locked at 450w, which could possibly use about 12kwh a day, at 100% use. Most people are going to game, which won't use 100%, and even then maybe only 8hrs a day (which puts you into a heavy user category. All that being said it's probably going to use 50-360kwh a month. My power here is $0.16/kwh. So $8 to approximately $55 a month. What this math doesn't show is that your current system probably used 90% of that already, so the difference for me was almost $0
If your workload is big it does add a bit...mine added about 20-30kwh a month on average but i bought a huge 4k 80w display too so that adds a little too
@@delfean2666 Were you able to get more work done, faster? The cost of electricity might be less than a cheaper card if calculated as productivity divided by energy consumption. I'm not sure what everyone's hourly rate is, but if you could save an hour a month, I'd assume that would offset the extra energy cost. Then again, I tend to justify these kinds of things to feel good about buying things I don't need - I have a 4090 being shipped to me as I write this. I almost immediately regretted my impulsive purchase, so I came here for reassurance that it will be worth the investment. It will be used for editing and game streaming on my son's rig.
My 4090 has actually been the only part of my system I have not had issues with (watch me jinx it now). My asus x670e extreme motherboard and 7950x3d have been nothing but trouble and dont get me started on Windows 11 as a whole.
That's good to hear, do you just run it stock? Also totally agree with windows 11. It's been a mess. 12 will be here soon, hopefully they don't f it up.
@@techwandoUS I ran a slight overclock on select games but the majority of the time it is just a stock gigabyte gaming oc 4090. And I wouldn't hold my breath about Windows 12 haha, the whole AI thing worries me. I don't want AI to have access to ALL MY FILES.
Great video! Mind if I ask what your go-to monitor you mention is? I'm a 2k person myself and have a 6 yr old Dell 2k and been dying to find something that might be better. Current monitor is 32" and non-OLED... so I'm almost expecting you say it's a 27" OLED. LoL
I have a 3080 12gb and other than alan wake 2. there have been no games ive played over the last year that have made me want to upgrade. everything runs at 1440p smoothly
No wonder. 3080/12 is just a tad bit slower than 4070 Super. And that one is a beast of a card for 1440p. Thanks to weak 4000 gen performance improvement, 3000 line-up is still relevant.
@@stangamer1151 ik but you cant deny the perf per watt improvement. I built a pc for a client with a 4070 and it was within margin of error of my 3080 but was only using like 170w out of the box and was able to get it down to 160w with higher clocks than stock with afterburner. on my 3080 while undervolted draws 300-320 and when OC its over 400
Went from a eVGA 3090 rtx to a Zotac 4090 just under a year ago....been in love with this card every since. If you are a VR sim(flight/racing) gamer its literally god sent. my biggest gripe is the INSANE power consumption. Gotta pay to play so they say.....
Getting mine next week... Can't wait! Will play with 4k as long as I can lol. Besides.. 1440p has been my standard for many year's anyway no loss anyway.
Love your perspective at the end there! I went with a 3060ti for 1440p gaming and VR purposes in 2020 and I'm still very happy with it. That being said, the one vertical where you probably benefit from the power of the 4090 IS indeed VR. I think it's one of the few gaming justifications for it if you want to experience high fidelity high-end PCVR at max settings for full immersion due to the hardware demands of running VR ports of games that weren't optimized or designed for the medium (i.e. playing Red Dead Redemption 2 in VR is possible via mod but is very heavy handed). Outside of that, I honestly don't think it's necessary.
Appreciate the comment! I really need to look more into VR, as this is probably the most popular rebuttal comment to my opinion on the card. Thanks again 👍
My 4090 was bought on day one so it's time for an upgrade. It has been through 2 cpu and MB upgrades: 12900k/Z690 and 7950X3D/x670. It also has been through 2 monitor upgrades as well: Samsung Neo G9 47" and then Neo G9 57". I have the 47" as a secondary monitor now. Otherwise my 4090 has been very reliable with no issues. Power draw is negligible because i have 2 electric cars and run my AC 24/7 at 20C or 68F. I never check the cable because it's fine. I'm using the stock Cable from my 1000w Corsair PSU.
So this is how I mitigated all those issues: 1: I bought a Corsair 12V-HP aftermarket cable that goes straight from my HX100w PS into the GPU, , and I’ve not heard anyone have an issue with that set up, if you can afford a 4090 you can afford a £20 cable. 2: I bought my 4090 direct from Nvidia and got it at msrp £1569.99 3: I bought a Velocty2 water block for £200 for my 4090 and cut the size down drastically which also helps with the weight on the pcie slot. 3: I paired my 4090 with a 7800x3d and max system power draw that I’ve seen with a full custom loop is 600w Yes it’s expensive, but it really is a Titan class card, and that’s what you should compare it too in terms of price and performance not a XX80ti Look nobody needs a Ferrari, but it’s cool as hell if you can afford it.
As a mechanical engineer and industrial designer,the 4090 is a fantastic multipurpose card.The card is fantastic for 3D model rendering,FEA (Finite Element Analysis),video rendering and stabilization of drone footage.The other benefit of the 4090 is that it is less expensive than Nvidia’s Quadro professional cards.I think the way to view the 4090 and other future Nvidia top specification graphic cards is that their performance allows you to bypass at least one generation of card.The money that is spent on the card is amortized over a longer period of time thus creating greater value. I agree that of the sole purpose of owning a 4090 is for gaming,then the card is overkill.
thing about 4090 pricing often makes me smile, like people forgot that nvidia made titan rtx for $2500, they will go to whatever price they want and 1600 isn't that high in the grand scheme of things, like 10 series titan was $1300
@@TurboMintyFresh for you my friend, i went to inflation calculator, $2500 of titan card of december 2018, up to october 2022 in which 4090 came out, went up to $2965.49, so if we just accept that 90 class card is titan card of old, then 4090 is half off compared to 2090... is it a lot of money? yes it is, but in comparison, it's not that much, and Nvidia is capable of going with way higher asking price than $1600 for 90 class card
I have been using flagship GPUs since the 2000s for one simple reason - longevity. Also, "Pay more or buy twice". It is a very hefty investment no denying it but what you get in return is "Load it, max it and play it!". I was gaming at 1080p for many years despite having a flagship GPU because my logic is "If it can run eg, COD 5 at 250fps then it should be able to play COD 7 at 150fps and it certainly turned out true, at least for the COS/MOH/BF series so you basically have peace of mind for multiple gaming cycles where you can confidently run all your games at maxed detail (until ray tracing came along at least). Even though I have a 4090 I will sometimes still disable anti-aliaising, use DLSS in Balanced mode and select the Very High (instead of Ultra) preset when I game on my 3440x1440 ultrawide screen, the primary reason being I'd rather play AAA titles at 120fps @ Very High rather than 90fps @ Ultra although 90fps is already very smooth. For run of the mill FPS/action games the framerates are usually in the 150-200+ fps range and this an indication of how much headroom is available at least for the next 2 years or more. And it is for this reason that one can justify paying >$1,500 for a flaghip GPU.
Awesome video! Waiting to upgrade my 2080ti. Have been thinking of getting an used 4090 but I might wait for the 5090 instead. If you game, and edit video on the same machine. What type of driver are you running (studio/gaming)? Also it looks like you are running davinci? Is it utilizing the GPU to boost the editing/rendering performance?
Appreciate it! I swap between the gaming driver and studio driver actually. Was planning a video around this very topic just need to get to it. Davinci resolve also runs really well on both drivers, but I did happen to notice less freezing of the application with a mixture of video file types while running the studio driver. I need to do more tests to really confirm anything, but I currently a lot of 4K Sony footage, AV1 OBS, and 1080p upscaled TH-cam mp4s, and when you overlap them, occasionally you run into every now and then.
Excellent video! Very easy to follow along with and you're well spoken, i appreciate you making this! I have a 3090 and thought about upgrading to a 4090 just for davinci resolve. Can you tell me more about your edits? Do you edit 4k videos and render as such? & What encoder do you use? Any upscaling? Just curious if the 4090 is worth the upgrade from my 3090
Appreciate the kind words! Honestly I'll save you some time and just say, the difference between the 3090 and 4090 are in DR and AP are really just end results render times. The 3090 still has all that fast memory, and holds up super well. I would skip this generation. As for edits, it's pretty much whatever the client wants. My personal videos are simple, usually a blend of 4k and upscaled 1080p. Others I'll use SLog and or slightly downscaled 6K from my camera. As for the encoder, I primarily use AV1 for all of the screen grabs, and h.264/5 for everything else. 265 takes a bit longer to render though.
I have a 4090 FE with an EKWB. 0 connector issues and has ran perfect since day 1. Toyed around with pushing the OC on it for fun and benchmarking. Then set it back to factory and applied a little under-volting. No issues so far and never gets hot. The 12900k paired with it on the other hand takes some work to keep the all core at 5GHZ while keeping temps down due to my EVGA Z690 board loving to excessively add voltage even at factory settings.
Waiting for the 5090 to come out since that's the best opportunnity to snag a deal with a 4090. If I buy it new, I will do it via Micro Center with the extended protection in case it catches on fire and they will just replace it 🙃
It says on the contract I signed with my landlord that I will only be using 1000W per socket in the kitchen and the washer / dryer cabinet. If, as many people say, your cable gets a little warm and they worry about it, think about what is going on INSIDE OF YOUR WALL. Especially if you live in a 100+ year old building like me. I think I might actually blow a fuse if I have a 1200W PC running at full speed.
Also 4090 owner here, this are my takes: 1) 12vhpwr isn't well thought out connector, but definitely most fails are due to people not plugging it right, also do not plug and unplug repeatedly! this kind of connector is rated for 30-40 mating cycles. So connect once make sure it's fully inserted after you cable manage, and leave it as is. 2) power consumption is high for me around 450-500 watts, but this is the situation today with advanced nodes. 3) price I paid 1,750$ last year so 150$ more than MSRP, but this is for the suprim liquid X version, getting the card at MSRP isn't realistic, but it ain't new, this has been happening for years now. 4) depends on the version mine is 2 slot but very wide because of the PCB. 5) if you want 4k high refresh rate, this the best right now 6) good temps, but I will add the card to a loop, I just want to build whisper quiet system.
Very well said, appreciate the well written comment. Also great model card, and $150 over MSRP really isn't that bad, especially if you consider the price of a 360 AIO on top of the card.
There were changes with the ATX specification to prevent it from operating without a solid connection. Size is great you don't have to deal with noise. Most of this stuff doesn't have anything to do wit YOUR experience and actually using the card. I still believe if you get MSRP (I got 5% under MSRP with MC card) it's WAY better value than the 4080. I know a lot of benchmarks show 20-30% uplift but that was what was available at release. There is 60% more hardware available.
got a 4090 since January 2023... total gamechanger for me... I now do my arch viz renders 100% on GPU, no more CPU renders, also opened the world to me to do video walkthroughs in a matter of hours or a couple of days, instead of insane times or just realistically refusing to do videos
4090 FE owner here. Only bad thing I can say (I mean price subjectively aside), was Nvidia's choice to go without their store and rely on Best Buy who had terrible early management of purchasing. While I game a lot, I do a huge amount of 3D rendering and other work so while yes, this card was expensive, it's been worth it. But if you're just wanting the best gaming performance, yeah, it's definitely a hard pill to swallow and I'm still bewildered that GPUs are now the price of what a machine used to be. Wish Cablemod's 12vHPWR adapters worked - ran mine without too much issue but then with the snags, of course, stopped using it (RIP to those who cards died). Hopefully Nvidia and other OEMs have learned a lot from this launch. On the size, yeah it's like Nvidia is like "How much space in this hypothetical rectangle can we use for a PCIe card?" "Yes. Use it!". Thankfully I can fit the size in my case but yeah it's a monster. Card does stay suuuuper cool though!
Wattage is not a measure of power efficiency. Joules per unit of work is efficiency, and the 4090 is pretty good on that. It is high power but it finishes the job much sooner. (I don't know what the idle power is compared to other cards, some idle performance should be included in any efficiency comparison.)
Even though I could easily afford multiple 4090s, i just ordered a 4070 Ti Super 16GB. I think I'll be happy with the performance of that GPU. I'm coming from an RTX 2080.
I have enjoyed everything with my 4090. I would spend any amount of money to not have to water down my experience! Ultra everything or just nah! For years every high end card still had to compromise something. This is the only time in history I got what I paid for. Years ago you bought an $800 gpu and still had to dial back settings sometimes. Not with the 4090!
5 หลายเดือนก่อน
I've had a 4090 for 9 months now, with no problems with the cable, then again I've never stressed the card to it's limit.
I got my founders card for $1599. Never had any issues with the connector. It runs very cool. The power draw for me has been great, but then I have solar on my house so that helps a lot. I drive a Samsung Neo Odyssey G9. I have no complaints at all.
My 1 year review of owning the 4090 is turning on any game, impressed at the huge FPS and eye candy, I say "cool", turn off my PC, then proceed on playing my game laying on my couch on my steam deck.
PLEASE NOTE: there have been many mentions of the power consumption, and how efficient the card can be when undervolted.
I Totally understand this, but for this video I was testing at STOCK settings. I know it's hard to believe because most people on this channel are enthusiasts, but people run stock way more often then OC/Undervolted.
Also the all over the place mixed comments on this, tells me it requires a totally separate video. My personal card starts to really tank in performance BAD around 80-75% limit, there are people in the comments saying drop it to 60% and I think thats bad advice, everyone has different use case scenarios. Thanks everyone.
Also, this video is about this card and this card only, not comparing or contrasting 7900XTX/3090/3080ti etc.
No reminders are necessary. The like button should be hit right after anyone clicks on your videos
@@anthonywaldrep5982 Haha your the best Anthony. 🙏🙏
Thanks for your story. I feel I lucked out getting mine more recently missing the whole adapter mess with Cablemod etc. I think they should change the connector but it's an amazing card. I max out 1440P now native, will possibly go 4K in the future. I still prefer the 4090 FE to my 7900XTX, but I can see a lot of people using something like a 7800XT and just FSR/be happy at the midrange. Thanks for the video.
What CPU you have?
@@jose131991 12900KF
Turns out the secret to me enjoying PC gaming as much as console gaming was to have a card where I could just turn everything to ultra and start playing without ever bothering to tune my settings.
Honestly that's the single reason PC gaming isn't for everyone. You often have to tweak settings instead of just running it without thinking. That's why consoles will always have a place for gaming.
Hell yeah.
C'mon my guy, kids reading this could start doing this and start their game with the dumbest tech in video games after loot boxes ! Motion Blur !!
+ultrra
-Motion sickness designed to mask 20fps dips
pc gaming was always better it was just most people couldnt afford it, aka mostly younger adults and teens. Thats i play pc and not console, Even a 1k gaming pc would beat any console by miles, the current ps5 and xbox tech is on par with a amd 5700 gpu that came out back in 2019.
with pc you can always upgrade your gpu and get massive fps and visual boost im talking like easily 30-80 percent and iam talking about low end to mid cards.
not only that the texture detail and lighting effects are like a increase of 120-290 percent depending on the card. Just compare something like a 6700xt gpu vs 5700, and its 40 percent increase in fps and texture detail is a 70 percent increase,
and renders images at 112 percent faster, and this is a 310 dollar gpu, that is between low end and mid.
@@ArtisChronicles
Or you know, just use the predefined settings that 99% of people use, no need to be a smartarse.
I got a 4090 at launch, and have paired it up with a 13700k as well as an ultrawide oled monitor. It's actually insane.
Yeah it's more than enough in almost every situation.
I have a 4090 but a i7-1100k and is bottlenecking the card even at 4K. Doesn’t help that newer AAA games are all cpu intensive
@@jose131991 Wasn't the 11th Generation Intel CPU really just a myth? You actually owns a i7 1100k??!! 😀
@@jose131991@jose131991 was a bad time to buy intel.
Even an older 5800x was faster in gaming at the time.
While costing much less.
The 13700K is not enough CPU for a 4090. But then, neither are the 7800X3D or the 14900K. The 4090 is awesome, but if the sword is too heavy to properly swing, what good is it?
I bought a 1080ti w/ i7/7700k back 7 years ago or sumthn. And thats how I plan to purchase when it comes to teh 5090. Even if its $2500 and lasts 5 years. Thats $500/year I spent to damn near guarantee I wont have to address upgrading anytime soon. I have a motto. "Buy once, cry once." And it rarely lets me down
Great motto, also sick to see how your 1080ti system served you. Appreciate your comment.
Same.
That's a great motto holy shit
Yup, I'm getting the 5090 and the Zen 5 when they launch. Can't wait🤩
I have a 1080ti, too, and I think I'll hold on for the RTX 6080 or RDNA 5 cards... the 5090 will only be x3 the 1080 ti for x3 the tdp, and I don't want a 1000-watt computer.
A 4090 is literally a mortgage payment for some people!🤣🤣
It really is! It's more than mine actually haha
Not if you're buying in 2024... then it's like half of one 😬
& The maintenance needed on these cards lol. It definitely feels like you can't just plug and play with a 4090, there's so much seemingly that goes into even running it. Of course it runs great but man the stuff you have to monitor on a 4090 is wild (in comparison)
lol I could buy 2 4090s every month for the price of my mortgage.
@@joelv4495 👀
In europe, PC gaming is on massive decline because of overpriced GPUs. Since 2019, PC gaming has lost over 20 million gamer to consoles. We upgraded every generation to the 70 cards, which were around 400-450 euro (with good cooling). Now, a good 4070 with good cooling costs around 700 euro, which is nearly double the price. These price increases are hitting every sector, especially the low-end entry-level sector, which accounts for the largest share of the gaming market. They literally killed the casual-mass-market on PC. On the long run, this can't be good.
100% agree, it's terrible for the mid range. Hopefully the new APUs can really change that in the future.
I simply refuse to pay more than 500 - 600 EUR for a good GPU. Still using my GTX1070 from years ago.
(Got it in first 3 months of its launch day)
Ive upgraded every other componnent in my PC over the last 2-3 years.
Now i see the RX7800XT from AMD and i will probably make the switch, i just wanted more raw performance and VRAM from both companies the ladt 2 generations.
Well as a European we just don’t make that amount of money. Here in the U.S. my wife and I make close to $200k annually. In Germany we would make less than half that if at all. Herr I can afford a 4090. In Europe I couldn’t.
Since 2019, Geforce now took on about 25million customers +, therefore many have moved to cloud gaming, not just consoles
@@pandaonvodka6676 I played on my GTX 1060 for many years up until a few weeks ago. The card still holds up pretty well at 1080p as long as you don't play the latest games on the highest settings (and if you forget that RT exists, of course). I feel like playing on ultra settings brings a miniscule visual improvement anyway, though. I upgraded mostly for work reasons.
I have an asus tuff 4090 I love it but in all reality I'm still healing from my kidney transplant donation
😅😂 I don't want to laugh, because I can't be to sure...
@@techwandoUS the price I paid my friend$1899.00 USD
@@techwandoUSDon't worry though, it was a "donation" 😈
I got mine on buy now pay later, so I could delay losing my kidney.
Friend of mine said he was considering a sex change.
Uhhh, okay ! Who knew, turns out he needed the money for a 4090. Casually told him he had 2 kidney's. Just sell one of those as was calling him over the phone, the look on my wifes face as she was didn't really knew it was serious or not, it's was gold.
After owning a 4090 for 6 months, all I can say is that if you can afford a 4090, you should get one.
its 2 years old better to wait for the 5000 series no point buying it now and regretting when you see the 5000 series. This gpu flagship should be bought when its released thats the only way to justify the price!
Worst time to buy one, 2 years old AND Price increased ..... it used to cost 2,300$ now its 3500$ in my country after all the sanctions and shit. Plus 5090 is close so might as well get that on release thats what i did when i got my 4090 just bought it on release and built a whole pc arpund it
@skilletbakes420 yea I did get my 4099 for $1100 on Facebook marketplace. The 5000 series will probably be a worthy upgrade. Still the 4090 has at least 4 more years of decent performance.
@@abeidsalim8008yuh just watch the 5060 ti do what the 4090 can and only cost 500 bones and only use 200 watts or less 😊
I will, in about 6-8 months for about 500-600 dollars 😊
I went from a GTX 680 which I used for 7 years to a GTX 1660 Super that I used for 3 years and currently I have an RTX 4080 because I saw a deal which was just a no brainer.
Combined with the R7 7700X and 32GB DDR5 6000MHz RAM. This 4080 already is just... wow. I can play every game I ever wanted to play on max. settings on high fps.
I couldn't even imagine the power of a 4090. 😅 so yeah, nobody really needs a 4090.
Solid progression over the years!
i have 7900xtx and a 5800x3d, can play 95 percent of games at max settings at 144 +fps at 4k settings. No need to upgrade anytime soon and if i wanna play in 4k just hook it up to the tv.
Always depends on your expectations. But once you go high end, you always keep up its value by reselling it at a good time.
@@grandcanyon2 Having the XTX is like having the "better" 3090 Ti - It will beat the 4080 in rasterization and at least compete with Ampere in terms of RT. Get the FSR3 mod and it becomes actually playable in 4K. Having 120 FPS in 4K + RT maxed would be nice but even 75 is a progress given the incredible beauty, in games like Callisto Procotol you don't necessarily need 120.
@@voyagerdeepspaceexploratio5023 I wish I had HDMI cable that could do 4k at 144
Re: 12V connector: If you didn't feel or hear a click, it's not seated.
5090: Now with fire detector and daisy-chainable fire extinguishers!
Correct. Still doesn't stop them melting tho! The whole seating thing was their first shot at blaming consumers.
i pushed my 12v connector on my 4090 in like it owed me money. The 12v 2x 8pin connector that came with my corsair is very tight fit with my rog strix 4090. There is zero lateral movement in the connection with that cable
Yup. I made sure it was seated properly and so far no issues. Been using mine for like a yr now.
Sadly that's not true. The 12VHPWR to PCIe adapters particularly I don't feel a click. In fact I think my FE card you don't feel a click with any cable. But when I got an ASUS 4090 TUF you could feel it, though still barely. Its the nature of the clip being so tiny, there isn't enough force behind it or enough mass to make a proper click.
When I moved to official Corsair cables it was slightly better and definitely feels more secure, without the weight of the adapter pulling the connector at an unhealthy angle. I feel making the cables on the adapters longer would have been a good idea to reduce the strain.
Yes I am sure every time mine were fully seated.
I think we're getting to the point of diminished returns as these cards become gargantuan. A new series of cards every two years doesn't allow enough time to shrink the die sizes. Pretty soon, we're all going to he buying cases according to how large the GPU is going to be
100% agree on the timeline, it would be nice to see a 3 year cycle, give us a little more time. Seems like AMD might be doing just that this year ironically. Nvidia is producing too much money to want to stop.
I really hope the cards don't get much bigger too 😬
And need their own nuclear plant to power it.
I've been buying nvidia for twenty years. The releases used to be yearly. Sometimes twice in the same year two different series.
@@yjzep9922 They were certainly less eventful too back then. They'd just slight out releases randomly. Since the 700 series it's been a lot more "formal" or predictable.
That`s already the case.
Besides what has die size to do with card size.
I can see the die sizes shrink considerably compaired to card size.
There are no diminished returns, you get what you pay for and in case of the 4090 a lot more.
No such things as too much power with VR. We are still probably 3 graphics card generations behind unlocking just the current VR head sets potential.
Yeah, certainly with VR sets like Pimax Crystal Super coming. I'm personally using a RTX 4090 + Pico 5 right now. But I will upgrade to a higher end VR set when the RTX 6090 is on the market. Hopefully that will be powerful enough.
In MSFS I can get 90fps in VR on ultra but that's with the reverb G2. Not sure how my performance would be affected if I went to the Crystal Light or something with 2880x2880 pixels per eye. But I bet I could still max it out on any game aside from MSFS which is the most demanding VR game that I know of.
@@arcrides6841 There is no such thing as maxing it out. There is always more supersampling.
No doubt; my 4090 isn't even close to being able to handle the Vive Pro 2 at max settings - before supersampling.
Very true
First video I've seen of yours. You have a very natural, relaxed air to your way. Very pleasant to listen to and to watch. Also, the subject matter and pacing is great. Thank you!
Thank you, I appreciate you. One of the best comments I have gotten in a while.
Needed it to run a 7680x2160 monitor for sim racing. I don't feel lucky or blessed lol, I work my ass off, and spent 2 decades being working poor before my endeavors paid off enough that I can easily buy a $2400 gpu (white rog strix...) in my mid 40s... I also drive a mid tier pickup with 200k miles. Buddy of mine spent more on Taylor Swift tix for his wife and daughter lol.... and that's 1 night.
It's funny you mention the pickup, I almost left in a second describing the 2013 Honda fit I drive. The video was just getting to long.
All I will say is, hey man... whatever makes you happy. PCs are if anything kinda an affordable hobby when you start to look at car parts, camera gear etc.
Appreciate the comment.
@@techwandoUS I have a relatively pricey sim racing setup.. ihave about 15k in it including the pc. That might buy me tires and brakes for 1 real race season lol... and I mentioned the truck because i have friends who buy a new premium truck every 3 years, have a corvette, and a harley lol. so when they give me crap for how much i spend to " play games", I kinda laugh..
@@yjzep9922 yeah this is a cheap happy and well worth it. Also it's safe and we don't need to show wealth to others in public
Anyone thinking about the future of video cards might be interested in the fact that motherboards are coming out with the power delivery in the PCI slot.
Despite what you feel you're still lucky and blessed, no matter how hard you worked.
Owning a high end card (7900xtx) for the first time I definitely developed thoughts about GPU maintenance and its longevity ("I invested so much, maybe I should change thermal pads at some point" etc.). The 4090 is in its own league but making such an investment comes with the expectation of having a free of issues, high quality experience in every way and a usage of at least 5 years. The reality is obviously different and I feel like the past 2 gens high end cards are more experiments than established, well tested products. Thats why I went all in and got the nitro+ top of the line model, just to minimize any quality issues..So far its been amazing, but not everyones GPU works well in their system.
That said, this is a very nice review! I had to make my own (fairly short) researches and adjustments to make my 7900xtx work without issues (so far) and I hope u can enjoy your 4090 for as long as possible.
Appreciate the well written comment. Happy to hear you are having good experiences with the 7900XTX. Lately I hear nothing but good from the owners of that card.
As for your concern with thermalpads, honestly if the cooler design is good and they are using decent pads (which you can use a GPU teardown site/video to see) then I'd say leave the pads for a long time and just replace the paste the second your warranty is up. The only time I was personally changing pads often was on the FE 30 series cards, because for some weird reason they decided to use really bad, almost paper like thermalpads.
@@techwandoUS Ofc. Ive seen some "low quality" pads on other top models and decided to go for Sapphire just to make sure. But since top GPUs seem to stay very cool (UV+OC-ed mine, 50°C , 65 °C hotspot) maybe maintenance is less of a problem, unless temps start rising unreasonably high. Thanks for your insight.
i got a 7900xtx i love it , its paired with a 5800x3d cpu runs all games max settings 144+ fps easy. best part no need to upgrade for another 3 years
@@Greenalex89 I have the Sapphire MBA, it easily goes to 115°, 90° on the memory under load. It is likely to be the thermal issue. The performance is still incredible and so far I undervolted it. I plan to RMA it in October or whenever I find a cheap 6700 XT in hope for a refund, worse case would be a custom XTX. Best case I hold out and buy a used 4090 by the new year. You can be happy with those customs, they are much better.
@@voyagerdeepspaceexploratio5023 Dont u mean the AMD Radeon MBA 7900xtx? Cause that one is known to haver thermal issues with high hotspots, cause a few thousands of those didnt undergo a quality control before hitting the markets. AMD refunded those without issues. When it comes to thermals I would pick the nitro+ or the XFX version of the 7900xtx.
The evolution of a “4K” card inevitably becoming antiquated due to increased requirements at that resolution is a great point I haven’t heard anywhere else. Easy sub w bell noti’s.
Appreciate the sub! Also yes, got to be thinking of the future of the products we purchase.
You dont need a 4090 for good 4k
As much I want a 4090, I would rather keep my EVGA 3080ti FTW3 Ultra because I feel safer knowing it will not burn my house down.
Funny considering 3000 series were made on a shit process thats runs hot, uses tons of power and degrades fast. Samsung trash node, was cheap tho. For Nvidia that is.
only people who weren't smart enough to plug the connector all the way in, had issues.
not once did my house burn down in almost two years.
go figure.
@@arturiaarthus8367must be quite the smart cookie!
Been so busy I haven't been able to watch any content. Im so happy that you are still going strong. Your channel is such a good watch. Very informative and entertaining.
Thanks Jesse. Always appreciate you bud.
Just built a new PC a month and a half ago. 14900k, 32g ddr5, 2tb Samsung 990 beast system. But “cheaped out” on a 4070 ti super instead of a 4090. The price difference would have been $1300 for hardly any significant performance gain. I haven’t shelled out the cash for a 4K monitor yet soooo… at 1440p I can max settings in any game and get buttery smooth frame rates. The 4090 was in my budget but 1300 extra cost simply isn’t justifiable. I’m very happy with my decision. I can use that money to upgrade to a 5090 less than a year from now instead.
Nice, just make sure to get that Intel baseline bios for your CPU so it will last you longer. Appreciate the comment.
I built a similar machine to you recently with a 4070 Ti Super... It's incredible. Hard to see how i would need anything more really.
4090 at 1800 dollars and more problems than any other Top tier card. Nvidia treats their customers like suckers.
What is the morally okay price ? What do you think?
Huh? What problems? They made a beast card. You buy a 200 lbs dog you’re gonna have 200 lbs dog problems
@@Daud438 It's okay for a top end card to be expensive. $2K may be too much for gaming, but there are still professional users doing stuff like video editing. And I don't think NVidia could really avoid the insane power draw and consequent physical size of the card. But the power connector should not be a fire hazard.
Try to tel us 3 problems and don't cry about your wallet. PC hobby is not free.🤣😂
Out of all the cards I own and have own I love the 4090, it's so powerful but also quite efficient, however that connector is going to be something that will more than likely devalue these cards or make it a headache for every owner. I got my TUF 4090 day one and last week after 18 months of use the connector melted on it. I had to document and show it because it melted on both sides GPU and PSU side. The PSU side got fused and couldnt be unplugged.
I work with high and low voltage electricity always plugged mine fully. I think what happens over time and this is probably unavoidable is that due to the high current going through the wires the plug gets warm and cold, warm and cold and the thin pins expand and contract increasing resistance in the connection and heat until it escalates one day.
I've heard every theory from user error to plugs being rated only for 20 or 30 plugins per cable. It's just a bad design period. I like the idea of one plug to rule them all but any replacement of an already established proven method should be an improvement. This connector isn't over the standard 8 pin connector. I've abused those connectors plugging and unplugging them so much without any issues ever whereas with this one it's almost a given it will one day fail and now I cant leave my computer to render and walk away. That's a huge side effect imo.
Im on my second TUF 4090, melted connector as well.
@@Zwank36 Were you able to get a replacement relatively hassle free? I had an extended warranty with my local MC so they just gave me a new one. Got another TUf because I like the build quality and cooler especially.
100% spot on about the long term value.
It's a shame you had to go through that, and I have also been saying the same thing. It feels inevitable. It will probably happen to a lot of them as time goes on. Probably right out of their warranty window 😅
@@TerraWare Bought from a brick and mortar store, had a replacement next day. Consumer protections very good in Australia.
What we really need to do is tell Nvidia to stop being so greedy and sabotaging their own products (already overpriced) so you have to buy another. Why hasn't anyone put the blame on Nvidia yet? This is clearly a problem that they created to make more sales.
I'm buying a 5090 because it's my childhood dream to build the best PC out there. Thanks to god money isn't an issue now.
Eh, then the 6000 series will come out and I'ma have the best PC out there lol
@@madboi1591 Not to mention people like this will pay 2500$ for 5090 ...making the whole market for all out us…and themselves... even more retarded insane.
"almost like we're test subjects for what's coming"
This is how literally all of us felt when doom 3 came out, and again when crysis came out.
It was remarkably more common back in the day to have to update your entire system if you wanted the good frame rate for video games
That was around the time i was moving from emachine to "dell with extra ram and a low end card"
I hated emachines...
I've owned a 4090 TUF OC for a year and your points are spot on. It's a powerful card that the average consumer just doesn't really need. I've had to convince friends to not buy one simply because they just didn't need it. I had to get a replacement as I was having issues with driver crashing and thankfully my RMA experience was fast and great. They sent me a factory new replacement just under a week of diagnosing the issues with my card. During the down time of not having the card I had to use my 3080 ti and it just felt so different, once you get your hands on something like it, it's just hard to get off it.
Summed up the video right there...
It is just too expensive for just a video game card. But otherwise you quickly get used to worse if necessary. I survived the two "GPUless" months in 2022 with a R9 Fury, the cheapest option for some performance at the time. It could still play old games flawlessly in 4K, in BF4 actually the HDMI port is more a bottleneck than the performance. You only buy a current gen high end card to play AAA titles in 4K, RT, 120 Hz and more.
But how will the 4090 hold up to skyrim delta plus loyalty addition in 2030?
🤣🤣🤣🤣🤣
I have a 4090 strix, and i am honestly impressed, its the best card overall that i have used possibly since forever
It's definitely a badass card, there's no doubt about that. Happy it's serving you well
I still remember when I upgraded from a GTX 1060 6GB to an RTX 4090.
It used to take almost a minute to generate AI images, but after the upgrade, it took only a second.
I was in shock, and I'm still amazed by such speeds.
@@pastuh that's an insane upgrade
@@pastuh well its 5-6x faster in normal workloads, and then you get the tensor cores for AI which can make it 10-20 times faster in some workloads.
It should be for how expensive it is
Undervolt is the thing to do. I’m not conservative. Running on 750w psu too.. UV at .95v is the key to stock or better performance but 75-100 less watts. So 300-350w max. Oc memory by 800-1000 and do the curve oc. There’s a few videos on it. Very simple. Very effective. Can go as low at .9v without much loss. So probably 250-275w. And it’s insane still. Maybe 5-8% loss. My card doesn’t break 60c ever on 50% fan speed. Tuf non oc
You should check the temperature on your psu. It might blow up if you reach 100% utilization on both cpu and gpu. You can use a cheap scanner
@@tyrellwreleck4226 I’m well aware of the power draw. And always have software running to check it. Corsair hx 750i. The psu fan doesn’t even come on most sessions. No joke. It’s totally fine.. total sys power draw is below 600w even at stock with no UV running cyberpunk or similar game…It could run this way non stop for a decade probably. Sure, I wouldn’t buy a new 750w psu when I replace this one. Probably 8 years old or older…. I’ll go with a 850w min but might just go to a 1k watt to have plenty of headroom, I like the idea of it not having a fan come on ever. But I think in general most people are over cautious and/or worry about stuff that will never be an issue. Also used a cheap 3 to 1 8 pin to 16 pin connector from Amazon. Did upgrade it to Corsair 2 to 1 cable tho. So it only plugs into 2 8 pins on psu… still can pull around 500-550w tho in testing. And this is close to the max psu rating. But I wouldn’t ever run it this hard. Again, this gpu is so damn efficient and looses so little performance capping v at .95. Max is 350-400w tops this way and is very close or better than stock. Of course with mem oc and curve oc. Runs about 30-45hz lower than stock v
@@LaneBatman-c2v thats not the point he was trying to make. There is a reason why PSU's are rated for 80%. Thats the peak effecience. Any PSU running above 80% utilisation can have issues. Even without heat, capaciters can still blow when used at or close at 100% for long periods of time. Every PSu should have around 20% overhead capacity for your system of thew way you use your system. Thats just how electronics work.
This is aspecialy true when you think that any component in your system can fail and it will just emergency kill off you system and not damage anything, but a failed PSU usualy takes down components with its death. And you will realy kick yourself in the head when your pricey GPU dies because of a blown PSU that has been utilized close to 100% for to long.
@@ronniebots9225 na.. its fine. And it might run in the 500-500w range even at stock. Closer to 400w with uv. Total power. Not just gpu. This is under 80% in both cases… hmmm…600w is 80%.
_ImWateringPSUs_ has the definitive undervolting video on TH-cam for 4090s. Man is a genius. Makes it really easy.
Owning mine has been such a blast. Yeah its expensive as hell but I have no regrets. The last 2 years of gaming has been incredible. I highly recommend pairing this with an ultrawide or 4k OLED monitor. Match made in heaven.
I have a massive 3080 and the temps are so low and everything runs perfectly. I actually underclock the thing because I found that I never max it out.
I have never bought a GPU until i got the 4090. To me it looked like a normal card but my friend saw it and said that GPU was ridiculously big. Then i opened up my old pre-made pc i bought a decade ago and saw the Nvidia card it came with... the card was tiny with a single fan. The 4090 is gigantic. TUF even gave me a little stand to help hold the card in the new case so it doesnt start to lean from how heavy it is. Owned it for 6 months and happy with it so far
I use a CORSAIR Premium 600W PCIe 5.0 / Gen 5 12VHPWR PSU Cable for my Gigabyte RTX 4090 Gaming OC and haven't had any issues what so ever. The plug has a snug fit and connects with a very audible click. It also uses solid wire instead of stranded. My motto is click it and forget it.
I have heard good things about the Corsair 12VHPWR cable. Haven't even seen one burn up yet.
Is a 600w PSU not pushing it a bit for a 4090? They advise on 850w minimum.
I have one of the MSI MPG pcie 5 psu's with the right cable and thats been problem free too
What brand PSU?
@@chrissyboy7047he means the cable is able to provide up to 600w, obviously the psu is higher than.
@@davidb.5486 ahh. That makes sense. I misread the original comment. Doh
I bought a 4090 well over a year ago now, and I have no intention of upgrading for another generation or two. It's so far ahead of the rest of the competition that it'll end up being the 1080Ti of the 2020s and still be hitting good frames in 4-5 years time.
I came across your channel and your style works with me, so I subscribed with all that good good. I have a MSI 16Gb Expert 4080 Super with that really enclosed shroud. Your size ideas are right. It's going in the Thermaltake Tower 500 with a Z570 Godlike. FYI the Tower 500 packaging foam works as the CATBED 500.
Hey! Thanks for the sub, and yeah the sizes are crazy. Rumour is they went smaller on the new designs (50 Series) I guess we will see.
Also nice pro tip on the cat bed. It's hard to keep them out of customer boxes haha 😅
@@techwandoUS Yeah you get two sides of foam for cats. You get a rectangle perimeter with a "skin" that allows cat belly to droop into the cavity. Thanks for the(edit) reply.
This is the first GPU where I needed something to keep it from sagging and possibly breaking over time. My Asus TUF 4090 came with a little metal magnetic stand thing that is adjustable. I made sure all power cables are completely plugged in. I feared the melting adapter issue. I've had this card since last October and it's been nothing but amazing. I have it in an i7-3700 system. I was using the i7-8700K and didn't want to bottleneck this new GPU. The i7-8700K was still running games great when paired with a powerful GPU. I also just felt the need to move on from six year old hardware. I fear hardware failure when things get so old. I'm using the Lian Li LANCOOL 216. This case has excellent cooling. The 4090 is so big that it gets real close to the two giant from case fans keeping things cool and quiet.
4090 Completely sucks. MY ATI Rage Pro consumes only 10 Watts of power. Get on my level.
Great video and breakdown. I absolutely love my 4090, it's just a beast with everything I throw at it. I don't think I'm gonna even look at 5080/5090 and skip to the 6000 series at this point in another 3 years which is kinda insane when we look back at the 2 year upgrade cycle a lot of us are on with GPU's. Star Citizen is the only game in the future I'm really looking forward too that's demanding and it runs it fantastic. I just can't see needing an upgrade for many years to come. Yes, the price is high but I'm gonna get more years out of this card vs. all the cards I've owned over the past couple of decades.
Thank you, appreciate it. I think with the possibility of Nvidia having to limit the 5080 to the 4090 spec due to China (rumours) it's a good idea if you own a 4090 to definitely skip the 5000 series. I wish you many years of luck with your card my friend!
Good video man! I really enjoyed it! I have a 4090 and you're correct about everything. No one needs it. But once you have it, it is so hard to go back down. I will say this. There is something nice about knowing there is no where else to go when you're gaming. What I mean is this. I used to always wonder "What am I missing? How much more FPS could I get if I had a 3080 in stead of a 3070? A 3090 instead of a 3080?" The list goes on and on. Eventually, it just becomes a chase. You then reach the top and like Thanos, you rest. Now when I am gaming or video editing, I know that if I can't do something then no one else can either. I hope that makes sense lol. But with that being said, it is a stupid amount of money. If I was not a creator, there is no way I would ever spend my money on a GPU over 1K. Last night I fell asleep with my PC turned on. It stayed on all night, I never do this. This morning my first thought was "Oh no!" and "What if the cable is smoking on the 4090?" Thank God everything is okay. But man I should not have to worry about that crap. Anyway, great video!
Thanks man, appreciate it. So the question is Erock....
Will you be scooping a 5090 😎😉.
I will try and borrow one from someone.
Also that falling asleep thing...
Did that once too, woke up frantically like I had committed a crime the night before.
I think we definitely hear about more of the failures due to viral popularity, but you can never be too safe. The slight undervolt seems to be the best thing you can do with the 4090.
Thanks again man.
@@techwandoUS As tech creator who focuses mostly on GPUs, I feel forced to buy a 5090. So I will plan to get it. I may stand in line the night before it launches and make it an event type thing with a friend or two. I don't know yet.
@ErockOnTech That would make great content. Microcenter is pretty good with creators. Hit them up ahead of time and let them know, you never know.
I had the 4090 from March 23 to October 24. The good things everyone knows, incredible fast, I did run everything at double 4K (Samsung Neo G9 57") with DLSS quality and the experience was amazing. But I always had that thing in the back of my mind, "dude you did put 1900 in this card and you play a lot of Factorio and BeamNG, not just Cyberpunk". So at the end of this generation I sold it for 1400. I got the XT 7900 XT (not XTX) for 600. Even the room is a little more fresh since I'm putting only 300W out as heat, not 450W. My temperatures are back in the 50's but my framerate went from 120 to 90 with FSR3 Balanced and FG in demanding games. Again, 4k x2.
It was amazing but I won't be back to any 90 series again. TO MUCH MONEY. At most I will get an RTX 5080, the cheapest available and thats it. This is not like a truck, when you drive a truck you cannot go back to cars. The BFGPU is not that. That framerate is achievable is you change the settings here and there. Nobody need "ultra" or "psyco" at every setting.
About Raytracing, that was beautiful. But not worth the extra 800 dollars.
Once you touch 4090, you can never go back
I can confirm that the worry about the cable is a thing. I turn off my PSU and disconnect my PC every day because I don’t trust it when I go off to work or something.
Why do you think anything would happen as long as the computer is turned off? Even a completely "almost broken" cable will have absolutely zero effect on your computer when turned off or in standby.
You can tweak power consumption quite a bit with the OC voltage curve in MSI Afterburner and then using nvidia-smi to reduce the maximum clock rate. Its kinda odd you can't do both these in Afterburner as clock rate offsets seems to disable the custom voltage curve.
That might seem nuts, but its like on CPUs, having more cores clocked slower can actually perform just as well as less cores clocked higher - under the right workload. GPU tasks inherently are multi-threaded loads by their nature so usually scale well to more cores compared to CPUs. So overall, you still end up much faster than a 4080.
Compute workloads like Folding@Home I was able to get about 80% more performance on a 4090 vs 4070 Ti, at the same power consumption. It shaved about 100W off the 4090 power consumption with just a slight reduction in performance.
I still haven't gotten around to checking how it impacts gaming, but a lot of people at the 4090 launch found just limiting the TDP to 60-80% barely reduced performance in games. NVIDIA just pushed the clock rate on 40x0 series way past the efficiency range. They actually seem most efficient at around 2.4Ghz which makes sense as this is around where the gaming GPUs are clocked.
Cablemod is seriously sketchy. They refused to accept an unopened return because “It’s not worth the shipping cost”. They will NOT stand behind their product.
it would help to also focus at what monitors one is using, as to where i am gaming on a 57" ultra wide with over 16 million pixels, where I need a RTX 4090 (which I have and love it)....
Wow 57-in is insane lol. I agree though I moved over to a 48-in OLED for my monitor and you can't really play in anything less than 4K at that point. It just doesn't look right
For the cost of the 4090 i'd rather upgrade my entire platform. I play 1440P 165Hz with a 4070 Ti Super with no issues. Even that card is overpriced. A 70 series card used to be $600
Techwando : the 12-pin connector has smaller aluminium pins and thus smaller combined surface to transfer current than the 8-pin standarized has since 2006. As well the RTX 4090 has a TGP of 450 Watts when in reality it continuously eats anywhere from 540-580 Watts (=45-50 Amps of current) with max peaks up to 641 Watts. To give you a perspective....my Manli RTX 3080 Ti 12GB ate continuously 390 Watts (=32,5 Amps) with peaks up to 441 Watts. And my Corsair RM850x wasn't able to feed it despite connecting 2 separate PCI-EX cables in 2 x 8-pin on the GPU. I've measured and my PSU can continuously feed a max of 354 Watts through 2 x PCI-EX cables.
Whereas my AMD RX 6900 XT, which I've upgraded to from my 3080 Ti, eats continuously around 250-275 Watts with peak up to 294 Watts....while being faster than my RTX 3080 Ti 12GB is every rasterized game today.
I've measured them both extensively with current clamps and an osciloscop and then properly calculated the actual PWR consumptions. Nvidia GPU-s 3000 and 4000 series are less efficient and eat more PWR for less/equal performance compared to AMD 6000 and 7000 series.
The 12-pin connector is a FLAWED DESIGN and it's Nvidias design and thus Nvidias fault because they should have tested it properly before releasing it into the wild.
I got a Asus Strix 4090 and my Card never and i say NEVER, above 450W. Most of the time it pulls 300w +/-. This is not a lie, its true and even without undervolt, it wont go above 450w.
Also i like the new adapter, why? because i dont like 3-4 massive cables in my Card.
@@venusprinzj8094 either you're lying or your reading / using a software that isn't reporting the power consumption properly. The RTX 4090 has a TDP of 450. This is Thermal Design Power which is completely different measurement from the actual PWR consumption. In reqlity the RTX 4090 consumes variably from 450 to 641 WATTS....with continuous (in modern-ish games) around 550-590 Watts with spikes up to 641 Watts.
@@lflyr6287 No i dont lie. HwInfo and ThermalGrizzly WireView tells the same.
My 4090 does not pull above 450w because it cant, i dont OC my GPUs and allways do Undervotlt.
The 40series is efficient af. Optimum did a Video and a 4080 pulls less than 150w compared to a 7900xtx in CS GO.
A 7900xtx pulls more or the same as my 4090 with undervolt.
Its simply not true what you are telling.
@@lflyr6287 I dont wanna blame you but is there a reason you hate Nvidia and your Ryzen Picture?
@@lflyr6287 I watched Videos form Gamers Nexus and DerBauer, they tested it and you lie.
Dont be a Mad AMD Fanboy, why do you lie?
As a fellow 4090 owner I would agree with most of the gripes that you have about the card, the only thing I disagree with is regarding the power draw. I have a founders 4090 which is technically capable of pulling 600w, but I have never actually seen it pull anywhere near that except in benchmarks like furmark and 3dmark. Even the most demanding modern games like Alan Wake 2 or Cyberpunk only pull about 380-400w maximum with path tracing and everything maxed out, with my undervolt it pulls about 250-280w. I upgraded from a 3080 which often pulled more power than my 4090 does.
Sorry this looks like a rant but I didnt intend for it to be. Overall nice video!
I'm basically always ranting haha, what are your power settings/CPU etc?
The 4090 is definitely A very efficient card if used properly and tuned, but most people don't do that. Especially if they have like a strix model or something that's high-end that just instantly sucks as much power from the wall.
Also, some people probably don't see the full power draw because the games they play might not even fully use the card.
Appreciate the comment!
Lenovo has had (they go in and out of stock on it; just checked and it was currently in stock when I posted, it was out of stock this morning) the MSI Liquid Suprim X 4090 for $1,749 and discount codes can be applied. For me this brought the price down to $1,525. The liquid cooled card is much smaller (in width it is only up to two slots versus something more like 4 in width). So, those factors together ended up moderating (somewhat) a couple of the drawbacks mentioned in the video (though the price is still very steep).
$1,525 is still a lot better than something like the $1500 4080 Strix. Still a solid deal, nice purchase!
@@techwandoUS Thanks! Yeah, I have a couple of additional use cases as well. 1) I plan to use it for GPU multi-core programming / processing and 2) I use a freely available older, beta version of madvr coupled with videoprocessor, a capture card and the 4090 for real-time video processing (mostly for dynamic tone mapping to a projector in a theater setup). The equivalent hardware device (the madvr envy extreme - which is basically a PC with a high end NVIDIA card and the software) goes for around $9,495 to $14,995.
What codes did you use? You got it down a bit lower than me:)
Dukele432 - I just tried again (5/3/2024) and was able to get it back down to $1,529.37. $1,749.99 (goes in and out of stock). I got $50 off by using the $100 off code I got when signing up (this only lets me take off $50). I got $87.50 off by using EXTRAFIVE. I got $83.12 off by using my wife's nurse discount. They also have student, teacher, first responder discount (verifying through id.me).
4090 is the king of 4K gaming. I have zero complaints about my 4090's. Now we are getting close to the 5090 and looking forward to seeing how it games at 4K.
I almost bought one of these yesterday. My local Best Buy has one 4090 Founder's Edition Card in stock. And I had it in my cart. $1,711.99 after Taxes. I am currently using a MSI Gaming X Trio RTX3090 card with Intel 12th CPU. I really want to upgrade. But, I decided to remove the card from my cart and just wait for the 5090 cards reviews and go from there. I don't need it right now. But, I plan on buying that I Pimax Crystal Light headset and want more power to run it smoothly. I Appreciate the video. I just justify the cost. I got my RTX3090 card for $700 used from someone who upgraded to the 4090 card.
Definitely hold on to that 3090 a bit longer like you said. Also nice score, $700 ain't half bad.
hey buddy i have almost exactly the same story. I got my RTX3090 in 2022 Dec right before the 40 series release for about $900. I am recently SUPER tempted to upgrading to 4090 when i saw a deal of $1400 on it. I did an analysis and found that there is a relatively small improvement on my experience (I mostly only want to collect it, not so much gaming or doing machine learning stuff), but the downsides are 1) $1400 still something I would struggle to spend 2) I am concerned about the quality issues as mentioned in this vid 3) maybe 5090 will be released in 6 months and 4090 will depreciate much more 3) I don't know how to deal with the replaced 3090 and PSU cuz I need a bigger one, even selling them I assume only $600 can be collected. In short, I gain somewhat not so much, but would spend $800 in net, and worry about quality and depreciations. Not a good deal. Maybe I will buy 5090.
@@leoxu6743 They say good things come to those that wait! I am going to save up for a 5090 card. But, may end up buying a Used 4090 card if I can get one cheap. We will see. I am curious to see how much it will cost and how much better it is.
I brought my RTX 3090 for $600 last year from guy who also upgraded to the RTX 4090. When prices for the 4090 drop then I will upgrade my system.
@@RichLifeStories depending on how much the 5090 card is and if there is any available. I may end up just buying a cheap 4090 card if I can. But, no matter what. I will be upgrading. 😉👌
i built my current pc from scratch with 1080Ti 12g gpu with intel i7-7700K CPU back in 2015, it has served me well aand i have played all my games on highest setting until few months ago,
i will start saving for another build from scratch and let's see how long that will take
great video
I upgraded my 6700K + GTX 1080 to a RTX 4090 + 7950X3D. It will last me A LONG TIME as I use it with a 1440P 240 refresh OLED. I never pay more/year for my PC hobby. I just put 1K aside each year and I buy when I see a great upgrade.
That is a ridiculous upgrade. You must have been mind blown for the first few months.
Totally worthy IMO. 🤙
@@techwandoUS I am still blown again now 1 year later! 😁
Cool man! I did the same thing. I went also from a 6700k + 1080 ti strix (yes the tie version 🙂) to a whopping 13900k with a 4090 strix. Its hard explain the huge improvement. 😂 I also went to 4K while I did game at 1440p with my trusted 1080ti. Its crazy how this card hold up for so long. Lets hope the 4090 does the same!
@@amdtje8282 Yeah that's a beast of a setup! I went from 1440P for now, but I will upgrade to 4K when the RTX 6090 is on the market.
This is a great vid even 6 months later. It’s nice that you talked about gpu longevity and how the 4090 might progress. It’s not like if you have a 3080 like myself that you need to blow out 2000 on the 4090 because it’s twice as fast.
Thank you, the 3080 is still a really strong card.
MSI 4090 Suprim here... Use for gaming only. Zero complaints, absolutely amazing performance. Can't wait for the 5090.
I honestly went with the 7900 XTX than a rtx 4090 mainly due to the 12 volt connector. My co-worker's MSI rtx 4090 connector melted on him. That pretty much gave me my decision to go with the 7900 XTX.
This is coming from a guy that has purchased mainly Nvidia cards - 98000 GT Ultimate, gtx 660, gtx 770, gtx 970, 1080 and now on my 1st AMD card - 7900xtx.
My Asus strix 4090 OC is working vore 13 months with no problems at all.
Using a 4K 120 hertz 55inch LG TV as monitor...
Edit : Y got the cable from Corsair just like my PSU.
That should be something to be proud it should be a given
4090 will last you a few years! My budget allowed me an rx6800 with 16GB. It is a good companion to the 5600x. VR beast. 1080p 165fps effortlessly. Runs cool. I hope to get a few years. 20 years ago I would upgrade every year. I understand chasing that waterfall.
Shelf life is more important to me now.
I didn’t get a 4090 at launch and I regretted it. Won’t make that mistake again with the 5090.
Be careful. They might stagnate it since they have no competition, no more covid crazed home bound hardware buyers.
AMD isn't likely going to release a GPU to go against the 5090, so the 5090 could end up being what they were going to use for the 5080. They could realistically only make the 5090 25% or 30% faster than a 4090.
@@Killswitch1411 I personally think they are going the make the 5080 a bit faster than the 4090, and make the 5090 much faster, making the gap in performance between 80 and 90 class cards even bigger than before. That way they can make the case for a substantially more expense 90 class card. At least that would be the strategy if I were Nvidia.
@@nossy232323 I currently have a 4090 and struggle finding any games worth having the extras horsepower. I may stick with 4090 this next generation. My 4080 build plays the games I have time for well enough.
I only game on my 4090. I use a OLED ultrawide with 175hz. I definitely need this card to run steady high frames in modern games.
The last 7 or so years of owning ultrawide monitors has forced me to buy top of the line GPUs.
Just out of curiosity, what game is that footage from? Where people are melting after being killed.
That's actually MW3. It's a reskin of the airport
@@techwandoUS I thought it looked like CoD. I've only played the first two MWs though. Thanks!
@@SoundsFromBeyondalso the melting affect is from a weapon bundle reskin it doesn’t happen by default
I bought a MSI Suprim X 4090 in January, probably the worst time to buy one, but I was too tempted... I agree with the general message for gamers who are planning to use it only for gaming: it's not worth it, especially with the 50 series right around the corner... on the other side... i won't upgrade my AM4 system anytime soon and so I'm limited to PCIe 4.0 which means I won't benefit from the 50 series cards anyway since these cards will most likely adapt the PCIe 5.0 standard, so yeah... great video btw, really liked it.
Appreciate your comment, and yeah I'd hold on to your platform as long as you can. AM4 is still so solid. I assume you are running a 5800X3D with that 4090?
@@techwandoUS Yes, undervolted 5800x3D and 4090, I really like the performance and honestly, I see no need to upgrade for the next years, 4k@120fps should be no problem for most of the games I play (non-competitive) 👍🏻
I just bought a 4090 paired with the i9-14900k. I can't wait until it gets to me!
How is it? I went with the i9 and a 4080 super
@Btburkhardt It's fantastic. I bought a used Samsung Odyssey G8 and decided set up for 4K and it doesn't slow anything down at all. I thought I was having an issue because the gpu usage kept going down to 0%. Turns out I wasn't taxing the system enough so it would revert to the cpu card instead. Quick system configuration change and it's been a dream since. Fast and smooth. I haven't had any issue with overheating or crashing.
Would you recommend the 4090 now or should I wait for the 5000 series? This is for video editing and some gaming.
@@carterskindle7086 I’m considering the same. 4080 super I’m not sure is as future proof
@@carterskindle7086 I definitely recommend the 4090. This thing is a beast. I've tinkered with the settings to really get the juice flowing and it is smooth as ice.
I personally have an RTX 4000 SFF. No power connector, 75 watts (65 under load is sometimes more realistic), enough VRAM that I'm not usually worried about it, and enough performance as long as I'm not trying to run at ultra settings.
I'm absolutely delighted with my purchase, and do not at all regret avoiding anything more cost effective in the low end, or not getting a 4090 in the high end.
What is your opinion on GPUs and UE5? Im in the market to build another computer after running a 1080ti and 6700k for 7-8 years, but alittle hesitant on possibly getting a 4090 or waiting for 5000 series to handle the up and coming UE5 games. I build to last 6-8years
UE5 is going to only get more complex as time goes on. Maybe some better optimization will come out of more dev time. That being said, if you have managed this long with your current setup (bravo btw, that's awesome) keep it going for a little bit longer.
It's looking like the 5080 is probably going to be somewhere around the performance of the 4090 because they want to sell it in China. 5090 is going to also be quite insane according to some leaks (taken with salt). So you could either wait for all the 5090 buyers to fast flip their 4090s and buy one of those. Only issue with that is all the 4090 issues, some of which mentioned in the vidoe. I'd say just aim for a next gen card personally. That GDDR7 will be a game changer for hard to run titles. You'll also have peace of mind that you'll get an extra 2 years of support.
Actually thank you. You put yourself in my shoes and gave a grounded and realistic assessment without the overbearing technical mambojumbo and that "but it's up to you" BS at the end.
So thank you
Appreciate the comment, happy to hear you got something out of it.
Imagine the electricity bill on the 5090😢😢😢
This is a really nice and informative video, thank you! It has made my decision to get a different card much easier. Also with the recommendation on the likelihood of 2k gaming being better long term. Good luck with your channel and I look forward to your future videos!
Happy to have helped! Also appreciate the kind words. Definitely this late into the 40 series lifecycle, I would probably wait to see what the 50 series offers, unless of course you need something ASAP.
How much does it add to your bill. I heard it only add 10 or 11 dollars to a bill
It wouldn't add a dime over any other card if you did the same workload. This card doesn't just magically peg itself at 600w 24/7.
Most of these are locked at 450w, which could possibly use about 12kwh a day, at 100% use.
Most people are going to game, which won't use 100%, and even then maybe only 8hrs a day (which puts you into a heavy user category.
All that being said it's probably going to use 50-360kwh a month. My power here is $0.16/kwh. So $8 to approximately $55 a month.
What this math doesn't show is that your current system probably used 90% of that already, so the difference for me was almost $0
If your workload is big it does add a bit...mine added about 20-30kwh a month on average but i bought a huge 4k 80w display too so that adds a little too
Just lock at 60% power and you will lose only 4-6 fps..
@@delfean2666 Were you able to get more work done, faster? The cost of electricity might be less than a cheaper card if calculated as productivity divided by energy consumption. I'm not sure what everyone's hourly rate is, but if you could save an hour a month, I'd assume that would offset the extra energy cost. Then again, I tend to justify these kinds of things to feel good about buying things I don't need - I have a 4090 being shipped to me as I write this. I almost immediately regretted my impulsive purchase, so I came here for reassurance that it will be worth the investment. It will be used for editing and game streaming on my son's rig.
Really good opinion piece, ty
Thanks for checking it out!
My 4090 has actually been the only part of my system I have not had issues with (watch me jinx it now). My asus x670e extreme motherboard and 7950x3d have been nothing but trouble and dont get me started on Windows 11 as a whole.
That's good to hear, do you just run it stock? Also totally agree with windows 11. It's been a mess. 12 will be here soon, hopefully they don't f it up.
@@techwandoUS I ran a slight overclock on select games but the majority of the time it is just a stock gigabyte gaming oc 4090. And I wouldn't hold my breath about Windows 12 haha, the whole AI thing worries me. I don't want AI to have access to ALL MY FILES.
@@Evenaardez it won't, Microsofts ai is just a cloud based LLM and i doubt ms cares for your files.
Great video! Mind if I ask what your go-to monitor you mention is? I'm a 2k person myself and have a 6 yr old Dell 2k and been dying to find something that might be better. Current monitor is 32" and non-OLED... so I'm almost expecting you say it's a 27" OLED. LoL
I have a 3080 12gb and other than alan wake 2. there have been no games ive played over the last year that have made me want to upgrade. everything runs at 1440p smoothly
No wonder. 3080/12 is just a tad bit slower than 4070 Super. And that one is a beast of a card for 1440p. Thanks to weak 4000 gen performance improvement, 3000 line-up is still relevant.
@@stangamer1151 ik but you cant deny the perf per watt improvement. I built a pc for a client with a 4070 and it was within margin of error of my 3080 but was only using like 170w out of the box and was able to get it down to 160w with higher clocks than stock with afterburner. on my 3080 while undervolted draws 300-320 and when OC its over 400
I changed my 3080 12GB with a 4090 for Alan Wake 2 last year. You can play every game with pathtracing in 4K.
Same still using a 3080 for 1440p and it’s doing just fine.
Went from a eVGA 3090 rtx to a Zotac 4090 just under a year ago....been in love with this card every since. If you are a VR sim(flight/racing) gamer its literally god sent. my biggest gripe is the INSANE power consumption. Gotta pay to play so they say.....
Getting mine next week... Can't wait! Will play with 4k as long as I can lol. Besides.. 1440p has been my standard for many year's anyway no loss anyway.
What model did you pick up?
@@techwandoUS PALIT GeForce RTX 4090 GameRock 24GB
How is it? Would you recommend I aim for a 4090 now or wait for the 5000 series?
I dont even care about 4k and I wont bebuying a 4k monitor anytime soon
@@carterskindle7086 Probably wait and see how the 5000 series are, but if you can't wait. Then I reccomend it higly.
Love your perspective at the end there!
I went with a 3060ti for 1440p gaming and VR purposes in 2020 and I'm still very happy with it.
That being said, the one vertical where you probably benefit from the power of the 4090 IS indeed VR. I think it's one of the few gaming justifications for it if you want to experience high fidelity high-end PCVR at max settings for full immersion due to the hardware demands of running VR ports of games that weren't optimized or designed for the medium (i.e. playing Red Dead Redemption 2 in VR is possible via mod but is very heavy handed).
Outside of that, I honestly don't think it's necessary.
Appreciate the comment! I really need to look more into VR, as this is probably the most popular rebuttal comment to my opinion on the card. Thanks again 👍
My 4090 was bought on day one so it's time for an upgrade. It has been through 2 cpu and MB upgrades: 12900k/Z690 and 7950X3D/x670. It also has been through 2 monitor upgrades as well: Samsung Neo G9 47" and then Neo G9 57". I have the 47" as a secondary monitor now.
Otherwise my 4090 has been very reliable with no issues. Power draw is negligible because i have 2 electric cars and run my AC 24/7 at 20C or 68F.
I never check the cable because it's fine. I'm using the stock Cable from my 1000w Corsair PSU.
Upgrade the 4090??? 🤡
bro what?
So this is how I mitigated all those issues:
1: I bought a Corsair 12V-HP aftermarket cable that goes straight from my HX100w PS into the GPU, , and I’ve not heard anyone have an issue with that set up, if you can afford a 4090 you can afford a £20 cable.
2: I bought my 4090 direct from Nvidia and got it at msrp £1569.99
3: I bought a Velocty2 water block for £200 for my 4090 and cut the size down drastically which also helps with the weight on the pcie slot.
3: I paired my 4090 with a 7800x3d and max system power draw that I’ve seen with a full custom loop is 600w
Yes it’s expensive, but it really is a Titan class card, and that’s what you should compare it too in terms of price and performance not a XX80ti
Look nobody needs a Ferrari, but it’s cool as hell if you can afford it.
Well said ☝️ and nice work arounds
Wow.expected to see at least 200k subs. Have mine. Good video!
Appreciate you, maybe someday. Just gotta keep the passion for the hobby up.
As a mechanical engineer and industrial designer,the 4090 is a fantastic multipurpose card.The card is fantastic for 3D model rendering,FEA (Finite Element Analysis),video rendering and stabilization of drone footage.The other benefit of the 4090 is that it is less expensive than Nvidia’s Quadro professional cards.I think the way to view the 4090 and other future Nvidia top specification graphic cards is that their performance allows you to bypass at least one generation of card.The money that is spent on the card is amortized over a longer period of time thus creating greater value. I agree that of the sole purpose of owning a 4090 is for gaming,then the card is overkill.
Those final thoughts really made sense of this whole situation, it does feel like we lost that great value segment and excitement.
Appreciate you watching till the end. Yeah hopefully they make a shift with the upcoming generation. People need a win.
thing about 4090 pricing often makes me smile, like people forgot that nvidia made titan rtx for $2500, they will go to whatever price they want and 1600 isn't that high in the grand scheme of things, like 10 series titan was $1300
not that high. yeah sure
@@TurboMintyFresh for you my friend, i went to inflation calculator, $2500 of titan card of december 2018, up to october 2022 in which 4090 came out, went up to $2965.49, so if we just accept that 90 class card is titan card of old, then 4090 is half off compared to 2090... is it a lot of money? yes it is, but in comparison, it's not that much, and Nvidia is capable of going with way higher asking price than $1600 for 90 class card
I have been using flagship GPUs since the 2000s for one simple reason - longevity. Also, "Pay more or buy twice". It is a very hefty investment no denying it but what you get in return is "Load it, max it and play it!". I was gaming at 1080p for many years despite having a flagship GPU because my logic is "If it can run eg, COD 5 at 250fps then it should be able to play COD 7 at 150fps and it certainly turned out true, at least for the COS/MOH/BF series so you basically have peace of mind for multiple gaming cycles where you can confidently run all your games at maxed detail (until ray tracing came along at least). Even though I have a 4090 I will sometimes still disable anti-aliaising, use DLSS in Balanced mode and select the Very High (instead of Ultra) preset when I game on my 3440x1440 ultrawide screen, the primary reason being I'd rather play AAA titles at 120fps @ Very High rather than 90fps @ Ultra although 90fps is already very smooth. For run of the mill FPS/action games the framerates are usually in the 150-200+ fps range and this an indication of how much headroom is available at least for the next 2 years or more. And it is for this reason that one can justify paying >$1,500 for a flaghip GPU.
Awesome video! Waiting to upgrade my 2080ti. Have been thinking of getting an used 4090 but I might wait for the 5090 instead. If you game, and edit video on the same machine. What type of driver are you running (studio/gaming)? Also it looks like you are running davinci? Is it utilizing the GPU to boost the editing/rendering performance?
Appreciate it! I swap between the gaming driver and studio driver actually. Was planning a video around this very topic just need to get to it.
Davinci resolve also runs really well on both drivers, but I did happen to notice less freezing of the application with a mixture of video file types while running the studio driver. I need to do more tests to really confirm anything, but I currently a lot of 4K Sony footage, AV1 OBS, and 1080p upscaled TH-cam mp4s, and when you overlap them, occasionally you run into every now and then.
Excellent video! Very easy to follow along with and you're well spoken, i appreciate you making this! I have a 3090 and thought about upgrading to a 4090 just for davinci resolve. Can you tell me more about your edits? Do you edit 4k videos and render as such? & What encoder do you use? Any upscaling? Just curious if the 4090 is worth the upgrade from my 3090
Appreciate the kind words! Honestly I'll save you some time and just say, the difference between the 3090 and 4090 are in DR and AP are really just end results render times. The 3090 still has all that fast memory, and holds up super well. I would skip this generation. As for edits, it's pretty much whatever the client wants. My personal videos are simple, usually a blend of 4k and upscaled 1080p. Others I'll use SLog and or slightly downscaled 6K from my camera.
As for the encoder, I primarily use AV1 for all of the screen grabs, and h.264/5 for everything else. 265 takes a bit longer to render though.
I have a 4090 FE with an EKWB. 0 connector issues and has ran perfect since day 1. Toyed around with pushing the OC on it for fun and benchmarking. Then set it back to factory and applied a little under-volting. No issues so far and never gets hot. The 12900k paired with it on the other hand takes some work to keep the all core at 5GHZ while keeping temps down due to my EVGA Z690 board loving to excessively add voltage even at factory settings.
Waiting for the 5090 to come out since that's the best opportunnity to snag a deal with a 4090. If I buy it new, I will do it via Micro Center with the extended protection in case it catches on fire and they will just replace it 🙃
It says on the contract I signed with my landlord that I will only be using 1000W per socket in the kitchen and the washer / dryer cabinet. If, as many people say, your cable gets a little warm and they worry about it, think about what is going on INSIDE OF YOUR WALL. Especially if you live in a 100+ year old building like me. I think I might actually blow a fuse if I have a 1200W PC running at full speed.
Also 4090 owner here, this are my takes:
1) 12vhpwr isn't well thought out connector, but definitely most fails are due to people not plugging it right, also do not plug and unplug repeatedly! this kind of connector is rated for 30-40 mating cycles.
So connect once make sure it's fully inserted after you cable manage, and leave it as is.
2) power consumption is high for me around 450-500 watts, but this is the situation today with advanced nodes.
3) price I paid 1,750$ last year so 150$ more than MSRP, but this is for the suprim liquid X version, getting the card at MSRP isn't realistic, but it ain't new, this has been happening for years now.
4) depends on the version mine is 2 slot but very wide because of the PCB.
5) if you want 4k high refresh rate, this the best right now
6) good temps, but I will add the card to a loop, I just want to build whisper quiet system.
Very well said, appreciate the well written comment. Also great model card, and $150 over MSRP really isn't that bad, especially if you consider the price of a 360 AIO on top of the card.
There were changes with the ATX specification to prevent it from operating without a solid connection. Size is great you don't have to deal with noise. Most of this stuff doesn't have anything to do wit YOUR experience and actually using the card. I still believe if you get MSRP (I got 5% under MSRP with MC card) it's WAY better value than the 4080. I know a lot of benchmarks show 20-30% uplift but that was what was available at release. There is 60% more hardware available.
The problem with the 4090 is that it’s not fast enough
I love every minute of owning it. Playing in 4K on 4K screen is a privillage benefit. Still 78 deg C while gaming at the moment.
got a 4090 since January 2023... total gamechanger for me... I now do my arch viz renders 100% on GPU, no more CPU renders, also opened the world to me to do video walkthroughs in a matter of hours or a couple of days, instead of insane times or just realistically refusing to do videos
It's really good for that type of stuff. Happy to hear it's serving you well.
4090 FE owner here. Only bad thing I can say (I mean price subjectively aside), was Nvidia's choice to go without their store and rely on Best Buy who had terrible early management of purchasing. While I game a lot, I do a huge amount of 3D rendering and other work so while yes, this card was expensive, it's been worth it. But if you're just wanting the best gaming performance, yeah, it's definitely a hard pill to swallow and I'm still bewildered that GPUs are now the price of what a machine used to be.
Wish Cablemod's 12vHPWR adapters worked - ran mine without too much issue but then with the snags, of course, stopped using it (RIP to those who cards died). Hopefully Nvidia and other OEMs have learned a lot from this launch.
On the size, yeah it's like Nvidia is like "How much space in this hypothetical rectangle can we use for a PCIe card?" "Yes. Use it!". Thankfully I can fit the size in my case but yeah it's a monster. Card does stay suuuuper cool though!
Wattage is not a measure of power efficiency. Joules per unit of work is efficiency, and the 4090 is pretty good on that. It is high power but it finishes the job much sooner. (I don't know what the idle power is compared to other cards, some idle performance should be included in any efficiency comparison.)
Even though I could easily afford multiple 4090s, i just ordered a 4070 Ti Super 16GB. I think I'll be happy with the performance of that GPU. I'm coming from an RTX 2080.
I have enjoyed everything with my 4090. I would spend any amount of money to not have to water down my experience! Ultra everything or just nah! For years every high end card still had to compromise something. This is the only time in history I got what I paid for. Years ago you bought an $800 gpu and still had to dial back settings sometimes. Not with the 4090!
I've had a 4090 for 9 months now, with no problems with the cable, then again I've never stressed the card to it's limit.
At 4:01, where was that cable from that you had issues with?
I got a 90 degree cable from moddiy and so far, have had no issues with it.
Cablemod unfortunately. I still use them but that was a very early rev of the 16pin 12vhpwr cable.
I got my founders card for $1599. Never had any issues with the connector. It runs very cool. The power draw for me has been great, but then I have solar on my house so that helps a lot. I drive a Samsung Neo Odyssey G9. I have no complaints at all.
I got a white strix 4090 oc. Got it for retail at microcenter on release day. Definitely a highlight of 2023 for me
My 1 year review of owning the 4090 is turning on any game, impressed at the huge FPS and eye candy, I say "cool", turn off my PC, then proceed on playing my game laying on my couch on my steam deck.
Just stream your monster pc to your steam deck? Thats what i do