We need to have a chat about these 4090 Benchmarks...

แชร์
ฝัง
  • เผยแพร่เมื่อ 10 ต.ค. 2022
  • The RTX 4090 is finally here... but there are some things you should know...
    Get an iFixit kit for yourself or someone you love at - amzn.to/3IDDkj9
    Get your JayzTwoCents Merch Here! - www.jayztwocents.com
    ○○○○○○ Items featured in this video available at Amazon ○○○○○○
    ► Amazon US - bit.ly/1meybOF
    ► Amazon UK - amzn.to/Zx813L
    ► Amazon Canada - amzn.to/1tl6vc6
    ••• Follow me on your favorite Social Media! •••
    Facebook: / jayztwocents
    Twitter: / jayztwocents
    Instagram: / jayztwocents
    SUBSCRIBE! bit.ly/sub2JayzTwoCents
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 5K

  • @Cryptic0013
    @Cryptic0013 ปีที่แล้ว +2584

    Remember folks: Time is on your side. Nvidia is, as Jay put it, sitting on a mountain of overbought silicon and developers aren't going to rush to use this new horsepower (they've only just begun optimising for the 2060 and next-gen gaming consoles.) If you don't hit that purchase button now, they *will* have to drop the price to meet you. The market pressure is on them to sell it, not you to buy it.

    • @earthtaurus5515
      @earthtaurus5515 ปีที่แล้ว +66

      Indeed and generally that's usually the case tech does go down in price barring any unforseen madness like a global pandemic or GPU based crypto mining or both....

    • @DarkSwordsman
      @DarkSwordsman ปีที่แล้ว +67

      I really do think games are at an all time high of being CPU-bound due to how they work internally. For example: Unity is limited by draw calls. Every mesh with every material incurs a draw call, and it's why VRChat runs so bad while barely utilizing a single CPU core or the GPU. It's also part of the reason why Tarkov runs so poorly, though that's mostly down to the insane number of objects they have in the game and the, in my opinion, less then optimal LOD.
      Engines like UE5 with Nanite and Lumen, and games like Doom are prime examples of the direction that we need to go for future games if we want them to actually take advantage of modern hardware. The hardware we have now is so powerful, I don't think people realize what absolutely crazy things we can do with some optimization.

    • @corey2232
      @corey2232 ปีที่แล้ว +27

      Exactly. And at those prices, there's no way in hell I'm touching these cards. I already thought jacking up the prices last gen was too much, so I'm going to happily wait this out.

    • @JSmith73
      @JSmith73 ปีที่แล้ว +12

      Especially after rDNA3 hopefully brings a good sanity check to the market.

    • @theneonbop
      @theneonbop ปีที่แล้ว +8

      @@DarkSwordsman Draw calls in unity are often easily fixable with optimization from the developers (early on in the project), I guess the problem with VRChat is that the developers aren't really involved in making the maps.

  • @SkorpyoTFC
    @SkorpyoTFC ปีที่แล้ว +618

    "Nvidia doesn't see this as a problem"
    Summed up the whole launch right there.

    • @TheCameltotem
      @TheCameltotem ปีที่แล้ว +9

      supply and demand 101. People who can afford this will buy it and enjoy it.
      If you don't got the money then buy an Intel ARC or something

    • @lgeiger
      @lgeiger ปีที่แล้ว +30

      @@TheCameltotem Ha! Let's see how many 4090's will break due the strong bend of the cable. Not sure if everyone is going to "enjoy it" when Nvidia starts blaming them for breaking their 4090, because they bent the cables too much. This adapter is a problem and I am so sure that it's gonna create problems in the future.

    • @raymondmckay6990
      @raymondmckay6990 ปีที่แล้ว +23

      @@lgeigerNvidia could have solved the connector problem by having the connector be an L shape where it plugs into the card.

    • @lgeiger
      @lgeiger ปีที่แล้ว +10

      @@raymondmckay6990 Exactly my thought! But I guess that's not a valid solution for a billion dollar company.

    • @PresidentScrooge
      @PresidentScrooge ปีที่แล้ว +3

      @@raymondmckay6990
      What Nvidia shouldve done is plan ahead. If they plan to push a new standard, they should have worked with PSU developers 2-4 years ago so at least the high-end PSUs will have that standard available. this is just pure arrogance by Nvidia

  • @bander3n
    @bander3n ปีที่แล้ว +315

    I love how I can watch multiple tech TH-camrs and they give you a general idea about a product while each one gives their input into a specific part that is important to them . Gives you an overall good idea about it . Very informative video Jay . Love it

    • @pat7808
      @pat7808 ปีที่แล้ว +15

      Jay, GN, and LTT. The holy trinity.

    • @punjabhero9706
      @punjabhero9706 ปีที่แล้ว +4

      @@pat7808 I watch all of them. And also Hardware unboxed, Pauls hardware and sometimes hardware canucks(I love their style of videos)

    • @pat7808
      @pat7808 ปีที่แล้ว

      @@punjabhero9706 Yes love the canucks!

    • @marvinlauerwald
      @marvinlauerwald ปีที่แล้ว

      @@pat7808 aaaand red gaming tech/hardware meld

    • @HillyPlays
      @HillyPlays ปีที่แล้ว

      @@punjabhero9706 Because this community subscribes to so many others, I was recommended those channels and it MASSIVELY improved my understanding

  • @YAAMW
    @YAAMW ปีที่แล้ว +167

    The MOST valuable info in this review was the bit about bending the connector. MASSIVE thanks for pointing that out. Somebody IS going to have a REALLY bad day because of this. The second most valuable info was the bit about the over-engineered coolers. This is the first time I felt restricted by the GPU clearance in my Armor Revo Snow Edition because of these unnecessarily huge GPUs

    • @DGTubbs
      @DGTubbs ปีที่แล้ว +5

      I agree. Shame on NVIDIA for downplaying this. If you screwed up somewhere in design, own it. Don't hide it.

    • @Chris-ey8zf
      @Chris-ey8zf ปีที่แล้ว +1

      Honestly though, if someone is that careless as to break off their connector by bending/pulling on the cables, they probably aren't responsible enough to be building/upgrading PCs. PCs aren't adult lego sets. You have to actually be careful with what you're doing. People that break things due to being that careless deserve to lose the money they spend. Hopefully it teaches them a valuable lesson for the future. Better they break a 4090 and learn than to mishandle a car or misfire a gun that can actually harm/kill others.

    • @EkatariZeen
      @EkatariZeen ปีที่แล้ว +8

      @@Chris-ey8zf Nope, that's just a sh¡t design, even if the user is careful they would be stuck with an unusable brick or an unsightly opened case until they get one of the 4 cases in the entire market where that crap fits.
      Jay just showed that he broke the power cable by bending it, so it's not just being careful about not breaking the cheap connector in the PCB.

    • @javascriptkiddie2718
      @javascriptkiddie2718 ปีที่แล้ว

      Vertical mount?

    • @MoneyMager
      @MoneyMager ปีที่แล้ว

      This video and comment aged "well" :)

  • @niftychap
    @niftychap ปีที่แล้ว +1696

    Going to wait for AMD or buy last gen. In my eyes performance of the 4090 looks amazing but so easy to get carried away and end up blowing way more on a GPU than I'm comfortable with.

    • @originalscreenname44
      @originalscreenname44 ปีที่แล้ว +72

      I would say that for most people it's unnecessary. I'm still running a 2080 FE and it does enough for me to play anything on my PC. Unless you're working in CAD or streaming/creating video, you don't really need anything this powerful.

    • @CornFed_3
      @CornFed_3 ปีที่แล้ว +113

      @@originalscreenname44, those of us that only game in 4K would disagree.

    • @victorxify1
      @victorxify1 ปีที่แล้ว +124

      @@hoppingrabbit9849 yea, 1440p 160 fps > 4k 90 in my opinion

    • @idkwhattohaveasausername5828
      @idkwhattohaveasausername5828 ปีที่แล้ว +70

      @@CornFed_3 if you're only gaming in 4k then you need it but most people are playing in 1080p

    • @2buckgeo843
      @2buckgeo843 ปีที่แล้ว +18

      Ya snooze ya lose bro. Get the 4090 and call it a day.

  • @TheMatrixxProjekt
    @TheMatrixxProjekt ปีที่แล้ว +1365

    Completely agree with you on the bending cable to fit in case issue. What they probably should have done is made the cable adaptor have a 90 degree bend pre-done or made the plug an L-shape that can lead the cable directly downwards (or going an extra mile and having a rotational system that can allow it to go in any direction). There are so many design choices that could have remedied this I feel, but instead they’re giving out a whole Nine Tailed Fox for builders to deal with. Really odd oversight for such an expensive product.

    • @MichaelBrodie68
      @MichaelBrodie68 ปีที่แล้ว +62

      Exactly - like the very familiar SATA L connectors

    • @kingkush3911
      @kingkush3911 ปีที่แล้ว +24

      Definitely would be smart to have a 90 degree cable to avoid having to bend the cable and risk damaging the gpu .

    • @Pileot
      @Pileot ปีที่แล้ว +21

      Whats wrong with having the power connector on the motherboard side of the card and facing down? This is probably going to be the longest expansion card in your case and its not likely you are going to be running two of them.

    • @Xan_Ning
      @Xan_Ning ปีที่แล้ว +2

      EZDIV-FAB makes a 8-pin 180 degree power connector that wraps over onto the back plate. I expect them or someone else to make the same thing for the 12-pin. (EDIT: just saw that they have a long 12-pin (not 12+4 pin) to 2x8-pin, so I think they will have one for 12+4)

    • @gordon861
      @gordon861 ปีที่แล้ว +7

      Was just going to say the same thing. I wouldn't be surprised if someone produces a 90 degree plug/extension to solve this problem.

  • @TheZigK
    @TheZigK ปีที่แล้ว +93

    couldn't wait any more. Had money to upgrade my 1050 ti when the market was inflated. Finally snatched a 6800 XT for $550 and have 0 regrets. Will still be watching to see how things evolve

    • @dennisnicholson2466
      @dennisnicholson2466 ปีที่แล้ว +4

      I last week nabbed the 6800xt after seeing a mod video that taxes this card to comfortably run as a 3090ti . Been having some power draw issues thought would have been safe using same modular psu that my dual 1080s .

    • @TheZigK
      @TheZigK ปีที่แล้ว +3

      @@dennisnicholson2466 I saw a article about the same thing! Seems like it requires a custom cooling setup to see real improvements though, and probably doesn't work with every game. If I find myself running up against the card's limits I might consider it

    • @ElectricityTaster
      @ElectricityTaster ปีที่แล้ว +3

      good old 1050ti

    • @HansBelphegor
      @HansBelphegor ปีที่แล้ว +1

      Same but msi 6950xt

    • @user-ck8ec7pj1l
      @user-ck8ec7pj1l ปีที่แล้ว +1

      I got that STRIX 3090 White edition he is showing for $999 2 weeks ago. Been eyeing it for months.

  • @bernds6587
    @bernds6587 ปีที่แล้ว +119

    Roman (der8auer) made an interesting point about the Powertarget:
    setting it to 70% allows the card to run cooler with its power reduced 300W, but still performing at about 95% of its graphical power. The power to fps curve definitely looks like it runs overclocked by default

    • @R1SKbreaker
      @R1SKbreaker ปีที่แล้ว +3

      Oh this is good to know! I have a 750 watt power supply, and I think I am going to splurge for a 4090 eventually, coming from a 2070 Super. I'd really rather not upgrade my power supply, and if I can hit 95% graphical power with my current power, then I am more than happy. 4090 is OP as is; I can deal with a 5% graphical power reduction.

    • @bobbythomas6520
      @bobbythomas6520 ปีที่แล้ว

      @@R1SKbreaker (coming from a person who owned a 750 watt power supply)

    • @ryze9153
      @ryze9153 ปีที่แล้ว

      ​@@R1SKbreaker I would get a 3070 now and then wait for 50 series. That's my suggestion.

    • @R1SKbreaker
      @R1SKbreaker ปีที่แล้ว

      @@ryze9153I'm actually just going to stay with 2070 Super until the 5000 series. After I upgraded my CPU, I'm a lot more content with my current setup.

    • @ryze9153
      @ryze9153 ปีที่แล้ว

      @@R1SKbreaker I'm hoping to have a 3060 ti or somethin like that pretty soon.

  • @Appl_Jax
    @Appl_Jax ปีที่แล้ว +459

    Was EVGA the only board partner that put the connectors on the end? At least they had the foresight to recognize that it was a problem. They will be missed in this space.

    • @onik7000
      @onik7000 ปีที่แล้ว +7

      PCB is not long enough on most GPU to put it there. Actually connectors on 4090 FE are on the END of PCB ) Only fan and radiator are behind that point.

    • @TheFoyer13
      @TheFoyer13 ปีที่แล้ว

      I have an MSI 3090ti with the 12 pin connector facing the side panel. These cards are so big, and so long, that if the power connector was at the rear, it would bend even worse. I guess the only benefit to that would be seeing the RGB "carbon" motherboard leds that are hidden by the PCIE wires but then it'd be uglier and the wires in the way of my fans. Now that they don't sell a lot of cases with optical drives, they aren't as long as they used to be. I guess it really comes down to what kind of case you buy. (My 3090ti is in a corsair 4000d, and it fits great, and the adapter doesn't touch the glass)

    • @EisenWald_CH
      @EisenWald_CH ปีที่แล้ว +1

      @@onik7000 that's why they put a little daughter pcb (when it's more than just the power) or just a cable that connects the PCB power IN to whatever place you would like the connector to be (and then fix that connector to the metal or plastics structures from the heatsink), it's not like they can't do what EVGA does, they just don't care that much or the feel it will look "out of place" or "bad", cost is also a thing, but i feel it's very negligible in this case, as it's just a little extra cable and fix points (redesign would be a bummer though, with machining cost and all).

    • @BM-wf8jj
      @BM-wf8jj ปีที่แล้ว +1

      It wasn't even a foresight. Last year I ended up having to send back a FTW3 3080 because I wasn't able to get the glass side panel back onto my Corsair 280x w/ the PCI cables attached to it smh. They could've at least sent out some type of communication to inform buyers that they fixed that issue.

    • @Old-Boy_BEbop
      @Old-Boy_BEbop ปีที่แล้ว +2

      @@onik7000 if only they could invent small pcb boards and capacitors with wiring to add the Power Connectors at the end... if only, what a world we would be living in.

  • @gremfive4246
    @gremfive4246 ปีที่แล้ว +360

    Igor's Lab explained why the AIB's coolers are so massive, the AIBs were told by Nvidia to build coolers for 600 watt cards (all those rumors back in July of 600 watt 4090s) in the end the 4090s would have been just fine with 3090 coolers since Nvidia used TSMC over Samsung.
    Thats going to add alot of cost to AIBs cards over the Founders Editions and maybe one of the things that made EVGA say enough is enough.

    • @ChenChennesen
      @ChenChennesen ปีที่แล้ว

      Bit of a tangent but what are the recommended psu sizes now?

    • @kimiko41
      @kimiko41 ปีที่แล้ว +17

      Gamer's Nexus did some power draw testing, stock was 500w and overclocked with the power limit / power target bumped was pulling over 650w. Seems to me the large coolers are necessary unless AIBs want to set a lower limit than FE.

    • @fahimp3
      @fahimp3 ปีที่แล้ว

      @@ChenChennesen 10000W if you want to be safe with the transient spikes and future proof it.

    • @randomnobody660
      @randomnobody660 ปีที่แล้ว +9

      @@kimiko41 Wasn't it mentioned in this video that the card got to 3Ghz while maintaining 60-ish degrees in a closed case with factory fan curves? That sounds like even the fe's cooler is arguably overbuilt already.

    • @TotallySlapdash
      @TotallySlapdash ปีที่แล้ว +12

      @@randomnobody660 or NV is binning chips for FE such that they get an advantage.

  • @Porkchop899aLb
    @Porkchop899aLb ปีที่แล้ว +30

    New builder here and went for a 3080ti for 165hz 1440p. The 4090 will be lovely of an upgrade for me in 3 or so years lol

    • @liandriunreal
      @liandriunreal ปีที่แล้ว +6

      getting rid of my 3080ti for the 4090 lol

    • @kertosyt
      @kertosyt ปีที่แล้ว +1

      @@liandriunreal i just got a 1080 after my 970 died ..lol

    • @garretts.2003
      @garretts.2003 ปีที่แล้ว +5

      3080ti is a great card. Certainly an excellent first build. The cards are getting so good that last gen is worth the savings for me personally. I'm still on a 2080ti running 1440 ultrawide decent. Most single player games I'm fine with at 60fps and online FPS can always drop the resolution if required. I'll probably upgrade to the 3080ti while prices are good.

    • @mattgibbia2692
      @mattgibbia2692 ปีที่แล้ว +1

      @@liandriunreal you must hate money 3080ti is more than enough for anything out right now

    • @Amizzly
      @Amizzly ปีที่แล้ว +1

      @@mattgibbia2692 like what? Most games with high detail textures and RT on my 35” UW 1440p monitor are dragging ass with my 3080. Cyberpunk it’s like 45FPS even with DLSS on performance mode. With no DLSS it’s like 15FPS.

  • @luckyspec2274
    @luckyspec2274 ปีที่แล้ว +4

    17:00 Hi, I am from the future, JayzTwoCents was right about the cable bend issues

  • @theldraspneumonoultramicro405
    @theldraspneumonoultramicro405 ปีที่แล้ว +345

    man, i just cant get over just how comically massive the 4090 is.

    • @woswasdenni1914
      @woswasdenni1914 ปีที่แล้ว +7

      there goes my plans making a ssf build. the cooler itself is bigger than the entire build

    • @itllBuffGaming
      @itllBuffGaming ปีที่แล้ว +1

      @@woswasdenni1914 if the waterblocks for them are the same as the 3090. It’ll be a quarter the size, if you want a small build custom liquid is going to be the way now

    • @MichaeltheORIGINAL1
      @MichaeltheORIGINAL1 ปีที่แล้ว +4

      I thought the 3090ti was huge but this thing is a whole nother story, haha.

    • @Cuplex1
      @Cuplex1 ปีที่แล้ว +2

      Agreed, I think my 3080 is massive. But it's nothing compared to that beast of a card. 🙂

    • @watchm4ker
      @watchm4ker ปีที่แล้ว

      @@outlet6989 It'll fit in a full tower case, as long as you don't have drive cages to worry about. And you're thinking EATX, which is for dual-socket MBs

  • @tt33333wp
    @tt33333wp ปีที่แล้ว +60

    They could introduce “L” shape adapters. This would be a good solution to this bending issue.

    • @Digitalplays
      @Digitalplays ปีที่แล้ว +32

      Then you can take two Ls when buying one of these

    • @johnderat2652
      @johnderat2652 ปีที่แล้ว +4

      @@Digitalplays4090 would be a great card for Blender artists, AI training, and simulations.
      Not so sure about gaming though.

    • @mukkah
      @mukkah ปีที่แล้ว

      Interesting thought and I wonder if it has been explored by RnD at nvidia (seems like an obvious thought now that you mention it but it escaped me entirely up until then, so who knows)

    • @Chipsaru
      @Chipsaru ปีที่แล้ว

      @Anonymous did you see 43 FPS in Cyberpunk RT vs 20ish with 3090? Nice uplift for RT titles.

  • @markz4467
    @markz4467 ปีที่แล้ว +115

    You should start adding power consumption to the fps graphs. Something like - 158/220, where 158 stands for the fps count and 220 represents power consumption in watts.

    • @nervonabliss2071
      @nervonabliss2071 ปีที่แล้ว +3

      Or just next to the cards name

    • @ramongossler1726
      @ramongossler1726 ปีที่แล้ว +1

      no, Power consumption is just an AMD fanboy Argument just like "no one needs RT anyway" if you are concerned about Power consumption buy a Laptop

    • @GM-xk1nw
      @GM-xk1nw ปีที่แล้ว +30

      @@ramongossler1726 Power consumption is a thing people who pay bills care about, you know people with responsibilities.

    • @GamerErman2001
      @GamerErman2001 ปีที่แล้ว +16

      @@ramongossler1726 Power consumption raises your power bill, affects what power supply you need and heats up your PC as well as your room. Also although this is a small matter for a single user several people using large amounts of power to run their computer creates pollution and can also cause black/brown outs.

    • @excellentswordfight8215
      @excellentswordfight8215 ปีที่แล้ว +8

      Bills aside, when using something like PCAT so that you actually get a good messure of GPU-powerdraw it would actually be a good way of seeing how system-bottlenecked the card is.

  • @IxXDOMINATORXxI
    @IxXDOMINATORXxI ปีที่แล้ว +27

    Im sticking with my evga 3080ti that baby will get me through until 50 series easy. Honestly by that point im hoping intel cards are good, id love to try a good top notch Intel card

    • @SuperSavageSpirit
      @SuperSavageSpirit ปีที่แล้ว +3

      Intel's cards won't be if they are even still making them by that point and didn't give up

    • @christopherlawrence4191
      @christopherlawrence4191 ปีที่แล้ว +1

      If i dont have a card at all (1050ti died on me a year ago). Should i wait or keep the 4090 i ordered?

    • @brandonstein87
      @brandonstein87 ปีที่แล้ว +1

      ​@christopherlawrence4191 how is the card? I just got one

  • @shermanbuster1749
    @shermanbuster1749 ปีที่แล้ว +67

    I can see a lot of 90 degrees adaptors being sold for these cards. If I were in the market for this card, that is the way I would probably go. Go with a 90 degree adaptor so you are putting less stress on the cables and saving some space.

    • @ArtisChronicles
      @ArtisChronicles ปีที่แล้ว +7

      That's the one thing that would make the most sense to me. Problem for me is the damn things are so big. I do not want to run a card that big in my case.
      Those 90 degree adapters should exist regardless though. It's a pretty important piece overall.

  • @connor040606
    @connor040606 ปีที่แล้ว +52

    Thanks for the tips Jay with the adapter cable. Pretty sure you just saved a ton of people RMA headaches!

  • @Harrisboyuno
    @Harrisboyuno ปีที่แล้ว +2

    Love the iFixit promos as well. All around info and entertainment. Thanks so much Jay.

  • @rallias1
    @rallias1 ปีที่แล้ว +52

    So, someone else showed how they were able to cut power down to like 50% and still get like 90% of the performance. I kinda want to see your take on that, and maybe in comparison to a 30-series or team red card with the same power limits.

    • @paritee6488
      @paritee6488 ปีที่แล้ว

      tech ciiy!

    • @emich34
      @emich34 ปีที่แล้ว +2

      @@paritee6488 derBauer too - running at 60% power target was like a 2% fps drop in most titles

    • @Trisiton
      @Trisiton ปีที่แล้ว +1

      Of course, you get diminishing returns after a certain point. This is why there are laptop 3080tis that run at 180W and still get like 70% of the performance of a desktop 3080ti.

    • @sonicfire9000
      @sonicfire9000 ปีที่แล้ว

      @@Trisiton those sound very interesting but I just have one question in my mind tbh: how are the batteries? Just asking so I don't accidentally run into a razer or alienware situation

    • @prabhsaini1
      @prabhsaini1 ปีที่แล้ว

      @@Trisiton 180 watts? on a laptop? those things must only run for a solid 28 seconds

  • @latioseon7794
    @latioseon7794 ปีที่แล้ว +328

    After the whole ordeal with 30 series and the "4070" thing, i hope nvidia gets a reality check

    • @eclipsez0r
      @eclipsez0r ปีที่แล้ว +45

      That performance sells lol this card gonna be sold out

    • @TheSpoobz
      @TheSpoobz ปีที่แล้ว +22

      Honesty just gonna stay with my 3080ti cuz of that

    • @kevinerbs2778
      @kevinerbs2778 ปีที่แล้ว +26

      @@eclipsez0r that's the most disappinting part about this.

    • @samson_the_great
      @samson_the_great ปีที่แล้ว

      @@eclipsez0r yup, I already hit the plug up.

    • @omniyambot9876
      @omniyambot9876 ปีที่แล้ว +1

      people who would spend money to 3090ti before crypto crash have every reasone to buy 4090 cards especially with those insane performance jumps. Yeah we hate NVIDIA, they are dick and overpriced but let's stop being stupid here, their product is still absolutely competitive that's why people still buy them, they are not pointing a gun at you.

  • @dale117
    @dale117 ปีที่แล้ว +295

    What impressed me the most was the improvement in 4K resolution. Can't wait to see what AMD brings to the table.

    • @surft
      @surft ปีที่แล้ว +10

      Excited too, but I'm going to be shocked if their top of the line can get close (single digits) in fps to this in most games. The uplift in rasterization alone is insane.

    • @roccociccone597
      @roccociccone597 ปีที่แล้ว +8

      @@surft Well leaks suggest they do match it, sometimes even beat it. I do expect RNDA3 to be very very good.

    • @TwoSevenX
      @TwoSevenX ปีที่แล้ว +4

      @@surft AMD and NvIdia both expect AMD to *win* in pure raster performance by 5-20% with the 7950XT

    • @jakestocker4854
      @jakestocker4854 ปีที่แล้ว +27

      @@roccociccone597 the leaks always say that though. Literally for the last 3 generations there have always been links that AMD has something huge coming and then they release some solid cards but nothing like the leaks hype up.

    • @roccociccone597
      @roccociccone597 ปีที่แล้ว +7

      @@jakestocker4854 well rdna 2 was pretty accurate. And they mostly match nvidia. So I’m optimistic amd will manage to match or even beat nvidia in raster and get very close in ray tracing. And I hope they won’t be this expensive

  • @jonathanjanzen5231
    @jonathanjanzen5231 ปีที่แล้ว

    Love your combined 4K/1440/1080 graphs. Music is cool too!

  • @commitselfdeletus9070
    @commitselfdeletus9070 ปีที่แล้ว +1

    Jay makes the only ads that I don’t skip

  • @vorpled
    @vorpled ปีที่แล้ว +118

    DetBauer had an amazing point which shows that they could have reduced the size of the card and cooler by 1/3 for about a 5% performance hit.

    • @sircommissar
      @sircommissar ปีที่แล้ว +20

      Id unironically rather not, rather have a bigger card and better perf. Let some AIB have their trash perf for smaller size

    • @Beamer_i4_M50
      @Beamer_i4_M50 ปีที่แล้ว +45

      @@leeroyjenkins0 1/3 of cooler size means 1/3 of power needed. Means 150 Watt less you have to pay for. Means cooler temps all around. For 5% penalty in performance.

    • @jtnachos16
      @jtnachos16 ปีที่แล้ว

      @@leeroyjenkins0 You are both utterly missing the point, and a perfect example of the type of idiot who lets scalpers (be they the official producer or a third-party) keep stake in the market. Which is what NVIDIA is doing right now by marketing their 4070 as a 4080. Additionally, your comments further paint the picture as someone who doesn't have the money for a 4090 in the first place.
      You've just demonstrated that you have no understanding whatsoever of how GPUs work. Guess what? With less power draw and 5% less performance, you are still in a range where overclocking can bring that performance back to a degree that is utterly unnoticeable in actual use (this ignoring that we are talking loss of 1-3 frames or so in most real-world loads). Furthermore, that reduced power consumption? Less wear on parts, means more reliability, not just on your gpu, but also psu.
      This is also ignoring that the transient spikes that were being caused by GPUs are still unsolved as of yet, and are capable of killing other parts in the system (yes, they are claiming the new power supply standard solves it, but those aren't commercially available yet and will likely have most consumers priced out for the first year or two. It's also a really bad precedent to be having a totally new power standard for PSUs popping up solely because one manufacturer refuses to work toward efficient use of power). Further yet, momentary power spikes were what was consistently killing 3080ti and 3090 cards, yet nvidia's response was, to my knowledge 'we added more capacitors' which isn't a solution, as those capacitors will still end up getting blown.
      Put bluntly, Nvidia tim tayloring it in search of tiny advancements in performance is absolutely a bad idea, from engineering, consumer, and environmental positions. Literally from every reasonable and informed position, it's a bad idea. Furthermore, that power draw actually is reaching the point where a high end pc will risk overtaxing standard american in-home circuits.
      The TLDR here, is that NVIDIA absolutely did not have to draw that much power for a substantial performance gap over last gen. They are being exceedingly lazy/arrogant and trying to brute force the situation in a way that is almost certainly going to result in failing cards and potentially damage to other parts in the system.

    • @rodh1404
      @rodh1404 ปีที่แล้ว +11

      Personally, I think NVidia took a look at what AMD has in the pipeline and decided to go all out. Because they'd probably have been beaten in the benchmarks if they didn't. Although given how huge the coolers they've strapped on to their cards are, I don't understand why they haven't just abandoned air cooling for these cards and just gone with water cooling only.

    • @HappyBeezerStudios
      @HappyBeezerStudios ปีที่แล้ว +1

      I heard that was more an issue with the manufacturing process. The one they originally planned warranted the 600W and massive coolers, but the one they ended up using sits at 450W and doesn't need those monsters.

  • @Zeniph00
    @Zeniph00 ปีที่แล้ว +123

    Impressive uplift, but happy to wait and see what AMD has. MCM tech has me very interested in what will come.

    • @Malc180s
      @Malc180s ปีที่แล้ว +4

      AMD has fuck all. Buy what you want now, or spend yoursl life waiting

    • @georgejones5019
      @georgejones5019 ปีที่แล้ว +21

      @@Malc180s lmao. Probably a userbrnchmark fan boy.
      AMD has the 3D vcache. The 5800X3D's tech will only improve with age, they've stated it not just applicable to CPUs, but GPUs as well.

    • @HeloisGevit
      @HeloisGevit ปีที่แล้ว +4

      @@georgejones5019 Is it going to improve their shocking ray tracing performance?

    • @AGuy-vq9qp
      @AGuy-vq9qp ปีที่แล้ว +3

      @@georgejones5019 the sounds false. GPUs are a lot less latency sensitive than CPUs are.

    • @mutley69
      @mutley69 ปีที่แล้ว +7

      @@HeloisGevit Ray tracing is just another tactic to make you buy their next latest and greatest cards, let alone the last 2 gens of cards have been been bad for Ray tracing. This 4090 is the first card that can actually manage it properly

  • @wile-e-coyote7257
    @wile-e-coyote7257 ปีที่แล้ว

    Thanks for sharing your benchmark results, Jayz!

  • @dschlessman
    @dschlessman ปีที่แล้ว +3

    Damn you called all these power connector issues. Good job!

  • @Daniel218lb
    @Daniel218lb ปีที่แล้ว +5

    You know, i'm still very happy with my rtx 2080ti.

  • @Annodite
    @Annodite ปีที่แล้ว +5

    Love your videos Jay! Always excited to watch it as soon as you release videos :D

  • @michaelkuhlmeier8472
    @michaelkuhlmeier8472 ปีที่แล้ว

    Just noticed the smoke effect on the background of the graph slides. Always enjoy the little touches Phil does to make the videos look great.

  • @kinaceman
    @kinaceman ปีที่แล้ว

    love the ifixit ad!! "..the new minnow" makes me laugh every time

  • @adamsmith4953
    @adamsmith4953 ปีที่แล้ว +77

    Looks pretty impressive, I can't wait to see what AMD comes out with

  • @Liberteen666
    @Liberteen666 ปีที่แล้ว +16

    Jay I really appreciate your content. Thank you for being quick when it comes to informing all of us. I'm looking forward to building my 4K system in the upcoming few weeks after I see the aftermarket versions of 4090 and their benchmarks. Keep us updated!

  • @dereksinkro1961
    @dereksinkro1961 ปีที่แล้ว +10

    5900x and 3080ti is a nice spot to be for 1440 gaming, will enjoy my rig and watch this all play out.

    • @duohere3981
      @duohere3981 10 หลายเดือนก่อน

      @reality8793he thinks he’s smart 😂

  • @werderdelange2985
    @werderdelange2985 ปีที่แล้ว

    Great work and honest review as always Jay! On a side-note, what song was the background instrumentals from?

  • @dunastrig1889
    @dunastrig1889 ปีที่แล้ว +21

    Thanks! Raw performance #'s are always what I look for first. DLSS and FSR are nice options to have if you can't push the fps but I want to see bare performance with all the bells and whistles. Now I'll look for a 12vhpwr 90 adapter...

    • @andytroschke2036
      @andytroschke2036 ปีที่แล้ว

      Better to wait for ATX3.0 PSU's to release instead of an Adapter.

    • @antonchigurh8343
      @antonchigurh8343 ปีที่แล้ว

      @@andytroschke2036 They are already available

    • @andytroschke2036
      @andytroschke2036 ปีที่แล้ว

      @@antonchigurh8343 where? All I can find is ATX 2.4 with a 12VHPWR. ATX3.0 hast several other additions

  • @PixelShade
    @PixelShade ปีที่แล้ว +10

    I'm at a point where I am totally happy with the 6600XT performance... At least for gaming at 1440p. I kind of feel like games need to become more demanding to justify an upgrade. The 4090 is impressive and I would totally buy it if I worked with 3D modelling/rendering professionally... But let's not kid ourselves. This is not really a "consumer" product, but rather nvidias professional grade hardware made available to normal consumers.

  • @gettingwrekt
    @gettingwrekt ปีที่แล้ว +11

    Jay...I'd love to see you build the next personal rig in the EVGA E1. The analogue gauges are cool as hell.

    • @jsmooth76
      @jsmooth76 ปีที่แล้ว

      yes evga all the way

  • @JohnAmanar
    @JohnAmanar ปีที่แล้ว

    I still really love the iFixit intro! xD Great video!

  • @hdz77
    @hdz77 ปีที่แล้ว +9

    The only thing I got out of this benchmark is that the 4090 price is not justifiable. The 6950XT and the 3090 is more then enough for price to performance, if anything wait and see for the RX 7000 series GPUs.

    • @jondonnelly4831
      @jondonnelly4831 ปีที่แล้ว

      It is justifiable. Take the performance increase per dollar into account. The 4090 costs more, but it gets a lot more fps. If you have a 1440p 240Hz panal and a high end cpu you will feel that increase. The 3090 will feel S L O W in comparison.

    • @LordLentils
      @LordLentils ปีที่แล้ว +2

      @@jondonnelly4831 Old flagship GPUs were the beasts of their time yet the price increase wasn't as tremendous over a single generational leap.

  • @NiveusLuxLucis
    @NiveusLuxLucis ปีที่แล้ว +81

    Thanks for talking about the connector, had the same concerns and nvidia's response is kind of unbelievable.

    • @earthtaurus5515
      @earthtaurus5515 ปีที่แล้ว +10

      Nvidia is too damn egoistic and they are fully aware of frying 12 vhpwr connectors as well as it's very low durability. They just don't give a damn about anyone except their bottom line. So, if you fry your PC or break their connector off the PCB they think everyone will go out and by another 4090. The only way they will do anything about it is if there is massive backlash publicly especially if people start frying their PCs en masse due to the low durability of the 12 vhpwr 4 way adapter.

    • @Cruor34
      @Cruor34 ปีที่แล้ว +2

      This one isn't a big concern to me... I buy a top end GPU every 4 years or so unless something breaks (for example, my 980 TI broke, MSI didn't have any left so the sent me a check, I got a 1080TI) I build the PC and I don't ever touch it again really, then I build a new PC years later. Who the hell is hooking and unhooking the cables constantly? On my list of care issues, this is really low, unless I am misunderstanding, its only an issue if you hook and unhook it 30 times.

    • @hashishiriya
      @hashishiriya ปีที่แล้ว +6

      @@Cruor34 cool

    • @flopsone
      @flopsone ปีที่แล้ว +3

      a 90 degree connector would make things a lot better/easier, surely nvidia or an aib can find a little space and just have the connector just come directly out of the bottom of the pcb, would direct the cable straight down where most cases have the power supply

    • @jordanmills7327
      @jordanmills7327 ปีที่แล้ว +3

      @@Cruor34 that 30 times was under "ideal conditions" which means with the cable being inserted straight in with no wiggle and without bending the cable. its almost certainly a lot less than 30 times if you bend or insert the cable in "not ideal" conditions. also keeping it in a tight angle might wear the cable/port down over time and considering this connector could be a serious fire hazard this is completely unacceptable.

  • @__last
    @__last ปีที่แล้ว +2

    hopefully i can get a 3090 for much cheaper now since no game is gonna need a 4090 for at least 3-4 years.

  • @joshuaableiter4545
    @joshuaableiter4545 ปีที่แล้ว

    I agree with JayzTwoCents' observations on the power connector.

  • @vetsus3518
    @vetsus3518 ปีที่แล้ว +145

    I’m with you… a little confused why they didn’t create a 90 degree adapter if that was the temporary solution until the new PSU’s are released… that would have at least allowed you to fit it within a ‘normal’ case. Also, I love the GPU mounted on the wall in the back. At least that’s how it appeared to be. Looks like one of your custom printed GPU stands mounted to the wall with the card on it. It’s a cool look. Try putting up some motherboards too…. I mean since you’re just a little tech channel. Lol

    • @Ben-Rogue
      @Ben-Rogue ปีที่แล้ว +9

      That cable solution is just lazy. A 90 degree adapter with about 20CM of cable before they split it, would be a lot cleaner and easier for customers to fit into cases

    • @ChaseIrons
      @ChaseIrons ปีที่แล้ว +3

      Someone will make an adapter eventually. For my 3090’s I got u-turn adapters that have been excellent since launch. No bends needed at all

    • @joee7452
      @joee7452 ปีที่แล้ว +1

      I am not a betting man but what is the chance that they didn't create one so that the after market do create them and then can charge a 79 or 99 dollars for them as an extra part? Remember they put caps on prices, so that would be a way for them to give, say asus, an easy way to make 50 or 70 dollars extra on the 4k series that technically doesn't count in the price of the gpu itself.

    • @MrMoon-hy6pn
      @MrMoon-hy6pn ปีที่แล้ว

      @@joee7452 Since nvidia seems to treat their AIBs with utter contempt as shown with evga leaving the GPU market entirely because of nvidia, I somehow doubt that's the reason.

    • @lUnderdogl
      @lUnderdogl ปีที่แล้ว

      I bet they planned to do but sometimes it is too late to implement.

  • @HRC4793
    @HRC4793 ปีที่แล้ว +45

    The iFixit ad is the only one you don't want to skip

  • @urazsoktay5275
    @urazsoktay5275 ปีที่แล้ว

    Very thorough and good video. Thank you.

  • @marco_scuba
    @marco_scuba ปีที่แล้ว

    Awesome video as always!

  • @CR500R
    @CR500R ปีที่แล้ว +78

    Thank you for testing the PowerColor Red Devil 6950XT! It makes me feel better about my purchase. It actually held its own against the 3090 & 3090Ti on a lot of benchmarks. Not many people test the Red Devil 6950XT. It's not the most popular of the 6950XT cards.

    • @KingZeusCLE
      @KingZeusCLE ปีที่แล้ว +2

      Maybe not, but any of the other 6950 XT numbers still apply. They likely all perform within 1-2% of each other.
      Bitchin' card though. Even with the 4090 released.

    • @vespermoirai975
      @vespermoirai975 ปีที่แล้ว +1

      Red Devil and XFX/Sapphire has always been my favorites when I've had AMD Cards. Red Devils seem to be what EVGA was to Nvidia. With XFX/Sapphire coming close.

    • @ArtisChronicles
      @ArtisChronicles ปีที่แล้ว +1

      @@vespermoirai975 idk if I'd call the Red Devils that as the Red Devil RX 480 actually cut a lot of corners. Mostly applied to overclockers, but I'd still refrain from running Furmark on them. Unless you want to risk damaging them. Old card now, but still a relevant issue.

    • @vespermoirai975
      @vespermoirai975 ปีที่แล้ว +1

      @@ArtisChronicles If I remember right there was a thermal pad fix for that. Could be thinking the R9290

    • @farmeunit
      @farmeunit ปีที่แล้ว +2

      @@ArtisChronicles I had a Red Devil 580 and I loved it. That was their top tier card. Red Dragon below it, and then any other models. I wanted a Red Devil 6800XT but prices were ridiculous. Finally got cheaper but got a 6900XT Gaming Z for less.

  • @greenawayr08
    @greenawayr08 ปีที่แล้ว +124

    Jay, have you considered including VR in your benchmarks? it's an ever growing segment and really pushes performance with variance in performance from card to card. just a suggestion.
    thanks for the great content.

    • @pumpkineater23
      @pumpkineater23 ปีที่แล้ว +14

      Agreed. Even mid-range GPUs are way more than good enough for gaming on a monitor. VR is what really pushes a card now. Flat-screen gaming is yesterday's tech. How does the 4090 improve MSFS in VR.. that's a more 'worthy opponent'.

    • @blinkingred
      @blinkingred ปีที่แล้ว +7

      VR is growing? Last I checked it was stagnant with declining interest and sales.

    • @3lbios
      @3lbios ปีที่แล้ว +3

      I'd love to see a 3090 vs 4090 comparison on a Reverb G2 headset in the most popular VR games.

    • @lePoMo
      @lePoMo ปีที่แล้ว

      has the VR landscape changed?
      VR requires a fluid high framerate so much that no (sane) game developer takes any risks. or has this changed?
      When I bought into VR (CV1), every game targeted RX470-RX480 (GTX980/GTX1060). I somewhat got stuck on BeatSaber so didn't follow evolution since, but to my memory, the only games back then to break with this were flatscreen ports to VR.

    • @privateportall
      @privateportall ปีที่แล้ว +3

      Vr remains gimmicky

  • @patratcan
    @patratcan ปีที่แล้ว

    You made me like watching an ad. Hats off to you!

  • @JimmyFigueroa
    @JimmyFigueroa ปีที่แล้ว

    That I fix it ad was awesome lmao

  • @sidra_games4551
    @sidra_games4551 ปีที่แล้ว +23

    There is so much happening on such a short timeframe that it just makes sense to wait a few months before deciding on a new build. How are the Intel chips gonna perform versus the new AMD ones? How will the new AMD cards perform? How will the lesser (70/80) Nvidia cards compare with this one? And keep in mind we are still awaiting on 5.0 M2 SSDs. It's new build time for me as my last one is 5 years old. But I am gonna let the dust settle and once JAN-FEB rolls around figure out what's best.

    • @Silver1080P
      @Silver1080P ปีที่แล้ว

      I've been waiting for what's next across the board for 3 years now, whether it's due to cost or lack of power, most of the things I've been interested in has pushed to the side. I have a 3080 12gb and i7 8700k so I'm happy enough for now. Will be looking at Intel's next cpu though

    • @jamesc3953
      @jamesc3953 ปีที่แล้ว

      @@Silver1080P Do you find your 8700k bottlenecks your 3080? what kind of resolution do you play at?

  • @carlwillows
    @carlwillows ปีที่แล้ว +16

    It will be interesting to see the performances of the 4080's with only 47% and 60% of the 4090's cores respectively.

    • @carlwillows
      @carlwillows ปีที่แล้ว +3

      @@taekwoncrawfish9418 I don't think it will be quite so linear, but we shall see.

    • @ChiquitaSpeaks
      @ChiquitaSpeaks ปีที่แล้ว

      @@carlwillows benchmarks dropped it’s pretty Linear

    • @ChiquitaSpeaks
      @ChiquitaSpeaks ปีที่แล้ว

      @@taekwoncrawfish9418 same architecture lol right everything

  • @ChikoDc_Tv
    @ChikoDc_Tv ปีที่แล้ว

    Thanks for the wire bend tips !

  • @vedomedo
    @vedomedo ปีที่แล้ว +2

    I'm not gonna get the 4090, and I would probably have gotten the 4080 16gb if the pricing here in Norway was more in line with the $ pricing. However, here you have to pay the 4090 price for the 4080 16gb, and the 4090 costs like $2000-> $2500, which is simply silly. Even if I sold my 3080 the difference is still what I would expect a 4080 16gb to cost in total. Who knows, maybe the 50xx will be more in line with "normal" pricing, or maybe even the 40xx cards go down in price after a while.

    • @corniel657
      @corniel657 ปีที่แล้ว

      daym bro ur rich

  • @eddiec1961
    @eddiec1961 ปีที่แล้ว +6

    I think it's worth waiting a little bit longer, I think your right to be concerned about the socket a 90°plug would be a better solution.

  • @drizzlydr4ke
    @drizzlydr4ke ปีที่แล้ว +7

    Evga really were smart having the cables on the side, on my lian li dynamic the cables were touching the glass (with frontal cable plug)but having it on the side gives better room for the cables indeed even if you have a fan. Nvidia really need to make it on the side so it fit most cases with no issue

  • @zachr369
    @zachr369 ปีที่แล้ว +1

    I got a evga 3090 ftw3 earlier this year once the price finally dropped back down, mostly to run stable 120 frames in vr at all times for assetto corsa ultra settings so other filters. I play on 1080. so seeing what the new 40 series can do is good to see. Around all the news with invidia may or may not continue with their products in the future.

  • @dtrjones
    @dtrjones ปีที่แล้ว +1

    Thanks, Jayz, really liked this video! If you had any reservations about calling out Nvidia on the 12v power connector on the FE card then you shouldn't, I'm glad you called it out, however with all these new systems I'm going to go through a PC builder anyway so let's hope they've also seen your advice!

  • @Angsaar011
    @Angsaar011 ปีที่แล้ว +66

    I got myself a 3070 Ti. Simply because that was what was available at the time of the shortage. I'm very curious to see what AMD brings to the party. Here's hoping it will be something exceptional.

    • @Its_Me_Wheelz
      @Its_Me_Wheelz ปีที่แล้ว +4

      I nailed a great deal on a 3080 last month, and I'm all sorts of happy. It will last me a long time. Most likely somewhere around the 5000 to 6000 cards.

    • @Angsaar011
      @Angsaar011 ปีที่แล้ว +4

      @@Its_Me_Wheelz For sure. I had my 980 ti up until recently. The 30-series are going to last a long time.

    • @Its_Me_Wheelz
      @Its_Me_Wheelz ปีที่แล้ว +3

      @@Angsaar011 In all honestly, I was running a 2060 super, and it ran everything I play with no problems. Mainly ESO, HLL, COD, and a few others in that kind of games. I had no intentions of upgrading. But like a said, I got a great deal on the 3080 and so, here I am.

    • @yyorophff786
      @yyorophff786 ปีที่แล้ว

      You should have waited for this card.

    • @josephj6521
      @josephj6521 ปีที่แล้ว

      @@yyorophff786 not at these prices.

  • @mcflu
    @mcflu ปีที่แล้ว +40

    On one hand seeing current performance of the RX 6950XT compared to 3000 series in super impressed and looking forward to what thing bring with RDNA3 next month. On the other hand, all of these cards are too much for me lol and I'm happy with my 3060ti 😁

    • @chexlemeneux8790
      @chexlemeneux8790 ปีที่แล้ว +4

      Personally Im totally fine playing in 1080p and the 3060ti is more than capable of playing every game I own on ultra settings with at least 100fps . I got it in a $1800 CAD pre-built PC during the chip shortage , while people were paying that much for a 3080 by itself. I felt like I made out like a bandit and still feel super good about that purchase.

    • @craiglortie8483
      @craiglortie8483 ปีที่แล้ว +2

      thats why i went with a 67000 xt for my upgrade. running 60fps on a 4k monitor. i watch youtube and streaming services more than i play now, so it made the best choice for me.

    • @cerebelul
      @cerebelul ปีที่แล้ว +1

      More impressive is the fact that it comes very close to the 4090 in some games at 2K/4K.

    • @FlotaMau
      @FlotaMau ปีที่แล้ว

      @@craiglortie8483 same I mean 4 k at 60 fps for single player Is really fine, I only Envy More fps for competitive but playing comp at 4k Is NOT even a thing.

    • @craiglortie8483
      @craiglortie8483 ปีที่แล้ว

      @@FlotaMau I play war thunder fine with my Philips monitor. The settings I turn down are the same as I would to improve game play. I stay locked between 55-60 fps all game. Lose a few ms from monitor but nothing I don't lose from age.

  • @blze0018
    @blze0018 ปีที่แล้ว +1

    I have a 3090, no rush on my end. If a 4090 or RDNA3 comes down in price a bit I might get one sometime next year, since I do play at 4k resolutions, but no rush. Ideally I hold off until the 5000 series and get an even bigger boost, although I am concerned about how power hungry those cards will be.

  • @Rasgarroth
    @Rasgarroth ปีที่แล้ว

    The best ad on a video I've seen in a WHILE

  • @ferdivedaasje
    @ferdivedaasje ปีที่แล้ว +3

    Thanks for the video and your thoughts. I use my PC to game and to do some 3D design work. Right now I'm super happy with my 3080ti, it does everything that I want and more. I game at 1440p, so I don't think the performance difference in current games would be very noticeable to me. I don't design massive projects, so also there I don't think I will notice much difference. My strategy will be to just wait for now. I do usually like to have new toys, but not to be the earliest adopter.

  • @damienlahoz
    @damienlahoz ปีที่แล้ว +9

    Why does it feel like the review community are trying to make 90s a mainstream product? I dont recall so much attention being afforded the Titans, which this essentially is. Its just weird. Every channel has dedicated significant time covering this show piece and what, 1% of PC enthusiasts will even bother trying to buy one regardless of how it performs. Its obvious that Nvidia is trying to normalize certain price points but it doesn't mean people have to play along.

    • @DanielFrost79
      @DanielFrost79 ปีที่แล้ว +1

      I personally hope and wish people did NOT play along. These prices are fuck**g insane and ridicolous.

    • @TheKain202
      @TheKain202 ปีที่แล้ว +1

      Because it's the only one they're allowed to make content about, until NV starts shipping out lower models? And besides, as ridiculous as it sounds - the 90's have the best value.
      4080 and 40""""80"""" had such a ridiculous price hike from last gen, or any before for that matter - it's really hard to justify buying it.

    • @damienlahoz
      @damienlahoz ปีที่แล้ว +1

      @@TheKain202 value? You could say a Ferrari is great value compared to a McLaren. And youd be right but its also astronomically expensive and outside the price point of 99.9% of consumers.

    • @Ferretsnarf
      @Ferretsnarf ปีที่แล้ว +3

      The power draw is just absolutely insane as well, and is an enormous increase over the previous gen. We're essentially seeing the performance scale with the power draw... which isn't really that impressive. 600 watts is off the charts. Honestly, we're not far off from having to dedicate an entire circuit in your house to these PCs that would be running hardware like this.
      Between the price, the power, and the ludicrous size of these things, when is enough enough? Nvidia, come see me when you make a better card by making it better, not by getting the performance out of a proportional gain in both size and power draw.

    • @damienlahoz
      @damienlahoz ปีที่แล้ว +1

      @@DanielFrost79 when the rubber meets the road, no 99% of gamers aren't playing along. $2k is still $2k to damn near everyone. Alot of these people saying they are buying one have 3060s and aint buying sht

  • @pkkillernate
    @pkkillernate ปีที่แล้ว

    Where is the cheaper add blocker because I’ve watched 6 ads just for this video and 4 without skips! Love the channel.

  • @Yoyo34
    @Yoyo34 ปีที่แล้ว

    Normally I am very bored while reading benchmarks or listening to people reading from them. But this song, made it a lot better. 😀

  • @BrettWidner
    @BrettWidner ปีที่แล้ว +194

    This is actually very interesting. A lot of your numbers for the 4090 are INCREDIBLY different from LTT's. I'm actually quite perplexed by it, their 4090 was getting 2X the fps of your 4090 test on what looks like the exact same settings in Cyberpunk. 4K, RT On, Ultra Preset, DLSS Off.
    EDIT: Just want to clarify, I'm not accusing either reviewer of anything. Merely pointing out the vast differences, could be related to their test bench, could not be. Something one might have to think about if they're looking to buy this card.
    UPDATE: LTT ran the card with FidelityFX on by mistake. JayzTwoCent's numbers are accurate.

    • @Pleasant_exe
      @Pleasant_exe ปีที่แล้ว +17

      And gamers nexus

    • @AsaMitakasHusband
      @AsaMitakasHusband ปีที่แล้ว +7

      Yea i noticed that too lol

    • @marcelosoares7148
      @marcelosoares7148 ปีที่แล้ว +30

      Hardware Unboxed too got different numbers but the strangest one was the 6950XT getting only 28fps in CP2077 on the LTT Benchmark

    • @B8con8tor
      @B8con8tor ปีที่แล้ว +8

      Everyone's numbers will not match. It will depend on room temperature, Open/closed case, Cpu Memory and so on.

    • @BrettWidner
      @BrettWidner ปีที่แล้ว +46

      @@B8con8tor I get they're on different benches but LTT's 4090 getting 2X the performance of Jayz 4090 on from what I can see are the same settings in Cyberpunk?

  • @shadowjulien5
    @shadowjulien5 ปีที่แล้ว +44

    Definitely waiting on rdna3 to see what they’ve got to offer. Then I finally want to get around to an itx build. Im also waiting to see what raptor lake has to offer because with the price of am5 boards I might actually end up going Intel after 4 zen systems lol

    • @ZackSNetwork
      @ZackSNetwork ปีที่แล้ว

      Why on GPU’s that powerful and huge they need space and proper cooling? As hell as a high wattage PSU. SFF builds are good for low to mid range PC’s.

    • @Adonis513
      @Adonis513 ปีที่แล้ว +3

      Never understood the point of putting high wattage cards in itx builds , very stupid.

    • @shadowjulien5
      @shadowjulien5 ปีที่แล้ว +4

      Tbh the engineering challenge of getting that much power in a small space seems fun and I’ve had a mid tower for like idk 15 years lol I just wanna try something different

    • @Adonis513
      @Adonis513 ปีที่แล้ว

      @@shadowjulien5 you buying cutting edge hardware just so it can be throttled in a itx setup , unless you are doing a custom loop i see no point.

    • @ghomerhust
      @ghomerhust ปีที่แล้ว +1

      @@Adonis513 thats the point of the challenge they talked about. if they can get it to run properly on air, thats a win. if they can fit cooling in that tiny box, its a win. for some people, just chucking big ass parts with big ass numbers in a big ass case with big ass airflow, well, it's boring as hell, regardless of performance.

  • @milesprower06
    @milesprower06 ปีที่แล้ว +1

    My GPU buying plans happened a month and a half ago; used 3080 Ti on Facebook for a very decent price.
    Likely going to be my workhorse for the next generation or two for 4k gaming, which I have just now gotten into the past month.

  • @NightWolfx03
    @NightWolfx03 ปีที่แล้ว

    some of those heatshrunk connector ends can be bent carefully with adding a bit of heat. But of course you have to be careful not to melt the connector or sleeve, and also careful not to pull the insulation back on the wires or pulling pins out. But a heatgun on a low setting can sometimes be helpful with some, but not all, harnesses. Just depending how thick the heatshrink is, and if there is anything under it ( like Corsair has capacitors in some of their cables ).

  • @OzzyInSpace
    @OzzyInSpace ปีที่แล้ว +10

    With the placement of that weak as heck plug, I'll be holding off of the 40 series. I'll be perfectly happy with my 3080 TI for a while.

  • @sleepii15
    @sleepii15 ปีที่แล้ว +2

    Great review and details as always. I’ll stay with my 3090; I wanted to upgrade but not worth it not having a 280hz 1440p monitor. Can’t wait for Monitor technology to catch up. That bending cable situation looks like it will show it’s true colors soon enough, I think they could’ve done a better job with it.

  • @TonnyCassidy
    @TonnyCassidy ปีที่แล้ว

    15:40....... i expected the signature laugh of JTC channel (phil's laugh)

  • @RIGHTxTRIGGERx
    @RIGHTxTRIGGERx ปีที่แล้ว +8

    im always a few gens behind because i cant really afford the best of the best but rn im pretty focused on upgrading my cpu. ive currently got a 1660 super and i think i want to upgrade my graphics card sometime next year , i want to see how much the 30 series drops in price once all the 40s have been announced, released and sold. new tech and competition is a good thing for everyone.

    • @Bdot888
      @Bdot888 ปีที่แล้ว +2

      Good idea! I waited for a while and recently went the used gpu route and got a 3080 for $550. But im sure prices will drop a little more, just keep an eye out!

    • @Erikcleric
      @Erikcleric ปีที่แล้ว +2

      3090 prices dropped like Crazy in Sweden the past weeks. From 2600-ish USD to 1300 USD now. So I'm upgrading my brand-new rig which has the 3060ti to a 3090.
      4000 series, it's overkill for any game right now and the coming years unless you NEED 4K ultra at max fps...
      My 3060ti will go into a future desktop I'll get for my old room at my moms place. Hate for it just be abandoned.

    • @RIGHTxTRIGGERx
      @RIGHTxTRIGGERx ปีที่แล้ว +1

      @@Bdot888 i might look into something used actually. Not something ive ever thought about doing but it could save alot of cash!

    • @RIGHTxTRIGGERx
      @RIGHTxTRIGGERx ปีที่แล้ว +1

      @@Erikcleric definitely not in the market for anything over 1k but the way prices have been trending, i dont think that’s something ill have to worry about soon. I get not wanting to toss parts, it feels like such a waste. Ive got a bunch of old parts taking up space in my closet that im never going to touch again but i cant get myself to get rid of them lol.

  • @09juilliardbayan
    @09juilliardbayan ปีที่แล้ว +39

    Considering my budget, as much as I dream of having a 40 series, I see this as the perfect opportunity to buy a 30 series, which I have been waiting for for a looong time. It's all so exciting

    • @exq884
      @exq884 ปีที่แล้ว +1

      same - looking at a 3090

    • @bloodstalkerkarth348
      @bloodstalkerkarth348 ปีที่แล้ว

      @@exq884 wait to see if the 4080 is better or the new amd card

    • @zerogiantsquid
      @zerogiantsquid ปีที่แล้ว

      I'm in the same boat. I saw a 3090 for $950 on newegg two weeks ago and sniped it. I'm kinda sad that the 4090 is so much better, but at the same time I was super excited to finally get my hands on a 30 series. Still a massive leap from my previous card.

    • @chillchinna4164
      @chillchinna4164 ปีที่แล้ว +1

      @@zerogiantsquid Life is about being happy with what you are able to get, rather than being upset about not obtaining perfection.

    • @beH3uH
      @beH3uH ปีที่แล้ว +1

      Just bought rx 6900 xt for 800 euro lol prices are good.

  • @stueyxd
    @stueyxd ปีที่แล้ว +12

    Just looking at the sheer size of the card, I think a lot of us would need a bigger case, and with the concern you are sharing about even the larger cases requiring such a stressed bend, I think I will give it a miss until some aftermarket/third party manufacturers correct this in our favour.

    • @Toutvids
      @Toutvids ปีที่แล้ว +1

      Exactly, my full tower Thermaltake Core X71 wouldn't even shut the glass side panel with one of these mounted. If I went vertical mount, the card would be starved for air shoved flush with the glass. No thought about current cases on the market was given when making these GPUs.

    • @yurimodin7333
      @yurimodin7333 ปีที่แล้ว

      just cut a hole in the side like a supercharger sticking out of a muscle car hood

    • @PDXCustomPCS
      @PDXCustomPCS ปีที่แล้ว

      Imagine this in an O11D Mini.. 😅

    • @cole7914
      @cole7914 ปีที่แล้ว

      Cost of the card, cost of a new PSU, and cost increase of electricity to run this monster. Nah… I’m good.

  • @TinariKao
    @TinariKao ปีที่แล้ว

    I love looking at high end desktop hardware as a precursor of what can trickle down in the mobile, low power space which is where I start to care about things. :3

  • @JosueRodriguez1225
    @JosueRodriguez1225 ปีที่แล้ว

    I never skip your adds, they’re awesome

  • @kieranpalmer2085
    @kieranpalmer2085 ปีที่แล้ว +3

    Here before the title change love you jay haha

  • @AdrianRusu95
    @AdrianRusu95 ปีที่แล้ว +3

    That iFixit ad spot 🔥🔥🔥

  • @dunningkrueger
    @dunningkrueger ปีที่แล้ว

    Love the soundtrack!

  • @roylee3558
    @roylee3558 ปีที่แล้ว +15

    They need to put the cable plug on the motherboard side, then make the cable end a premade 90* (like how you have SATA 90* cable ends). This would keep the cable out of sight, and also relieve the bend pressure on the card's connection port.

    • @tr5848
      @tr5848 ปีที่แล้ว

      Do that kind of thing with high-end military hardware and it works really well and is robust.

  • @midnightlexicon
    @midnightlexicon ปีที่แล้ว +14

    Wanna see what the 4080 fe has in store for us. Good to see FE construction allows for good boosting whitout fan speed adjustment. Might stick with FE from now on.

  • @lostgenius
    @lostgenius ปีที่แล้ว +4

    I was finally able to get a 3080 a few months ago, so I will be passing on the 40 series

  • @bfizzle81
    @bfizzle81 ปีที่แล้ว

    Glad you mentioned the cable, Im running a EVGA 3080ti FTW3 and a Lian Li 011 Dynamic case and my cables are pressed up against the glass. Theres no way this would fit in my current set-up with the current connector config on the 4090........but honestly im good with my current setup @ 1440p and it should last me a while at least :) I'll wait.

  • @SpaceshipAwesome
    @SpaceshipAwesome ปีที่แล้ว

    that ifixit ad was honestly insane hahaha

  • @TehEv0
    @TehEv0 ปีที่แล้ว +3

    I remember seeing an article that touched on why board partners have gone with such chonk coolers that are really overkill compared to what is needed. They mentioned about how the original spec earlier in the production process may have been based off of the old Samsung 8nm node, instead of TSMC coming in with their custom 4N process that made them much more efficient, thus lowering the thermal potential too. All the AIBs had designed the coolers based on the original spec of the Samsung node, and just stuck with those instead of having to potentially delay deployment of their cards, all the while losing money to NVIDIA while they sold FE cards left, right and centre.

  • @jessmac1893
    @jessmac1893 ปีที่แล้ว +6

    One number I’d love is power (amps) coming out of the display port. For those with longer extension cables for VR, they often lose connection. I end up using my 1070ti for VR instead of my 3080ti because the 1070ti consistently connects and powers it with an extension cord. Weird stat. But no one measures/reports it.

    • @Shadow0fd3ath24
      @Shadow0fd3ath24 ปีที่แล้ว

      Part of that is because only 1-2% of people are vr players and many have standalone VR headsets that dont interface with a pc

    • @GravitySandwich1
      @GravitySandwich1 ปีที่แล้ว

      I have the Reverb G2, They brought out a new powered cable due to connection issues. (usb wasn't providing enough power) Side note: I have a 1080ti. looking towards a The AMD equivilent of the 4080. (I'm boycotting Nvidia)

  • @mastervorn6380
    @mastervorn6380 ปีที่แล้ว

    What CPU, RAM and mobo are you using Jay? Would like to see your test bench as well. Been watching you since 2017 keep up the good work Jay, Phil and Nick!

  • @TekniQx
    @TekniQx ปีที่แล้ว +1

    @23:20 - Frazier never knocked out Ali in the first fight. He *DID* knock him down in the 15th round (only knock down of the fight) and did end up winning.
    Love ya, Jay! 🤓

  • @MrGryphonv
    @MrGryphonv ปีที่แล้ว +7

    I may have missed it in the video, but I'm really curious about the undervolting performance. I was able to cut down about 100w draw on my 3090 with an undervolt at about a 3% performance hit. If the 4090 can be undervolted with similar or better values it will also be a good selling point.

    • @Daswarich1
      @Daswarich1 ปีที่แล้ว +2

      Der8auer tested tuning the power target and the 4090 got about 90% performance at 60% power target.

    • @MrGryphonv
      @MrGryphonv ปีที่แล้ว +3

      @@Daswarich1 Those are amazing numbers. Well worth the compromise IMO

  • @NostalgicVibes
    @NostalgicVibes ปีที่แล้ว +9

    The 4090 really makes me want to do a full custom loop because of where the connector is located and because of how beefy they are. Would be nice to see a full custom loop once people start doing some builds.

    • @MrInstinctGamer
      @MrInstinctGamer ปีที่แล้ว

      I can't wait to see one these things draw like 400w of power and the engineering of the heating keeps them cool at 80 so I wonder if a custoom loop would really help. Def gonna need its own 360 rad probably 🤣

    • @linsetv
      @linsetv ปีที่แล้ว +3

      @@MrInstinctGamer Well according to Der 8auer temps aren't even that bad.
      And he also tested other power limits like 70% where the card draws 30% less power for only 5% less perfomance...
      With temps in the 60 while gaming...(Founders Edition)

    • @NostalgicVibes
      @NostalgicVibes ปีที่แล้ว +1

      @@linsetvI agree. The cooling I don’t think will be so much of a problem from the testing that I’ve seen and the GamersNexus video breaking down the vapor chamber. The power draw is relatively close to a 3090 ti. The cooler is SO DAMN BEEFY as well that it should be able to handle it. My only real concerns are...
      1. The connector slot on the PCB becoming loose over time and eventually breaking off. (What Jay mentioned)
      2. You’re forced to use a larger case unless you do a custom loop
      3. THE PRICE (I have a 3080 ti so is it REALLY worth the upgrade?)

    • @linsetv
      @linsetv ปีที่แล้ว

      @@NostalgicVibes Case size will probably be the most problematic :D
      And ofc the price but i guess this time it will get lower a bit faster

  • @Stevarneo
    @Stevarneo ปีที่แล้ว

    nice to see the nods to evga in the background

  • @faucheur06400
    @faucheur06400 ปีที่แล้ว

    There are household power cables, sata cables, USB cables, HDMI cables, jack cables, IDE cables and (if my memory is good) even molex cables, which have a 90° angle, made of hard plastic. I can't believe that no one in the graphics card industry thought it would be a better solution than "yeah, bend that cables".
    The cables for my GTX 1080 are touching the glass panel since six years already.

  • @nicoarcenas
    @nicoarcenas ปีที่แล้ว +4

    Couldn't help myself from cheer for the 6950 XT while watching the benchmarks

  • @srodigital
    @srodigital ปีที่แล้ว +3

    Would be nice to see some sort of performance test comparisons based on real work examples of productivity (video editing etc) instead of or as well as gaming. Maybe stitching images in PTGUI for example.

    • @HappyBeezerStudios
      @HappyBeezerStudios ปีที่แล้ว

      What I've seen from the test is that there is no need for me to get a 4090. I play none of the games tested, so there is no interest.

  • @MisterEightyFour
    @MisterEightyFour ปีที่แล้ว

    Came for the benchmarks, bought into the shredding over the graphics 🤘🏻

  • @Filiral
    @Filiral ปีที่แล้ว +4

    Card seems great, but I managed to get a 3080 ti on launch day and uts kept me plenty happy. Nothing I currently play or plan to play should strain that card too much, so I'll probably upgrade everything but graphics card next year and then wait for 5XXX to GPU upgrade.

    • @duohere3981
      @duohere3981 10 หลายเดือนก่อน

      That was cheap I bet 😂