Nvidia RTX 3080 Ti + A6000 Review: Did GDDR6X ruin GA102? Who should buy Big Ampere?

แชร์
ฝัง
  • เผยแพร่เมื่อ 21 ธ.ค. 2024

ความคิดเห็น • 859

  • @e2rqey
    @e2rqey 3 ปีที่แล้ว +125

    My 3080 paid for my 3080Ti lol. Also, I can confirm that the 3080Ti idles at ~110Watts, although I do have mine OC'd.

    • @theigpugamer
      @theigpugamer 3 ปีที่แล้ว +14

      You basically got a 3080 sold it on eBay and got a tie right ?

    • @Ascendor81
      @Ascendor81 3 ปีที่แล้ว +10

      This is the way. Used my 3080 for 7 months, sold it, got a 3080Ti+$200cash :)

    • @Miskatonik
      @Miskatonik 3 ปีที่แล้ว +21

      I personally don't think it's worth it. I recently bought the 3090 FE (idles at 20W, lol) and gladly paid the 300$ premium for an unlocked hash rate and 24Gb of VRAM. I think 12Gb of VRAM is ridiculous for a 1200$ card with a capped mining performance on top. Not to mention the much smaller cooler, they should have used the 3-slot one from the 3090. I really feel the 3080 Ti it's an Nvidia money grab like Gamer's Nexus stated. If I wanted to save some money would better buy the 3080, much better value overall. Also sold my 2080Ti and 3080 at a profit, btw, got lucky it was just before the Bitcoin crash. Prices are much lower now.

    • @LiveForDaKill
      @LiveForDaKill 3 ปีที่แล้ว +2

      What energy management settings are you using in Nvidia Control Panel?
      I have read that when you switch it from "max performance" to "normal" that it will drop to 10-20 watts.

    • @lupintheiii3055
      @lupintheiii3055 3 ปีที่แล้ว +20

      @@LiveForDaKill Funny how people find ok to use all the kind of workarounds with Ampere while the response to any workaround with RX 5000 cards was: "I'm paying $400, it need to just work, it's not my job to fix AMD mess!!!1!!!1!"

  • @joshuac837
    @joshuac837 3 ปีที่แล้ว +93

    Not sure if red eyes in the thumbnail were accidental or intentional, but I’m here for it 😂

  • @mtunayucer
    @mtunayucer 3 ปีที่แล้ว +125

    Difference between gddr6 and gddr6x is smaller than i expected, power usage compared to g6x is insane!

    • @VoldoronGaming
      @VoldoronGaming 3 ปีที่แล้ว +9

      Marketing. G6X is better than G6 right...or it sounds like it should be...

    • @tetrasnak7556
      @tetrasnak7556 3 ปีที่แล้ว +14

      infinity cache FTW !

    • @WayStedYou
      @WayStedYou 3 ปีที่แล้ว +19

      @@VoldoronGaming i think people would've preferred 16gb gddr6 vs 3080 10 gddr6x

    • @lupintheiii3055
      @lupintheiii3055 3 ปีที่แล้ว +4

      @@WayStedYou Of course they would, but then how Nvidia could outsell you a 3090?

    • @MrWarface1
      @MrWarface1 3 ปีที่แล้ว +2

      Insane.. till you compare hashrates, and realize gddr6x makes a huge difference

  • @excitedbox5705
    @excitedbox5705 3 ปีที่แล้ว +136

    That is because Nvidia manipulated the launch reviews by sending people that stupid PCAP thing. It uses a slow as balls ADC so any power measurement will miss any power spikes. Igors Lab used a real Oscilloscope and measured almost 500W spikes on the 3080. But most reviewers won't spend the $400-600 on an entry level Oscope so by controlling the gear reviewers used to test the cards they could claim lower power usage. Believe me, Nvidia would NEVER use a $40 meter to test their GPUs when even a base model Multimeter costs double that. PCAP would be useless for any real measurements, and any maker should have known that immediately.

    • @TheBackyardChemist
      @TheBackyardChemist 3 ปีที่แล้ว

      yeah, but any properly designed circuit with a slow ADC should have a low-pass filter on its input, so unless they ommited that the spikes should not matter much for power consumption

    • @powerdude_dk
      @powerdude_dk 3 ปีที่แล้ว +8

      @@TheBackyardChemist but your PSU should still could handle power spikes like that, so it could potentially be a problem if your PSU has a bit too low wattage.

    • @lupintheiii3055
      @lupintheiii3055 3 ปีที่แล้ว +5

      @@TheBackyardChemist That doesn't solve the issue, the spike will cause heat in a way or the other in some place inside your system wich will dissipate into your room (or somewhere else if by chance you have some kind of external radiator :-)

    • @bgtubber
      @bgtubber 3 ปีที่แล้ว +2

      Yep. RTX 30 series are a big failure in terms of power consumption. That's why I have my 3090s undervolted and I suggest everybody else do that too if you know how to do it (it's not that hard really if you are tech-savvy). Now they run at ~280-290W under full load with minimal to no performance loss compared to the stock 350-380W.

    • @lupintheiii3055
      @lupintheiii3055 3 ปีที่แล้ว +4

      @@bgtubber The fact is you can do that on RX 6000 cards too

  • @Bang4BuckPCGamer
    @Bang4BuckPCGamer 3 ปีที่แล้ว +118

    I laughed so hard when you pointed out the 3060 has the same amount of V Ram as a 3080 Ti.

    • @mtunayucer
      @mtunayucer 3 ปีที่แล้ว +3

      Maybe 3060 should have launched with just 6 gig vram, or the rest of the 3070-3080 should have had 16-20 gig of ram.

    • @bgtubber
      @bgtubber 3 ปีที่แล้ว +36

      ​@@mtunayucer No, 3060 is fine with 12GB. Please don't give Nvidia any ideas.

    • @mtunayucer
      @mtunayucer 3 ปีที่แล้ว +8

      @@bgtubber lmao

    • @gsrcrxsi
      @gsrcrxsi 3 ปีที่แล้ว +5

      3060 = 12GB of slower ram, with less bandwidth. They are not the same.

    • @bgtubber
      @bgtubber 3 ปีที่แล้ว +6

      ​@@gsrcrxsi GDDR6 is still plenty fast. It's not like it is 5 years old technology or something. The first GDDR6 cards to come out with GDDR6 were the previous gen RTX 20 series. Besides, you can always OC it the VRAM if you think that you need more raw power. I would personally take 12 GB of GDDR6 over 6-8 GB of GDDR6X if I had to choose.

  • @davidgunther8428
    @davidgunther8428 3 ปีที่แล้ว +41

    So the race between the RX 6800XT and the RTX 3080Ti is a TIE! 😆

    • @davidgunther8428
      @davidgunther8428 3 ปีที่แล้ว

      @Gareth Tucker it was a pun. But the performance was about equal between the two

  • @gpunk2338
    @gpunk2338 3 ปีที่แล้ว +60

    LOOOLLL I was just listening to your podcast with Colin Moriarty where you said I'm not gonna give this a good review loll..

  • @theigpugamer
    @theigpugamer 3 ปีที่แล้ว +81

    This proves that amd was actually right by not going g6x and instead going infinity cache

    • @MrWarface1
      @MrWarface1 3 ปีที่แล้ว +1

      Tom's opinion proves that? How? I have 2 3090s running 100% mining load 24/7 in my bedroom. Do u guys not have air conditioning? Amd peasants

    • @wile123456
      @wile123456 3 ปีที่แล้ว +2

      GDDRX6 is currently an exclusive deal with Nvidia right now from was it micron?

    • @wile123456
      @wile123456 3 ปีที่แล้ว +34

      @@MrWarface1 lmao using an 800 watt PC for mining then having a 600 watts air conditioning on inside the room to counter balance it. Your part of the problem with the current climate crisis we are in. Disgusting

    • @waleedkhan1122
      @waleedkhan1122 3 ปีที่แล้ว +14

      @@wile123456 Well said.

    • @MrWarface1
      @MrWarface1 3 ปีที่แล้ว

      @@wile123456 first off.. I have central air. Second, my power cost went up a little more than a dollar a day for the 15 a day they are making mining. Imagine being such an idiot, you try to make a point about stuff you know nothing about cause you are such a fanboi. Lmao

  • @cottonginvideo
    @cottonginvideo 3 ปีที่แล้ว +55

    Went from a 400 w 3080 to a 6900 xt. I much prefer the raw rasterization perf, lower power draw, reduced CPU driver overhead, and 6gb extra vram.
    RDNA2 definitely wins this generation. Nvidia mindshare is still insane though.
    I’d like Ampere cards more if they were priced competitively or at least had more vram for longevity.

    • @gamtax
      @gamtax 3 ปีที่แล้ว +9

      When I saw RTX 3080 only have 10GB, I laughed and immediately reminds me of Kepler cards. Some games already consume almost 10GB when playing at 4K. I don't think this card will last for 3 years when playing at 4K.

    • @buckaroobonzai9847
      @buckaroobonzai9847 3 ปีที่แล้ว +2

      Did you notice less heat being generated in the room with the 6900xt?

    • @TheWiiplay
      @TheWiiplay 3 ปีที่แล้ว +2

      I’ve been saying this since I picked up my 6900xt at launch and I sold my 3080 a week later. Finally folks are staring to notice.

    • @cottonginvideo
      @cottonginvideo 3 ปีที่แล้ว

      @@buckaroobonzai9847 yes I have a small room that my pc is in

  • @eddyr4984
    @eddyr4984 3 ปีที่แล้ว +27

    great work tom ! looking forward to seeing more content

  • @endlesslogins156
    @endlesslogins156 3 ปีที่แล้ว +55

    I definitely enjoy the qualitative analysis of these reviews. This is something more reviews need, was it actually noticeable when being used? Or is it just better based on number fetish?

  • @wile123456
    @wile123456 3 ปีที่แล้ว +47

    I feel like the purpose of the 3080ti is so Nvidia can discontinue the normal 3080 and sell the same die for almost twice as much

    • @elheber
      @elheber 3 ปีที่แล้ว +1

      No doubt. Yields improved as manufacturing matured and they saw all the money they were leaving on the table.

    • @darcrequiem
      @darcrequiem 3 ปีที่แล้ว

      I think you are dead on Wei.

    • @tacticalcenter8658
      @tacticalcenter8658 3 ปีที่แล้ว +2

      Don't believe everything you hear. Were living in a world of propaganda of huge proportion. Nvidia is evil, sure that's a fact, so is amd. And the FBI and the Obama's and bidens and Clinton's etc.

    • @wile123456
      @wile123456 3 ปีที่แล้ว +5

      @@tacticalcenter8658 but comviently not the corrupt trump family does anything wrong? Lmao

    • @tacticalcenter8658
      @tacticalcenter8658 3 ปีที่แล้ว

      @@wile123456 show me evidence. So far its just been lies from a criminal organization. Again... Show me truth and facts that aren't made up by the communist. You can't do it. Cause you've been so brainwashed you believe things people tell you without proof or manufactured proof.

  • @gsrcrxsi
    @gsrcrxsi 3 ปีที่แล้ว +24

    The 3080ti only idles at 100W if you have the power management setting in NVCP set to max. This keeps the card at the boost clock, so of course it uses more power. Change it to default/balance and power drops to P8 state and like 10W. That’s a bingo.

    • @sonicwingnut
      @sonicwingnut 3 ปีที่แล้ว +5

      Yeah I did wonder this, considering my 3080ti FE doesn't even spin the fans up when doing most basic non-gaming tasks.
      It's almost like he cherry picked a series of situations which would make the 6800XT look better.

    • @Gindi4711
      @Gindi4711 3 ปีที่แล้ว +3

      This like all the guys setting power limit to 300W in BIOS and then complaining that their 11900K is actually drawing 300W :D

    • @gsrcrxsi
      @gsrcrxsi 3 ปีที่แล้ว +4

      And in true narcissistic tech-tuber fashion, he won’t reply when he’s shown to be wrong or coming to the wrong conclusion based on wrong/incomplete/biased information without verifying anything. Haters gonna hate. Shills gonna shill.

    • @pilotstiles
      @pilotstiles 3 ปีที่แล้ว

      I have a 6900xt and paid retail for it. I can mine ETH at full hashrate and didn’t get butt hurt from Nvidia’s inflated prices. I upgraded from a 1080ti so I am not a fanboy. You guys enjoy your overpriced product limited to what your allowed to do with it. I’m not going back to company that only cares how much money they can make and limits you in its uses.

    • @gsrcrxsi
      @gsrcrxsi 3 ปีที่แล้ว

      @@pilotstiles the 6900XT is already hash limited from AMD, they just don't advertise it like nvidia does. you think an unrestrained 6900XT only does 65MH/s?, just barely more than a 5700XT lol. it should be a LOT faster.

  • @Zorro33313
    @Zorro33313 3 ปีที่แล้ว +93

    Don't forget about Nvidia's software scheduler that creates a pretty noticeable additional load on a cpu making it eat more power and produce more heat. So 3080TIE! is 400w itself + some amount of parasitic power load. Based on HUB overhead investigation - around 10-30 watts.

    • @tetrasnak7556
      @tetrasnak7556 3 ปีที่แล้ว +18

      Exactly ! Not only nvidia gddr6X cards consume an insane amount of power for the performance, but they also offload a part of theyr business to the cpu !

    • @powerdude_dk
      @powerdude_dk 3 ปีที่แล้ว

      @@tetrasnak7556 What??? This is insane! How can Nvidia be so careless...

    • @hasnihossainsami8375
      @hasnihossainsami8375 3 ปีที่แล้ว +2

      @@powerdude_dk they weren't careless, they had to do it to stay ahead of AMD.

    • @noergelstein
      @noergelstein 3 ปีที่แล้ว +11

      @@powerdude_dk In the pre-DX12/Vulkan days the graphics pipeline with DirectX and OpenGL was much more CPU limited and also very difficult to parallelize to multiple thread. Nvidia restructured their driver to offload some workload to an extra thread to increase performance if you had a multicore CPU, and at the time that made sense as many games didn't use the extra core(s) much. This did give Nvidia a performance advantage at the time.
      But come DX12/Vulkan and AMD has a better starting position because all the optimizations Nvidia did in the past don't do anything and their software + hardware architecture causes increased CPU load in their driver.

    • @chapstickbomber
      @chapstickbomber 3 ปีที่แล้ว +3

      If you think about it, the 3090 is sort of a dual rank memory GA102 while 3080ti is single rank. If they interleave between the chips, yeah you have more chips but each chip will be running fewer cycles.
      Techpowerup found the 3090 used less power in multimonitor vs 3080ti.

  • @EpicFacePalmz0r
    @EpicFacePalmz0r 3 ปีที่แล้ว +5

    Glad to see your community providing you with hardware so you can keep up with the bigger channels.

  • @DaveGamesVT
    @DaveGamesVT 3 ปีที่แล้ว +51

    It seems like AMD really nailed it by increasing effective bandwidth by using cache.

    • @bgtubber
      @bgtubber 3 ปีที่แล้ว +5

      Optimizations vs brute force. The smarter way to do things.

    • @tacticalcenter8658
      @tacticalcenter8658 3 ปีที่แล้ว +1

      But amd charges the same for less. amd don't do cuda, don't have tenser cores for other apps that utilize it, they don't have mining capabilities (yes newer nvidia cards its been removed from) etc. AMD is charging way to much for their gaming only product. That's a fact. Either bake in more function or reduce prices. The two brands are completely different products this generation. But AMD decided to offer less and keep the same price and fanboys don't see this because of the faults of the nvidia cards and the evil brand practices.

    • @3polygons
      @3polygons 3 ปีที่แล้ว +1

      Not having CUDA (there's quite less apps depending on Open CL, or not getting great performance, prolly as nVidia has a longer history in apps) neither the NVENC encoder (streaming and certain video editing) is a problem for a certain number of graphic apps (work). But I'm all for cards at a good price for graphic work, whatever the brand. As in... if u get me a card not as efficient in A and B apps due to feature lacking... but at a price so good that it's like a lower nVidia tier in cost... the AMD one might win by brute force (so it happened with Radeon VII)... But what I am seeing is both brands are still way too overpriced.

    • @dawienel1142
      @dawienel1142 3 ปีที่แล้ว +2

      ​@@3polygons That's because both brands cannot saturate the market, and using this to get more revenue, nothing wrong with that it's just business.
      Nvidia sets the price and AMD follows (for the most part)
      Knowing this, We the consumers need to stop buying new GPU's if we want prices to come down.
      GPU's have received some sort of a luxury status faaaar beyond their value and consumerism has fucked us all in the end.
      I'm to blame as well I guess since I did buy a Asus G17 laptop with a 5900HX+RTX3070 combo, but that was after my GTX1070 died on me and RTX3070's were going for half the price of my G17. (made quite a bit back by mining on the side though)
      But yeah this consumerism and hype to literally sleep at retailers has gotten out of hand, don't get me wrong it's a fun experience with friends in a way I guess, but we are sending the likes of Nvidia and AMD the wrong signals here.
      Just my 2 cents

    • @tacticalcenter8658
      @tacticalcenter8658 3 ปีที่แล้ว +1

      @@dawienel1142 agreed. I bought a 1080ti for $300... What it was truly worth in 2020

  • @concinnus
    @concinnus 3 ปีที่แล้ว +33

    Remember when we hoped the 3080 Tie was gonna be on TSMC 7nm? Pepperidge Farm remembers.

    • @MrWarface1
      @MrWarface1 3 ปีที่แล้ว +1

      I remember when Nvidia went with samsung 8nm and beat tsmcs 7nm. I also remember the many generations Intel was destroying tsmc 7nm with 14nm.

    • @bgtubber
      @bgtubber 3 ปีที่แล้ว +9

      ​@@MrWarface1 Yea, they beat them at how power hungry, hot and massive in size (due to needing giant coolers) a product can be. 😂 I would take the product that's a few % slower, but much more power efficient, cooler and compact any day. BTW, what do you mean by "many generations Intel was destroying tsmc 7nm with 14nm"? As far as I know, there are just 2 generations of AMD CPUs on 7nm - Ryzen 3000 and Ryzen 5000. Intel didn't "destroy" any of those. Like WUT?? In the case with Ryzen 3000, Intel was 10% faster on average in games and much slower in productivity software. So I wouldn't use the term "destroy" in this case. Not to mention, the HEDT Ryzen 3000 chips (Threadripper) obliterated any Intel HEDT CPU at that time (and still does). And in the case of Ryzen 5000, AMD is a tad faster in games and, again, much faster in productivity workloads. So that makes only 1 generation of 7nm AMD CPUs (Ryzen 3000) in which Intel had any dominance and it was only in gaming. So overexaggerating much?

    • @MrWarface1
      @MrWarface1 3 ปีที่แล้ว

      @@bgtubber that's your argument? They are bigger and hotter.. lmao who cares what you would rather have. You aren't an enthusiast. Also, keep lying to yourself about ryzen. Userbenchmark says it all.

    • @bgtubber
      @bgtubber 3 ปีที่แล้ว +4

      ​@@MrWarface1 I'm not an enthusiast? Even though I have two RTX 3090s and a 32-core Threadripper 3970x with 64 GB of B-die RAM OC'ed to 3800 Mhz CL14 with tightened secondary and tertiary timings. Yep, I guess I'm not an enthusiast. 😂 And Userbenchmark? Really? Thanks for the laugh. Imagine using Userbenchmark to prove your point. UseLESSbenchmark are the laughingstock of the PC community. All reputable hardware reviewers have accused them of intentionally tailoring their benchmarking software in such a way that it will artificially boost Intel's numbers. Literally ANY other benchmarking software paints vastly different picture in terms of gaming and productivity performance for Intel vs AMD.

    • @MrWarface1
      @MrWarface1 3 ปีที่แล้ว

      @@bgtubber lmao you flexing to the wrong guy scrub. I will assume you lying about your imaginary setup. If not, I'll post another vid of the real goods. th-cam.com/video/YBdnB3BARTI/w-d-xo.html imagine claiming a benchmarking site is a shill because the product you fanboi for gets exposed. They are real world benchmarks scrub. Boilerplate benchmarking metrics. Unfortunately "influencers" don't control the environment so people can actually record their systems performance with proper cooling and overclocks.

  • @aadavanelangovan1630
    @aadavanelangovan1630 3 ปีที่แล้ว +8

    That F bomb was ripe lmaooooo. This was goood after a long day, Mr. Tom!

  • @tetrasnak7556
    @tetrasnak7556 3 ปีที่แล้ว +35

    Finally someone in my feed that is telling the trouth about gddr6X !

    • @bgtubber
      @bgtubber 3 ปีที่แล้ว

      What do you mean? Pretty much everybody knows that GDDR6X is blazing hot and cr@p at power consumption.

    • @nathangamble125
      @nathangamble125 3 ปีที่แล้ว +5

      @@bgtubber Pretty much everyone who's really into PC hardware knows it, but that's definitely not the majority of people.

    • @Krogoth512
      @Krogoth512 3 ปีที่แล้ว +2

      GDDR6X is the swan song of the GDDRx standard. The 32-bit bus is simply too limiting. You have to clock to the moon to match bandwidth that HBMx can easily obtain. HBM is the future for ultra-high bandwidth and memory capacities, however neither are really needed for gaming GPUs which is why HBM is only found in professional-tier SKUs that demand both. Fiji was proof of concept while Vega 56/64 was stuck in the unenviable position of trying to entertain professional market and gaming markets while AMD didn't have enough capital to make separate designs at the time

    • @olifyne6761
      @olifyne6761 3 ปีที่แล้ว +1

      More like gddrFailure

    • @tacticalcenter8658
      @tacticalcenter8658 3 ปีที่แล้ว

      @@Krogoth512 I bet you engineers could make gddr6x work efficiently. Just saying. Otherwise indeed the way its implemented currently is a failure. For nvidia, its marketing. X is always better and more expensive.

  • @ElusivityEtAl
    @ElusivityEtAl 3 ปีที่แล้ว +15

    IMO the big selling point for GDDR6X was that it was supposed to give the 3090 "1TB/s bandwidth".
    i.e. 936.2GB/s / 19.5gbps ( * 21gbps) = 1008GB/s.
    Obviously due to the horrible power and heat draw they underclocked it to 19.5gbps so it ended up with 936.2GB/s, at the cost of all the heat and power.

    • @gamtax
      @gamtax 3 ปีที่แล้ว +2

      Should be fitted with HBM2E if they want all the bandwidth though.

    • @nathangamble125
      @nathangamble125 3 ปีที่แล้ว

      @@gamtax Yeah, but Nvidia aren't going to use VRAM as expensive as HBM2E

    • @gamingmarcus
      @gamingmarcus 3 ปีที่แล้ว

      @@nathangamble125 Not that it would make any difference for a halo product like the 3090. You don't buy a Titan class card for price/performance but for performance.

  • @MacA60230
    @MacA60230 3 ปีที่แล้ว +17

    Well Hardware Unboxed that you referenced says the 6800XT *does not* perform like a 3080Ti in 4K. On average ~12% worse. That's around 10FPS in a lot of games at that resolution and it's definitely a noticeable difference. You can take any sample of 4/5 games to paint the narrative that you want. It's when you look at larger sample sizes like HU does that the truth comes out.
    Add to that:
    - better ray tracing performance (I know RDNA2 is not as bad as it looked at release in that department but it's still worse, sometimes *a lot* worse)
    - DLSS (even if FSR is great, right now DLSS is simply the better option because of adoption, and while Nvidia cards will get access to both, AMD cards will be FSR only)
    - better for productivity
    And suddenly the case for the 3080Ti doesn't look so bad. You said undervolting meant lowering it's performance but that's not even true, you can absolutely undervolt while keeping the same (or sometimes a bit better!) performance than stock.
    I don't think it's a good buy don't get me wrong, the 3080 is such a better price/perf, but imo you're not being fair. You're ignoring or downplaying what makes the Nvidia cards strong and laser focusing on their weak points (like power consumption).

    • @darkwolf1739
      @darkwolf1739 3 ปีที่แล้ว

      Theres also an argument to be made that hw unboxed swapped some games that ran better on nvidia hardware for games like godfall and dirt which runs like 30% slower on nvidia like wtf?

    • @darkwolf1739
      @darkwolf1739 3 ปีที่แล้ว

      I've noticed this channel really hates chewing on nvidia.
      Even hw is a bit nvidia hating since when you return the benchmarks nvidia cards even ran better at 1440p and 1080p!
      When looking at multiple sites that are genuinely considered good (so sites like that one which loves intels stick aren't included) and piling them into an average (like a meta-analysis would do) came to the conclusion that ampere runs better even at 1440p and 1080p (6800xt versus 3080 and 6700xt versus 3070). There is ofc the argument of overhead but I've seen plenty of people with say 5600x or 10th gen and above intel cpus which isnt all that uncommon to pair with ampere cards and should be what you'd pair with them.
      Nvidia have done some terrible things and will bend over games where they can (tbh any company in this position would imo, even the "best" giant companies turn around if they can) but they do have the superior cards if you have a decent cpu and power supply.

  • @LaDiables
    @LaDiables 3 ปีที่แล้ว +7

    8-pin CPU port is likely for rack server chassis (as is the blower style cooler).
    Much easier to manage one cable instead of 2 in a server.

  • @InternetSupervillain
    @InternetSupervillain 3 ปีที่แล้ว +1

    I like your review data delivery. I like how you conclude with general user experience between the card and it's contemporaries. Values on a bar chart only go so far after awhile.

    • @MooresLawIsDead
      @MooresLawIsDead  3 ปีที่แล้ว +1

      Agreed. I think real world differences need to be explored more unless you are a mega channel that can afford to run through tons of data.

  • @georgiospappas
    @georgiospappas 3 ปีที่แล้ว +5

    Your camera is incredible. The video quality of the video is so great

  • @patsk8872
    @patsk8872 3 ปีที่แล้ว +17

    "This ten thousand, seven hundred and fifty two..." DOLLAR? "...CUDA core..." Oh. Well, wait until it hits ebay.

    • @ian_dot_com
      @ian_dot_com 3 ปีที่แล้ว

      this is truly the worst timeline

  • @theemulationportal
    @theemulationportal 3 ปีที่แล้ว +6

    Really enjoyed this review, was a unique angle and it's near impossible to disagree with.

    • @ian_dot_com
      @ian_dot_com 3 ปีที่แล้ว +2

      my exact thoughts as well!

  • @twinsim
    @twinsim 3 ปีที่แล้ว +4

    As an A6000 owner we use it split the GPU into VM for work also. It actually was easier to get and because we can spit it between VMs that was a no brained because 16gb for 3 users.

    • @MooresLawIsDead
      @MooresLawIsDead  3 ปีที่แล้ว +1

      I liked the A6000 a lot. But I think an A4000 is the perfect one for me...

    • @RamkrishanYT
      @RamkrishanYT 3 ปีที่แล้ว

      What softwares are required for it and which hypervisor do you use?
      Does this require a license for Nvidia grid or something?

    • @twinsim
      @twinsim 3 ปีที่แล้ว +1

      @@RamkrishanYTVMware and license from Nvdia. All of which I wouldn't normally do but prices were so high and VRAM on the cards is so low. We use Omniverse for work and 12gb is standard for us.

    • @RamkrishanYT
      @RamkrishanYT 3 ปีที่แล้ว

      @@twinsim do you have any link to a youtube tutorial for it?

    • @twinsim
      @twinsim 3 ปีที่แล้ว

      @@RamkrishanYT umm no there what 100 A6000 in the world. I would get like five views.

  • @StyleAndSpeedx
    @StyleAndSpeedx 3 ปีที่แล้ว +9

    I play on an LG OLED 48 inch. That's where my Sapphire Nitro really shines in 4k.

    • @hammerheadcorvette4
      @hammerheadcorvette4 3 ปีที่แล้ว +6

      Sapphire is a damn good company.

    • @crimsonhoplite3445
      @crimsonhoplite3445 3 ปีที่แล้ว +5

      Same here brother. Nitro+ 6900XT OC. I'm usually always pegging that 120Mhz at 4K on just about any game.

  • @marka5968
    @marka5968 3 ปีที่แล้ว +17

    3090 GDDR6 also has lots of cooling problems. Any workload that heavily tasks the RAM will get the RAM to the throttling temp of 110. So, 3090 with all that RAM is pointless when it is constantly throttling whenever you try and use it. Almost every 3090 owner I've seen has had to do hacky cooling and underclocking. Otherwise, the performance is in par with 3080 and the memory chips will probably fail in a year or so running at 110. AMD has none of those problems!

    • @TheAmmoniacal
      @TheAmmoniacal 3 ปีที่แล้ว +4

      This is causing problems also in some professional workloads, where I often get computation errors and crashes that I believe is due to memory errors (it happens a lot less with a powerful fan pointed at the back of the card). I assumed the AIB model I had was just horribly built so I got the SUPRIM one, but no - same temps, same issue. And the A-series cards is not an option at all, the A6000 starts at $7000 USD here.

    • @Hennerbo
      @Hennerbo 3 ปีที่แล้ว +2

      @@TheAmmoniacal MSI cards, including Suprim, all have terrible thermal pads. I replaced mine on a 3080 ventus. VRAM went down by 20°C

    • @Miskatonik
      @Miskatonik 3 ปีที่แล้ว +5

      Memory temp problems are a disgrace, it's pretty annoying to pay top dollar prices and have to fix such extreme temperatures. My 3090 FE reached 110C memory junction while the core stayed in the 60Cs under demanding games (4K 120Hz), it's a stupid and severe design flaw (happens with almost all versions out there). I was going to watercool it anyway, so I bought a block and problem solved, memory remains on the 70Cs now. But many people will not be able or even aware of this memory temp problem, it's absolutely unacceptable that GPUs are being sold with a time ticking bomb in their memories. They will degrade and there will be problems. There should a collective claim against Nvidia for selling cards like this.

    • @simondale3980
      @simondale3980 3 ปีที่แล้ว +2

      Water cooling is the answer, it allows me to max out the memory overclock (+1500). Memory intensive applications have it running in the mid 90's still but only because the backside is still undercooled. Corsair make a block that cools the rear as well if that's still to hot for you. It's just a shame you need to go to this expense.

    • @Tom--Ace
      @Tom--Ace 3 ปีที่แล้ว

      Nonsense, my 3090 vram never goes above 80c under load. Even mining, it stays at 102 or below. Its all about how it's cooled

  • @KillerXtreme
    @KillerXtreme 3 ปีที่แล้ว +1

    Got my hands on the EVGA 3080ti FTW3 Ultra, and yea this thing is smoking HOT, if I let it run at 100% power it easily hits 80+c even on the lightest of tasks. Without overclocking.

  • @tinopimentel5734
    @tinopimentel5734 3 ปีที่แล้ว

    6:00 was the most legendary part of the review xD always great work keep it up!

  • @Dreadknight_Gaming
    @Dreadknight_Gaming 3 ปีที่แล้ว +10

    Norwegian ❄️ Winters
    Dad - Chop some wood, son. For the Hearth.
    Son - Nah Da, I put a 3080 Tie in there. We're set.
    Dad - It's good to see my son spend my money for the good the family.

    • @haukionkannel
      @haukionkannel 3 ปีที่แล้ว +2

      What they use in summer? 1030?
      ;)

  • @fVNzO
    @fVNzO 3 ปีที่แล้ว +11

    There it is, the actual titan ampere + pro drivers. Ohh and a 2-4x price increase off course.

  • @3polygons
    @3polygons 3 ปีที่แล้ว +1

    That A6000 is a wet dream for someone working with Blender... (I do so, and render by GPU (which is a ton faster than in CPU) with that crazy amount of VRAM....yay). Not that I'd hit the limit of memory with such (would be an interesting experiment... how to fill up 48GB for a render :D :D). And I guess, for how Davinci Resolve Studio (not the free, the 300 bucks one) uses heavily GPU... that'd be AMAZING for even 8k projects.
    24GB of the 3090 was already crazy...

  • @RyugaHidekiOrRyuzaki
    @RyugaHidekiOrRyuzaki 3 ปีที่แล้ว +2

    That reflective shroud is gorgeous.

  • @brandonbajc2084
    @brandonbajc2084 3 ปีที่แล้ว

    I've been waiting all week for this!!!!

  • @owenyang9216
    @owenyang9216 3 ปีที่แล้ว +12

    There might be an issue with 3080 ti... My 3090 consume ~35W according to HWiNFO64 at idle. My entire system idles at around 90W, so a 3080ti single-handily using 110W is very odd.

    • @emanuelmartins5268
      @emanuelmartins5268 3 ปีที่แล้ว +2

      i dont believe it too, he didnt show any proof of it

  • @javiej
    @javiej 3 ปีที่แล้ว +10

    And who the a6000 is for? ... me.
    And no, I'm not a gamer, and yes we also exist.
    The reasons are probably irrelevant for you, but for me they are everything. Being able to get support from Nvidia engineers directly, which I need often. The stability and peace of mind when working with clients over the shoulder. Being able to sleep well while my a6000 is rendering all over the night without a glitch. And the fact that even if she lives in my super overpacked wokstation she is still friendly with my other high tdp cards (video, 2nd gpu, fiber channel, quad NVMes, etc..

    • @miyagiryota9238
      @miyagiryota9238 3 ปีที่แล้ว +1

      Not for everyone

    • @SuperLotus
      @SuperLotus 3 ปีที่แล้ว

      It's for Big Chungus to brag on Reddit

    • @Outrider42
      @Outrider42 3 ปีที่แล้ว

      I know, right? I can't believe he just all Quadro owners play games on them. There are probably more Quadro owners who never game than do. And the ones that do...likely have a dedicated gaming PC for that.

  • @bretth6393
    @bretth6393 3 ปีที่แล้ว +9

    The thumbnail makes me feel like I've just walked down a dark alley and met a black market GPU dealer.

    • @bgtubber
      @bgtubber 3 ปีที่แล้ว +2

      What do you mean? Isn't this how people get their GPUs in the past few months?

  • @garytrawinski1843
    @garytrawinski1843 3 ปีที่แล้ว

    Tom, I liked your take on reviews. Great job!

  • @blueeyednick
    @blueeyednick 3 ปีที่แล้ว +4

    Damn son, you did it this time.

  • @luclucchesi7326
    @luclucchesi7326 3 ปีที่แล้ว +1

    In South Africa, I got a 3080ti for less than what I could get a 3080 for, well I couldn’t get a 3080 so I kinda took what I could get, I went with what ever gpu would perform best in the games I play, namely racing sims wity triple 1440p monitors and in that use case the nvidia cards destroy the amd cards. In mining I have managed to reduce power draw without lowering the hash rate. I’m getting 63mh/s with 55% power limit, so it ends up pulling 215w to achieve that hash rate and the card runs super cool, under 50° C. fully overclocked, the highest it gets to is 68°c the card is a Palit game rock 3080ti oc, very happy with its performance, first time using the brand

  • @TadgNokomis
    @TadgNokomis 3 ปีที่แล้ว +12

    "Wrong Polygons" spoken like a true tech enthusiast who is very knowledgable about "architectures" and not someone from the car industry feigning a higher level of tech literacy to grow a following.

  • @WeencieRants
    @WeencieRants 3 ปีที่แล้ว

    Another great review!
    You and Dan are going great work! Keep it up

  • @digitaltactics9234
    @digitaltactics9234 3 ปีที่แล้ว +2

    I love the rasterization perf AMD has & the lower wattage usage. But on the Nvidia cards that will be using OBS streaming with NVEnc encoding will also be the win for so many people.

  • @zuhairshahid2225
    @zuhairshahid2225 3 ปีที่แล้ว +1

    Thank you Brother
    for honest review .
    I have nvidia GPU so much power draw ..
    Now amd have ray tracing I am love to purchase amd and sell nvidia

  • @silvertree88
    @silvertree88 3 ปีที่แล้ว +26

    When a 580 uses less power while mining that an idle 3080ti, and only has a 25% lower hash rate.

    • @bgtubber
      @bgtubber 3 ปีที่แล้ว +7

      That's not true. I have two 3090s which are even more power hungry than a 3080 Ti and they both idle at 20-30W. I think that Tom used an OC profile with fixed frequency and voltage. My 3090s idle at ~100W only if I overclock them with fixed frequency which doesn't allow them to downclock and go into lower power states.

    • @suppar8066
      @suppar8066 3 ปีที่แล้ว +1

      @@bgtubber that actually sounds more realistic. Still it is weird that a 6800xt basically ist the same as a 3080ti :/ guess ist rtx 3060 or amd if you want more performance now. But i would be totally fine with a 3060 if i could get one for msrp :D

    • @mattiasnilsson3980
      @mattiasnilsson3980 3 ปีที่แล้ว +1

      Something is wrong with your 3080 ti it should not draw more then 20 watt.

  • @ItsJustVV
    @ItsJustVV 3 ปีที่แล้ว +4

    GJ Tom, this is how a small(-ish) independent review should look like. You point the good and the bad and you show real world scenarios and a lot of useful info on the side too, not just some graphs while also testing vs the competition. A lot of YT channels, big and small should do this.
    Even though HU and GN are the benchmarks for reviews, you school a lot of others how it should be done (looking at Linus, Jayz2c, Joker, even Redgamingtech and others).
    I'm also glad you don't shill for any company.

    • @tacticalcenter8658
      @tacticalcenter8658 3 ปีที่แล้ว

      And yet Linus schools even them in other aspects. His insight into the market and company is something a lot of other channels don't have. (I'm far from a Linus fanboy, I watch more technical channels).

  • @bigcazza5260
    @bigcazza5260 3 ปีที่แล้ว +25

    underclock the g6x to 16gbps and re run. would be a cool idea

    • @yusuffirdaus3900
      @yusuffirdaus3900 3 ปีที่แล้ว +1

      The way memory signalling work already different from standard GDDR6. Anyway nice to see

    • @lupintheiii3055
      @lupintheiii3055 3 ปีที่แล้ว +2

      To demonstrate how GDDR6X is useless even at same transfer speed?

    • @bigcazza5260
      @bigcazza5260 3 ปีที่แล้ว

      @@lupintheiii3055 yeah i wanted to see a timing difference basically

    • @lupintheiii3055
      @lupintheiii3055 3 ปีที่แล้ว +1

      @@bigcazza5260 Considering how bitchy GDDR6X is it will probably crash or trigger ecc and make the experiment null

    • @bigcazza5260
      @bigcazza5260 3 ปีที่แล้ว

      @@lupintheiii3055 stupid invention when samsung ie the dudes who make GA102 have 18gbps gddr6. micron must give good head either that or huang picks favorites

  • @mrinalbharadwaj6013
    @mrinalbharadwaj6013 3 ปีที่แล้ว

    i just loved your review it is so simple easy to understand than others please keep doing it !!!

  • @bills6093
    @bills6093 3 ปีที่แล้ว +1

    Power draw figures from Techpowerup are interesting. Idle: 3080Ti 16W 6800XT 29W Idle with V-sync 60HZ 3080ti 110W 6800XT 146W.

  • @LurkingLarper
    @LurkingLarper 3 ปีที่แล้ว +1

    Seeing as I had my friends PC crash just yesterday due to overheating during this horrible summer, I find it funny how some people still excuse high power usage components. As far as I'm concerned, power usage is the second most important metric after the performance by a country mile. It affects heat output and noise levels directly and those are quite crucial unless you have your PC hidden in the other room, lol.

    • @juliusfucik4011
      @juliusfucik4011 3 ปีที่แล้ว

      I have a 3090 and 5950x in a case. Running full load 24/7 and it is not loud or hot. The secret is to leave one panel open on your case 👍

    • @LurkingLarper
      @LurkingLarper 3 ปีที่แล้ว

      @@juliusfucik4011 yes indeed, and if you have the panels open, what's the point of the case? Also, you will heat up your room that way which is fun during winter and hell during summer. I had to remove my panels from Fractal define r5 case since it was suffocating older mid range Vega 56 and i7 4970k, lol. You couldn't get good airflow cases five years back but luckily today you can get them. I would strongly recommend getting something with proper airflow.

    • @TerraWare
      @TerraWare 3 ปีที่แล้ว

      @@juliusfucik4011 It doesn't matter if you leave one panel off or water cool the entire build with top of the line radiators and fans. The components will still output the same amount of heat and the same amount of heat will be exhausted into your surroundings. Some people care about the heat/power consumption and some don't that's fair. My game room is rather small so I can feel the temperature change when I play something demanding on my 5900x/6800xt which are both water cooled in an open air case.
      If I was running them air cooled in an enclosed case I'd still be dumping the same amount of heat in the room though because the components will generate the same level of heat regardless of how efficient my cooling solutions are.

    • @MLWJ1993
      @MLWJ1993 3 ปีที่แล้ว

      In that case, don't buy any current gen desktop part since power consumption shot up to >200w on both vendors. Maybe Intel will provide you with low power consumption GPUs?

    • @LurkingLarper
      @LurkingLarper 3 ปีที่แล้ว

      @@MLWJ1993 both Ampere and RDNA 2 are quite efficient architectures and even if NVIDIA is busy raping their customers wallets, their achievements on Samsung's inferior node compared to what AMD is using with TSMC is nothing short of impressive. The current gen just cannot be overclocked a lot, if at all, and if you do, you lose any semblance of the inherent efficiency the architecture had to start with. I would gladly get 3060Ti, 3070 or 6700XT on MSRP, but that's just not going to happen anytime soon and in few months time I don't want to buy old hardware and will be looking forward to next gen. The current market conditions made this gen dead to me.

  • @kyraiki80
    @kyraiki80 3 ปีที่แล้ว

    Very interesting. Missed the info about idle Temps before. That's insane

  • @bryantallen703
    @bryantallen703 3 ปีที่แล้ว

    Awesome vid. I love those cuts... The "TY" edit was epic...Shit was cool😎.

  • @WesternHypernormalization
    @WesternHypernormalization 3 ปีที่แล้ว +1

    Bannerlord benchmarks! Good man! =]
    10:19 Sweet Warband mod pogchamp

  • @rcavicchijr
    @rcavicchijr 3 ปีที่แล้ว

    As the owner of a 6800xt I can say I am extremely satisfied with this card. I have the aorus card running at a 2650Mhz O.C. with max memory speed and timings, and it barely hits 60° at load.

  • @flogjam
    @flogjam 3 ปีที่แล้ว +1

    RTX3090 SUprim X - idles at less than 27Watts while watching this video. The cooler is very good too. I have undervolted a little and it boosts higher - 2000Mhz at times.

  • @omarlinp
    @omarlinp 3 ปีที่แล้ว

    I'm am really surprised when I saw the thumbnail looking forward for this video.

  • @HuntaKiller91
    @HuntaKiller91 3 ปีที่แล้ว +4

    Im buying the 6600xt sometime later but i really wish 7700xt performs as good as the current 6800xt/6900xt
    Upgrading again then nxt year

    • @benjaminoechsli1941
      @benjaminoechsli1941 3 ปีที่แล้ว

      Tom is expecting anywhere from a 20-60% improvement over RDNA 2, so I think it's absolutely possible.

  • @andersjjensen
    @andersjjensen 3 ปีที่แล้ว +6

    I think Ruthless Looking Corporate Henchman bellowing "THE THIRTY EIGHTY TIE!" will haunt The Nvious One for a long time! :P

  • @davidgunther8428
    @davidgunther8428 3 ปีที่แล้ว +4

    Yeah, but could nVidia control the distribution on regular GDDR6? 😛

  • @bigdeagle1331
    @bigdeagle1331 3 ปีที่แล้ว

    Great job! You always benchmark right!

  • @recordatron
    @recordatron 3 ปีที่แล้ว

    I managed to get a 3070 back in November when prices were significantly better. I was still on the lookout for a higher end card but as you've pointed out in this video to be quite honest I haven't felt like I've been getting a 'compromised' gaming experience so what's the point? I like the fact it's using less energy than the top cards and with all the upscaling technologies emerging it feels a bit pointless having a screaming top end card. I do wish it had more Vram but I tend to upgrade my hardware semi regularly so it's less of a concern in that regard. Great video as always!

  • @Ang3lUki
    @Ang3lUki 3 ปีที่แล้ว +4

    They should come out with a GA102 with 16 gigs of GDDR6, maybe slightly cut down cores from the 3080, and far lower power limit. It might be efficient, question mark?

    • @pretentiousarrogance3614
      @pretentiousarrogance3614 3 ปีที่แล้ว

      you need a 256 bit bus or 512 bit one for 16gb something which ga102 doesn't have

    • @andersjjensen
      @andersjjensen 3 ปีที่แล้ว +1

      Let me explain his explanation: One memory module takes 32 bits of bus width. GA02 has a 384 bit bus (if not cut down) and 384/32 = 12. So 12 memory modules. GDDR6 and GDDR6X is available in 1GB, 2GB and 4GB modules. But all modules have to be of the same size. So on a 384 bit bus you can have 12GB, 24GB or 48GB total. On a 320 bit bus you can have 10GB, 20GB or 40GB total. You can do the math for the rest :P

  • @pauldubreuil7857
    @pauldubreuil7857 3 ปีที่แล้ว +2

    Respect everything this man says. However; I do have to say, I have a Zotac Trinity 3080ti, at load 330 W idle 21.0 W ... Performance is Superb! (despite zotac being perceived as *trash*)

  • @Kakaroti
    @Kakaroti 3 ปีที่แล้ว +1

    Even the baseline 6800 with 16GB is a steal for MSRP and will age nicely.

  • @commandertoothpick8284
    @commandertoothpick8284 3 ปีที่แล้ว +4

    The real question is when will sub 200-300 $ cards come out?

    • @haukionkannel
      @haukionkannel 3 ปีที่แล้ว +1

      It allready did… gt1030!
      :)

  • @zaidlacksalastname4905
    @zaidlacksalastname4905 3 ปีที่แล้ว

    New intro babyyyy

  • @thomasb1521
    @thomasb1521 3 ปีที่แล้ว +2

    That A6000 is the most beautiful card I have ever seen.

  • @Ang3lUki
    @Ang3lUki 3 ปีที่แล้ว +8

    After this generation, I wonder if the big halo products will use the current flavor of HBM

    • @tacticalcenter8658
      @tacticalcenter8658 3 ปีที่แล้ว +1

      Pro cards only and only if large customers demand it.

  • @exe16
    @exe16 3 ปีที่แล้ว +2

    1:50 the most premium cooler I've ever felt in my life is the reference Radeon 6800XT. Beats the crap outta the Red Devil 6900XT cooler that I have in terms of quality (though not in performance, obviously). Seriously: it's all metal, heavy AF, hides the PCB completely, looks absolutely gorgeous and DOESN'T SAG. It's almost witchcraft. I actually felt bad about sending that card to the mines after getting the 6900XT, it's just too beautiful.

  • @Pawel_Mrozek
    @Pawel_Mrozek 3 ปีที่แล้ว +4

    There is a lot of countries in the world where creators are not so wealthy like on the west and they need as much CUDA cores as they can gest for the reasonable money. It is highly unprobeable that I can earn enough more money with A6000 to make it obvious choice comparing to 3080ti. Despite that this power consumption is bothering me because pushing such card to the limits for long period of time was not what I was expected on my current setup with 750W power supply.

    • @xlinnaeus
      @xlinnaeus 3 ปีที่แล้ว

      Perhaps the 6800XT is the way to go then?

    • @Pawel_Mrozek
      @Pawel_Mrozek 3 ปีที่แล้ว

      @CaptainMcShotgun For me the price of 3090 is unacceptable

    • @Pawel_Mrozek
      @Pawel_Mrozek 3 ปีที่แล้ว

      @@xlinnaeus Fine but no cuda cores. I was a fan od Radeon obce but lets face the truth. This cards are for gaming or mining.

  • @gsrcrxsi
    @gsrcrxsi 3 ปีที่แล้ว

    The 8-pin CPU is a “standard” connector for GPUs in the enterprise space, which is exactly what the A6000 is aimed at and why it has that connector instead of the PCIe connectors. No adapter needed when used in a server as intended.

  • @EldaLuna
    @EldaLuna 3 ปีที่แล้ว +1

    reminds me of a vid kinda. where i made a comment about just the vram size in general cant remember where. but i once said how they got no problem doubling sizes in the datacenter side but struggle to even give us anything worth the price and honestly if they keep this up im retiring from gaming all together. one frosty day i get locked into console world once more.

  • @novajett
    @novajett 3 ปีที่แล้ว

    I have an evga 3070 ti ftw3 ultra and I'm in love with it. Super quiet and very cold. Dope shit

  • @DouglettDD
    @DouglettDD 3 ปีที่แล้ว +7

    ever since that one broken silicon i keep noticing him saying gdr6(x) instead of gddr6(x)

  • @Dennzer1
    @Dennzer1 3 ปีที่แล้ว +1

    Fantastic video in every way! Informative, incisive, and funny.

  • @overcoercion590
    @overcoercion590 3 ปีที่แล้ว

    Got a 3090 hybrid. Nice for keeping some heat out of my case but, 4k gaming is now a winter sport

  • @Wheel333
    @Wheel333 3 ปีที่แล้ว

    I like the new offset camera angle 🙏❤️🙏

  • @iComment87
    @iComment87 3 ปีที่แล้ว

    Great vid, bro.

  • @tjr3308
    @tjr3308 3 ปีที่แล้ว

    My 3080Ti used to idle at full boost until I uninstalled MSI afterburner, I overclocked it using the Nvidia Overlay scan, now it idles around 250MHZ @ 25 watts and boosts to 2040MHZ @425 watts in 3D mode.

    • @bgtubber
      @bgtubber 3 ปีที่แล้ว

      I have MSI Afterburner and my 3090s still idle at 20W (meaning they downclock properly). You must have set something in Afterburner wrong. Maybe you've had a fixed frequency overclock applied?

    • @tjr3308
      @tjr3308 3 ปีที่แล้ว

      @@bgtubber I tried it with a new install of afterburner (settings not saved from previous install) and before I'd touched any settings this weird boost behaviour at idle stared again. Hold on.. you've got two 3090s? What a legend!

    • @bgtubber
      @bgtubber 3 ปีที่แล้ว

      ​@@tjr3308 Latest version of Afterburner? Did you try clicking the Reset to defaults button? Pretty weird indeed. 🤔
      Yea, I got my 3090s at near MSRP just a few weeks before the scalpocalypse kicked in. I really dodged a bullet there. 😅

  • @TheDaNuker
    @TheDaNuker 3 ปีที่แล้ว +6

    I'm kinda annoyed about how bizarre the memory configurations are for Nvidia GPUs this generation. I'm looking to move up from 8 GB VRAM for Blender workloads (it's optimized for Optix/CUDA so an AMD GPU loses too much performance) and its like just bad choices all around. A 12/16 GB 3070 GDDR6 would be really ideal over a 2080Ti but instead we gotten a 8 TB 3070Ti. Sigh.

    • @Mopantsu
      @Mopantsu 3 ปีที่แล้ว +3

      The 3070 should have been at least 10gb out of the gate. Nvidia penny pinching again.

    • @TheDaNuker
      @TheDaNuker 3 ปีที่แล้ว

      @@Mopantsu That would make it the ideal budget 2080Ti alternative and would be a great deal but Nvidia just had to gank everyone's wallets...

    • @irridiastarfire
      @irridiastarfire 3 ปีที่แล้ว +1

      Yeah, I'm in the same boat. No good options for (high VRAM) Blender this generation except the 3090. I decided to get the 3090 (MSI Suprim) -- at 60% power cap it uses 15% less power (accounting for longer total render time + including CPU usage) for 4% longer render time. Haven't played with voltages yet. Idles at 40W which is fairly high IMO.

    • @VoldoronGaming
      @VoldoronGaming 3 ปีที่แล้ว

      @@Mopantsu Not really. Just clever marketing in the market segmentation.

  • @KingGiac
    @KingGiac 3 ปีที่แล้ว +8

    Awesome video. Just FYI your 3070 was artefacting at 13:17….think you may have pushed the oc too high. EDIT: no it wasnt!

    • @WhimsicalPacifist
      @WhimsicalPacifist 3 ปีที่แล้ว +1

      That made me panic. It had me thinking "It's not my 970's time! It's too early to die!"

    • @jmporkbob
      @jmporkbob 3 ปีที่แล้ว +6

      As someone who has actually played Metro, that is the effect of being in a radioactive area. Unless I'm just missing something.

    • @KingGiac
      @KingGiac 3 ปีที่แล้ว

      @@jmporkbob ah then disregard....It really looked like it!

  • @H1mS0L0st
    @H1mS0L0st 3 ปีที่แล้ว +1

    I actually want one of those 12gb 3060s. I'm hoping that someday I can find one close to MSRP.

    • @MLWJ1993
      @MLWJ1993 3 ปีที่แล้ว

      Just get a 3060Ti, the 3060 can't even use that VRAM without crippling under the load required to even use 10gb (4k maximum settings all textures put into VRAM in Doom Eternal which nets you a visual difference of 0%)...

  • @mattecrystal6403
    @mattecrystal6403 3 ปีที่แล้ว

    5:40 this is so good. IDK why reviewers don't do a all in one average of fps for all games tested. It help's a lot with the bigger picture.

    • @MLWJ1993
      @MLWJ1993 3 ปีที่แล้ว +1

      Ever watched hardware unboxed? Cause they do, without RT games the 6900XT is on top (the unlocked water-cooled behemoth). With RT (& DLSS) Nvidia is at the top of the charts until you get to like the RTX 3080. Of course that was before FSR was out, but only Godfall was in the games they tested so I doubt it would make a difference yet.

  • @mako0815
    @mako0815 3 ปีที่แล้ว +3

    part of the higher efficiency of A6000 compared to the TIE might also be binning 🤔

    • @MooresLawIsDead
      @MooresLawIsDead  3 ปีที่แล้ว +8

      Not this much of a difference - and remember the A6000 is the full die.

    • @timbermg
      @timbermg 3 ปีที่แล้ว

      @@MooresLawIsDead Undervolting a 3090 can save as much as 30% on compute applications. Pity that's windows only and with the chapstick thermal pads many AIOs seem to use, the operating spec is exceeded at 60-70% of TDP. Some shocking engineering shortcuts.

  • @LJ-uy9ru
    @LJ-uy9ru 3 ปีที่แล้ว +2

    I got the 3080 fe. I'm happy with it. No need for a 3080ti.

  • @altitudegamer2249
    @altitudegamer2249 3 ปีที่แล้ว +1

    Isn't 3080 ti my best choice for my LG C9 tv?

  • @Syntheticks
    @Syntheticks 3 ปีที่แล้ว

    Good vid, keep up the good work!

  • @s77lom49
    @s77lom49 3 ปีที่แล้ว +11

    The 3080ti is meant to be used in Syberia.

  • @brucethen
    @brucethen 3 ปีที่แล้ว +10

    The 3080Ti is for Eskimos, Alaskans and Siberians

  • @pauljones9150
    @pauljones9150 3 ปีที่แล้ว

    Good stuff. Take my like and comment

  • @RealDaveTheFreak
    @RealDaveTheFreak 3 ปีที่แล้ว

    Even though the 3080 mobile 16GB is but a 3070 Ti with GDDR6, it has enough VRam and, compared to the current scalper prices, is actually quite affordable and you get a whole PC as well. May be an okay option nowadays. (until desktop prices come back down to normal, then it'll be a slightly cut down 3070 Ti fpr 1k$)

  • @sfesfawfgasfhga
    @sfesfawfgasfhga 3 ปีที่แล้ว

    Back when I had a GV100, I could play Flight Simulator on a Win10 box over RDP, from a Windows 8.1 client, and the performance was absolutely flawless. Blew me away.
    And on idling, an idling GV100 didn't really add anything extra to power consumption from the wall socket. It wasn't really any different from any other newish consumer card -

  • @Flybyhacker
    @Flybyhacker 3 ปีที่แล้ว

    This review definitely the best hot take on 3080 Ti !

  • @bohomazdesign725
    @bohomazdesign725 3 ปีที่แล้ว

    Well, I got a 2x RTX A4000 setup and in my use case - Blender + DaVinci Resolve + UE4/5 - its plenty enough power and VRAM (16GB btw). Heck, tbf a 2x A4000 setup actually is somewhat on par or even outperforms in some cases a 3090 in pure rendering performance in Blender and the total cost of two A4000 cards was smaller than one 3090 AIB model.
    Having that said (if I wouldnt be forced to go NVidia for OPTIX / work) if I needed a GPU just for gaming I would go for a RX 6800XT, no fcks given what anyone says, cheaper, cooler, less noisy than a 3080Ti.

  • @Vladx35
    @Vladx35 3 ปีที่แล้ว +1

    Try to copy the new DLSS 2.2 dll module into the DLSS supported games. It seems to remove the ghosting and blur to virtually non existent.

  • @d00dEEE
    @d00dEEE 3 ปีที่แล้ว +1

    Radial fans are not inherently louder or quieter than axial fans. The noise level is just a matter of effort, designing proper blades for a given flow and pressure ratio. AMD's cheap radial fans ("blowers") on their reference cards typically have square leading and trailing edges, which are about as bad as you can get for noise.

    • @yusuffirdaus3900
      @yusuffirdaus3900 3 ปีที่แล้ว

      I would like to say impeler than radial fan

    • @d00dEEE
      @d00dEEE 3 ปีที่แล้ว

      @@yusuffirdaus3900 The impeller is the part a fan that rotates and moves the air; it can be either radial, axial or a hybrid (for examples of hybrids, see turbocharger or centrifugal water pump impellers).

  • @seasesh4073
    @seasesh4073 3 ปีที่แล้ว +1

    I love comedic Tom. More comedic Tom please

  • @Whizzer
    @Whizzer 3 ปีที่แล้ว +9

    The 3080 Ti is for nVidia of course. They're making bank selling it.

    • @SoftnappGAMEPLAY
      @SoftnappGAMEPLAY 3 ปีที่แล้ว

      Not really it was bad hollow product which not useful for creators,gamers or miners they just planned the cash grab out since the they didn't want to waste die on cheaper models nvidia making more money on mobile gpu rather than desktop gpus