Radeon HD 7000 Series with Hindsight: Great, or Disappointing?

แชร์
ฝัง
  • เผยแพร่เมื่อ 27 มิ.ย. 2024
  • The Radeon HD 7000 is a well respected series of graphics, which lasted for generations and that are remembered fondly. But in this video, I explain that they actually sucked, and lead to terrible things. Enjoy!
    I am sponsored by Gamer Supps- so if you're wanting energy drinks (or your own WAIFU) then you can help support me AND get 10% off your own supps here: gamersupps.gg?afmc=klik4waifu
    0:00 - Intro
    0:59 - Before the 7000 series
    4:16 - Radeon 7000 series
    7:27 - Geforce 600 Series
    9:47 - Birth of the Titan
    10:24 - Clash of the Titans
    11:45 - Overly Overclocking
    13:30 - Why efficiency?
    14:27 - Please don't hate me, here are the reasons why it was still good
    15:46 - Conclusion
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 396

  • @juanignacioaschura9437
    @juanignacioaschura9437 2 ปีที่แล้ว +289

    Actually, the 7000 moniker came full-circle two times:
    - ATI RADEON 7000 (R100): 2000
    - AMD RADEON HD7000 (GCN1.0): 2011/2012
    - AMD RADEON RX7000 (RDNA3): 2022

    • @ahreuwu
      @ahreuwu 2 ปีที่แล้ว +21

      can't wait for the amd radeon xl7000 (rgp3) in 2033

    • @DigitalJedi
      @DigitalJedi 2 ปีที่แล้ว +6

      I have one of each of the first 2. Hoping to switch away from Nvidia for the RX7000 series to go full circle.

    • @pauls4522
      @pauls4522 2 ปีที่แล้ว +4

      yup iI found that pattern too. Almost every 10 years AMD loops back to 7000 series.
      The original Radeon 7000 series, HD 7000 series, and now the upcoming 7000xt series.
      I have personally owned the awful 32mb ati radeon 7000. A card which I actually fondly remember because it was my first gpu in my first ever pc that was all mine and not shared with family. I did not learn it was a gimped card until years later. Gimped as in 64bit memory bus, so even 3DFX cards from late 1998/early 1999 could beat it despite having DDR memory instead of SDRam. For example my sister had a 16mb 3dfx voodoo 3500TV and a 1ghz athlon cpu and 384mb of ram. I had a 500mhz intel pentium 3, 768mb ram, and the 32mb ati radeon 7000.
      back then I thought he pc was faster because she had a drastically better process, and a much nicer monitor providing cleaner visuals, but later now I know her voodoo card was at least 30% faster than mine.
      I also owned an HD7970 and still fondly remember it as my workhorse gpu for 5.5 years, and would have kept using it if the card did not start dying around the 5 year mark leading me to replacing in february 2019.

    • @albal156
      @albal156 2 ปีที่แล้ว +3

      OMG 10 years a apart too lol.

  • @GamingRevenant
    @GamingRevenant 2 ปีที่แล้ว +139

    The AMD HD5870 is literally the GPU which powered the start of my channel, and for more than 10 years allowed me to record videos and play video games with little to no problems (on 1080p). I consider it the best GPU pricewise vs. its performance that I have ever had.
    When I went ahead to buy a HD 6870, the salesperson literally told me it was not worth upgrading because all AMD did was shift the previous generation along the model names down without really improving. I'm happy he was honest about it, and so it seems he was right.

    • @e-cap1239
      @e-cap1239 2 ปีที่แล้ว +10

      Actually, the 6870 is worse, as it has less shaders. But it was sold for nearly half the price. I personally also had an HD 5870 2GB and it's performance was quite good even if nowadays, it is comparable to a GTX 1050.

    • @raresmacovei8382
      @raresmacovei8382 2 ปีที่แล้ว +6

      But that's not actually correct. Performance was mostly the same, but the card was smaller, more efficient, cheaper, great at crossfire

    • @stevy2
      @stevy2 2 ปีที่แล้ว +2

      I had the HD 6950, unlocked to a 6970. That thing was a budget beast back in the day.

    • @Cheesemonk3h
      @Cheesemonk3h 2 ปีที่แล้ว +1

      @@stevy2 I used a 6950 up until like 2018, it was severely outdated by that point but i still used it to play overwatch at launch. shame how quickly that game got ruined by identity politics

    • @Annyumi_
      @Annyumi_ 2 ปีที่แล้ว

      @GamingRevenant you can get GTX 260 to start your 10 year channel since all your games you playing are DX9 so you can use that instead of dx11 gpu

  • @enricofermi3471
    @enricofermi3471 2 ปีที่แล้ว +315

    That performance-effiviency thing explained around seven and a half to eight minutes into the video seems to be in reverse right now, lol. nVidia is rolling out 350-450 watt furnaces (3090, 3090Ti) with more to come in the 4000 series, while AMD is keeping it within 300 watts, with a bit over that in their recent 6950XT.

    • @miknew20
      @miknew20 2 ปีที่แล้ว +43

      if current rumours are right, ada lovelace is gonna have 450w+tdps to compete. maybe even one 600w sku. if true, then history repeats itself in funny ways.

    • @12Music72
      @12Music72 2 ปีที่แล้ว +68

      @@2kliksphilip Because it came from a source who's known to be spot on with the rumours?

    • @PMARC14
      @PMARC14 2 ปีที่แล้ว +9

      @@2kliksphilip with the release of the next gen cards which I hope to see you review, we will be able to better tell. AMD's RDNA2 seems to have a great design being usable as an iGPU in devices such as a steam deck. They seem to also have advanced new designs with multi-chip gpu's like their ground-breaking ryzen cpu's. In comparison NVIDIA seems to be doing the exact thing AMD has/was been doing, being behind on these new technologies, they hope minor architecture revisions better nodes and overclocks will pull them forward for the moment. Nvidia is using a custom 4N node from TSMC for its next gen card, AMD seems likely too use a mix of 5nm and 6nm. Die size may matter less for AMD also due to mutli-chip design as compared too previous cards discussed. If AMD can even pull parity with nvidia with this, then they will have accomplished something similar to what Nvidia managed with the 600 series gpu's.

    • @Just4Games2011
      @Just4Games2011 2 ปีที่แล้ว +21

      @@2kliksphilip Because people who leaked them have very good track records of leaks. RDNA 3 is gonna be an MCM GPU with 3 dies, vs what nVidia will have as monolithic 4090. Moore's Law Is Dead has a good track record on these leaks becoming the reality, and he confirmed both Lovelace for 600W and RDNA 3 for around 400-450W.

    • @pyrophobia133
      @pyrophobia133 2 ปีที่แล้ว +14

      both sides take turns making power hungry cards

  • @rzarectz
    @rzarectz 2 ปีที่แล้ว +54

    Kliks your tech content is top notch. Keep it up!

  • @BudgetBuildsOfficial
    @BudgetBuildsOfficial 2 ปีที่แล้ว +226

    The HD7770 was my opening into the world of real PC gaming, coming out when both the PS4/XboxOne were incredible overpriced and offering comparable quality I still have plenty of fond memories (and my original, if slightly awful VTX Variant of the card)
    Really liked this little recap into the real world situations of this generation 👌

    • @HappySlappyFace
      @HappySlappyFace 2 ปีที่แล้ว +2

      Burger

    • @jaect
      @jaect 2 ปีที่แล้ว +4

      @@HappySlappyFace Pretty much same man, had a Gigabyte HD7970 as my starting card and it paved the road forward; fantastic stuff, lasted a damn while too.

    • @Forendel
      @Forendel 2 ปีที่แล้ว +1

      @@jaect im still using 7970 lol

    • @tuff_lover
      @tuff_lover 2 ปีที่แล้ว

      Eww. Did you overclock it, at least?

    • @HappySlappyFace
      @HappySlappyFace 2 ปีที่แล้ว +1

      @@tuff_lover why eww? I'm a 7970 user too and it's really holding up strong

  • @01chohan
    @01chohan 2 ปีที่แล้ว +59

    AMD putting 3GB of VRAM in the 7900 series made such a massive difference as the GPUs aged. Reminds me of the i5 2500k vs i7 2700k comparison

    • @Orcawhale1
      @Orcawhale1 2 ปีที่แล้ว +1

      Both of which are completely wrong.
      As the 7970 ran out of performance, long before it turned into vram problem.
      The same thing happend to the 2500k.

    • @AFourEyedGeek
      @AFourEyedGeek 2 ปีที่แล้ว +15

      Haha, no it didn't, the 3GB was an awesome decision.
      Check out '7970 gaming in 2020' TH-cam video. The card was doing really well for such an old card, helped the higher amount of RAM on it. My Vega 64 is still going strong thanks to the 8GB of RAM on it.

    • @Orcawhale1
      @Orcawhale1 2 ปีที่แล้ว +1

      @@AFourEyedGeek No, your Vega 64 is still going strong on account on the fact, that we've hit a point of diminishing returns in graphics.
      Which means that there's no longer the same push for increase in graphics.
      And im afraid your wrong, the 7970 does infact run out of performance, way before the vram buffer hits max.

    • @AFourEyedGeek
      @AFourEyedGeek 2 ปีที่แล้ว +8

      @@Orcawhale1 the video link I suggested was evidence to the contrary regarding the 7970, games using just under 3GB VRAM, and performing relatively well. Increasing texture sizes has a minimal hit on performance but eats into the VRAM though. Higher textures can allow games to look good though you'll have to lower other settings to maintain decent fps. Some settings eat into different areas of a system, tweaking settings for your specific system can be beneficial, even a mismatch with higher RAM than relative performance of the GPU can offer a reasonable trade off.

    • @pauls4522
      @pauls4522 2 ปีที่แล้ว +1

      @@Orcawhale1 incorrect dude.
      By the late 2010s in around 2016 the 3gb of vram was a limiting factor of the card more so than the raw performance.
      I have personally noticed this in plenty of games from around that era such as mirrors edge catalyst. When paired with the less commonly used 6gb variant, games from that era can have maxed out textures and still hit 1080P/60 just fine. With 3gb variant however you would see fps in the 20s if you attempted to max out the textures.

  • @mariuspuiu9555
    @mariuspuiu9555 2 ปีที่แล้ว +119

    I believe the 7000 series is considered legendary because it improved a lot with driver updates (aka AMD Finewine)

    • @RealMephres
      @RealMephres 2 ปีที่แล้ว +3

      AMD's Drivers are somehow very good cross-platform in almost every regard. Them unnecessarily increasing prices is annoying, but you got to give respect to their software department.

    • @Orcawhale1
      @Orcawhale1 2 ปีที่แล้ว

      AMD Finewine has never been a thing.
      It was simply down to the architecture being reused over and over again.

    • @mariuspuiu9555
      @mariuspuiu9555 2 ปีที่แล้ว +8

      @@Orcawhale1 It was a major thing early on with GCN. the driver updates gave that architecture major improvements over time (especially for those who bought the HD7000 and 200 series). for example, in Doom Eternal the GCN cards are at times twice as fast as Nvidia's 600/700 series (direct competitors at launch).
      the 7970's competitor should have been the 580, but it can beat the 780ti in some titles now.

    • @AFourEyedGeek
      @AFourEyedGeek 2 ปีที่แล้ว +7

      The AMD drivers kept improving performance for GCN, while NVIDIAs drivers hurt performance over time and then dropped support early.
      It probably does have a lot to do with AMD using the GCN for so long, however the outcome for 7000 series owners is that the GPU was improving over time.

    • @Orcawhale1
      @Orcawhale1 2 ปีที่แล้ว +1

      ​@@mariuspuiu9555 Doom Eternal is using the Vulkan, which itself is based on AMD Mantle, from 2013.
      So obviously GCN is going to perform better than Nvidia.
      What's more, the increased performance, was simply down to the fact that developers became more familar with AMD's GCN architecture.
      As the Ps4 and Xbox One both used AMD hardware.

  • @simpson6700
    @simpson6700 2 ปีที่แล้ว +25

    I feel like we need history videos like this whenever a new generation comes out. Reminding us of the Performance, price, die size, and power draw. I feel like nvidia and AMD went off the rails this gen with the shortage and i don't think we will ever go back to normality, because the shortage was profitable.

  • @mancavegamingandgardening9901
    @mancavegamingandgardening9901 2 ปีที่แล้ว +56

    The real winner? The 7950 owners who bought an 'underwhelming' card that was later BIOS patched to have a better clock speed, and THEN was one of the best crypto mining GPUS at the time so you could second-hand the card and pocket more than you paid for it. I think this series is the first time I saw a GPU appreciate in value.

    • @dylon4906
      @dylon4906 2 ปีที่แล้ว +3

      same kinda thing with my rx 580 where I sold it for more than twice I bought it for 2 years after I bought it

    • @Shiinamusiclyricssubs
      @Shiinamusiclyricssubs 2 ปีที่แล้ว

      @@dylon4906 same with my vega

    • @ProperlyPsychotic
      @ProperlyPsychotic 2 ปีที่แล้ว

      @@Shiinamusiclyricssubs well that's been the case for nearly every GPU that's remotely competent at crypto mining

  • @Peterscraps
    @Peterscraps 2 ปีที่แล้ว +20

    I got a 3090 because it was literally the only thing I could get my hands on, the same thing happened back in early 2016 when I got my r9 390X. It seems I got stuck in the upgrade cycle that has the worst power efficiency and market conditions. Don't get me fucking started on the abomination that is the 3090 ti, double the price with a new power supply with energy costs going through the roof.
    Nvidia is playing the market because they didn't last time, It's like seeing your drunk uncle piss away his inheritance at betfred

    • @veryfunnyguy4991
      @veryfunnyguy4991 2 ปีที่แล้ว +1

      “Another successful office blunt rotation. pass that shit homie. Actually Norman you're being kicked out of the circle. Oh you can't do this to me I started this sesh! You chief that shit for like 6 puffs before you pass you're out Norman. DO YOU KNOW MUCH MUCH I SPENT ON THIS WEED!? Couldn't be much dog this is mid af. Fuck these opps I'll buy my own weed. I would like to purchase your best loud. Yeah fella try this out it's called goblin gas. WHAT THE FUCK DID YOU PUT IN THIS SHIT!? AH HAHAHAHA FELLA I'M TWEAKIN!!!”

  • @thepcenthusiastchannel2300
    @thepcenthusiastchannel2300 2 ปีที่แล้ว +22

    It's actually easy to tell how much of the performance uplift going from a 6970 to a 7970 was due to the new architecture and how much was due to the die shrink. A die shrink allows you to pack more transistors and often allows you to achieve higher clocks. In this case, the 28nm didn't allow for that much of a clock speed improvement. Going from 880 MHz to 925MHz. The bump in ALUs "Shader Units" also wasn't that large going from 1536 to 2048 pushing the FP32 performance up from 2.7 TFlops to 3.8 TFlops.
    The number of ROPs (still the main performance determining factor) was the same at 32 ROPs for both. This mean't that the Pixel Fill Rate went from 28.2 GPixel/s to 29.6 GPixel/s yet the performance went up significantly between the two cards.
    This is because the Fill Rate efficiency went up, the Pixel Shader efficiency went up as well. The move AMD made was from VLIW to SIMD. So what we're seeing is a boost in efficiency predominently led by architectural efficiency improvements rather than node shrink improvements.
    However, GCN was mean't to service two markets at once and this was its downfall in the consumer market. It was mean't for the Professional Workstation Market as well as the Consumer Gaming Market. This mean't that GCN tended to be extremely powerful compute wise but lacked Pixel Fill Rate performance. nVIDIA, on the other hand, segmented their architectures. nVIDIA's gaming line had more ROPs than AMD's and less Compute performance. nVIDIA basically went the way of the ATI/AMD 4000/5000 series with Kepler and Maxwell. Less power usage, more gaming oriented.

  • @timothypattonjr.4270
    @timothypattonjr.4270 2 ปีที่แล้ว +7

    The 7970 aged better than expected because AMD continued to use GCN for far longer than Nvidia used Kepler.

    • @Loundsify
      @Loundsify 2 ปีที่แล้ว +1

      It also helped that a lot of game engines on console were built around GCN.

  • @FatSacks
    @FatSacks 2 ปีที่แล้ว +66

    I had a 7970 back in the day that I bought for only $330 a few months after it released! It was legitimately the best GPU I ever owned. I played Crysis 3 on it with pretty high settings, Bioshock 3, GTA 5, Wolfenstein, Far Cry 4 and lasted me until Witcher 3. My friend let me have his 780 and being blinded by the shiny new nvidia card I sold the 7970 on ebay for a cheeky $150. 2 months later the first mining craze struck and the card was suddenly worth $500+ AND the 780 turned out to be a piece of trash with how hot it ran and how loud the cooler was compared to the Sapphire cooler on the 7970 and to top it all off it wasn't even faster (I was also running an 8150 at the time so I think nvidia drivers sucked on bulldozer compared to the GCN stuff).
    After the 780 I went to the 980 (went to intel with the 7700k here) then the 1070, 1080, 2080ti (+9900k) and now these days I'm running a 3080, and it's ok I guess. RT is nice and the Founder's cooler looks sick and runs pretty cool, but I don't feel the same way I felt about the 7970. Maybe it's just me getting older and more jaded and expecting more out of my hardware.

    • @FatheredPuma81
      @FatheredPuma81 2 ปีที่แล้ว +3

      I'd guess that the reason for that is because it was the tail end of the Extreme Settingz era. Where you needed a top of the line card to max out a game at 1080p and get a solid 60 FPS and the start of a tech stagnation that was only really broken when the 10 series came out.
      I'm sure many people feel the same way about the 8800 Gxx cards actually.
      Having just upgraded from a GTX 1080 to a 6800 XT and then "upgraded" (in my eyes) to a RTX 2080 Ti I can see that only really noticed a different in games I don't really play. DLSS will likely make this much much worse when I upgrade from the 2080 Ti to a 4080/5080 or something.

    • @FatSacks
      @FatSacks 2 ปีที่แล้ว +4

      Man the 2080ti was honestly the worst card I've had in a while, it ran so much hotter and louder than my 1080. I don't think I would have went to it from a 6800xt.

    • @0Synergy
      @0Synergy 2 ปีที่แล้ว +3

      @@FatSacks I actually went from a GTX 1080 to a 6800XT. What a fucking difference lmao. Crazy card.

    • @FatheredPuma81
      @FatheredPuma81 2 ปีที่แล้ว +1

      @@FatSacks Tbh I just wanted to go back to playing Minecraft at a decent framerate and watching videos. Was actually aiming for a 3070 or 3080 but ended up finding it for $500.
      It does run hot though. With an OC and raised temps it gets to 84C :\.

    • @FatheredPuma81
      @FatheredPuma81 2 ปีที่แล้ว

      @@0Synergy The gaming performance jump is massive. If you only game and don't watch any videos (especially while gaming), don't care about RT, and don't care about OpenGL games (Minecraft) then it's a really good card. Especially if you can get one at MSRP.

  • @JakoZestoko
    @JakoZestoko 2 ปีที่แล้ว +14

    I am one of those people still running a 7770 GHz edition to this day. It only cost $100 CAD, which with today's prices is mind-bogglingly cheap for a mid-range card. I can still run CSGO at my desired graphics settings and consistently get at least 144 fps, so I'm happy with it. Definitely some of the best price/performance we'll ever see in this space I think.

    • @LucasCunhaRocha
      @LucasCunhaRocha 2 ปีที่แล้ว

      I had a 7700 ghz and later a 7950 from sapphire and they both died quite fast, dunno if it was just me being unlucky or a design problem.
      I am using a 980 now which was a massive upgrade.

    • @jasonjenkinson2049
      @jasonjenkinson2049 2 ปีที่แล้ว

      My first real card was a Sapphire HD 7790 2gb and boy did that change my world. Unfortunately I traded it for a gtx 760 three years ago, which used twice the power for a marginal improvement on performance. Luckily I know better now.

    • @lordwafflesthegreat
      @lordwafflesthegreat 2 ปีที่แล้ว

      I loved my 7770 GHz. Had it 'till 2017/2018. It had only 1 gb of VRAM, but could still run Watch_Dogs 1 and 2 with no issues.
      It was replaced very recently, when the 3060ti first dropped.

    • @goncalomoura9816
      @goncalomoura9816 2 ปีที่แล้ว

      @@jasonjenkinson2049 That was a terrible decision, nowadays people are finding out that an HD 7790 / R7 260X clearly outperforms a GTX 680 or GTX 770 in newer DX 12 and Vulkan titles while costing 4 times less and using half the power draw, but to be fair nobody knew that back in the day, this video is all wrong comparing early benchmarks, "tech experts" were advising people on a budget to buy the GTX 650 Ti or 750 Ti paired with i3 dual core CPU's over an FX 6300 + 260X. Nowadays those cards and CPU's cannot even launch the newest games, while the HD 7790 can still play Elden Ring, FH 5 or God of War smoothly at 1080p, and FX CPU's are outperforming older 2nd and 3rd gen intel ones.

    • @jasonjenkinson2049
      @jasonjenkinson2049 2 ปีที่แล้ว

      @@goncalomoura9816 😢

  • @TrueThanny
    @TrueThanny 2 ปีที่แล้ว +9

    Crossfire actually worked very well at the time. The number of games that didn't work well with it was very small.
    I had a pair of 7970's clocked at 1200MHz each, and that was faster than any nVidia card for years afterwards.
    As for efficiency, AMD is well ahead with RDNA 2, and is likely to get further ahead with RDNA 3.
    It's an error to conclude that the difference in power consumption is down to the node. The power efficiency difference between TSMC's 7nm node and Samsung's 8nm node is far lower than the efficiency difference between RDNA 2 and Ampere. Whether or not RDNA 3 takes the absolute performance crown, it's looking like it will utterly destroy nVidia in performance per watt. And that's with nVidia seemingly having a slight node advantage.
    But we'll see once we have actual data rather than just rumors.

  • @Lyajka
    @Lyajka 2 ปีที่แล้ว +2

    nice and informative video! looking forward to see FSR 2.0 in extra low resolutions!

  • @dklingen
    @dklingen 2 ปีที่แล้ว +4

    Great video and a nice perspective - sadly the new card performance matching prior generation card pricing days seems to be lost as we are hitting $2K for top tier (insanity).

  • @onurekinaydin
    @onurekinaydin 2 ปีที่แล้ว

    This was an amazing video, it put everything gpu related that we have been going through into perspective. I wasn't old enough to follow the gpu market back in 2011 so, thank you so much for this video!

  • @bmcreider
    @bmcreider ปีที่แล้ว

    Awesome video on my current rabbit hole of nostalgia I’ve ventured down.

  • @kiri101
    @kiri101 2 ปีที่แล้ว +11

    The HD 7850 2GB has aged like a fine wine, mine's still powering forwards with modded drivers on Windows and excellent out-of-the-box support on Linux.

    • @Orcawhale1
      @Orcawhale1 2 ปีที่แล้ว

      Then obviously it hasn't aged like fine wine.

    • @kiri101
      @kiri101 2 ปีที่แล้ว

      @@Orcawhale1 It received considerable performance improvements in its lifetime, especially in open source drivers, and still continues to benefit from increased functionality and performance with modded drivers. So yes, like fine wine.

  • @Trovosity-Entertainment
    @Trovosity-Entertainment 2 ปีที่แล้ว +1

    Love the information, love the music

  • @MarcinSzklany
    @MarcinSzklany 2 ปีที่แล้ว

    I love the hardware videos you do. Very insightful. Thanks!

  • @danielmadstv
    @danielmadstv 2 ปีที่แล้ว +1

    This was awesome and I really enjoyed the history. I got into gaming with the HD 6950 and honestly didn't know much about the exact history afterwards. This was super interesting and I hope you make more like this. I'm also eagerly awaiting your FSR 2.0 video! Can't wait to hear your take, that's the one I respect the most as you are my designated upscaling and AI expert. Thank you for your excellent work as always.

  • @akn8690
    @akn8690 2 ปีที่แล้ว +9

    Im both subscribed to this and kliksphilip channel but videos from this channel wont drop to my subscriptions page. I saw this in my home page. weird.
    Also I would like to see a video about why amd drivers were bad and are they still bad and should we pay more for nvidia cards for its better and more stable software support ( both dlss, nvidia broadcast kinda things and drivers.)

    • @unison_moody
      @unison_moody 2 ปีที่แล้ว +1

      Same

    • @gennoveus
      @gennoveus 2 ปีที่แล้ว

      The video was also hidden for me, too ...

  • @hellwire4582
    @hellwire4582 2 ปีที่แล้ว +11

    I got a 7950 as replacement for a 6670 in 2020. I was blown away when i could run red dead 2 at 1600x900 low settings with an oc'd i7 950 and a 7950 at 40 fps ^^ the game still looked totally amazing and felt smooth.

    • @FatSacks
      @FatSacks 2 ปีที่แล้ว +1

      that's badass

    • @Orcawhale1
      @Orcawhale1 2 ปีที่แล้ว

      Them im afraid, you don't know what smooth is.

    • @hellwire4582
      @hellwire4582 2 ปีที่แล้ว +7

      @@Orcawhale1 i know your brain is smooth

  • @ODST1I7
    @ODST1I7 2 ปีที่แล้ว +1

    I'll always remember the 7850 fondly. I used it for almost 6 years, even though I had to run most games at low by the end of its life. The price point was something else back then, I got an ASUS card for about $250 grouped with $100 in new games. Great video, Phillip.

  • @kobathor
    @kobathor 2 ปีที่แล้ว +5

    I loved my Sapphire HD 7970 Dual-X until 2016, when I got a Sapphire R9 FURY for $250 and sold the 7970. I still sometimes regret selling the 7970. I had treated it so well, having repasted it, replaced a dead fan, things like that. It even played a lot of DirectX "12" games ;)

  • @simonrazer8303
    @simonrazer8303 2 ปีที่แล้ว +4

    it felt like you said "in conclusion" 50 times in this video

    • @simonrazer8303
      @simonrazer8303 2 ปีที่แล้ว

      @@2kliksphilip Oh oke. Well, it was a good video no matter how 👍

  • @isaacweisberg3571
    @isaacweisberg3571 2 ปีที่แล้ว

    Philip, this was a great video which made me very nostalgic

  • @syncmonism
    @syncmonism 2 ปีที่แล้ว +8

    A lot of people don't understand just how much of Nvidia's performance advantage has had to do with having the advantage of games being better optimized for Nvidia hardware. People seem to like to attribute better performance per watt at the same node size as entirely dependent upon which GPU architecture is better, but that's not a reasonable assumption to make.
    In other words, Nvida's GPU engineers had an unfair advantage against AMD's

    • @cyjanek7818
      @cyjanek7818 2 ปีที่แล้ว +7

      ​@@huleyn135 I dont know if you dont belive him or if you call these practices shit but it is true that biggest one on the market manipulates it in every possible way.
      For example it was known problem that when laptop ryzen 4xxxx series came out and beat everything intel had people had trouble finding laptop with those cpus, beause many companies had deals with intel to take only their cpus. This is possibly the reason why amd has advantage program - many laptops with amd were some low end things that made bad impression.
      Same with optimizing games - it isnt some hot take that games work better with card from companies who helped, I think this problem got a lot of attention during nvidia hairworks and witcher 3 era.
      If its unfair or "it is what it is" depends on your view but I just wanted to clarify that it really happens.

    • @simpson6700
      @simpson6700 2 ปีที่แล้ว

      @@huleyn135 it's true that amd cards were better at pure computational tasks that weren't games and some games (codemaster's grid, dirt and later F1) reflected that performance too. But in the end you should buy your GPU based on what you get, not what could maybe be in the future if every game was optimized for AMD, based on a couple of outliers. Just like you shouldn't buy an Nvidia GPU on the premise that every game could be made with gameworks technologies.

    • @user-fs9mv8px1y
      @user-fs9mv8px1y 2 ปีที่แล้ว

      Honestly the only real advantage nvidia has is better raytracing performance as far as I know

  • @xTheMoegamer
    @xTheMoegamer 2 ปีที่แล้ว

    My PC gaming journey Started with the Radeon HD7950.. how the time flies! It's also still my backup if anything happens to my 1080.. it would still more or less run my most played games so I am happy

  • @0Blueaura
    @0Blueaura 2 ปีที่แล้ว

    the switch to 6000 series at 2:00 got me really good xD

  • @Squeaky_Ben
    @Squeaky_Ben 2 ปีที่แล้ว +2

    I remember building my very first gaming PC in 2011.
    8 GB of RAM, a 6 Core Phenom II and a HD 6970.
    All to play Crysis 2 and Battlefield 3 at maximum resolution.
    Those really were the days.

  • @beachbum111111
    @beachbum111111 2 ปีที่แล้ว

    The radeon 7950 was the first PC gaming card I had, previously I had a laptop with a Radeon 5650 that only lasted me 2 years, but that 7950 survived until this last year where it seems to crash for good, and man did it get me through alot.

  • @WangleLine
    @WangleLine 2 ปีที่แล้ว

    This was a really nice watch~

  • @CompatibilityMadness
    @CompatibilityMadness 2 ปีที่แล้ว +4

    Great video on interesting topic, but I have to ask :
    Performance used in table (@6:00), is based on Your test, time-of-release data/reviews, or on even later tests (done near the end of it's driver support, circa. 2020) ?
    Because this will impact GCN quite a lot (AMD drivers needed to mature before full performance could be seen). Same deal was with Navi and RDNA cards (Polaris/Fury to a lesser extent since they aren't as radical in changes vs. 7970). For example in GCN's 1.0 case, we only get good frame pacing (1% and 0.1%) after NV shows it's impact, and release of tools that actually measure it. Anyone who remembers that whole thing ?
    My 0.50$ about this topic : Multi-generation performance comparisons should always be made on PC with CPUs that have highest IPC available (for example today that would be Ryzen 5000 series or Alder Lake for GCN/Maxwell 2.0 GPUs).
    There is simply no way to effectively/accurately measure best cards, with era specific hardware (OC'ed or not).
    Furthermore, GCN was made to be more general compute focus architecture vs. Terascale, as it was meant to rival or at least contest with Nvidia's CUDA platform on super computers and other large scale compute projects (which can bring A LOT of money for more efficient compute designs). I think this is why efficiency on "pure graphics" on it, can be viewed as not as good as on previous generation cards (transistor budget was used to make it more versatile, instead of faster in specifc area). Simply put : It's not GPUs fault it's not as efficient in pure graphics as previous generations, it's simply how GPU market changed in few years since Terascale cards were released, that made it more profitable to leave some "game performance" behind for better overall product.

  • @KaaptnIglo
    @KaaptnIglo 2 ปีที่แล้ว

    There is definitely value in doing this :)
    was interesting and insightful indeed

  • @additivent
    @additivent 2 ปีที่แล้ว +91

    They seem to have aged well, but only failing now because of AMD's shoddy driver support. The 7770 was a legitimate budget option for me 2 years ago, if it wasn't for a friend offering me a 1060 3GB for cheap.

    • @user-ol3tf1qi6c
      @user-ol3tf1qi6c 2 ปีที่แล้ว +20

      Linux open source driver still & always will support your card with updates. (:

    • @Keullo-eFIN
      @Keullo-eFIN 2 ปีที่แล้ว +7

      Modified NimeZ drivers gives the older cards some extra life support.

    • @cyjanek7818
      @cyjanek7818 2 ปีที่แล้ว +8

      Wasnt it checked on Linus tech Tips chanel that older amd cards actually work with newer games (better driver support)?

    • @MandoMTL
      @MandoMTL 2 ปีที่แล้ว +5

      @@cyjanek7818 Fine Wine. Yep.

    • @conenubi701
      @conenubi701 2 ปีที่แล้ว +16

      "shoddy driver support". This is an ancient card, you can't reasonably expect this card to be supported in this day and age with drivers

  • @Maupa.
    @Maupa. 2 ปีที่แล้ว +1

    Hi 2kliksphilip, do you decide what videos will be visible in the Subscription feed? Because I don't see this video on mine and that's really annoying.

  • @GD-mt4pe
    @GD-mt4pe 2 ปีที่แล้ว +1

    I didn't get this video in my subscription feed, I saw it in my recommended 4 days later.

  • @demonabis
    @demonabis 2 ปีที่แล้ว

    My first card was a RADEON HD7850, paired with a FX-8320 and 8GB of RAM in 2012, many memories with that card, accompanied me for many years until it died and I got a 270. I clearly remember I was debating whether I should've got the 7850 or the 660Ti...

  • @tambarskelfir
    @tambarskelfir 2 ปีที่แล้ว +4

    It's a nice recap of the situation, but there's some context missing. At the time of the HD7000 series, Rory Read decided that there was no future in discreet GPUs and AMD wasn't going to compete in that market going forward. AMD under Read, genuinely believed that the future was APUs, and the GCN architecture was perfect for APUs. Good enough for graphics, also excellent for GPGPU. This perennial lack of a competing product from AMD in discreet GPUs was in no small way due to corporate policy, which didn't change until after Ryzen.

    • @beetheimmortal
      @beetheimmortal 2 ปีที่แล้ว +1

      That's absolutely insane. You just gotta love stupid decisions made by the leaders.

  • @Lenk9
    @Lenk9 2 ปีที่แล้ว

    it was only one year ago since i upgraded from my radeon 7800 to my current gtx 1080. you served me well radeon

  • @StingrayForLife
    @StingrayForLife 2 ปีที่แล้ว

    My HIS 7970 ghz edition just died a couple of weeks ago after almost a decade of service. So many memories!

  • @SciFiFactory
    @SciFiFactory 2 ปีที่แล้ว

    It would be great to have you on the Moores Law is Dead podcast!
    Do you know Tom? He can talk about GPUs for ages. That would be a stellar episode! :D

  • @TRHardware
    @TRHardware 2 ปีที่แล้ว

    Very entertaining video!

  • @aqueousdog
    @aqueousdog 2 ปีที่แล้ว

    Do you think you could do a video dedicated to the price creep of gpus? It's definitely something to see the price of high-end cards then vs now.

  • @DDRWakaLaka
    @DDRWakaLaka ปีที่แล้ว +2

    what music do you use in this? the songs are all great

  • @haven216
    @haven216 2 ปีที่แล้ว +1

    My first GPU that I ever owned was the HD 7730. For a little cheap GPU, it did quite well with the games I played at the time like Minecraft and Terraria.

  • @vAlcatr4z
    @vAlcatr4z 2 ปีที่แล้ว

    HD7870 was the first gpu I first owned, I can say that it's still the top in my gpu list. It ran many games at near max settings back then with no issues regarding fps until I started playing Rust which months later killed the gpu (image would freeze few minutes after launching any game). What a great legacy.

  • @henrik1743
    @henrik1743 2 ปีที่แล้ว

    I remember the max fan test of this card, it was nuts

  • @leonader9465
    @leonader9465 2 ปีที่แล้ว +2

    For some reason I was craving a 2kliksphilip video about hardware. lol Thanks.

  • @bakuhost
    @bakuhost 2 ปีที่แล้ว +2

    What HD7000 was 10 years ago? Can't believe it's that old.

  • @SaltyMaud
    @SaltyMaud 2 ปีที่แล้ว

    I've had good timing with my past GPU upgrades. 4890 in 2009, 7950 in 2013 and GTX1080 in 2016, they've all been great cards at a great time. Not so sure about picking up a RTX3070 in 2022, but since I managed to yoink one at MSRP, I just went with it, might not be the best GPU purchase I've made lately if RTX4000 is coming out soon and it's as crazy as it's said to be.

  • @JwFu
    @JwFu 2 ปีที่แล้ว +1

    Since i moved out (more then a decade ago) and had to pay energy bills my self i started to care more about efficiency.
    good video, somehow didn't popup in my sub tab.

  • @DCNion
    @DCNion 2 ปีที่แล้ว

    It's quite annoying that these videos don't show up in the subscriptions feed.. Luckily I caught it in the Home page.

  • @J0elPeters
    @J0elPeters 2 ปีที่แล้ว

    I still have my 7950 which I bought used in 2014. Such a fantastic card

  • @freedomofmotion
    @freedomofmotion 2 ปีที่แล้ว +5

    7850 was an overlooking monster. I had one that ran at 1.6ghz or something mad like thst. Silicone lottery for sure but damn that was a good card. MSI twin model

    • @Loundsify
      @Loundsify 2 ปีที่แล้ว

      So it was literally running twice the compute of the PS4.

  • @ianw9708
    @ianw9708 2 ปีที่แล้ว

    I bought my HD7970 GE second hand on 2014 (coming from a 7770), and I used it until it completely died on 2021. It even went through the oven a few times. Always a real trooper. I miss RAMDACs :(

  • @Venoxium
    @Venoxium 2 ปีที่แล้ว +2

    My first GPU was the Radeon 7790! Had an ASUS one and I remember arguing with myself if I was going to go with the cheaper 7770 GHZ Edition and buy a FX8320 or stick with the FX6300 and go with the 7790.

  • @ffwast
    @ffwast 2 ปีที่แล้ว +1

    [gently pats old 7950] "Don't worry buddy I still love you,a home theater pc is an important job you know"

  • @TiborSzarvas
    @TiborSzarvas 2 ปีที่แล้ว

    I remember my first ATI card after a decade of nVidia reign: a Club3D X800 RX in 2005. Then in 2008, my last VGA card was a Sapphire HD3850 with 256MBs, those were the times!
    I was one of the latest to change to PCI-E; around 2013-14 I got a Sapphire R7 250 1GB card which got me through till the end of 2018 when I purchased a 8GB RX580...
    I still have the rx580, it has no problems running even the newest games in 1080p (on medium). I was today years old when I learnt it consumes 330Watts which is way too much for my 550W power supply.
    Thanks for another entertaining and informational video, Philip.

    • @Loundsify
      @Loundsify 2 ปีที่แล้ว

      330w is total system power. The RX 580 uses 185w

    • @TiborSzarvas
      @TiborSzarvas 2 ปีที่แล้ว

      @@Loundsify well, thanks. So one still needs that much power in that "department" of the power supply. My previous supply was 550W also, but distributed less power to the card, and the PC kept freezing/restarting while gaming. Changing to a better brand solved this, but i'm still thinking maybe i haven't got enough wattage for the card or some other component. Idk, anyway thanks for the info.

  • @TylerL220
    @TylerL220 2 ปีที่แล้ว

    My first card was a 7850, paired with an i7 920. That system impressed the hell out of me for what it was.

  • @CometAurora
    @CometAurora 2 ปีที่แล้ว +1

    i used a 7950 for a long time, i got it in late 2015 so already pretty outdated with the GTX 1000 series just around the corner but it served me damn well until about 2021

  • @kasuraga
    @kasuraga 2 ปีที่แล้ว

    i rocked a crossfire hd 4830 setup for years. upgraded it after like 5 or 6 years to an hd7850 then to the gtx 970 after a few years. great cards for the price imo

  • @SyphistPrime
    @SyphistPrime 2 ปีที่แล้ว

    My fondest memory of this era of GPUs was one not posting in my PC so I had to go with a 6850. Makes me kinda sad because a 7000 series card would be usable for some light Linux gaming for much longer than the 6850 lasted. It was night and day though going from a 6450 to a 6850. I still have both of those GPUs to this day. The 6850 is unused, but the 6450 is a great display adapter for my server PC for if I need to hook up a display for a console output.

  • @parkerlreed
    @parkerlreed 2 ปีที่แล้ว +1

    I still have my RX 480 8 GB from 2016 and it's going great. For me that was the last of the great performers at a great price.

  • @unyu-cyberstorm64
    @unyu-cyberstorm64 ปีที่แล้ว +1

    My ATI Radeon HD 5870 runs Dreadnought at 1080P 60 at medium settings. It’s very nice, and is DX11 compatible

  • @NekBlyde
    @NekBlyde 2 ปีที่แล้ว

    This vid didn't appear anywhere in my subscription box, I only noticed it in my recommendations because I was watching your Iceland Day 2 video which DID appear in my sub box. I hope 2kliksphilip gets fixed. :'(

  • @benhinton5475
    @benhinton5475 2 ปีที่แล้ว +2

    The 7000 series was pretty fantastic for compute when it came out

  • @SlapsYourShit
    @SlapsYourShit 2 ปีที่แล้ว +6

    I think it's important to acknowledge that the 7000 series is literally the reason frametimes became a standard metric in framerate analysis, as it had horrendous stuttering on DX9 games unless they were specifically patched (like Skyrim).
    It's the reason I went NVIDIA with the 970 and never went back.

  • @lmchucho
    @lmchucho 2 ปีที่แล้ว

    Philip this coming generación, is the third 7000 generación from ATI. The very first Radeon card was the Radeon 7200 from the year 2000.

  • @PackardKotch
    @PackardKotch 2 ปีที่แล้ว

    I know this is a Radeon 7000 series video, but damn does 6000 series hold a special place in my heart. First dedicated graphics (in AIO pc) and first dedicated graphics in desktop pc (6770m and 6950 1gb respectively)

  • @pauls4522
    @pauls4522 2 ปีที่แล้ว

    I have seen multiple youtubers construct the same data and results about the 7k series, but I have to say your presentation was absolutely phenomenal.
    Fun Fact though, the HD 7970 was actually released in late December 2011 in relatively limited supply. Not that the extra week really matters much in retrospec.
    The og HD7970 was originally beaten by the GTX680, but with driver improvements throughout many years it actually beat nvidia card.
    There was also a less commonly talked about super-overkill-for-the-time 6gb variant of the HD7970. Many of the large youtubers in the rare case it is mentioned instantly dismiss it as not mattering because "DERP- the HD7970 is not powerful enough to fully utilize 6gb vram -DERP" without even going into the proper analysis of it. Which I find to be a shame. I personally know quite a few games such as mirrors edge catalyst on top of others released around the 2016/2017 which were bottlenecked by the 3gb vram buffer, which could have at least seen maximized textures and 60fps still if using the 6gb variant.
    I bought my HD7970 when its price dropped to 300$ and came with 3 free games in September of 2013. I used the card until February 2019, and the only reason I retired the card then was because the card was starting to see signs that it was dying such as no longer being able to overclock, a 1-2 games would have gpu driver crashes, and some games would see "occasional" artifacts that would go away with a game restart. Most of the time though the card still ran great, but I knew it was not sustainable, so I took advantage of the early 2019 mining crash to pick up a dirt cheap

  • @morwar_
    @morwar_ 2 ปีที่แล้ว

    Me and my brother had one of those.
    Good old times.

  • @prgnify
    @prgnify ปีที่แล้ว +1

    The thing about the HD7000 IMO was that it was a good buy for so much longer, with ATI/AMD 'rebadging' it with different tech. To me the 280x is the greatest card ever in terms of longevity and being 'fine wine', and if I'm not mistaken it was 'just' an HD7000 in disguise.
    Honest to god, I bought a vega 64 (the red devil one, cause of cooling) at lauch. For years I regretted it, since it was priced between the 1080 and 1080ti, and if I puckered up and paid the extra, the 1080ti would have been better - especially since AMD walked back on some features (I can't remember by name, but there was some pipeline/scheduler thing that was supposed to make vega the best thing ever [tm]). But nowadays, especially after seeing Hardware Unboxed videos, I'm fine with my decision - I'm still using it daily, never had any issues, HBM is still nice for OpenCL.

  • @Catzzye
    @Catzzye 2 ปีที่แล้ว +9

    Video idea: upscaling old futuristic graphics card box art

  • @fups1
    @fups1 2 ปีที่แล้ว

    Fantastic video! I have a lot of great memories with my old 7950! If only I held on to the millions of dogecoin I mined with it a little while longer...

  • @xGatoDelFuegox
    @xGatoDelFuegox ปีที่แล้ว

    Are nvidia's fab terminologies the same ones as amd? "28nm" for intel and amd are different

  • @TrevorLentz
    @TrevorLentz 2 ปีที่แล้ว

    I remember grabbing a used MSI Twin Frozr 7950, upgrading my rig from a brand new 6850. My 8320 FX series processor may have been a bottleneck, but it kept my main rig up to date enough to enjoy games at 1080p. I wish I had kept the eBay email receipts to help remember how much I spent back in those days. GCN 1.0 was actually really outstanding for used parts folks of the time. I never saw the 7950 as a disappointment. It was... good enough for me (a dude who only wanted to play games with my budget 8320 fx build at the time). Sure, AMD could have overclocked and made it less power efficient, but the market was a whole other world back then. I'll admit, a few years ago... I was, foolishly, a bit of an AMD fanboy back then, but I still could enjoy games without breaking the bank.
    Having a better job and more money now, I can afford to upgrade my PC way more frequently (I should say, a few of my PCs). When I was younger and working a minimum wage job, the inferior products at a reasonable price were a perfect step into a better generation of PC hardware.

  • @luckythewolf2856
    @luckythewolf2856 2 ปีที่แล้ว +2

    I’m pretty sure the 6970 was the real replacement for the 5870 because the 5970 was a dual gpu card and the 6990 was the successor to that. This means the naming was shifted up one tier of card.

    • @luckythewolf2856
      @luckythewolf2856 2 ปีที่แล้ว

      @@2kliksphilip Ah I see! Yes I agree hopefully the next generation can help us although I’m not super hopeful for the power consumption being great

  • @ocudagledam
    @ocudagledam 2 ปีที่แล้ว

    My experience with the 7000 series was that I bought a 7950 for a song exactly three years after it launched, kept it for 4 years, during which it fairly happily chewed through what I cared to throw at it at high details at 1080p and even after that I was still able to resell it (for half of what I originally paid for it) as it still noticeably outperformed the RX560, which was AMD's entry level solution at the time.
    Regarding the 3GB, I can tell you that it made a heck of a difference in AC Unity, which, even at 1080p, struggled on the 2GB cards unless the textures were set to low, and, considering how long I'd kept it, I'd wager that it wasn't the only game where I benefitted from the extra gig.
    BTW, I previously owned a Radeon HD 6950 2GB for another four years (that is, more or less, since that card had launched) and while one generation up doesn't seem like much, and in theory there wasn't that much of a difference in raw power, in practice it was enough to give me another 4 years of moderate gaming, so I would call the 7000 series quite successful.

  • @exmerion
    @exmerion 2 ปีที่แล้ว

    my 7850 1gb had 1 fan and ran incredible hot. When it hit 80c during the summer it would crash the display drivers.

  • @mulymule12
    @mulymule12 2 ปีที่แล้ว

    Ooo, I had 2 7850s in crossfire, some games struggled but otherwise lasted well up to 2019 without issues

  • @DogeCharger
    @DogeCharger 2 ปีที่แล้ว

    Ahhhh I remember my 7850 I had to turn on with a paper clip for the second power supply because I was using a lenovo prebuilt

  • @bjornroesbeke
    @bjornroesbeke 2 ปีที่แล้ว

    I used to have two HD7970's in Crossfire. It feels like yesterday that i installed them in my PC.
    I've since had a R9-290 and now a GTX1080.

  • @SweepiNetworks
    @SweepiNetworks 2 ปีที่แล้ว +2

    Sadly, the HD7970 only was on par with Nvidia’s second largest chip of the Kepler architecture (GK104), allowing Nvidia to sell the GK104 as an HighEnd tier card labled GTX 680, and use their most powerful Kepler chip (GK110) to establish a new “Enthusiast” tier with the GTX Titan (GK110).

  • @cavegamer5989
    @cavegamer5989 ปีที่แล้ว

    I had both a 6750 and a 6970 in 2015-17 they were very capable even then. Ran the new battlefront from 2015 at over 60fps so I was happy. Had a 7970 for a few days after that then just traded it for a gtx 1050 and the experience was just so much better.
    Now I've come full circle and have a rx 6600xt.
    Great card.

  • @Alexsandrosla
    @Alexsandrosla 2 ปีที่แล้ว

    I'm still bummed out that I had to buy a rx 560 as a stopgap because my 7870 died 9 months ago. Completely agree with you about prices becoming worse. payed the equivalent of 2 of my old gpus on a 2 generation old gpu

  • @ytszazu2gmail381
    @ytszazu2gmail381 2 ปีที่แล้ว

    Have a R5 240, HD 7750, HD 7770 and R9 270 as GCN 1.0 cards. Weirdly, my R9 270 failed first. The others are still working through. Still using the R5 240 as a family computer

  • @jelipebands1700
    @jelipebands1700 2 ปีที่แล้ว

    Got the msi twin frozr hd7950 at frys and man that card blew me away skyrim and battlefield 4 never looked and ran so good. Lasted me for a lot of years. I did upgrade to a rx480 because they were so cheap at release. The second mining boom hit and i sold my rx480 for more than double what i paid and put my hd7950 back into system. I will never forget that card…

  • @mohammedduty8905
    @mohammedduty8905 ปีที่แล้ว +1

    What is the highest monitor hertz?

  • @mordecaiepsilon
    @mordecaiepsilon 2 ปีที่แล้ว

    You should probably consider the significance of asynchronous compute in the GCN architecture and GCN's superior API support. You need to consider these limitations of the 600 and 700 series NVIDIA cards and how AMD's 7000 series and GCN pushed NVIDIA to improve their asynchronous compute for the 900 series. As you said, AMD's shiny new architecture powered a new generation of consoles that are still with us today. Otherwise, great video

  • @benedictjajo
    @benedictjajo 2 ปีที่แล้ว

    ah... will never forget my 2gb 7870 which I used for 7 years. It still works but getting its well deserved rest in the store room.

  • @YoRHaUnit2Babe
    @YoRHaUnit2Babe 2 ปีที่แล้ว

    I still was rocking one 1 year ago. but man it died to overheating

  • @subterficial
    @subterficial 2 ปีที่แล้ว

    I have really fond memories of the 7970. It would be great if RDNA 3 could put them ahead again just for a while.

  • @livingthedream915
    @livingthedream915 2 ปีที่แล้ว +1

    I find your comparisons to the Titan GPUs very difficult to stomach - almost no one had those GPUs to begin with and the PC gaming community at large ignored those products entirely. Halo products at the far high ends of product segments are usually overpriced to the point of absurdity by both gpu manufacturers (why buy a $500 7970 ghz when you can buy a $350 HD 7950 and OC to match the $500 product) and it was often the value proposition that caught gamers interest

  • @sarim5030
    @sarim5030 2 ปีที่แล้ว

    Well well well, do I see some AI upscaling here? really good video btw

  • @jonathunky
    @jonathunky 2 ปีที่แล้ว

    7950 is a great option if you're into Hackintosh, even to this day.
    I really like the fact it has two Mini DisplayPort 1.2 connectors, both supporting 4K/60/10 bit display, cheapest option on the market to do so.
    And it still supports Metal 2, OpenCL, DirectX 11.1, basically like a modern GPU but without raytracing.
    Plus, thanks to 3gb framebuffer, the card had plenty more life to it than 7870 and 670 and others.

  • @Keullo-eFIN
    @Keullo-eFIN 2 ปีที่แล้ว

    I have an Asus HD 7970 Matrix on my 2nd PC (Xeon X5650 @ 4.1GHz, 18GB RAM etc.) and with modified drivers, it's still way better than anyone would think about of a 10yr old card.