Why AMD Graphics Cards are Great, but I STILL got Nvidia.

แชร์
ฝัง
  • เผยแพร่เมื่อ 12 ก.ย. 2024

ความคิดเห็น • 1K

  • @Symba_Lysm
    @Symba_Lysm ปีที่แล้ว +90

    The reason you haven't seen much of an increase in davinci resolve support is because *propper* GPU acceleration requires you buy davinci for the full asking price. I experienced this when I first use davinci resolve!

    • @khaledm.1476
      @khaledm.1476 ปีที่แล้ว +22

      damn, I just use premiere pro that I got 100% legally

    • @caen-un9cw
      @caen-un9cw ปีที่แล้ว

      @@khaledm.1476 I should try looking into getting davinci resolve paid totally legally

    • @JunkBondTrader
      @JunkBondTrader ปีที่แล้ว +5

      or it requires you to pirate it. lol. 1337 :D arghh matey.

    • @diddymies
      @diddymies ปีที่แล้ว

      ​@@JunkBondTradertoo old version

    • @diddymies
      @diddymies ปีที่แล้ว

      ​@@khaledm.1476that sound that you got it illegaly

  • @Rudenbehr
    @Rudenbehr ปีที่แล้ว +8

    Gamers just want cheaper AMD GPUs so that they can get cheaper Nvidia cards. That's really all it is.

  • @AnkitKumar-lq1oh
    @AnkitKumar-lq1oh ปีที่แล้ว +263

    The thing is I like gaming performance of AMD gpus but i also do animation where nvidia gpus comes in handy. Amd should also focous on creative work not only gaming. Nice vedio and good to hear your opinion on the topic 👍

    • @jay_arson5075
      @jay_arson5075 ปีที่แล้ว +2

      3d animation or 2d?

    • @AnkitKumar-lq1oh
      @AnkitKumar-lq1oh ปีที่แล้ว +18

      @@jay_arson5075 i do 3d in blender and rt cores really help a loot in rendering

    • @deuswulf6193
      @deuswulf6193 ปีที่แล้ว +41

      @@AnkitKumar-lq1oh Yup, even the base model 3060 will run laps around the 7900xtx when it comes to blender renders. In a sane world, that should never happen.

    • @dontsupportrats4089
      @dontsupportrats4089 ปีที่แล้ว +5

      You know more about my wife's job than I do. She prefers nvidia for her work.

    • @nathanlarson6535
      @nathanlarson6535 ปีที่แล้ว +22

      if the majority of people who use blender own an NVIDIA GPU, what’s the financial incentive for specifically optimizing for AMD and intel arc users? this is the main obstacle that AMD has to overcome - the developers of these productivity applications simply don’t care, and that’s why NVIDIA basically has a monopoly in that sense.
      if fewer customers own an AMD GPU, there are less reasons to optimize for AMD, which leads to even less people buying AMD, which means there are even less reasons to optimize for AMD, etc. it’s the feedback loop that is killing the GPU market.

  • @testtube173
    @testtube173 ปีที่แล้ว +204

    Ray tracing has been the biggest marketing ploy of the last 10 years. It is barely functional in a majority of it's applications and the games that it does work well in are so few and far between it's just not worth the insane margins they are squeezing out of gamers with these 800-2000 dollar cards. All the development time that has gone into this technology could have been used to make rasterization even better but here we are struggling to get 60 fps in most games with RT.

    • @GregoryShtevensh
      @GregoryShtevensh ปีที่แล้ว +9

      Yea yeah... What ever helps you sleep at night mate

    • @arenzricodexd4409
      @arenzricodexd4409 ปีที่แล้ว +4

      No matter how good rasterization is some lighting effect are difficult or even outright impossible to mimic.

    • @mitchelldittes2555
      @mitchelldittes2555 ปีที่แล้ว +11

      I mean, nvidia doesn't ONLY have RTX to sweeten the deal. Include Reflex, DLSS, better drivers and software support (imo), and now a new addition, VSR, along with whatever they may have in the future, since new techs are constantly coming out now, and it becomes far more subjective. If u want those features, an extra 150 bucks on top of an already $1050 min card for all that is honestly p justifiable. This isn't to say it's all good value, this is to say that AMD has the same value issues with the gen of cards.

    • @GregoryShtevensh
      @GregoryShtevensh ปีที่แล้ว

      @@mitchelldittes2555 especially when you consider the bad power usage, vapour chambers, drivers bricking windows, people asking to swap their 7900 cards on marketplace, and the sales on the 4080 that bring the 4080 to the same price as the 7900xtx

    • @03chrisv
      @03chrisv ปีที่แล้ว +8

      We're still in the growing pains stage. RT is needed if we're going to push graphics further and further. RT is also so much easier for devs to use over rasterization, no more needing to pre bake lights after every build (which can take hours). This will give artists and designers more time to develop more interesting things rather than sitting idle waiting for the lights to bake. Once RT is the standard and hardware has caught up to where even a cheap laptop APU can run RT without issues it'll be fine.

  • @hendrixc6988
    @hendrixc6988 ปีที่แล้ว +28

    Went from gtx970 to a750. Threw the 970 in an hp prebuilt and it’s an emulation and multimedia pc for the living room. I knew intel would be kinda iffy at first so I got the a750 to hold me over until battlemage.

    • @vextakes
      @vextakes  ปีที่แล้ว +8

      Gl my dude, I’d love to try arc too

    • @ovemalmstrom7428
      @ovemalmstrom7428 ปีที่แล้ว +1

      This me like! 👍

    • @detroid89
      @detroid89 ปีที่แล้ว +3

      Respect for taking the risk on a shaky product like that. To be fair to Intel, they have come along way with the development and improvements of thier arc series.

    • @Max-fw3qy
      @Max-fw3qy ปีที่แล้ว

      Lol, what a DUMB decision.....

    • @hendrixc6988
      @hendrixc6988 ปีที่แล้ว +1

      @@Max-fw3qy Found the nvidia fanboy.

  • @witekborowski1410
    @witekborowski1410 ปีที่แล้ว +68

    I bought a new 6700 XT last summer, just as the prices settled down after the crypto crash. Back then it cost the same as 3060 Ti (at least here, in Poland), and I went with the red team just because of the VRAM capacity.

    • @Strongholdex
      @Strongholdex ปีที่แล้ว +9

      Good choice.

    • @Axeiaa
      @Axeiaa ปีที่แล้ว +22

      Even better choice nowadays as the 6700 XT price undercuts the 3060 TI (at least in the UK and NL).
      + Better raster performance
      + More VRAM
      + Cheaper
      - No CUDA acceleration (if you don't use programs that support it not really a con)
      - Worse raytracing support (although in this class of GPU raytracing gimps performance so hard you don't want it enabled anyhow)
      I had to pick between those two as well as that's within my budget and went for the 6700 XT - no regrets. It still entertains me that I can now play StarCraft 2 without the graphics card fans even spinning haha.

    • @evalangley3985
      @evalangley3985 ปีที่แล้ว

      Who is laughing now... you are my friend...

    • @idkwhatimdoin2298
      @idkwhatimdoin2298 ปีที่แล้ว

      Same, I built my first PC on January of this year, and I was in between the 3060 ti or the 6700XT, both were around the same price at the time of buying the components, $350 iirc, but since I really don't care about ray tracing and whatever Nvidia decides to focus on more with their gpus, I ended up going with the 6700XT, and boi do I love it, I can record and play games at the same time with no issues, the Adrenaline software helps a lot when I want to check my performance, and it gives me the option to record which comes in handy whenever I forget to open Outplayed (yes I use this to record my gameplay/highlights). I'm satisfied with the GPU and overall my PC

    • @thatguy3000
      @thatguy3000 ปีที่แล้ว +2

      Just built my PC January of this year and really wanted to go with the 6700 XT but it was like 200 USD more than 3060 Ti. Ended up going with the 3060 Ti. Still would because the performance increase isn't really worth it in my opinion. Most games still use 8GB of VRAM and those that do go over it still give a smooth experience. Just shitty games like TLOU part 1 are unstable but thank God I didn't buy that and actually checked the reviews. Also gonna go with AMD if possible next time, Nvidia really did us dirty by giving the 3060 Ti and 3070 only 8GB of VRAM. At this point, everyone who sold their 2080 Ti for the 3070 are just gonna feel scammed.

  • @Revoku
    @Revoku ปีที่แล้ว +51

    I brought AMD for the first time in years, because the potential for overclocking the 6800XT well above its pay grade looked great, I had heard all the AMD driver issues n problems and noticed most had workarounds etc(being a tinkerer none of this bothered me, so figured why not).
    got it, overclocked it, was not disapointed in the overclock at all(it performs better than anything I could have afforded nvidia side by a HUGE margin), but the first two weeks were a nightmare of working out crashes n bugs n issues, that everyone put down to driver issues, and had various work arounds for, such as turning off hybrid hibernate etc etc.
    turns out none of issues it had were driver issues at all, they could be reduced or removed by driver tweaks and software workarounds...but that only got rid of the symptoms not the actual problem.
    turns out the problem all along was the displayport cable quality/spec...cables that come with most monitors are version 1.2, and don't actually meet the 1.2 spec, on top of this, most cables you can buy don't actually meet the spec they were rated at.(unless you find a VESA certified cable)
    why is this a problem? refresh rate lock, if you have an out of spec cable, it can have problems locking to the refreshrate of your monitor correctly, it will lock to a random refreshrate such as 15hz even tho windows is showing 144hz, or it will fluctuate between certain points, 100hz-148hz for example.
    it also caused problems such as the screen being black after a shutdown(windows 11 by default does a hybrid sleep mode instead of an actual shutdown) and on wakeup if the cable unstableness caused an issue, you got a black screen OR stuttering on the desktop after "cold" boot when it had been off for awhile, and the only fix was a reboot, and if you forced a game while it was in this state before restarting, it would usually crash.
    how did I test and find this out? accidentally while checking my monitor for ghosting.
    using the UFO ghosting test site, it has a refresh rate indicator at the bottom.
    what does not having a stable refresh rate do? it causes random black screens, random game crashes, stutters and other issues.
    a new VESA certified 1.4a cable fixed all of these issues.
    the only two issues it didn't fix were video stuttering on a second monitor while playing a game on the primary high refreshrate monitor, and using AMD live recording with HEVC(both of these were known issues listed in the drivers, and got fixed in later driver versions)

    • @GhOsThPk
      @GhOsThPk ปีที่แล้ว +4

      I can confirm this, when i bought my 6900XT i also acquired an good quality HDMI cable, but some time ago i had to use an random poor quality cable for a week because reasons, that was when i faced my first problems with the card, which i had none up to that point.

    • @ricaeldocarmo5005
      @ricaeldocarmo5005 ปีที่แล้ว +7

      I always bought nvidia, but I bought a RX 6600 a while agora and had a BUNCH of bugs, crashes and stuff. I know there's workarounds and all, but fuck it, i bought a card and have to fix it before using it? Hell no. Sold it and bought another NVIDIA one. AMD never again.

    • @RafitoOoO
      @RafitoOoO ปีที่แล้ว +3

      ​@@ricaeldocarmo5005 típico batata que devia comprar um console ao invés de um PC.

    • @nathanlarson6535
      @nathanlarson6535 ปีที่แล้ว +14

      i feel like many people that complain about AMD “driver problems” tend to have a DP/HDMI cable that isn’t good, which leads to screen flickering and sometimes the screen turning black completely due to insufficient bandwidth through the cable in question. maybe not the majority of users who experience “driver problems” but i feel like some people just automatically blame AMD and blame the GPU itself because they don’t know what else could be wrong. it sucks that buying the right cable can be confusing but that’s just the way it is i guess. nothing is perfect.

    • @doraafelfedezoofisol
      @doraafelfedezoofisol ปีที่แล้ว +3

      ​@@RafitoOoO Nah people just wanna play a game on high fps and not fuck around with badly optimised hardwares

  • @tech6294
    @tech6294 ปีที่แล้ว +110

    I bought a 3060 Ti and I'm regretting the 8gb of vram for the price I paid. When AMD's 8000 series comes out I'm gonna switch over. Or if ARC Battlemage is really good I might try that as well. I've owned ATI/AMD cards before (9600, x700, 5870, 6990) and with Nvidia skimping VRAM I think it's high time I switched. We had 8gb back in 2015. Nvidia is just holding the market back just like Intel did with quad cores for a decade. I bought an AMD 2600, then the 5900x I have now because I was done with Intel. I'm also now DONE with Nvidia for a while.

    • @simon6658
      @simon6658 ปีที่แล้ว +8

      RX 7600M XT only has 8GB VRAM. That means Navi 33 GPU only has a 128-bit memory controller. 7600 XT, which should have a similar performance as the 3060 Ti, will also have 8GB VRAM. So there is no need to regret it. Do you remember AMD deleted their comments that VRAM is important in the 5500 XT advertisement and released the 64-bit 4GB 6500 XT? AMD doesn't care about VRAM. 7900 XTX and 7900 XT should compete with 4090 and 4080 so they have 24 and 20GB VRAM. But RDNA 3 is too slow to compete with Ada. AMD can only lower the price to 4070 level to sell them. That's where the VRAM advantage in AMD comes in.

    • @X862go
      @X862go ปีที่แล้ว

      What those also have 8gb 😅

    • @Varue
      @Varue ปีที่แล้ว

      I would be tempted by battlemage as well, but drivers may steer me away.

    • @enzomercier2789
      @enzomercier2789 ปีที่แล้ว +4

      Why would you be unhappy with 8Gb of VRAM on a 60 series card? Not like you're going to have the performance to play 4K AAA with it anyway even if you had the VRAM.
      I do agree that Nvidia is a bitch and should really put more VRAM in their GPU, especially 70 and 80 series, but let's be real, it doesn't make any difference in your specific case

    • @khaledm.1476
      @khaledm.1476 ปีที่แล้ว +1

      Ye I am not going intel till atleast Celestial because their drivers won't be as fully stable yet. If I rate AMD/Nvidia they're like an 9/10 stability while intel seems like a 3/10 even if it jumps to a 6/10 that's still alot lower.
      The problem is AMD isn't super competitive, like Vex said they're just matching nvidia. If you're matching nvidia and I am used to nvidia why would I switch to you? Give me a valid reason AMD: be super aggressive on pricing, create brand new technologies instead of following nvidia's lead, do something cool for once

  • @kushy_TV
    @kushy_TV ปีที่แล้ว +99

    I recently bought an RX 5700XT used for £150/$185 after being only on Nvidia cards exclusively for around 9 years, upgrading from my aging 980 Ti, and I'm glad I did upgrade. This thing is a beast and can actually play halo infinite at good fps unlike the 980 ti.

    • @idhamproaqw98
      @idhamproaqw98 ปีที่แล้ว +1

      Neat, one of my buddy got em for 125$ for one, no box tho

    • @Strongholdex
      @Strongholdex ปีที่แล้ว

      That is a screaming deal.

    • @simon6658
      @simon6658 ปีที่แล้ว +4

      5700 XT is $100 in Hong Kong now. It is a good deal. But I think 1080 Ti is better. It has 3GB more VRAM than 5700 XT. VRAM has been a big problem for games now.

    • @3s0t3r1c
      @3s0t3r1c ปีที่แล้ว +2

      @@simon6658 no it does not. LOL

    • @simon6658
      @simon6658 ปีที่แล้ว +6

      @@3s0t3r1c Then buy the 4GB 6500 XT and 8GB 3070😅.

  • @Retro_Mage
    @Retro_Mage ปีที่แล้ว +12

    Don't forget to use something like DDU to completely uninstall your drivers before installing your new drivers, even if you're still on the same Nvidia or AMD side, as it can still cause problems.

    • @MrKsan05
      @MrKsan05 ปีที่แล้ว +2

      Be careful if you're like me and don't know anything about DDU, I deleted everything even my motherboard drives and it took a while to get everything right again.

    • @Kevlord22
      @Kevlord22 9 หลายเดือนก่อน +1

      I usually format drive when i get a new gpu or even a cpu. I don't really keep anything on it anyway that i cannot download.

  • @heksogen4788
    @heksogen4788 3 หลายเดือนก่อน +3

    Its cute how AMD sponsored creators are pissing on their audience face, and the audience is like "oh what a refreshing rain".

  • @curtin_
    @curtin_ ปีที่แล้ว +61

    I’ve got a 6800xt and have 0 issues. I upgraded from a 2080 super and I couldn’t be happier. Honestly my biggest fear was the drivers but I was so wrong. Drivers have always been stable for me and with it having 16GB of VRAM I don’t have to worry about it not being able to run things at good fps and while looking amazing

    • @Strongholdex
      @Strongholdex ปีที่แล้ว

      Do you have stutters when you play a game in its first Minutes?

    • @nightdark2616
      @nightdark2616 ปีที่แล้ว +12

      ​@@Strongholdex AMD has this feature where if you play a new game, it will read and adjusts it's drivers to the game for exactly 10 minutes until it stabilizes and then runs fine. But only the first time you play a new game and for ten minutes. It is a feature, not a bug.

    • @bmwofboganville456
      @bmwofboganville456 ปีที่แล้ว +3

      The 2080 Super is just as good as the 6800xt if you use DLSS Quality...

    • @GrandoSilver
      @GrandoSilver ปีที่แล้ว +5

      ​@@bmwofboganville456so and 6800xt use FSR? Huh?

    • @s17benjamin
      @s17benjamin ปีที่แล้ว +8

      @@D3nsity_ I would take this optimisation over the current Nvidia Driver Update BS that they are doing the past few months, been really shit compared to AMD's side

  • @GDsergio
    @GDsergio ปีที่แล้ว +5

    i was on the same boat back in December, i was trying to choose between a 3080 or 6900 xt both new, but then a sudden price drop happened on the 6950 xt leaving it at 670$. I ended up taking the risk with AMD i couldn't be happier, seeing all the sudden vram issues popping up with new releases, I'm so happy with my 16gb of vram plus since I still don't care for raytraycing, i got myself an absolute beast of rasterization performance.

    • @ForEverTon78
      @ForEverTon78 ปีที่แล้ว

      Nvidia's 8gb cards won't last as long as AMD's 12gb ones. Not everyone is able to upgrade on a frequent basis. I've made the same choice as you did.

    • @ForEverTon78
      @ForEverTon78 ปีที่แล้ว

      @Thomas B I'm using my 10yo 750w Cooler Master PSU with the RX. Except for ray tracing theres no comparison in AMD's cost benefit I am really happy with my choice.

  • @BarisYener
    @BarisYener ปีที่แล้ว +2

    Davinci Resolve free version doesn't come with full CPU and GPU encoder/decoder support. You need the studio version for full Hardware acceleration. It's not a problem of the nvidia drivers.

  • @Akizurius
    @Akizurius ปีที่แล้ว +5

    I think buying Nvidia card right now is like paying extra just for Ray-Tracing and CUDA. I have been using Nvidia cards for years and I have been happy with them but with the current prices AMD just gives much better deal. I live in Poland and RTX 4080 is about 20% more expensive than RX 7900 XTX here at the moment and prices didn't change much in the last few weeks. Raster performance is roughly the same on those 2 cards. Paying 20% extra for ray-tracing and CUDA seems like a bad deal. Its not like RX 7900 XTX can't do ray-tracing either. I've seen benchmarks that show that its RT performance is still above RTX 3090 in most games. Performance in productivity apps varies but in some AMD wins and in others Nvidia is better so its very case by case thing.

    • @user-dh7sm9zh9e
      @user-dh7sm9zh9e 10 หลายเดือนก่อน

      New technologies are bound to improve and make it easier to run.
      Hating on RT and DSLL is just following a hivemind.
      I think nvidia is just taking a different path than amd and it's not yet worth getting for performance.
      Video game devs are lost cause tho can't defend them not finishing their games lol.

  • @michaelmaness5493
    @michaelmaness5493 ปีที่แล้ว +1

    Ball Caps are not to hide your hair, or lack of it. For those of us that are follicley challenged, it keep you warmer in the winter, and from having sunburn in the summer.

  • @go_gonu
    @go_gonu ปีที่แล้ว +12

    Actually Driver issue was huge for me. Used 6800XT for 3 months with all other parts brand new incl. 13600K CPU. All steam games were doing fine, it's just that it had issues with online games, precisely with Riot games which I play the most. It had random freezes and crashes and frame dropping. After looking it up, it was a common issue in the Korean computing community as well as a lot of forum post at reddit too. Lost ark is an another game that had the same issue while I was looking up.
    Fed up with the crashes and lags, so I jumped ship to 4070ti by selling 6800XT with 40% lost value, but the best decision I made imo...
    On paper YES that AMD has much better cost value, but practically it is what it is and there is a reason why they have such reputation..

    • @Clashy69
      @Clashy69 ปีที่แล้ว

      I wouldn't say they have a reputation for bad drivers or at least not anymore it's not how it was like 5-10yrs ago but now most of the time drivers are good and age pretty good and if there is a bad driver they at least acknowledge it and nvidia also has bad drivers literally most recent issues were high gpu/cpu usage when idle and then the Diablo 4 beta blowing up nvidia gpus

    • @arenzricodexd4409
      @arenzricodexd4409 ปีที่แล้ว +3

      Yes MMO genre is one that AMD have ignored for a very long time.

    • @go_gonu
      @go_gonu ปีที่แล้ว +7

      @@Clashy69 I was referring to my personal experiences and there were alot of people sharing the issue and this was this year.. and the ppl who shared the issue all within 2years period. My prob was immeadiately fixed when I switched to nvidia. So I can only judge from what I've seen.
      Radeon did work like a charm on Hogwarts Legacy and RDP2 but it just sucks with games that constantly updates like league. It is what it is mate.. I used both. Dont care what fanboys say

    • @Clashy69
      @Clashy69 ปีที่แล้ว

      @꺼뉴TV not a fanboy just stating the facts both got their pros and cons and is just a preference now

    • @parkersgarage4216
      @parkersgarage4216 ปีที่แล้ว +1

      I tried a 6650xt myself and had driver issues w it as well. The moment I switched back no more issues. Like you I tried and won't try again. I returned it and I bought a used 4080strix 4 1100. I'm looking forward to trying an Intel gpu. I wanna see how battlemage does.

  • @slavicodyssey
    @slavicodyssey ปีที่แล้ว +5

    I'm building a PC from scratch and the reason why I'm planning on getting a 4070 ti is because I also plan on doing productivity on my PC. Nvidia still does a better job than amd with that.
    The 4070 is also the most I want to spend on a GPU so stretching to a 4080 is out of the question. Also a lot of 30xx cards didn't budge in price so they are similarly priced to their 40xx counterpart here in Germany. Except the 3070/3070 ti but I want more than 8gb vram.

    • @mishea13
      @mishea13 ปีที่แล้ว

      I would research that card before you buy... many are saying Nvidia has screwed that card up and it is a poor performance. Or maybe it was the 4060? Cant remember right now..but I would still reserach this. Cheers.

    • @slavicodyssey
      @slavicodyssey ปีที่แล้ว +1

      @@mishea13 the 4060s were the mistake, 4070 probably is the only one that is "worth it" since it's a 3080 with 12gb vram, dlss 3 and better power efficiency. Plus it's not that big as far as new GPUs go.
      The 4070ti is fantastic in performance but spending almost 1k for a GPU with 12gb of vram didn't sit well with me. In the end I just bit the bullet and got the 4080.

  • @Accuaro
    @Accuaro ปีที่แล้ว +6

    I also use DaVinci and Premier with a 6900XT and it works very well. Steve from HUB also uses a 6900XT in his main machine with Premier. On the topic of what games I play it's very similar to you, Overwatch and some indie games. It's been great at throwing out frames with my 360hz monitor.

  • @pondracek
    @pondracek ปีที่แล้ว +3

    The only reason I upgraded past a 1080ti is VR.
    AMD didn't show up to play for like the first 4-5 years of VR existing. (CV1 & Index released during the GTX980 era.) So a lot of the older classics just work better on Team Green. Which, granted, is less of an issue with today's GPU hardware on yesteryear's VR titles. But for anything roomscale, I don't want a wire. And nVenc + Reflex (+Virtual Desktop) is basically uncontested for that use case.
    I'm surprised nobody else has mentioned it, but AMD's OpenGL performance on Windows is also attrocious. It's not a big deal for me, only really making a difference when emulating PS2 titles, but if I was into PS3 emulation or Switch emulation (specifically the community that has sprung up around Breath of the Wild), I would be hard on Team Green's side for that reason.

  • @BryanGarcia-qy4qg
    @BryanGarcia-qy4qg ปีที่แล้ว +12

    For me software wise I was having issues with my RX 6900 XT. In my situation I have three monitor, one of which is an ultra wide and the other 240hz (the third monitor was for Spotify/Discord). For some reason my graphics card acted weird and glitchy from time to time. I tried different sets of cables, reinstalled windows and upgrade to a top of the line SSD. Nothing solved this so I ended up gettint rid of it. Ended up buying an RTX 4070 TI and haven't had issues. Maybe I got unlucky with my AMD card. These days I just tell people to buy whatever works for them.

  • @pritamraya5295
    @pritamraya5295 ปีที่แล้ว +2

    As a blender user, I can't recommend enough for Nvidia GPU. Even if I get AMD at half the price of similar performing Nvidia, I'd still get Nvidia. AMD is way out of league in 3d production.

  • @Astravall
    @Astravall ปีที่แล้ว +3

    Well CUDA and other proprietary stuff like G-Sync and DLSS are the reasons i refuse to buy Nvidia again (and i did for a long time since the Riva TNT). It's not like AMD cards would be bad at productivity apps, but Nvidia managed to stuff CUDA in that market and for what ever reason a lot of productivity apps refuse to provide OpenCL support.
    I want manufactures compete on good products not on locking out competition with proprietary features.

    • @arenzricodexd4409
      @arenzricodexd4409 ปีที่แล้ว

      Open source is double edge sword. Just look what happen with OpenGL before. OpenGL end up being behind DirectX in development because OpenGL needs to satisfy everyone on the consortium. In fact Khronos end up separating Vulkan altogether from OpenGL to avoid the API being held back by professionals that want their open source API to conformant with their legacy stuff. OpenCL as a compute API will be more important in professional space. In this space time is money. So some IHV cannot let themselves held back by API development that being slowed down just because Khronos looking for a solution that can satisfy everyone. So intel end up going with their OneAPI, Apple with Metal and AMD with their ROCm.

  • @markusbronst7942
    @markusbronst7942 ปีที่แล้ว +2

    For professional video editing 4K and higher there is currently only the 4090 option due to 4080 having 16 GB VRAM. YT channel Tech Notice is covering topics for content creators. He did review the 7900 XTX and found that there have been significant improvements, but in certain effects used, it’s not there yet. He did review the 4070 ti as well… VRAM limitation.
    When it comes to blender rendering, Nvidia cards are better. But big studios more likely use Nvidia and AMD Pro Lines for those type of work. Big studios however have render server farms that can be utilized. Nvidia has the ADA 6000, which has 48 GB VRAM and sells for about 6’800 $.
    What AMD does better than Nvidia is one centralized application for everything whereby Nvidia has an outdated look, requires 2 applications plus and forces to sign up online. I’ve used myself the screen recording via AMD catalyst and it works very well. I don’t use OBC anymore, which initial setup isn’t most user friendly for people who use it the first time. A YT channel that used Nvidia historically tried AMD and was positively surprised concerning streaming and desktop recordings.

  • @Flamingo4280
    @Flamingo4280 ปีที่แล้ว +15

    I just recently upgraded from a GTX 1080 to an RX 6800xt, and I found that it was an incredibly smooth transition! I had considered buying an Nvidia card, but in my market, even a 3070ti was selling for more than a 6800xt, so I would have to be crazy to buy Nvidia. This thing is a beast and it's particularly strong against the older 1080 in the games I like to play the most right now(Forza Horizon 5, RDR2, Monster Hunter World). I'm not sure what the situation was like at launch, but I'm confident that anyone buying AMD right now would not have any issues. Honestly, I had way more driver issues with my old RTX 3060 laptop, so Nvidia definitely isn't perfect either

    • @adammartin4906
      @adammartin4906 ปีที่แล้ว

      don't listen to these fanboys like who cares about a creator card AMD is for gamers Nvidia screws gamers over FACT and ray tracing is in the gimmicky stage and will be for good 10 years so once again AMD is really the only choice right now unless you're so rich you can buy anything but back to the real world were people have budgets it becomes so clear AMD GPUs are the clear choice here period

  • @9myke
    @9myke ปีที่แล้ว +1

    Can you do a mix of b-roll (when you have some) gameplay and face cam. Absolutely loving the content by the way

  • @Gatitasecsii
    @Gatitasecsii ปีที่แล้ว +3

    What are you even talking about. The Witcher 3 footage you showed was Day and Night (1 and 2 respectively) in regards to ray tracing. what is wrong with you.

  • @BertieJasokie
    @BertieJasokie ปีที่แล้ว +1

    You need to keep a good balance of video and narrative input, not too long on the face and no excessive lingering on shots that are sometimes irrelevant.

  • @Ivan-pr7ku
    @Ivan-pr7ku ปีที่แล้ว +5

    My last ATi/AMD GPU was Radeon HD 5870 that I bought back in 2009, just a month after its release -- it only reminds me how much more affordable was the high-end hardware back then. Anyway, two years later I got a really good deal for a used GTX570 by Palit only to end up with burned VRM due to faulty vapor chamber couple of years later (no more Palit for me), but I stuck with NV hardware for various reasons to this very day with my RTX4080 (which was the first brand new GPU since that Radeon 5870, btw), although for my needs I could have switched to AMD. We are all creatures of habit, I guess.

    • @generalawareness101
      @generalawareness101 ปีที่แล้ว

      My last AMD card was the 7870 and it had 1 bad fan that allowed it to cook the GPU (before throttling was a thing) and it was funky so I had to upgrade as it was locking me up, plus doing weird stuff. I purchased a 1060 6GB and still using it, but now is the time in my upgrade cycle and these prices are insane. I do use CUDA, and I mean I use CUDA heavily for AI so I feel like I am Jensen's beoytch. I am leaning towards a 7900XTX, but rocm is not released to allow this gen to work. We need 5.5.0 while 5.4.3 is the last version released (a ton of pissed off people on their github). I wish I could find what are the numbers for training and for generating of a 6700xt (for instance) so I could then sort of know where a 7900XTX would fall, but no dice. I think the AMD stigma with no CUDA everyone is on Nvidia since everything just works hassle free with Nvidia for AI/ML like Stable Diffusion and Kohya trainer.

  • @chamorro81
    @chamorro81 ปีที่แล้ว +2

    Good for you bro. I say get what works for you. I picked up the XFX 6800 XT merc and i’m super happy about it. It’s crushing all of my games at ultra wide 1440p with no issues at all. Games and hardware nowadays are amazing.

  • @bartdierickx4630
    @bartdierickx4630 ปีที่แล้ว +3

    During the 770 era and 1070 era AMD was competitive in the mid range. If I remember correctly both 280x (which competed with the 770) and vega 56 (which competed with 1070) were more performant for less money.

  • @TheLordNugget
    @TheLordNugget ปีที่แล้ว +5

    I currently have a GeForce GTX 1080 TI. It's a solid performer still, but I am looking at what options there will be for upgrading. I'm likely going to be looking at the next generations of cards. Hopefully the cost comes back down to earth a bit. NVidia has been anti-consumer for a while. Their actions have me wanting to move away from their gpus. Intel was a little disappointing. Looks like the hype for their next generation is starting to bubble up. Most of my decision on what card to go with will come down to what everything costs and their respective performance. I don't give a rat's azz about ray tracing so NVidia has no advantage with me.

  • @derschnuff8819
    @derschnuff8819 ปีที่แล้ว +6

    i would love to have a AMD card - but i have to admit, Nvidia GPUs have simply better and reliable support for applications. Thats also the reason why i would stay with the green ones, although the AMD cards have more bang for the buck. They need to catch up. 6800 XT would be a great pick - but besides its out of my possibilites to buy them atm - AMD is just not very good at Blender Rendering and the company does not have a history of continous support for such applications.

    • @evalangley3985
      @evalangley3985 ปีที่แล้ว

      That's such a load of crap... reliable my ass... My EVGA 1080 FTW was RMA twice in 10 months. I sold the third sample to a sucker like you that drink the coolaid...
      th-cam.com/video/kxoXbfzP5BU/w-d-xo.html
      th-cam.com/video/or7njUlYTLI/w-d-xo.html

    • @Fedreal_Bureau_Of_Investigaton
      @Fedreal_Bureau_Of_Investigaton ปีที่แล้ว

      blender rendering? just switching to linux you half your rendering time

  • @dazeen9591
    @dazeen9591 ปีที่แล้ว +2

    I hope people keep buying Nvidia instead so AMD prices have to keep dropping.

  • @erazzed
    @erazzed ปีที่แล้ว +3

    This is a pretty interesting topic and I feel like a lot of it has to do with feature economy. Nvidia products are highly driven by features (e.g. G-Sync, RT, DLSS, Reflex, NVENC etc.) and a wide range of industry support for productive use. While Nvidia is innovating and engineering new features AMD is trying to catch up most of the time and even if AMD follows up with their own implementation it's almost always slightly inferiour to Nvidia's implementation (image quality in DLSS vs. FSR, performance in RT, ...).
    For me, personally, this creates some kind of fear of missing out (FOMO): What if AMD's implementation is worse than Nvidia's in a new game or what if AMD's implementation is just not supported by this game? What if Nvidia invents a new feature that they make available for older cards as well (Nvidia hasn't in the past, but... what if they do the next time?)? And then there's also the fact that Nvidia is just better at working with/paying game devs to make them implement their "awesome new feature" (e.g. Reflex for competitive games). Maybe it sounds crazy, but these are the considerations that almost always drive me back into Nvidia's arms. I hate it and I would love to switch to AMD, but... there's always this "but, ..." that makes me hesitate and question whether AMD would be a good choice or not.

    • @Adri9570
      @Adri9570 10 หลายเดือนก่อน

      Seriously, it is about time for some tech youtuber to do a clear whole list/comparison about the set of features that you miss when you chose AMD or Nvidia. It would make waaaay easier to be safe switching from one brand to the other and it would promote the best brand with something more than "mere intuition" or just "fear of missing".

  • @primary7208
    @primary7208 ปีที่แล้ว +1

    I swapped from a 3080ti to a 6950xt and it’s insane how much better it is for gaming, im getting 200+ fps on 1440p in mw2 and 300+ on apex

  • @musek5048
    @musek5048 ปีที่แล้ว +5

    i took a gamble on a couple of used "not working" 6800xt's recently, got them for $200 each and after ddu wiping all my drivers and installing the latest ones they are both working nicely with a slight undervolt to keep temps down a tad. the seller said that they would crash under load or stress testing which made me wonder if he was using older drivers or something. for $500 though brand new they still seem a bit pricey but thats the market lately.

    • @adammartin4906
      @adammartin4906 ปีที่แล้ว

      they need the thermal paste replaced...common issue on Radeons

    • @musek5048
      @musek5048 ปีที่แล้ว +1

      @@adammartin4906 i plan to do that soon but i've been just running these cards to see if they run into crashing issues at all but so far they're chugging along nicely. I do plan to also replace the thermal pads once I crack them both open.

  • @patiszejuicebox
    @patiszejuicebox ปีที่แล้ว

    I had the opportunity to try out newer AMD a short while ago, and my experience was all over the place. For better context, I game, stream, photo edit, and video edit.
    I upgraded from an i7 8700k to an i9 12900K, and wanted a more powerful GPU, as I want to have a card that can do both 1080p and 1440p, which my 2060 at the time could not do. Bought a RX 6800 Non-XT for $500.
    Newer games: it was fine. Anything that's DX12 or Vulkan ran pretty smooth. Older games: inconsistent. Anything I ran that was DX11 or older varied with issues, from weird artifacting to completely unplayable due to constant stutters, frame drops, and freezes.
    With streaming, the AMF encoders have gotten a lot better with the latest OBS updates, but I found that it dropped frames a bit more frequently compared to NVENC and Quick Sync. As a result, I primarily stuck with Quick Sync.
    Unfortunately wasn't able to test it with Davinci Resolve when I had it.
    After 3 weeks of owning, I returned the 6800, and bought a RTX 3070. I'm glad I got to try AMD, and really wished it worked out for me, but the small headaches topping each other becoming one bigger headache was the nail in the coffin for me. Personally very hard for me to recommend AMD with my personal experience.

  • @LiquidSnake690
    @LiquidSnake690 ปีที่แล้ว +5

    I decided to get an RX 7900 XT over the RTX 4070 Ti despite the RTX 4070 Ti having better ray tracing performance and DLSS because I figured I would benefit more from the RX 7900 XT's 20 gigs of vram in future games.

    • @weirddylan7721
      @weirddylan7721 ปีที่แล้ว +1

      And FSR 3.0 will come out this year

  • @neverenoughguitars8276
    @neverenoughguitars8276 ปีที่แล้ว +1

    I chose Nvidia because of iray for Daz Studio. AMD doesn't have any 3d acceleration in that app sadly so Nvidia is the only option.

  • @NiceSk1ll3r
    @NiceSk1ll3r ปีที่แล้ว +9

    Used 1060 6GB since 2017 and this year upgraded to new 6800 XT (wanted to be safe and have warranty). Been using it for more than a month now and haven't encountered any major problems. Little nitpicks here and there, but nothing so serious that I consider returning it.

    • @thatguy3000
      @thatguy3000 ปีที่แล้ว

      I mean, the pretty much fixed all the issues. AMD takes like 2 or so years to fix their drivers on the cards. Look at the 6700 XT for example, it used to go toe to toe against the 3060 Ti but can now surpass the 3070 in raw performance (Not being VRAM limited) in every game. You just bought the GPU at a nice time is all. The 7000 series are going through issues of their own now though.

  • @wackynwild980
    @wackynwild980 ปีที่แล้ว

    Like the content Vex, a video randomly popped up in my recommendations when I was watching pc videos, definitely surprised you're channel isn't bigger. You got a sub brother. Yeah I want an AMD card but I want ray tracing too and I chase the most powerful stuff on the market... So nvidia will likely always be my go too. Ray Tracing cores man, I've always wanted a pc that could run that shit and now it's just becoming a thing were we can. Edit: I will say I'm more of a Team Red fan then I ever was tho cuz I got a laptop that uses an AMD GPU and I absolutely fucking love it. Like if you're going the laptop route, imo nvidia can't touch AMD. Nvidia 4000 series are like $2600 and up for a base fucking card.

  • @Saturn2888
    @Saturn2888 ปีที่แล้ว +4

    It's strange to hear you talk about "maybe AMD will gain marketshare" because they were one of the big 3 when I was a kid, and NVIDIA couldn't hold a candle to AMD's (ATi) video quality. Windows was visibly blurry with NVIDIA. Games too because their AF was bilinear or something. From my opinion, AMD is still a power house, but NVIDIA suddenly got the leg-up when I wasn't looking.

    • @roshawn1111
      @roshawn1111 ปีที่แล้ว

      more like that bought a bunch of devs

    • @-Ice_Cold-
      @-Ice_Cold- ปีที่แล้ว +1

      AMD can't gain market share, when everybody is buying NVIDIA cards

    • @Saturn2888
      @Saturn2888 ปีที่แล้ว

      @@-Ice_Cold- and now they have competition from Intel. It's bad today, but next gen is where they can really shine. NVIDIA is running into die size issues.

  • @MichaelHolmgaard
    @MichaelHolmgaard ปีที่แล้ว +2

    The driver issues with AMD these days just turn me off team red completely. I tried a 7900xt because of Vram and it was just a complete mess. I will rather pay more for an Nvidia GPU that I know will do the job without technical issues. It's plug-in-play and that is important at such prices

    • @mhahn
      @mhahn ปีที่แล้ว

      Can you provide more details about the driver issues you had? Specific examples?

  • @gradient602
    @gradient602 ปีที่แล้ว +13

    I think I want to make the switch to AMD as well. I’ve had nothing but Nvidia on 4 different current rigs and been using them since my 650 Ti. It’s just that Nvidia isn’t enticing anymore, too much money what you get, especially with vram needs increasing.

    • @ThisIsLau
      @ThisIsLau ปีที่แล้ว

      Yeah, the vram thing is one of the biggest things to consider. Looking at the leaks, nvidia is not going to give good amounts of vram

    • @Wobbothe3rd
      @Wobbothe3rd ปีที่แล้ว

      Lol, yeah right. Suuuure.

  • @dampintellect
    @dampintellect ปีที่แล้ว +2

    If you do get an AMD card for your system, remember to remove the Nvidia drivers preferably with DDU and put the correct AMD drivers on. Or you might run into weirdness.

  • @RashakantBhattachana
    @RashakantBhattachana ปีที่แล้ว +3

    अच्छा नहीं! अब और नहीं! मुझे सुंदर महिला चाहिए!

    • @IAMNOTRANA
      @IAMNOTRANA ปีที่แล้ว

      my man why you want bobs and vagana in a fuckin' tech channel comment section lmao

  • @exsinner
    @exsinner ปีที่แล้ว +1

    I went from 1060 6gb to 3080 ti to now 4090, I just have no faith with amd. I worked with numerous amount of laptops and when it has graphic problem, 80% of the time its amd's gpu. From bluescreening with all oem/amd drivers, to ULPS bug where the laptop took so long to boot even though it is using ssd and the absent of amd gpu driver when it is paired with intel igpu because in the past amd refused to provide support for that configuration. Recently 5000 series ryzen cpu having bios issue where the battery can only charge up to 99%. Some oem already provided bios update while some still havent. There is more but these are the only thing that comes to mind.

  • @Nighthearted
    @Nighthearted ปีที่แล้ว +4

    My reason to go with Nvidia is Emulation. AMD made an effort for OpenGL, but still has a long way to go. I think it's a bit of a gray area to them since they make consoles' hardware, so they don't mess with emulation support much. It's a shame AMD won't even try on that front.

    • @raresmacovei8382
      @raresmacovei8382 ปีที่แล้ว

      What are you even using OpenGL in emulation? XEMU? Citra? Meh.
      And OpenGL AMD's performance is now great.

  • @Tiber234
    @Tiber234 3 หลายเดือนก่อน

    This was a very interesting little discussion - thanks for this vid, Im late to this party but im at THAT point - I literally have an Nvidia 4070S and an AMD 7900 GRE next to me (both still boxed) As im upgrading from my 2060S I "naturally" flowed towards Nvidia but having had a very positive experience transitioning from an Intel i7 processor to an AMD in 2020 (still using my very capable Ryzen7 3700X) and also seeing the VRAM and price disparity btwn said GPU's I did my due diligence and trawled TH-cams review/comparison vids (including a number of yours) - im now 80/20 for keeping the AMD although still a bit concerned about having issues. I agree with you - its definitely good to be open to new alternatives (budget allowing). And if I do, I hope that in 2028 I look back with satisfaction at my option to go with AMD

  • @justaguy6216
    @justaguy6216 ปีที่แล้ว +6

    It was US$50 for you it was US$130 for me so I went with the 6800xt. Later on the price fell by a further $170 but the Nvidia only fell by another $70.
    So I do have a tiny bit of buyers remorse but not as much as if I went with Nvidia.
    I'm more than happy with it.

  • @hendriksaldua8208
    @hendriksaldua8208 ปีที่แล้ว +1

    we both think the same bro. I like amd.The hardware is good but the software is so laid back. their FSR 3.0 is still in development and their ray tracing is not on par with nvidia.

  • @NipahAllDay
    @NipahAllDay ปีที่แล้ว +4

    My first gpu that I bought was a 1060 3GB. I didn’t know it was a bad GPU at the time, but it was dead on arrival anyway. I just decided to get a 580 and began becoming an AMD fan. It was probably the best outcome, tbh. I did try a 3070 as my temporary gpu until I got a 7900 xtx, but it wasn’t that impressive. I think Nvidia is overrated for gaming, and even on some productivity uses.

    • @-Ice_Cold-
      @-Ice_Cold- ปีที่แล้ว

      I agree, my RX 580 8GB is rocking, when at the same price i could afford only 1650

  • @wanotai91
    @wanotai91 ปีที่แล้ว +1

    amd is actually more expensive in my country cause nvidia cards are everywhere while amd cards are very hard to find.... no point going amd if they are more expensive

  • @deuswulf6193
    @deuswulf6193 ปีที่แล้ว +3

    Developers want to use RT (aka path tracing) because it makes their job a billion times easier. The only issue we have right now is hardware related. RT is very important though, just as important as the shift to PBR textures which happened awhile back.

    • @TheUAProdigy
      @TheUAProdigy ปีที่แล้ว +1

      It does not make our job easier at all. Have you ever coded anything in c?

    • @deuswulf6193
      @deuswulf6193 ปีที่แล้ว +1

      @@TheUAProdigy Are you under the illusion that every game has its engine being written from scratch with every new project? That 3rd party engine's don't exist? That level designers and the art team are all writing code in C++?
      C'mon now.
      On that note, go watch the interview Gordon gave over on PCWorld titled "First Details About Path Tracing In Cyberpunk 2077".
      Then follow the trail...

  • @kuteo9314
    @kuteo9314 ปีที่แล้ว

    I have seen some guys complaining their pc wont boot with rx6600/6600xt/6700 on leddit and facebook. Their solution after tweeking bios settings blindly was switching to nvidia cards, and it bloody works. I always wanted to recommend good value stuff to my friends, but that problem was the deal breaker. It's not 100% , but it might happen, and my friends don't want to spend too much time troubleshooting. I still don't know what was wrong

  • @Shubhamrawat364
    @Shubhamrawat364 ปีที่แล้ว +3

    AMD fanboys are absolutely unbearable, apple fanboys are unpopular for blindly over paying but at least apple makes really good products, but here in the GPU space all amd does is shitt and put out products that are clearly inferior to NVidia but still these corporate suckers vouch for inferior products which is kind of weird to see. Nvidia GPUs are overpriced for sure but nobody can deny their technological lead from having good AI features to industry wide optimization for any software or workloads and not to forget the ML market where amd is completely non existent. amd cards are just good enough raster bricks but with NVidia my GPU can be so much more and because of that I don't mind paying a premium for Nvidia cards as it actually is overall a really good product in respect to what amd is offering.

    • @GhOsThPk
      @GhOsThPk ปีที่แล้ว +4

      And NVIDIA fanboys are in reality just salivating fangirls of Jensen, ones that like to be shafted in the face every generation, remember the 3.5GB disaster on the GTX 970? You guys like buying gimped cards that becomes obsolete two years later, good luck using RT with high textures in newer games with cards that has 8GB~12GB which are the majority of cards Jensen shits for you.
      RTX 4050 6GB when the RTX 3050 which was already a TURD had 8GB, RTX 4080 with only 16GB when AMD gave 16GB at 3070Ti (which has 8GB garbage) level on the RX 6800, both the 7900XT and the XTX has more VRAM than the RTX 4080, better yet was the RTX 3080 MSRP of $700 becomes $1200 on the RTX 4080 while AMD gives the 7900XTX at the SAME MSRP of the 6900XT ($1000) while having more 8GB of VRAM, you guys like to be shafted, it's so good.

    • @baloo5621
      @baloo5621 ปีที่แล้ว +12

      The irony of this comment is unbearable 😂

    • @leucome
      @leucome ปีที่แล้ว +2

      Yeah I looked your channel... I can definitively see all the fantastic thing you produce with your magical device from Nvidia. I definitively can not do anything that impressive with my raster brick.

    • @Shubhamrawat364
      @Shubhamrawat364 ปีที่แล้ว

      @@leucome ha ha ha Iam not a youtuber but i work in 3D animation and VFX industry. Here if Anybody even talks about Amd GPUs they either looses projects or job. You can stay with your below par raster brick and continue to have the Cool Aid drink.

    • @leucome
      @leucome ปีที่แล้ว +2

      @@Shubhamrawat364 A studio that fire their artist if they talk about AMD... Hum...sound legit.

  • @asdfjklo234
    @asdfjklo234 ปีที่แล้ว +2

    I like seeing gameplay footage a lot, but I also like the other homey bits sprinkled in. It's the mix that works best imo. As I see it, you're on a good track already with your content, keep it up! :-)

  • @TerraWare
    @TerraWare ปีที่แล้ว +1

    Back in 2020 when these cards released the 16gb vram was a big reason why I went with a 6800xt over a 3080. To me releasing what Jensen referred to as their "flagship" with only 10GB of vram felt like planned obsolescence even though back than games weren't very vram hungry. Now they are though and even the 4070Ti is being held back by it's vram buffer if you try to play at 4K in some of the latest games.
    I also see comments of people thinking DLSS will make up for lack of vram but it wont. Vram allocation and utilization doesn't change much between native and DLSS.

    • @furynotes
      @furynotes ปีที่แล้ว

      DLSS or FSR won't because people jack up the texture detail on 8gb cards like they are 1440p cards. I can see dlss working for 8gb cards at this point. I found 8gb not to be not enough for 1440p a few years back. Funny how most gamers are mad about a last of us part 1 not running well on 8gb cards at 1440p. They scream bad port partly because of that.

  • @saricubra2867
    @saricubra2867 ปีที่แล้ว +2

    If you went for a 3080Ti finding it used, you would get a no compromise NVIDIA experience for the RTX 3000 series.

  • @Skungalunga
    @Skungalunga ปีที่แล้ว +1

    It's whatever your particular use case is.
    AMD has a history lagging anywhere from 6 months to 2 years behind NVidia in terms of features and even then, it's a lesser version that they have to opensouce to get people to adopt it. People seem to think AMD is their friend by charging less. It's more like they have too. Want cutting edge features, better day-one drivers and willing to pay the premium for it, NVidia has you covered. Want to max out performance to budget ratio, AMD is currently the way to go. It's that simple. Personally, I like RT and find real G-Sync noticeably better than Freesync. Even so the last GPU i bought was for my nephew and it was an Intel Arc 750...decent performance, steadily improving drivers and 500USD of productivity software and games. Intel even has feature parity with NVidia and then some. Why are they being left out of the discussion?

    • @-Ice_Cold-
      @-Ice_Cold- ปีที่แล้ว

      Yeah, why? Maybe for crappy drivers and hardware? And i doubt G-Sync is ''noticeably better'' than FreeSync. G-Sync is more expensive technology and FreeSync is more often adding in the Monitors. NVIDIA is for rich, who loves flaunt their expensive things

  • @qubes8728
    @qubes8728 ปีที่แล้ว

    There’s two types of buyers. Those who prefer performance over price and those that prefer price over performance. The problem is the former don’t care how much things cost and for the latter it’s a race to the bottom.

  • @chrismachica5219
    @chrismachica5219 ปีที่แล้ว +2

    Needed to replace my old gaming PC (i5 and 2 980s in sli). Went with the red team because of Micro Center bundle deal w/ Ryzen 9 and they had the 7900 xtx in stock ($999 retail). I have no complaints. I think the biggest thing is, I don't feel I over paid. Playing Overwatch, 1440p ultra at 500 + FPS.

  • @989howard
    @989howard 4 หลายเดือนก่อน +1

    if you like old games before 2010 or direct x9 dont take the risk of buying radeon gpu

  • @atishalii
    @atishalii 8 หลายเดือนก่อน

    Little late comment but Happy 2024 btw ,If you own an Amd cpu , you should be ok with the price drop Amd is supposed to announce when the 4080 Super will be in stores , I mean Amd should decrease the 7900 Xtx price by at least 100 dollars or even more in order to sell their top card. I'm Green Blue ( GPU CPU ) since the beginning of my pc gaming things like about 20 years ago + - , now I do play just one very Cpu bound game, an old one World of warcraft Dragonflight and I found out quite recently than the 7800x3d is a complete overkill for that game especially with the 3dvcache in it so I'm waiting for the prices to go down a little then I'll switch full Red and go from 11700K / 3070 Ti Strix to 7800x3d / 7900 xtx , it's gonna cost me a leg and a hear here in Euorpe ( if the prices go down.............. ) and you're One of the ones that made me change my mind about Nvidia Intel and AMD ! SO THANKS for your honesty, clearness and the way you went straight to the point !!! ( forgive my English I'm French ) I might write things badly ! so time and patience will you come back to Red and me too ;) ;) ;)

  • @SuubUWU
    @SuubUWU ปีที่แล้ว

    The thing about audio versus gameplay; the video is in the background for the most part until I need to glance over for statistics. Digging the content mate.

  • @crzyces1693
    @crzyces1693 ปีที่แล้ว +1

    I bought a used 3080 10GB FE for $525 in September '22 as well and oh my gosh am I kicking myself for multiple reasons.
    1. I had a 5700 in my main gaming PC previously and it occasionally ran into problems at 1440P/High setting though SAM helped immensely with that. I *KNEW* ReBar with Nvidia simply doesn't work well because Nvidia doesn't really care about it, but thought the extra 2GB of VRAM would make up for it. Well it does...except when it doesn't and though my average FPS in Cyberpunk with a high/ultra mix at 1440P is almost double that of the 5700, my minimums are trash, with numerous stutters dropping me to "0FPS" without DLSS.
    2. I also thought I'd use Raytracing on occasion, especially since I almost exclusively play single player games. Except the fricken VRAM *won't let me.* Outside of Minecraft, which I don't play unless it's with my son, there is not a game outside of...idk, Resident Evil 7 and Metro that allow me to use it (and in Metro it's on no matter what you use. Kudos to those devs, my word did they do a hell of a job as it worked great on a 5700 as well.
    3. I could have gotten a 6800XT for $50 less as well; or a 6800 for $100 less, both of which are great at 1440P.
    4. OK, this is the worst because it is 100% my fault. A nearby miner offered me a 3080Ti for $525 as well, but he wasn't breaking down his rigs until the end of the month so I would have had to wait 6 days to pick it up. I looked at the benchmarks, the difference was minimal and considering it was _only_ for 1440P and I typically don't keep cards longer than 2 years, whelp, I just wanted to upgrade after being stuck on the 5700 for well over 2 years at that point.
    I *knew* 10GB was not going to be enough VRAM for 1080P/Utra or 1440P/High for much more than 2 years, but I did not expect VRAM usage to go through the roof so quickly. Granted, this may be 100% on game devs for being lazy as shit...OK, that's not fair. For not caring about PC gamers at all when compared to console players. It's much less work for them to use one memory pool, staying under 14GB or so, which is fantastic if you are on a console. Not so much if you are on PC. I mean c'mon, _"The Last of Us"_ doesn't exactly look better at 1440P high than Cyberpunk 2077 does, and Cyberpunk has far more to render, much further draw distance and simply a hell of a lot more going on and the VRAM usage is about 35% lower. Why? Because it uses system memory far more effectively. Spider-Man: Remastered? Same thing. Red Dead 2? Again, same.
    It's not just _"The Last of Us"_ either, it seems like a lot of new games are running into this issue, and it's obviously only going to get worse. Sure, if I stick with first party _"GamePass"_ or _"Steam Deck Certified"_ titles I'm sure I'll be fine, but general multi-platform or even worse, Sony exclusives are probably going to be trash on any GPU with less than 16GB of VRAM regardless of the cards power, at least at 1440P without upscaling. At 4K I assume you'll need to render the game at 1080P (or balanced in most cases) if the card has under 16Gb of VRAM because the games are being coded to pull from one big memory pool, and adjusting that isn't as simple as having a guy work in the back of the room for a few days plugging away at it. More like 10 guys working on it for a month *IF* they start at the beginning of the port process. Who knows if a lot of these ports will even be fixable with a patch, _"The Witcher 3"_ certainly wasn't.
    Now for multi-player games I doubt VRAM will be an issue for quite a while...like a decade *_unless_* you are talking about slower paced sim-like games that are more about pushing boundaries and staying afloat until they are forced to close (think Star Citizen). In that case, 4GB will probably still be enough to get you by if the main draw is competing and visuals are legit non-important. Maybe CS 2 will ask for 8GB, but I doubt it since Rainbow 6 Siege is fine on a 4GB 580. I know when I do play Battlefield 2042, SW: Battlefront or Fortnite I could care less about playing at 1080P/low or 1440P/high. I just find out where I need to be so I can lock 144hz and call it a day. I'm not exactly stopping to slowly gaze around the landscape at the top of a mountain or fort like I would in The Witcher 3, Horizon Zero Dawn or a heavily modded Skyrim. I'm not trying to get to the top of a crazy high building like I would in Cyberpunk or marveling at a visually stunning sky box like I would in Mass Effect Legendary Edition. In those cases, 10GB is more than I need because the experience is either strictly the skill required to do well or the fun of the people you're with.
    When it's a single player game where it's the game's job to create the experience by telling a story both visually and verbally/audibly though, well don't screw yourself. If you don't mind 1080P + upscaling then you can probably still use raytracing, and it absolutely does make a lot of single player games look immensely better. Even The Witcher 3 looks *INCREDIBLE* with RT, I mean the difference is breathtaking. Unfortunately playing at 1080P with balanced upscaling absolutely destroys faces and various body parts of monsters, at least on a 32" 1440P display. If I could lock in 30FPS by setting my frame limit there, I would absolutely take 1080P/High Native-1440P/Medium over 1440/High-Ultra 90-130FPS. Not possible do to the stutters though, and like I said, I don't even know if it is fixable at this point. They'd basically have to remaster the game from scratch and have a team working on it from day 1 for it to make sense, as I can't imagine there is any way to _make it go_ correctly at this point. I mean they gave it the old college try with 5 patches and it is still a mess; at the expense of the rest of the game, so eventually they have to cut their losses, undo what they've recently broken and call it a day, accepting that anyone not on a 3090/Ti, 7900XT/X, 4080 or 4090 is sol. 3080Ti? Pft, no RT in TW3 for you. 4070Ti? Nope, you are out too. 12GB of VRAM? not for 1440P with *ANY* settings if you want to use RT my friends, at least in newer titles that are aimed at consoles first.
    Best Used GPUs right now:
    $100 USD 8GB RX 580 (you can actually get them new for $90)
    $150 RX 5600, GTX 1660Ti
    $175 RX 5700/XT
    $200 RTX 2060 Super (Slower than the 5700XT by *A LOT)*
    $300 RTX 3070, RX 6700XT
    $400-$425 RX 6800
    $425-$450 RX 6800XT
    $475-$525 RTX 3080 10GB (only buy if you plan on playing multiplayer almost exclusively)
    $500-$575 RTX 3080/Ti 12GB, RX 6900 (much better purchase since that 12GB of VRAM on the RTX cards won't allow you to use Raytracing and the 6900 is slightly better at rasterization).
    $600-$700 3090/Ti I'd be torn at this price point since you can get a new 6950XT for $650, but you actually can start to use Raytracing *in most* titles with the 3090/Ti's, but you have to be OK with either 1080P, or 1440P/Medium Settings, and/or 30FPS at 4K depending on what level of upscaling you are personally alright with.
    There's also the professional use cases, though if it's content creation for YT, TT, FB etc, the differences between Ampere and Navi 2 are minimal at best now. If you are using 3dSMax, Maya, or CAD software that has a crush on CUDA, well that's a different story entirely. If you're getting into AI programming/training, then Nvidia is the way to go at this point (you may even want to look into their all-in-one boards at that point as well, which run between $100-$2000 and come with everything from a dual core ARM CPU and a Maxwell GPU up to a 6/12 with a 4048 Ampere GPU with 32GB of GDDR5)...
    And on that note, I'm getting off the topic of GPUs, but I can 100% assure you, that were I to do it all over again, I would have bought a used 6800XT, even if it was the same price.

    • @LaggyMo
      @LaggyMo ปีที่แล้ว +1

      How and why Tf did u write this in like 30 min? Appreciate the info tho👍

    • @GhOsThPk
      @GhOsThPk ปีที่แล้ว +1

      8~12GB cards are DOA for RT or 4K Ultra textures.
      NVIDIA won't stop, they're going to keep doing it, until people stop buying their shit, RTX 4050 6GB when the RTX 3050 had 8GB, 4060/Ti 8GB when the RTX 3060 had 12GB.
      RTX 4070/Ti with 12GB when AMD is giving 20GB with the 7900XT which is even more than the $1200 RTX 4080 (funny detail is that the MSRP of the RTX 3080 was $700, meaning NVIDIA is milking 70% more)
      RX 6950XT 16GB at $650 kills the entirety of the ADA line up starting from the 4070/Ti 12GB at $800.

  • @cash_banooca17
    @cash_banooca17 ปีที่แล้ว +1

    I bought a 6650xt as a stopgap gpu between my 1070 and eventually what ended up being a 4070ti and it was okay but I'm not buying another AMD GPU for at least a couple generations. The almost instant nail in the coffin was how AMD Adrenaline has almost no legacy features and no proper forcing of AF in games. By far my favorite thing about NVIDIA is how I can force AO and AA in older games, and how I can force 16x AF in games with broken or poor texture filtering. I found AF to break a lot while using my 6650xt as well. AMD is super streamlined and easy but NVIDIA gives me way more tools to fuck with and make games look better with. Also I had way too many random driver crash related pc restarts, in the month using my 6650xt I had like 4 or 5, since buying my 4070ti a couple months ago I had maybe 1 or 2. And one of the AMD restarts was so bad my driver corrupted and I had to completely reinstall it. Also overclocking stability wasn't great, ik silicon lottery and all but my 6650, my laptop's RX 580, and my brother's 5700xt all had instability with super basic overclocks, while my 750ti, 1070, and 4070ti overclock crazy well with little to no issues. AMD just never did anything to convince me they're worth switching over to full time despite being a better raw power value.

  • @Strongholdex
    @Strongholdex ปีที่แล้ว

    Why was AMD not there in 2016? Why did you buy a gtx 770 if the rx 480 beat it? I don't understand.

  • @1isaacgardner
    @1isaacgardner ปีที่แล้ว +1

    I've mostly used amd up until the rx480. I had huge problems with the drivers. I had to reformat my windows out of legacy and had to do a bunch of other things for it to post. After that I went Nvidia. But I may go back to amd graphics cards in the future. We'll see.

  • @projectc1rca048
    @projectc1rca048 ปีที่แล้ว

    I like the Hitch Hikers Guide to the Galaxy reference at the end there.

  • @chridi3
    @chridi3 ปีที่แล้ว +1

    You asked for opinion so here is mine. I like to see your face, especially when you have deep and important thoughts. 70% face and 30% cool gameplay would be good for me. Keep rocking!

  • @alexanderneuhold3048
    @alexanderneuhold3048 ปีที่แล้ว +2

    I got an RTX 3080 10GB Version, and it runs every game perfectly fine on 1440p. And yes I play with all graphics options on the highest possible. If the games are not E-Sports relevant, then they will run at arround 60-80 FPS (even with Raytracing for the one which suppprt it) and that is totally ok for me.
    For example:
    Hogwarts Legacy - Ultra Settings with Raytracing on Balanced -> 60-70 FPS
    Cyberpunk 2077 - Ultra Settings with Raytracing (one below Psycho, dont know yet how its called) -> 70-80 FPS
    All other game I play run arround 90-120 (in some cases even more)
    This is the third Nvidia card for me (after GTX 750 TI und 1070 TI) and I dont even think about to buy AMD. Even if Tubers tell the people to think about buying AMD, I would never recommend cause from my friends which bought AMD, they STILL got a lot of issues with.

  • @Lev-The-King
    @Lev-The-King ปีที่แล้ว +2

    The GPU market has me more interested in consoles than ever before... PC master race at what cost?

    • @Lev-The-King
      @Lev-The-King ปีที่แล้ว

      @@narwhal9852 considering this is a predominantly a gaming channel, that's why I wrote that comment.

    • @vextakes
      @vextakes  ปีที่แล้ว

      New consoles might be a great value rn, but it might change in a few years

  • @fabbri2183
    @fabbri2183 ปีที่แล้ว

    i have a half broken Radeon X800 GTO which i used for gaming until i was given away a pc with a Radeon HD 5750. the difference between the two but the quality of both the cards. when you read "AMD" on a gpu, you're literally getting something which won't ever die... at least that's my experience so far in more than 10 years of use

  • @macsdf1
    @macsdf1 ปีที่แล้ว +1

    not me, i got a 7900xtx

  • @The_Chad_
    @The_Chad_ ปีที่แล้ว

    Dude, you made the right decision going with the 3080, especially since you play esports and use creative applications. I bought a 6900XT and a 3090 as soon as they launched a couple years ago. I made all kinds of excuses why I would use the Radeon card in my main gaming rig for over a year. I finally got sick of the driver issues and started seeing just how bad AMD was as a company, especially with their GPUs. I firmly believe that most people who say they have made the switch to AMD after using Geforce for years and are happy with it either aren't power users and are using lower end or midrange cards, they haven't had their new Radeon cards long enough, or are lying to themselves to make them feel better about a poor purchasing decision.
    They really are worse and I believe that without question. I build computers all the time. I have close to 20 PCs running in my house that I regularly use and they all have different hardware configs. The PCs with Intel CPUs and Nvidia GPUs are just better with less problems. And I'm not fanboying. I don't like any of these companies. I would actually say I was an AMD fanboy for a long time. I'll also say AMD looked like they were starting to get really competitive when they started releasing their last gen products. But it seemed like halfway through their last product stack, they just started cutting every corner they could get away with. It's only gotten worse since then. No, AMD isn't going to get past the reputation they have because they have fully embraced it at this point. They aren't even trying to be competitive or release the best products they can. If you ask me, it looks like they are purposefully losing market share, probably for some crooked greasy reason.
    Ive been seeing a lot of people recommend AMD lately, but I think that advice is about 2 years too late. Two years ago, when they actually were competitive(perf per watt, not features) nobody was mentioning AMD as an option unless it was a last resort. The only reason they sold so many cards was because of mining. Gamers REALLY wanted Geforce, so much so they were willing to pay 2x more for the same performance. That's probably why AMD threw in the towel and decided to start focusing on selling fanboys overpriced, under performing garbage.

  • @FriarPop
    @FriarPop ปีที่แล้ว +1

    Nvidia blows AMD out of the water, only 2% go AMD for a reason. AMD is always caught lying about performance metrics also.

    • @WhyName
      @WhyName ปีที่แล้ว

      You get all your info on gpu benchmark or something? 7900xt and 4070ti at least are practically 1:1 in performance/$.
      I kinda wish you were right, it would make it a lot easier lmao

  • @Spork888
    @Spork888 ปีที่แล้ว +1

    I like a mixture of relevant gameplay, face cam, and screenshots / photos. Just whatever is most relevant at the time. I especially like when you name a product or game and put it on screen.

  • @Shrektus
    @Shrektus 8 หลายเดือนก่อน

    No. You:
    bought Nvidia > Find reasons to defend your choice
    not
    Find reasons to chose Nvidia > Chose Nvidia

  • @RN1441
    @RN1441 ปีที่แล้ว +2

    I used AMD (formerly ATI) almost 100% since the late 90s but switched over to nvidia in the 1080 generation because AMD was just not keeping up. I skipped the 20 series, but then went Nvidia again in the 30 series because the AMD 6000 series were just not available, but downgraded to the 'midrange' 60TI after having been a high end buyer for decades as the pricing/segmentation were getting out of control. Now here in this generation both AMD and Nvidia have priced me out of the market and I decide that the best 'current generation' card is actually a used 3080. Good luck AMD and Nvidia, steam hardware survey shows that less than 3% of the market is running a 4k or higher resolution and VR is only 2% - almost nobody needs the high end anymore, and I'm not going to be paying $800 for a "midrange" card.

  • @rtyzxc
    @rtyzxc ปีที่แล้ว

    Actual Nvidia advantages:
    - Reflex (NOT equivalent to common driver low latency modes, do your research)
    - DLSS has a bit better quality than FSR
    - Better stability/overclocking. I still see some AMD GPU users complain about random crashes (which also means full PC crash instead of just reloading the driver like on Nvidia). Some people say it's due to overclocking, but my last two nvidia GPUs have ran stable at +200mhz core overclock, so it makes the practical performance+stability better than what's represented on benchmarks.

  • @PoweredByLS2
    @PoweredByLS2 ปีที่แล้ว +1

    This sounds a lot like the Android vs iPhone argument. Specifically the people who always buy iPhones, because to them its a fashion accessory. They don't care about Android phones having great/better features and more innovation because what they want is to be seen with an Apple logo by their friends. It's all about fronting. nVidia is the iPhone of the GPU world and it'll be tough to crack that nut for AMD. But it's definitely not impossible, it's only been a few years that Intel had that coveted position in the CPU world, then Ryzen exploded unto the scene.

  • @dawienel1142
    @dawienel1142 ปีที่แล้ว +1

    This is exactly my point w.r.t RT.
    3080ti owner here and I also somewhat regret not going 6900xt since it was about 30% cheaper, but what brought me over the edge was the warranty on Palit vs XFX (3 vs 2)
    Great video and you need more views with your content quality.

    • @Miraihi
      @Miraihi ปีที่แล้ว

      If you ever try to get into the machine learning your regrets are going to instantly vanish since Nvidia dominates that niche almost completely. Only promising alternative we have for AMD is DirectML, but it's still in its infancy.

    • @dawienel1142
      @dawienel1142 ปีที่แล้ว +1

      @@Miraihi not something I'd ever get into so its not part of my decision making factors.
      I basically only play games on my PC.

    • @saricubra2867
      @saricubra2867 ปีที่แล้ว

      ​@@dawienel1142 You are wasting you 3080Ti that can do more than just gaming, by the way, the 3080Ti still beats a 6900XT at 4K, only the 6950XT can be faster from that lineup of products by AMD.

    • @dawienel1142
      @dawienel1142 ปีที่แล้ว +1

      @@saricubra2867 agreed, but the 6900xt is significantly cheaper and has more VRAM which is clearly becoming a problem even for 12GB high end GPU's.
      In a nutshell the 3080ti is faster until its NOT since it runs out of VRAM and then it's a stuttering mess, ironically the areas where it beats the 6900xt like 4k/RT would be what makes its VRAM a limitation so the 6900xt may well overtake it in the heavest games at 4k already.
      Lastly you mean to say that I'm wasting my GPU by not doing something I never ended or wanted to do with it, if you want to be critical the better viewpoint would be that I bought a product that has use cases that I will never utilize, in which case the better argument would be that I should have bought a product that fits my needs better and saved a significant amount of money in the process.
      I guess it was the warranty that just gave me the piece of mind to go for Pailt over XFX, I personally like the Palit brand a lot.

    • @saricubra2867
      @saricubra2867 ปีที่แล้ว

      @@dawienel1142 3080Ti is twice as fast for 4K RT than the 6900XT even when the VRAM is maxed out. 12GB is not an issue, take a look at the 1080Ti. By the way, the 3080Ti has the 3090 die and the 6900XT isn't as powerful.
      6900XT is significantly cheaper with crappy ray tracing, meanwhile you don't sacrifice anything with the 3080Ti in general, just like Turing marketing from Jensen Huang, "It just works".

  • @Varue
    @Varue ปีที่แล้ว

    facecam during gaming is epic! let's people see genuine reactions.
    just discovered your channel and it is good content. i look forward to your videos.

  • @xRlly
    @xRlly ปีที่แล้ว

    POV: u want to get sponsored by AMD
    MY OWN EXPERIENCE: I bought a AMD GPU 2 months ago, it was a RX 6600 XT that was supposed to deliver almost on par with rtx 3060 / ti, After I installed the GPU in my PC and installed drivers, uninstalled Nvidia GPUS etc, I did notice a big performance difference, the GPU was not what I expected, it was having huge delay in Fortnite, in Rust in almost any games I played, the MS delay was at around 9ms while my previous Nvidia GPU (gtx1070) was giving me 1-3ms delay.
    The AMD gpu was having huge stutters in most games, the gpu was at 99% most of the time, the performance overlay was very buggy, couldn't get it to open correctly even after checking everything was fine with the drivers and settings.
    Anyway, I ended up returning it and getting a RTX 3060 TI, it still serves me good, I have 110 FPS in Days Gone, ultra graphics settings, no lag spikes, no delay, matter of fact, my pc delay lowered up to 95%, In Fortnite I have about 0.4-2ms Render Latency and I can play my favorite games, yeah the 3060 TI could use 12gb of VRAM but we have atleast to appreciate the fact that Nvidia pays so many people to give us features amd don't and they still charge us almost the same price like Nvidia.

    • @brianchan8
      @brianchan8 11 หลายเดือนก่อน

      The 6700xt is the one completing with the 3060ti. Also, you could have gotten a lemon. At least the 3060ti isn't too bad of a deal

  • @od13166
    @od13166 ปีที่แล้ว +2

    My case i had to kept using Radeon since Freesync in NVIDIA doesnt work that well
    Especially if you go 60hz mod with Freeync being turned on it loses my monitor signal which doesnt happens in AMD

    • @-Ice_Cold-
      @-Ice_Cold- ปีที่แล้ว

      Yeah, monitors with FreeSync costs significantly less than NVIDIA's G-Sync

  • @elvertmack5039
    @elvertmack5039 10 หลายเดือนก่อน +1

    i can only speak for myself. the first pc i bought was a prebuilt with a 3080 in it from Micro Center before the crypto boom. It came with a 10900k and an rtx 3080. Great machine but then i wanted more because my monitor got better and bigger....Samsung g9 49 inch and the 3080 wasnt built likt that to run raytracing and evertything else i wanted too run on it. So now im at a 13700k with a 4080 and love it. but i said this to say in my opinion, AMD makes a great second choice for most. it is a tinkering card as where Nvida is just a work card. Nvida is like apple and Amd is like Android. I got my 7900 xtx and imedietly had to replace the thermoal paste because power color didnt do it right from the factory. great card once i repasted...but shouldnt have to. second....resale value. People will by an Nvida on resale vs an Amd card for there first card. not becasue it may be better or worse for them...but simply because of the myth around AMD, which i cant blame them when they are paying hard earned money for a card that may be still under warrenty or not..they just want a card that works.

  • @KraNisOG
    @KraNisOG ปีที่แล้ว +2

    I'd love to buy an AMD GPU, but the performance hit from my 3080 to the 7900XTX is wayyy too large. Really just looking forward to the next generation at this point.
    Which according to their roadmap, should be next year. Either a revamp with the super version of the cards like the 20 series, or the 50 series releases.
    Maybe AMD will come out with a really good 7950 version that does really good in Blender.

  • @techwandoUS
    @techwandoUS ปีที่แล้ว

    Vex, your videos are great man!
    What camera do you use, it's very crispy.
    Also to ad on to the Davinci thing...
    I loved my time with the 6900xt but it greatly made my experience suffer with Davinci. I also experienced freezing and lag in the timeline, whereas on the Nvidia cards even going back to a 1060 I had as a drop in wait for 40 series card, I had no issues.

  • @davidnolde4598
    @davidnolde4598 ปีที่แล้ว

    I prefer if you talk directly to the camera, right now there are too many channels that just use gameplay footage and you kind of lose the sense of uniquness to those channels.

  • @kevinjeon8824
    @kevinjeon8824 ปีที่แล้ว +1

    For me I currently own an rtx 4090 for gaming and RTX A6000. I do deep learning stuff so I have to kinda use NVIDIA...I initially bought 4090 for both gaming and deep learning and eventually I got RTX A6000 for the 48GB VRAM which helps a lot during training.

  • @lucaoru502
    @lucaoru502 ปีที่แล้ว +2

    I have an RTX 3060ti and I regret it, I bought it expensive due to shortages and I wanted to try ray tracing, a few months later in a local store they sold the RX6700XT, $150 cheaper than my 3060ti with better performance and more power consumption low

    • @GhOsThPk
      @GhOsThPk ปีที่แล้ว +1

      It's still cheaper today, you can buy an 6750XT 12GB (pcpartpicker ~$380) which shits on the 3060Ti 8GB (~$410) at raster.
      The funny part is that at Unreal Engine 5 Lumen+Nanite, the 6750XT is faster at RT, watch the Daniel Owen vídeo of the "3060Ti vs 6700XT"

    • @arenzricodexd4409
      @arenzricodexd4409 ปีที่แล้ว +1

      6700XT use more power than 3060Ti. Performance wise it depends. Generally 6700XT is faster but not to the point the gap is big unless the said game aare very bias towards AMD architecture. 3060ti and 3070 difference is around 10%. And 6700XT usually sit in the middle between the two.

    • @-Ice_Cold-
      @-Ice_Cold- ปีที่แล้ว

      @@arenzricodexd4409 12GB Vram is benefit

    • @arenzricodexd4409
      @arenzricodexd4409 ปีที่แล้ว

      @@-Ice_Cold- yes. But VRAM now become a marketing thing. Years ago people complaining when crysis 2 end up using heavy tessellation even on flat surface. Now texture that can be done well below 8GB even at 4k are now not being optimized causing bajillion of them are needed.

    • @-Ice_Cold-
      @-Ice_Cold- ปีที่แล้ว

      @@arenzricodexd4409 This is no more a marketing thing than RT. Doom Eternal will not go to maximum settings if you have less than 8GB of Vram. Halo Infinite will not run on GPU with less than 4GB. And now The Last of Us
      requires a ridiculous amount of Vram, which is not enough even for the minimum settings for Gpus with 4GB

  • @swifty1969
    @swifty1969 ปีที่แล้ว

    I bought the exact same card used for $556. Are you telling me that RE4 remake will crash due to insufficient 10Gb VRAM?

  • @danmar007
    @danmar007 7 หลายเดือนก่อน

    Every graphics card I’ve bought since 1989 was an ATI/AMD except for the first and last ones. The first was a Matrox and the last was an Nvidia.

  • @justhitreset858
    @justhitreset858 ปีที่แล้ว +1

    I've had a lot of GPUs over the years and in general Nvidia drivers are more stable. That's not to say AMDs are bad, just not as consistent and they tend to have more destructive driver issues that gain a lot of attention.
    AMD absolutely nails drivers the second time around though, like with the RX 500 series and R9/7 300 series. They were refreshes with various improvements, but the same architecture. The only AMD GPU family I had 0 issues with was actually Vega in their APUs. Again, Vega had been around for a while at that point so AMD had more development time put into the drivers.
    AMD seems to lack the ability to work on drivers for non-current products or least tend to severely reduce resources put into them.

  • @MaycroftCholmsky
    @MaycroftCholmsky ปีที่แล้ว +1

    The thing with AMD GPUs, their biggest edge is monetary value. They might not be the best of the best in terms or raw power, performance in work tasks or the accompanying technologies, but they bring the optimal balance between performance and cost, especially in lower-end segment. Look at 6600 vs 3050, 6700 vs 3060 or 3800 vs 3070, and the AMD cards will be the clear winners for their cost. 6700xt, 6800xt and to a lesser extent 6900 are also great cards that pack quite a punch, and the excess VRAM allows you to run things in bigger resolution and go for higher settings overall while Nvidia only starts going over 8gb of VRAM in 3080 price segment.
    But beyond that, of couse, the technological edge and the performance of the flagship GPUs is currently championed by Nvidia. They make great cards, but only if you have the money or need them for specific tasks optimized for nvidia hardware.

  • @Legendxtoli
    @Legendxtoli ปีที่แล้ว +1

    I went for the 4070 after having the gtx 1080 for 5 years, I do creative work so most software relies on NVIDIAs cuda cores. Ngl, I think I made the right choice for the budget I have. AMD cards are great for gaming, but isnt that just it? just great for gaming?

  • @Deeps__
    @Deeps__ ปีที่แล้ว +1

    See, I did that and bought the leap to a 5700xt nitro+. I then had the black screen issues and spent the majority of my time following fixes people posted, none resolved it so I sent it back for a 2070 super.

    • @arfianwismiga5912
      @arfianwismiga5912 ปีที่แล้ว

      that what's i frightened the most, i end up upgrade to rtx 3060 ti

  • @LaTrancheDePain
    @LaTrancheDePain ปีที่แล้ว

    the only reason people crash on RE4 Remake is because they don't know how to configure graphics settings, it does perfectly fine on a 1070