the brand loyalty phenomena is interesting, it's like sports or national pride, you pick a brand and follow it until the end, even when presented with arguments against it, which you try to explain and defend your brand of choice, it truly is interesting what human psychology can do.
I go for price I have a 4070ti because someone on marketplace wanted to swap for a lower card and some cash so I swapped my 3060ti and £200 for the 4070ti
An overlooked part of it is the effect of retailers and retail employees. I sent my cousin in to get him an AMD card specifically for best value, and the Microcenter worker ended up convincing him to get a WORSE Nvidia card for MORE money. I had forgotten to warn him commission often forces them to act just like car salesmen - and that they aren't necessarily all experts anyway. When retail workers have brand loyalties, it impacts sales especially to newcomers.
@@acid3129 Same here I always check benchmarks and reviews and then factor in price. Been doing that since the late 90's when I bought my first ever GPU. I have no clue as to why anyone WOULDNT do that, but blindly follow a company no matter what, even buying massively inferior cards while doing so. The last 2 Geforce cards I owned were a GTX 970 (cos it actually won vs the R9 290 - but then a week later AMD dropped the price by 30% DOH!), and the GTX980Ti which a friend sold me for £250 and I couldn't resist. Ever since then, every card has been AMD cos they simply smashed Nvidia at my price point every single time. And with the 7800XT, looks like I'm going that way again.
i upgraded from a 3080 10Gb to the 7900 xtx and I have been supremely pleased. This is the first time I've ever been able to absolutely crank texture settings at 4k without losing a high refresh experience. Nvidia's new tech does look cool but I have not regretted my purchase and I hope you don't regret yours either!
Same thing here. Pretty happy with mine. I'm excited to see what amd will being against a 50 series. Rumors are the 5090 will be a huge generational leap
@@FantasticKruH it’s just a stock model shroud made by power color. Somewhat unassuming but I dig the look. I got fairly lucky in the lottery as I can undervolt a good amount while cranking the memory frequency and the power limit. Hottest it ever gets is low 70s.
Same. Replaced my 1070 with an RX 6800 earlier this year. I just couldn't bring myself to buy a 70 class Geforce with the SAME VRAM as two gens ago! The last Radeon I owned was an R9-270X back in 2013 lol
I've had Nvidia cards for the last 10 years. I decide that enough was enough and decided to try AMD. I bought a Branded new 6700xt for £276 (actually cost me £230 but it was bought through a company which took off the VAT), I really can't fault the card that much; the idle power consumption is high at 34 watts compared with my 1070 which idled at 10 watts.
Also upgraded from Nvidia to an RX 6800, though shortly after release. We didn’t get hit as hard with pricing during the crisis in NZ. I had no issue playing games like Hogwarts Legacy or The Last of Us at release, and my only real “sacrifice” was having to turn off RT. I also dual boot in Linux, so AMD made that a lot easier. I was convinced enough that I upgraded to a 7900 XTX after getting a higher res screen.
This might be the most criminally underrated tech review channel on TH-cam, if not on all of internet. From the intro and outro, the quality of your graphs and the depth of your research, the channels feels like a ten million sub quality.
I wouldnt go that far, but it is good content for sure. I feel as if he is selling emotion and then claiming not to have any himself. I like the videos but they could be less about what could be, more about what is, what matters, and the reality that RTX is a gimmick. BECAUSE RTX IS A GIMMICK.
with what CPU you paired it with. I'm looking around to get a new PC with a AMD GPU (7800, 7900 XTX is way to much for my needs) to replace my i7-7700K with GTX1060 GPU. And btw, what's your distro choice?
I switched to a linux based operating system, and the drivers for my GTX1080 gave me grief on numerous occasions. So when i built my new pc i grabbed a 7900xtx, and it's been beautifully seamless thanks to AMD's open-source drivers.
That's exactly my experience, with a 1080 too. Haven't gotten to the "buy a radeon" part yet tho. I'm very sad to say nvidia has ruined my experience with Linux. X11 and nvidia sucks ass on things like kde plasma. Wayland sometimes works, sometimes doesnt. Video decoding in browsers is completely unsupported unlike with Intel. It made me switch back to Windows after 2 years as a rampant arch linux fanboy (btw). Feels like it killed a hobby for me. On the other side tho, I can't deny that if I could get an RTX card, I would do so, despite my experiences with nvidia on Linux. So it's kind of my own fault as well.
@@Noodles1922if I were a developer, putting my effort into a technology (DLSS) that not only works on only one of the three card brands, but certain versions won't even work on those cards depending on how old they are, I think I'd be more inclined to use something more open that has better coverage. AMD and Intel both have their technologies work on all brands, no matter how new or old they are. Considering both the Xbox and PlayStation are AMD as well, it makes even more sense to not cater to that nonsense. Support will come eventually for Nvidia hardware, just not on day one.
I bought a 6950XT after I sold my 4080 for a minimal loss. The 6950XT does the job for me, but I do miss DLSS since FSR can be a shimmering mess at times. Otherwise, the AMD card is fine.
@@Tnasucks2 it's just a waste of money. I bought a Rx 7900 xtx for 980 realised I didn't need it so sold it for 965. Bought a Rx 6800 second hand for 350
@@leroyjenkins0736comments like yours sum up this generation - both nVidia and AMD are asking for far too much money for most people at the top end. The sad thing is that Ada Lovelace is nVidia's best architecture going back to Tesla and G80 in terms of gen improvements but they've made it unaffordable to most this time around. I would disagree about it being a waste of money only in so much as more games are going to need higher system requirements now cross gen is done. Not to mention Nanite and Lumen in UE5 which are going to destroy a lot of GPUs when they become commonplace 😂.
I switched to AMD once, after being an Nvidia user all my life. Within three days, my dog died. It hung itself in shame. Then both my parents died in a car crash. After the funeral, the bank repossesed the family home, and I was about to live on the street. But just before I was evicted, the AMD card exploded, destroying everything for 11 blocks and killing hundreds of people. Geforce is better. Buy a 4060Ti. In fact, buy 10 of em. You'll save a fortune! -Jensen
I have used Nvidia 2001-2012 , then I have swith to AMD 2012-2023 , and in May this year I have Nvidia gpu , oh tho that is "older" gpu , so this or next year I get new one , I considered to wait for 5070 get out next year , but if I will 4080 in fair price I will grab it , I want to play in 3 games Starfield , Cyberpunk 2077 & RDR2 and only RDR2 is ready CP2077 is still updated and patched , Starfield will be playble like 2025 , so I can wait , now I get ready to play 2007 NWN2 later POE so I have games that keep me going until I get new GPU :)))
I've been on an RX580 since before the video card crisis started and have been very pleased. I only miss the more robust nvenc encoders from team green for streaming, but AMD has made me a believer and I'm still holding out to pick up a new decent "5yr AMD card." Very glad to hear that the Davinci Resolve support is only getting better!
I had no NEED to upgrade from my 6700XT to a 7900XTX but I did. The 6700XT is a beautiful tank. I think the upper tier 6000 series cards are the new Polaris for lack of a better term. will be good for years for solid 1440p gaming as the 12GB VRAM is excellent at the price and AMD keeps improving FSR. Lovely grab these days for $300 ish new. @@redpill2634
at around 9:30 the 7900 XTX isn’t around 8 frames anymore I have everything max and it averages 41 frames. I think with FSR 3 it’ll hit the 60 fps mark for sure
I switched to AMD Radeon just this weekend, for the first time in my life. All of Nvidia's current offerings are either underwhelming or massively overpriced, or both. AMD's new RX 7800 XT ticked all of the boxes for me: good performance, decent price (compared to the alternatives anyway), plenty of VRAM going forward, and RT performance is not *that* far behind. Sure I miss out on DLSS, but it feels to me that DLSS is now being used mostly as a tactic from Nvidia to prey on people's FOMO, to get them trapped in vendor lock-in, and to cover up the fact that their new GPUs are under-performing. Nvidia are becoming an increasingly scummy company and I don't feel like rewarding their behavior at all.
I don't think they're using DLSS as fomo, I think they genuinely think DLSS is the way forward. The cards in the 40x lineup heavily incorporate hardware features for it, and they really only reach good performance levels (with ray tracing) with DLSS turned on. I think they might be on to something, AI generation is the future and will make higher frame rates much more accessible, but people are going to have to accept that with frame generation turned off their cards won't be as impressive
AMD cards are over marketed. Thats the problem right now. They are great cards don't get me wrong. But having both kinda gives your own conclusions to this man.
I bought myself an RX 6950 XT about a month ago. it was brand new on Newegg for a bit under $600 after taxes and shipping (plus it came with Starfield). I gotta say other than a few minor gripes (which are mostly about RT performance) I love the card and it's been serving me very well. It was a huge upgrade in performance, stability, and efficiency over the A770 I was previously burdening myself with. Hopefully you find yourself to be as happy with your choice as a I was with mine, cheers!
I don't think you'll regret it. Been using AMD since the HD 5870 and outside of a period of time where the 5700xt had broken drivers and was black screening a lot, across all models it's been great. And I'd assume by the time developers are bold enough to fully transition to ray traced only games, rt will be something even budget gpus do well.
Dunno why iceberg said that RDNA3 has RT performance similar to RTX 3000 series when 7900xt is faster in many games using RT than 4070ti. Similar with 7900xtx and 4080. Of course Radeon gpus will lose in games like CP77 (this is one of few really good looking rt games) because they are made for nvidia. Look what is happening in games like metro exodus, resident evil, dead space, even control where nvidia did a lot to make rt working the best on their gpus and somehow amd gpus are working very good with ray tracing.
@@MacTavish9619The 4070ti losing to the 7900xt is karma for being stingy with VRAM . (For lols the 3070 gets beat by the 6800). The 7900xtx is about a 3090 in RT , and the xt is about as fast as a 3080 from what I see. So not bad , but not a match for top Lovelace - the top Radeon card about 25% down on the 4080 , and 60% down on the 4090.
My first GPU was a Voodoo card (twin GPU's!) after that it was all Nvidia all the way up to the RTX 2080TI which was the first card that disappointed me, but between the absurd price of the 4090 and the lost of EVGA as a board partner I decided to finally give AMD a go and went for a RX 7900 XTX (I game in 4k) so far I've been very happy with it. If Nvidia hadn't so grossly over charge for the 4090 and screwed over EVGA I might never had changed over.
Are you the one person who bought a 5900gtx? Lol. 8800gtx were woeful too. There has been some truly awful nvidia cards over the years but people are brainwashed.
I had a Red Devil 6900XT and loved it. It is now in my friend's PC after I had it for 2 years and upgraded to a 4090 (AMD didn't make anything to compete with it and it is amazing for DCS VR). I do miss AMD's drivers though. Relive and tuning in it was awesome.
7900XTX rivals the 4090 in performance when it comes to raw rendering rate, without DLSS / Raytracing on, for half the price, it beats or matches the fps of the 4090 in every game except for ones that rely on DLSS heavily.
@@sankaplays3098 no, it doesn't. It is a bit better than the 4080. I play a lot of Halo Infinite and the performance lines up with the 4080. Also, VR is way better on the 4090. That is why I have it. Halo Infinite is usually 144 fps+ at 4k. It doesn't have DLSS and I have it rendering native 4k for my Samsung G70A.
Congratulations on the upgrade man! I hope it works out for you. When I just built my rig I went with a rx 6800 myself and a Ryzen 7 7700x stellar little system so far.
The biggest problem with a lot of people who switch to AMD or Nvidia from eachother is that they don’t properly uninstall the old drivers and when the switch happens they get all kinds of driver and performance issues
Last time I had both sets of drivers installed nothing happened. Somehow they played nice together... I completely forgot I did that and it stayed like that for a couple of years. Then I realized and uninstalled the Nvidia drivers.
and that RTX cards runs better on a intel CPUs, yet a lot of people get a AMD CPU and RTX card then moan that there getting lower FPs than using a AMD CPU and a AMD card
Funnily my dad is doing the opposite hes been AMD for a while now and hes apparently sick of driver issues, I've no real reference myself only experience with AMDgpus I have is benchmarking the odd collectors piece now and then like a 295x2 and a VII etc
Tbh, I switched away from Intel as they stagnated too hard when they were way ahead of AMD. I turned away from NVidia when the new standard form-factor cards with coolers became common due to horrid pricing and driver issues. Now? AMD runs both CPUs and GPUs. They make amazing software. Their components all work together to squeeze out an extra 10~25% performance. The FineWine tech allows for drivers to further make GPUs performs even better. A Vega2 GPU on purchase could run 1080 at 80~90fps on Space Engineers. Today its running 140fps. The Driver rumors are very old. Coming from the same generation in where NVidia also had bad drivers. When Nvidia switched to the current form factor cards, they fixed most of the driver issues as the cards were basically just power houses of performance. No other tech went into then. But now, AMD had an extra 3 generations to fix their Driver problems and extra tech/software shaping for their CPU and GPUs. Now Nvidia having Driver issues all the time i see. Trying to keep all their 3rd party shit up to date. Wile everything AMD brings in is made in-house. Well majority of it. Not to mention better prices. The Encoders are now par with nvidia. The CPUs are so much better now, and even outperform Intel in the same bracket if not Par it. Only areas i see Nvidia and Intel pulling ahead? Are their flagship extreme parts. The parts that cost of atleats $2000 each. Raw power. But raw power isnt what is needed right now. We dont need Next 5 gens power of CPU or GPU. Gaming is still trying to outpace THIS current gen. And its not doing so well. Starfield being the FIRST ever game that REQUIRES an SSD. SSDs, 15 year old tech.
Great vid. I recently switched to AMD as the price to performance ratio was so much better than NVIDIA but what I wasn't expecting was how much better the AMD software was for tuning and displaying on screen metrics. A welcome plus.
That's so true, I went the other way. I've used Radeon since the HD 4870 but recently inherited a Nvidia card. I was looking for the settings and saw that the control panel looked like Windows 95 and instantly thought "Oh that can't be it" but yes... Yes it is. 🤣
I thought that at first when I bought a 6800XT around 9-10 months ago. But since then I've realised just like GeForce Experience/Control Panel on my old 2070 Super, it's only on very rare occasions that I actually have to go into Adrenaline in order to tweak anything, anyway. It's great that it's there and it's far better looking and easier to navigate than Nvidia's dated offering, but then again, it's barely ever used. Just in the same vein as "App Centre" for my Mobo or "Ryzen Master Utility" for my CPU are hardly ever used. Given the choice, I would choose to switch to Nvidias Control Panel & DLSS (which I sorely miss), over Adrenaline's better UI & FSR, any-day of the week. Price to performance was the only reason that I reluctantly chose AMD over Nvidia in the first place, TBH. BTW, I used to use DLSS on a fairly regular basis on my 2070 Super, however I just find that FSR is just either a poor quality or a downright unusable alternative to it. RT on the other-hand was/is barely useable on either GPU's.
@@quarkidee2878 Everything is still there and possibly with more options TBH, it's just terrible to navigate and looks dated. You are right on your plus point though, it's no where near as heavy on resources, than Adrenaline is.
I upgraded my 3060ti to an Rx 6800 and it's been fantastic actually maxing out games at 1440p without having to worry about stuttering when it ran out of vram
I've had a 4090 since December, but a few days ago I picked up an open box 7900 XTX for my second machine, and I will say it's been flawless. It's clear AMD has cleaned up some issues over the past several months because everything I've run on it has been great and I have 0 complaints about the card. I'm actually more intrigued by the 7900 than I am with the 4090 because everyone already knows the 4090 is king, but I like how the 7900 punches above its weight in some games, and some people talk about "AMD driver issues" where I've experienced none. Also, it was $828 vs $1699 I paid for the 4090, so there's that. Overall, I haven't been this impressed with an AMD card since the HD7950 like 11 years ago.
My only issue with the 7900XTX was the idle power draw. With two monitors and a high refresh rate it sucked 140W from the wall, constantly. I waited a few months but the drivers did not improve. Sold it and went for a 4090 and now it draws 20-30W when idle.
@@Teh-Penguin - Yeah I heard about the multi-monitor idle power draw issue with the XTX. Can't blame you for getting rid of it, I would've too under those circumstances.
I had no issues with the 7900XTX on a brand new rig....until I updated drivers a few times without DDU, and issues arose. Never had those problems before on previous gen AMD or Nvidia cards.
I spent over a decade with Nvidia GPUs (which were good, no complaints) and in late 2023 bought AMD for the first time. I was somewhat worried at first because of what Iheard and read online but... it works, performance-wise it's fine, the adrenalin software is decent. So... yay
Gahaha I love the little text Easter eggs you put into the vids. “One of the Steves” at 7:45 followed up by “The Other Steve” at 7:59 got a pretty good air-out-of-nose outta me. Keep up the good work, Iceberg
Honestly, the 7800xt its really not a bad choice here, its a ~60w less power usage, its new, smaller in size, and almost same performance giving all the new features, idk, maybe its just my view on it
It's also more expensive than a used 6900 xt. Performance wise it's fairly close, but it's also generally closer to the 6800 xt, a card that's also cheaper on the second hand market
@@sosukelele but...That's a used 3 year old card. The difference between a 6900xt/6800xt/7800xt is not that big at all. The extra $100 or so for a brand new card is well worth it imho.
I think is all about the price difference between the 7800Xt and 6800Xt. With similar price the first is a no brainer but if the 6800XT is like a 100 cheaper then you are better buying the 68000XT.
This is my first time seeing any of your videos, and i gotta say, this was great. Your realistic approach to all the gimicks card manufacturers pull to try and make their stuff seem better was refreshing. Ive been running a 6900xt for most of 2023, using it as my living room 4k tv GPU, and i absolutely love it. I'm glad to see someone giving it it's due. Subscribed
Your video has struck something in me more than any other tech TH-camr it wrenches my heart not seeing you more popular with this level of editing quality and talking you are quite literally the matpat of tech
I can get a 6800xt for 300 dollars on Facebook marketplace and I seen 7900xtx for 700 as they were 800 new this summer, if they drop to that or lower black Friday or Cyber Monday I'll buy.
After owning RTX 3060ti and 3070, I've realized most of the time i didnt really need to use RTX feature, made me realize how much value amd can give if its price to performance (thats why im saving for a 7900xt instead of a 4080.. (the prices here in the PH are outrageous)
If you had the Gaming OC RTX 3070, then we had the same gpu 😅I however chose to take a different path, I purchased an ASRock Phantom Gaming 7800 XT as a replacement on Saturday. It was one of 3 options, that card for $530 USD, a 6800 XT for $500 USD, or a used RTX 3080 12GB for $550 USD. It came down to future driver support for me and the inclusion of Starfield as an added bonus. All in all I would say so far the 7800 XT has been a worthwhile upgrade and I look forward to driver development for the card as well as a general understanding that the card will be supported longer than the other 2 options.
@@newax_productions2069 Not even close to true I’ve owned a 580 and a 5700xt, never had driver issues got the 2070 super nothing but driver issues, got the 4070 frequent driver issues it’s more like Nvidia has gotten worse drivers as Dead Space remake was unplayable on the 4000 series GPUs and still is even with the latest drivers
unfortunately for people like me form central EU the slight upcharge on nvidia is worth it cause of the lower power consumptions since our energy costs skyrocketed the last years.
I don't have any brand loyalty, but I think it's simply inarguable right now that Nvidia has the better product at the moment and has for awhile. If you care about ray tracing (and you should, if you're interested in pushing modern features in modern games like Cyberpunk, Control, Spiderman, Fortnite, Alan Wake 2, Ratchet and Clank, etc) then that literally only leaves one manufacturer producing cards capable of running them well. I'm not happy with that reality, but that is the case. Compounding this is features like DLSS and FG which help further push feature envelopes and have become defacto standard (see: inclusion in base standards + AMD's rush to haphazardly roll out pseudo responses). So, if one brand is setting the global standards and driving feature adoption in the latest title, most users are rightfully looking at the market and concluding the other one is not an option. Now, RT isn't great on lower midrange cards, but from the 4070 and up, you can ace RT and get good Path Tracing performance, which has been a game changer in Cyberpunk and Alan Wake 2. I would say if you're in the lower-midrange tier where RT and PT are off the table, AMD makes a potential purchase. But then you get back into DLSS and FG and the argument kind of falls apart again. And this is before the many, many, many quirks of the Radeon software stack, like incompatibility with some monitors. It's just too much for most consumers to bother with.
I bought a 4070 back in May. Upgraded from a 1660 Ti. This 4070 is my first experience with a high-end, ray tracing-capable GPU. I admit I did have a good bit of fun enabling RT in games that support it, just for the "wow" factor. I'd turn on ray tracing in a game like Doom Eternal, for example. Or Quake II RTX or Portal RTX. And I'd play for a while. I have two things to say about my experience with ray tracing: now that I'm used to the look of ray traced lighting, shadows, reflections, etc, it's difficult to go back to non-ray traced lighting and reflections. When I turn off ray tracing in Doom Eternal, for example. All those missing ray traced reflections becomes very noticeable. It's a bit jarring to turn off ray tracing and you notice all the missing reflections. However, my second point: at this point in time, until DLSS 3.5 is released, ray tracing is extremely 'noisy'. I thought there was an issue with my PC or GPU actually, because Quake II RTX and Portal RTX has very noisy ray tracing. Like, when light reflects off a metal surface, it doesn't look good. Anyone who's had personal experience with ray tracing knows what I'm talking about. So this is another reason why ray tracing is not even all that useful at this point in time, until DLSS 3.5 is implemented in games, which massively improves the ray tracing denoising quality. For now, what's the point in using ray tracing if it just looks *bad* often?
I get your point but I have a 3070Ti and I see no difference between RTX on and off. I've played the entire cyberpunk storyline with RTX off (on a 5700xt) and when I got the 3070Ti I was disappointed to see no difference between rasterized and ray traced lol... might just be me though. Never played the games you mentioned so might be just my case, though nvidia market cyberpunk as their go-to ray-traced experience
I honestly don't care about brand, i care more about price to performance. I switched from 1050ti to rx 6700xt this summer and I have 0 issues with it and most importantly I am very satisfied with it. Also switched from intel to amd ryzen.
Hey, Welcome to Team Red! There are a few things we wanted to make a bit clear here. 1. AMD has a separated ray tracing units and the patent doesn't seem to cover the way things work 100%. There is some load applied on SMs rather than these units, but it is purely connected to bvh calculations and has nothing to do with the rays per triangle calculations. We've talked about that in our video about ray tracing on different vendors differences if you're interested. 2. When it is true that AMD tends to be slower in RT this is not the case in 100% of cases. Obviously there cases when VRAM plays its role since RT is VRAM demanding as well, but also there are cases where rays per triangle calculations become more important that ray per bvh box and since AMD usually have comparable amount of CUs but on higher clock their rays per triangle is faster, that is also explained in our video about RT 3. Don't hurt the baby Series S!!! P. S. Team Red currently sucks as well with same 8gb cards :(
I have been on AMD since 2017 but bought a laptop late last year that had a (mobile) 3070. Honestly, losing "Chill" and AMD's software suite has been the biggest nuisance and it's a shame that my favorite feature (Chill) is almost never mentioned in any GPU review.
I wasn’t upgrading but buying a new PC instead, so it’s less relevant, but my thought process was essentially the same as yours, and being given (by myself lol) a higher budget, I went with a 7900 XT with a 1440p monitor for slightly longer future proofing in terms of VRAM amount and bus speed as well as to squeeze a tiny a bit of performance out of these heavy RT titles when I want to play something like the full path-traced mode of AW2 when it comes out (probably in 1080p with upscaling lol)
Personally speaking I'd love to see your video essays! I also started out with an R9 but mine was a 280X (used ex-mining card; my cats really loved the fact it made my PC double as a space heater lol). When it started artifacting in certain games, I got a GTX 1060 6GB. I'm currently on a RTX 2060 Super but I've always been RTX agnostic at best... maybe because I have an RTX 2060 Super lol but also nothing I really go for gaming-wise has had RTX added yet, so I can't yet render (cough) the enthusiasm. I still play at 1080p (I have a 1440p monitor, it's 75hz and I use it for work) and the games I play the most are relatively undemanding (read: old) so I'm okay for now... my very vague non-specific not at all subject to change plan is to upgrade my GPU before GTA6 so I hopefully I can wring another year or two out of the old girl yet lol
R9 280X is a rebranded 7970Ghz edition with a more mature BIOS and driver support. Interestingly, it was already ancient (in GPU terms) when Maxwell was established, yet it totally destroyed the GTX960. Now sure, the GTX960 was a heavily gimped card (disgusting specs, no SLI support, massively weak compared to the 970) - but the R9 280X was selling (decent Sapphire models) for £170 at the time the GTX960 came to market at £240. Imagine the 3 year old 7970Ghz beating the brand spanking new 960 by around 30%, despite costing 30% less. I had a few 280X's, only one ended up artifacting but I never took care of that one, and it was heavily used for many years. GREAT cards. Absolutely untouchable in terms of value by the end of their shelf time.
I believe it's not always about brand loyalty with content creators, rather pure absolut performance. Since tech is most of their life and as a creator, especially the big ones, spending the extra money for the best component is a nobrainer for them, not to mention some even get things for free. I myself have been with team green ever since 8800GTX era, life with tech took a couple years of hiatus, returning to the scene with 780Ti, 980Ti and finally 1080Ti. The latter was my main card throughout the early RTX era since GPU scene in general slowed down in generational improvements and prices got stupidly high for the high end parts. I recently upgraded and built yet another new PC with RTX4090, sure they are even more riddiculous in pricing but my 1080Ti starts to struggle with modern games. As you might have noticed I only buy the best card for that generation each time and Nvidia is still the one on top, so my choice very is simple. Why am I telling you guys this? Well, since I only purchase the best card on the market, I would like to think alot of people, especially creators, also do the same. Sure there are brand loyalty and fanboys, but one can't deny Nvidia has been the best for a very long time. Being the best do come with it's own special perks within the enthusiast community.
Upgraded from intel UHD 630 to Radeon HD 6370D to Vega 3 and then currently a 5700XT. No issues with AMD gpus. HD 6370D outperformed UHD 630 despite being a E2 3200 from 2011. My experience with AMD has been bug free and Great value for how long they has lasted.
It’s great to hear from another POV-RAY user from the 90s. That was my first raytracing experience too. :) As far as AMD in DaVinci Resolve, I’ve been benchmarking a lot of GPUs in the last week or two in both Resolve and Premiere Pro and found that in some cases for multi-layer editing it can make good use of multiple cards, spreading the FX processing across multiple cards. I have not had a chance to test the Arc A770 + Radeon combo yet (waiting on another PSU cable because my used one came with too few) but it might be an option for getting similar decode performance to the UHD770 Intel iGPU for decoding. In the meantime, let me know if there is any specific testing with a Radeon + iGPU vs Radeon alone that you would like me to do. I’d even be happy to process any example files if there are specific examples that are posing a problem.
Don't. Don't miss it. I currently have an RX 570 and am about to upgrade onto an rx 6700 xt out of this hell hole of a card. It has so many problems, mainly the drivers. The hardware isn't bad, but running anything on it is. I get random driver crashes that just cut my monitor output sometimes until a hard restart of my computer, sometimes it gets it back automatically. It even has poor performance for what it is. On top of this I have problems with running certain games with it due to the absolute garbage driver updates that it gets. As an example, during last summer it got a driver update that did something with the GPU software that would cause your game to crash if it tried to use it. It was such a specific error since the ONLY thing I could find that wasn't working with it, was Minecraft forge for 1.20.1, every other version of the same mod loader worked, every other mod loader worked and even the vanilla game worked. Forge was the only thing that couldn't be ran on it because of this driver. There was no other GPU I could that had the same issue, just the RX 570. But of course in true TH-cam fashion, I found an Indian TH-cam tutorial that explained the issue and how to fix it, which was to rollback to a previous driver.
i'm still on my rx 570, been 6 years now and i had only one problem once but it wasnt the gpu, in fact was a defected ram module that was causing windows to crash, other than that 100% fine, i'm planning to upgrade to an rtx card at some point, a friend of mine have both amd and nvidia and according to him, both had problems there is not a single brand free from having issues
Only reason I use nvidia gpu’s not because brand loyalty it’s because it has more optimization then Amd in most games. If it changes and they both get the same optimization then I’m going straight for the Amd cards.
I upgraded to a 6900XT early this year (Got a sweet deal!) from an RX570. I later realized that there isn't much of a difference as you go up in quality, but don't regret as I know that this will last me longer than the RX570.
I made the switch when the 6900XT came out, and upgraded from a 1080ti. Ray tracing wasn't in my mind at all, it was purely because I game on ultrawide 1440p and the 1080ti was starting to struggle at 60fps in newer titles at the time. I then had some regrets over time. Immediatly I noticed that it struggled in VR for some reason. Nothing demanding. Games like beatsaber would stutter randomly. Hopefully that's ficed now, but I can't test as I don't have the space for VR at the moment. The other regret came somewhat recently. Decided to get back into rendering and immediately remembered after setting up a scene and heading to render that I was used to using Nvidia Iray which isn't supported on AMD cards. Now a render that would have taken me 1 minute takes 10 on my CPU. I won't be switching back soon thanks to the crazy prices. But I will be going back to team green whenever the money lines up and the latest cards are worth it.
I've flipped a few times. The only issue that makes me wary of going back to AMD these days is the quality of the drivers. I had to return a 6600XT bought for my niece's PC because it would blackscreen on PC start.
I was having similar issues with my 6700xt, and then I found out the bios was not updated (should have checked that first), plus windows was replacing my driver's for whatever reason. I don't have those issues anymore, I honestly think it is worth giving a shot again, just make sure a few things are configured properly
Honestly, you deserve way more audience and recognition! Great videos and commentary 👌 also the budget part hits well cos im always on the poor side and always come to look at budget options 😅
Great card choice, I have bo doubt that with something as powerful as the 6900 XT you will itch you will itch to play on 4K from here on out. I hope you can upgrade your CPU soon as well, so editing is faster thsn before on all aspects and you cab extract as much performance out of the GPU as possible. I myself got a 6700 XT not too long ago and it runs like a charm.
One of my biggest issues with AMD is their software. Both the actual drivers as well as the user software just seems inferior in multiple ways that make me want to not switch.
It's been a great year for video game releases. Which is ultimately why I decided to upgrade my 2070 super to a 7900 XTX, haven't had any regrets so far.
I bought my 6950xt just a bit before the launch of the 7900 series and kind of regret it (even tho my wallet wouldve been pretty mad at me for buying it) but still the card does everything I need.
I upgraded recently, from a 2060 to a 3060ti. I considered getting a 6750xt, but the truth is NVidia GPUs, mostly due to brand loyalty, are far easier to resell, and reselling your old card is a crucial part in upgrading
depends on where you live, on eBay in the US radeon cards hold value better than nvidia does. During the pandemic the Radeon was actually cheaper than the 3060ti for example, yet the 3060ti sells for 200-220$ while the 6700xt for 270-300$. I recently got a good deal for a 3070Ti upgrading from my 5700xt, and I kinda miss it. DLSS is so worth it, but I only play in 1080p so I'm probably just going to sell the 3070Ti and get a 6800.
I'd work on switching that case asap. No mesh front + no space for the gpu to breath is a problem for the temps. 84c edge and 100+ junction and almost 3000 rpm sounds like a nightmare.
I upgraded from 1080ti to 7900 XTX, was NOT worth it. My Radeon 7900 XTX is so inconsistent and extremely "grainy". Too many AI features that just dont work. The adrenaline software does seem more intuitive than Nvidia control panel. I just want consistent high frames because my eyes are great, high refresh monitor is only utilized sometimes with this AMD card.
I myself used only nvidia for years. My experience over amd was positive plus I love SLI (rip). I went from a 2080 ti to a 7900 xtx and its a great product. The difference is noticeable for sure and understanding how to overcome them is fun. Nice choice sir.
The one true SLI is 3Dfx Scanline Interleave. I had a GTX295 which is a dual card with internal SLI and absolutely hated it, software support was abysmal where you needed just the right patch level of everything, and it just did not feel good, the frame pacing was atrocious as was the latency. What do i even care about how many frames per second it says it does if it doesn't feel right? I mostly used it in single GPU mode, and i liked dedicating one of the split GPUs to PhysX, that worked very nicely.
One of us! One of us! I stopped being one of Nvidia's customers when they released the 40-series and started selling DLSS instead of actual hardware. AMD seem to be the least profit-obsessed whilst also delivering a good product.
The only 40 series GPU that sells FG instead of actual hardware is arguably 4060 ti, which is still a big improvement compared to 3060 ti in terms of power draw. Everything else is significantly faster and draws less power.
I switched to AMD and have been having problems with dx11 games and it’s drivers who would have thought amd drivers would have problems… anyways tried some registry edits and seems good now.
My only reason to not use Radeon GPU cards is CUDA and its underperforming RoCM stack, specially on Linux. Encoding has been improved on Radeon in the last year, and rasterization is just excellent.
@@knowingbadger its not only that, newer Nvidia drivers literally limit my older laptops' 3050Ti power, newer ones wont let me draw more than i think it was 60W while the older ones allow up to 85W
@@AurasV well I feel like that also applies to amd laptops too to be fair. Just sounds like a way for them to conserve battery life since it's a laptop
I run my gtx 980 from 2014/2021 and switch to my RX-6900xt to 2022 till now I can confirm you Lads it definitely worth it better than nvidia friendlier software and easier to use and less greedy company
Nice to see a little more AMD showing. Never owned one but I had an amazing experience with a 6800xt system for a few weeks and the experience overall was butter smooth.
I am using davinci resolve for my sons online physiotherapy. Upload the videos captured by handphone to the therapist. My question is, why the 5600g utilization when compiling a mp4 video has all cores at 100%, while the rx 570 barely uses 10%? Making videos at 360 x 640, 500 kpbs and voice at 150 kpbs.
Just my guess with experience from using a cpu w/ iGPU (11th gen intel with Iris Xe integrated graphics) but the cpu probably has to work the cores much harder than the gpu because it's not designed for video compiling and rendering. I noticed that if I use Intel Quicksync (their iGPU dedicated encoding), cpu usage suddenly drops from 100% to ~50% while the iGPU is only using ~10-15%. So basically, GPUs are just much more effective at video processing work?
I am going to get a 7900 xtx AIO cooled this fall with a 7800X3D so I can use SAM and with a nice overclock I will have 4090 speeds or faster for less than the price of a 4090, it's crazy a 7900xtx + 7800X3D is less than. 4090 and with only better performance coming in with every driver update, the games I play the AMD GPU is faster anyways, I am currently using a 5800X3D and. 3080 FTW3 Ultra DDR4 cl16 3466 and I don't think the performance of the upgrade to current gen will be too much at 1440/1080P but my nephew needs a new computer so I am going to build one before I need to so he can get this one :)
I dont know man, im not a fanboy of a brand, since gt240 - gtx650ti - 2060super i never have a problem with nvidia drivers, i use amd HD6000 once, it gives me headache, and i am an IT guy who build my own 1st computer, nope.. i aint gonna have that experience again. And im a busy 8-5 employe, i rather give nvidia 100$ more for less headache.
Built my pc in 2022 with a 3070, upgraded to a 6800xt and watercooled it, love it. I made it pull over 400W in furmark the other day using Morepowertool The only strange issue I have is I have to clear my bios every time I disconnect the card for some reason, but my mobo bios isn’t up to date so maybe that’s it who knows, minor issue Building a pc for partner’s brother rn, his build will also be full red it’s just a better value and performance for his budget
I upgraded from a x79 1680 v2 and 980 ti to 6700xt last year, also got a good deal on a x570 motherboard that was "broken" for $20 and put a 5800x3d in it. no regrets with a little tweaking starfield run on mostly high settings with about 60fps average no mods. Honestly will probably go amd again especially if the price to performace is there like it aeems to be, nivida's really has meager offerings the past few gens in the actually affordable price ranges most people can dish out the cash for.
I went from a 1650 super, to a 2070 to a RX 6900 XT and I don't regret a thing. AMD Adrenaline alone is 1000% better than GeForce Experience. Plus, I just don't believe we should expect Nvidia to keep making gaming GPUs with how much they're raking in on AI cards.
bought my 6900 XT used once the prices started going down from the crypto boom, got an xtxh bin for under 600 USD, and it's been perfectly fine for 4k, its a bit overkill for the vast majority of games I play, and for the heavier games it's adequate to play optimized, but mostly high settings. It could use a repaste (and maybe some added thermal pads between the PCB and the back plate) but it still runs a stable overclock of just over 2700 MHz. Truly seems like one of the best value cards to get in 2023
I currently have a GTX 750ti and I upgraded my pc and only need a graphics card, but I am undecided between RTX 4060TI, 4070, RX 7700XT or 7800XT, they all vary about 150€ price from each other, ranging from 550€ to 700€ ish in my country. Any tips on what to choose, im currently leaning towards ASRock Radeon RX 7800 XT Phantom Gaming OC 16GB as it only costs 630€. I have: CPU-i7-13700K PSU-Corsair RM850x Motherboard-MSI PRO Z690-A WIFI DDR4 Ram-2x8GB G.Skill Case- CoolerMaster TD500 Mesh V2 Tips?
i tried out an rx 6600 as i felt my 1060 was getting a bit old i ended up with a pretty unstable card that had lag spikes all over the place would sometimes freeze my pc in certain games and generally just did not feel stable at all returned the card for a 4060 instead and has no longer had any of those issues since then its been extremely hard for me to consider another amd card like the 7800xt despite their prices because i am very worried its gonna burn me again :/ also i feel like fsr even on quality is far below dlss in terms of image quality especially in the background though that may be different at higher resolutions ?
I myself am switching to my first all AMD build. Going with a Ryzen 9 7950x paired with a Sapphire Tech Nitro+ RX 7900 XTX. Coming from an RTX 2070 laptop. This will be my first top of the line build. My best build before this not counting the laptop had an i5 4570, 16GB DDR3 1600MHz HyperX Fury RAM, and a Sapphire Tech RX 470 4GB. A fairly competent 1080p build.
My first rig used the RX5700 Sapphire AIB, and was plagued by driver issues until after I’d ordered a replacement card in the 3080ti. Haven’t had a single issue with the 3080ti beyond trying to manipulate the PCIE clip for the card on the mobo between the card and the DH15 cpu cooler. Performance was outstanding on the 5700 non xt, but the driver issues were only solved literally years after release.
the brand loyalty phenomena is interesting, it's like sports or national pride, you pick a brand and follow it until the end, even when presented with arguments against it, which you try to explain and defend your brand of choice, it truly is interesting what human psychology can do.
Apple :
It's just like arguing about Playstation vs N64 on the playground as a kid.
I go for price I have a 4070ti because someone on marketplace wanted to swap for a lower card and some cash so I swapped my 3060ti and £200 for the 4070ti
An overlooked part of it is the effect of retailers and retail employees. I sent my cousin in to get him an AMD card specifically for best value, and the Microcenter worker ended up convincing him to get a WORSE Nvidia card for MORE money. I had forgotten to warn him commission often forces them to act just like car salesmen - and that they aren't necessarily all experts anyway.
When retail workers have brand loyalties, it impacts sales especially to newcomers.
@@acid3129 Same here I always check benchmarks and reviews and then factor in price. Been doing that since the late 90's when I bought my first ever GPU. I have no clue as to why anyone WOULDNT do that, but blindly follow a company no matter what, even buying massively inferior cards while doing so.
The last 2 Geforce cards I owned were a GTX 970 (cos it actually won vs the R9 290 - but then a week later AMD dropped the price by 30% DOH!), and the GTX980Ti which a friend sold me for £250 and I couldn't resist.
Ever since then, every card has been AMD cos they simply smashed Nvidia at my price point every single time. And with the 7800XT, looks like I'm going that way again.
I just upgraded from my 2060 to an rx 6800 mostly because I wanted more VRAM so I fully understand. It’s been a great card so far!
same situation, went from a 1060 6gb to a 6600, immediately traded that and $100 for a 6800, its been an amazing card so far
I went from an Rx 550 to an rx 6800. Couldn’t be happier
So, from what I heard. Basically a 7800XT
@@Order..66you went from integrated graphics to a powerful gpu 😂
When I tried an rx 580 it ran super hot and loud compared to my 1050ti
i upgraded from a 3080 10Gb to the 7900 xtx and I have been supremely pleased. This is the first time I've ever been able to absolutely crank texture settings at 4k without losing a high refresh experience. Nvidia's new tech does look cool but I have not regretted my purchase and I hope you don't regret yours either!
Same thing here. Pretty happy with mine. I'm excited to see what amd will being against a 50 series. Rumors are the 5090 will be a huge generational leap
Which model did you go with?
@@FantasticKruH it’s just a stock model shroud made by power color. Somewhat unassuming but I dig the look. I got fairly lucky in the lottery as I can undervolt a good amount while cranking the memory frequency and the power limit. Hottest it ever gets is low 70s.
How much was it?
@@matthewzavala6491 I paid about $1000 for mine but that's before shipping and taxes.
Came back to AMD after over a decade. Couldn't be happier
Same. Replaced my 1070 with an RX 6800 earlier this year. I just couldn't bring myself to buy a 70 class Geforce with the SAME VRAM as two gens ago! The last Radeon I owned was an R9-270X back in 2013 lol
I've had Nvidia cards for the last 10 years. I decide that enough was enough and decided to try AMD. I bought a Branded new 6700xt for £276 (actually cost me £230 but it was bought through a company which took off the VAT), I really can't fault the card that much; the idle power consumption is high at 34 watts compared with my 1070 which idled at 10 watts.
@@Jimmy-6300 I had a 270X Toxic and that thing punched pretty damn hard!
@@Jimmy-6300 Same here, i replaced my GeForce GTX 1080 2 years ago for a AMD Radeon RX 6800,
that is 2 times faster and has two times more VRAM.
Also upgraded from Nvidia to an RX 6800, though shortly after release. We didn’t get hit as hard with pricing during the crisis in NZ. I had no issue playing games like Hogwarts Legacy or The Last of Us at release, and my only real “sacrifice” was having to turn off RT. I also dual boot in Linux, so AMD made that a lot easier.
I was convinced enough that I upgraded to a 7900 XTX after getting a higher res screen.
This might be the most criminally underrated tech review channel on TH-cam, if not on all of internet. From the intro and outro, the quality of your graphs and the depth of your research, the channels feels like a ten million sub quality.
I wouldnt go that far, but it is good content for sure. I feel as if he is selling emotion and then claiming not to have any himself. I like the videos but they could be less about what could be, more about what is, what matters, and the reality that RTX is a gimmick. BECAUSE RTX IS A GIMMICK.
@@Machistmo Ray Tracing is one of the most over hyped niche technos I've ever seen.
That's crazy.
Agree
with what CPU you paired it with. I'm looking around to get a new PC with a AMD GPU (7800, 7900 XTX is way to much for my needs) to replace my i7-7700K with GTX1060 GPU. And btw, what's your distro choice?
I agree. Glad I landed here.
I switched to a linux based operating system, and the drivers for my GTX1080 gave me grief on numerous occasions. So when i built my new pc i grabbed a 7900xtx, and it's been beautifully seamless thanks to AMD's open-source drivers.
That's exactly my experience, with a 1080 too. Haven't gotten to the "buy a radeon" part yet tho. I'm very sad to say nvidia has ruined my experience with Linux. X11 and nvidia sucks ass on things like kde plasma. Wayland sometimes works, sometimes doesnt. Video decoding in browsers is completely unsupported unlike with Intel. It made me switch back to Windows after 2 years as a rampant arch linux fanboy (btw). Feels like it killed a hobby for me.
On the other side tho, I can't deny that if I could get an RTX card, I would do so, despite my experiences with nvidia on Linux. So it's kind of my own fault as well.
Yeah Linux runs great with AMD graphics cards, NVIDIA never liked Linux, and AMD have always
made great drivers for it for Linux.
@@Noodles1922 Drivers being open source has nothing to do with game specific features.
@@Noodles1922if I were a developer, putting my effort into a technology (DLSS) that not only works on only one of the three card brands, but certain versions won't even work on those cards depending on how old they are, I think I'd be more inclined to use something more open that has better coverage. AMD and Intel both have their technologies work on all brands, no matter how new or old they are. Considering both the Xbox and PlayStation are AMD as well, it makes even more sense to not cater to that nonsense. Support will come eventually for Nvidia hardware, just not on day one.
@@Noodles1922 Love how everyone jumps on AMD fucked us in Starfield. Literally proves who never played a Bethesda game in their life.
I bought a 6950XT after I sold my 4080 for a minimal loss. The 6950XT does the job for me, but I do miss DLSS since FSR can be a shimmering mess at times. Otherwise, the AMD card is fine.
Why? Isn't rtx 4080 better ?
@@Tnasucks2 it's just a waste of money. I bought a Rx 7900 xtx for 980 realised I didn't need it so sold it for 965. Bought a Rx 6800 second hand for 350
@@Tnasucks2Pretty much the same as the other person. Sold it and bought a 6950XT for £530 after selling the Starfield code.
@@leroyjenkins0736 lucky bastard
used 6800 non xt is £400 here
@@leroyjenkins0736comments like yours sum up this generation - both nVidia and AMD are asking for far too much money for most people at the top end. The sad thing is that Ada Lovelace is nVidia's best architecture going back to Tesla and G80 in terms of gen improvements but they've made it unaffordable to most this time around.
I would disagree about it being a waste of money only in so much as more games are going to need higher system requirements now cross gen is done. Not to mention Nanite and Lumen in UE5 which are going to destroy a lot of GPUs when they become commonplace 😂.
I switched to AMD once, after being an Nvidia user all my life. Within three days, my dog died. It hung itself in shame. Then both my parents died in a car crash. After the funeral, the bank repossesed the family home, and I was about to live on the street. But just before I was evicted, the AMD card exploded, destroying everything for 11 blocks and killing hundreds of people.
Geforce is better. Buy a 4060Ti. In fact, buy 10 of em. You'll save a fortune!
-Jensen
Remember, the more you buy, the more you save😎
@@jthedood1605 Exactly!
1. Upscaling is the new Native!
2. Fake Frames are better than REAL FRAMES!
3. AMD cards explode! (protect ur family)
Dramatic as it sounds, a concerning amount of people talk about AMD cards like this. Switching to AMD for me was pretty uneventful.
@@yasu_redyeah it is mostly Ngreedia fanboys hating on people for buying smart and thinking with their brains and not their wallets
I have used Nvidia 2001-2012 , then I have swith to AMD 2012-2023 , and in May this year I have Nvidia gpu , oh tho that is "older" gpu , so this or next year I get new one , I considered to wait for 5070 get out next year , but if I will 4080 in fair price I will grab it , I want to play in 3 games Starfield , Cyberpunk 2077 & RDR2 and only RDR2 is ready CP2077 is still updated and patched , Starfield will be playble like 2025 , so I can wait , now I get ready to play 2007 NWN2 later POE so I have games that keep me going until I get new GPU :)))
I've been on an RX580 since before the video card crisis started and have been very pleased. I only miss the more robust nvenc encoders from team green for streaming, but AMD has made me a believer and I'm still holding out to pick up a new decent "5yr AMD card." Very glad to hear that the Davinci Resolve support is only getting better!
I honestly feel like the 7800xt is that 5 year card, but it remains to be seen how well it performs over the years
6700xt is aging like a fine wine
@@redpill2634 right?! Every 6700 variant seems to best the 7600... Its a hard choice to go with last year's model but it may be the right one.
@@redpill2634Definitely, it's what I upgraded to from a 580, and it's been performing really well.
I had no NEED to upgrade from my 6700XT to a 7900XTX but I did. The 6700XT is a beautiful tank. I think the upper tier 6000 series cards are the new Polaris for lack of a better term. will be good for years for solid 1440p gaming as the 12GB VRAM is excellent at the price and AMD keeps improving FSR. Lovely grab these days for $300 ish new. @@redpill2634
at around 9:30 the 7900 XTX isn’t around 8 frames anymore I have everything max and it averages 41 frames. I think with FSR 3 it’ll hit the 60 fps mark for sure
I switched to AMD Radeon just this weekend, for the first time in my life. All of Nvidia's current offerings are either underwhelming or massively overpriced, or both. AMD's new RX 7800 XT ticked all of the boxes for me: good performance, decent price (compared to the alternatives anyway), plenty of VRAM going forward, and RT performance is not *that* far behind. Sure I miss out on DLSS, but it feels to me that DLSS is now being used mostly as a tactic from Nvidia to prey on people's FOMO, to get them trapped in vendor lock-in, and to cover up the fact that their new GPUs are under-performing. Nvidia are becoming an increasingly scummy company and I don't feel like rewarding their behavior at all.
Same. Might try undervolting my 7800 XT.
I don't think they're using DLSS as fomo, I think they genuinely think DLSS is the way forward. The cards in the 40x lineup heavily incorporate hardware features for it, and they really only reach good performance levels (with ray tracing) with DLSS turned on. I think they might be on to something, AI generation is the future and will make higher frame rates much more accessible, but people are going to have to accept that with frame generation turned off their cards won't be as impressive
AMD cards are over marketed. Thats the problem right now. They are great cards don't get me wrong. But having both kinda gives your own conclusions to this man.
@moonasha they should charge less then if the cards themselves aren't impressive without relying on frame generation.
I bought myself an RX 6950 XT about a month ago. it was brand new on Newegg for a bit under $600 after taxes and shipping (plus it came with Starfield).
I gotta say other than a few minor gripes (which are mostly about RT performance) I love the card and it's been serving me very well.
It was a huge upgrade in performance, stability, and efficiency over the A770 I was previously burdening myself with.
Hopefully you find yourself to be as happy with your choice as a I was with mine, cheers!
I don't think you'll regret it. Been using AMD since the HD 5870 and outside of a period of time where the 5700xt had broken drivers and was black screening a lot, across all models it's been great. And I'd assume by the time developers are bold enough to fully transition to ray traced only games, rt will be something even budget gpus do well.
Dunno why iceberg said that RDNA3 has RT performance similar to RTX 3000 series when 7900xt is faster in many games using RT than 4070ti. Similar with 7900xtx and 4080. Of course Radeon gpus will lose in games like CP77 (this is one of few really good looking rt games) because they are made for nvidia. Look what is happening in games like metro exodus, resident evil, dead space, even control where nvidia did a lot to make rt working the best on their gpus and somehow amd gpus are working very good with ray tracing.
@@MacTavish9619The 4070ti losing to the 7900xt is karma for being stingy with VRAM . (For lols the 3070 gets beat by the 6800). The 7900xtx is about a 3090 in RT , and the xt is about as fast as a 3080 from what I see. So not bad , but not a match for top Lovelace - the top Radeon card about 25% down on the 4080 , and 60% down on the 4090.
My first GPU was a Voodoo card (twin GPU's!) after that it was all Nvidia all the way up to the RTX 2080TI which was the first card that disappointed me, but between the absurd price of the 4090 and the lost of EVGA as a board partner I decided to finally give AMD a go and went for a RX 7900 XTX (I game in 4k) so far I've been very happy with it. If Nvidia hadn't so grossly over charge for the 4090 and screwed over EVGA I might never had changed over.
you made this 2 months ago, still happy with the 7900? genuinely curious as I'm considering making the switch
Are you the one person who bought a 5900gtx? Lol. 8800gtx were woeful too. There has been some truly awful nvidia cards over the years but people are brainwashed.
I had a Red Devil 6900XT and loved it. It is now in my friend's PC after I had it for 2 years and upgraded to a 4090 (AMD didn't make anything to compete with it and it is amazing for DCS VR). I do miss AMD's drivers though. Relive and tuning in it was awesome.
7900XTX rivals the 4090 in performance when it comes to raw rendering rate, without DLSS / Raytracing on, for half the price, it beats or matches the fps of the 4090 in every game except for ones that rely on DLSS heavily.
@@sankaplays3098 no, it doesn't. It is a bit better than the 4080. I play a lot of Halo Infinite and the performance lines up with the 4080. Also, VR is way better on the 4090. That is why I have it. Halo Infinite is usually 144 fps+ at 4k. It doesn't have DLSS and I have it rendering native 4k for my Samsung G70A.
@@sankaplays3098CAP
Congratulations on the upgrade man! I hope it works out for you. When I just built my rig I went with a rx 6800 myself and a Ryzen 7 7700x stellar little system so far.
Same got my self a Ryzen 5 7800x with a 6700xt will upgrade the gpu later on when the prices come way down
Man your videos keep getting better and better. keep it up you gonna blow up
The biggest problem with a lot of people who switch to AMD or Nvidia from eachother is that they don’t properly uninstall the old drivers and when the switch happens they get all kinds of driver and performance issues
Last time I had both sets of drivers installed nothing happened. Somehow they played nice together... I completely forgot I did that and it stayed like that for a couple of years. Then I realized and uninstalled the Nvidia drivers.
That's because the average person is statistically proven to be an idiot.
and that RTX cards runs better on a intel CPUs, yet a lot of people get a AMD CPU and RTX card then moan that there getting lower FPs than using a AMD CPU and a AMD card
@@Muddy.Teabagger not sure this is true at all.
Funnily my dad is doing the opposite hes been AMD for a while now and hes apparently sick of driver issues, I've no real reference myself only experience with AMDgpus I have is benchmarking the odd collectors piece now and then like a 295x2 and a VII etc
Tbh,
I switched away from Intel as they stagnated too hard when they were way ahead of AMD.
I turned away from NVidia when the new standard form-factor cards with coolers became common due to horrid pricing and driver issues.
Now? AMD runs both CPUs and GPUs. They make amazing software. Their components all work together to squeeze out an extra 10~25% performance. The FineWine tech allows for drivers to further make GPUs performs even better. A Vega2 GPU on purchase could run 1080 at 80~90fps on Space Engineers. Today its running 140fps.
The Driver rumors are very old. Coming from the same generation in where NVidia also had bad drivers. When Nvidia switched to the current form factor cards, they fixed most of the driver issues as the cards were basically just power houses of performance. No other tech went into then.
But now, AMD had an extra 3 generations to fix their Driver problems and extra tech/software shaping for their CPU and GPUs. Now Nvidia having Driver issues all the time i see. Trying to keep all their 3rd party shit up to date. Wile everything AMD brings in is made in-house. Well majority of it.
Not to mention better prices. The Encoders are now par with nvidia. The CPUs are so much better now, and even outperform Intel in the same bracket if not Par it.
Only areas i see Nvidia and Intel pulling ahead? Are their flagship extreme parts. The parts that cost of atleats $2000 each. Raw power.
But raw power isnt what is needed right now. We dont need Next 5 gens power of CPU or GPU. Gaming is still trying to outpace THIS current gen. And its not doing so well.
Starfield being the FIRST ever game that REQUIRES an SSD. SSDs, 15 year old tech.
Great vid. I recently switched to AMD as the price to performance ratio was so much better than NVIDIA but what I wasn't expecting was how much better the AMD software was for tuning and displaying on screen metrics. A welcome plus.
That's so true, I went the other way. I've used Radeon since the HD 4870 but recently inherited a Nvidia card. I was looking for the settings and saw that the control panel looked like Windows 95 and instantly thought "Oh that can't be it" but yes... Yes it is. 🤣
@@oblivieon1567I mean, it isn't really bad. Plus point: It runs well with an shitty CPU.
I thought that at first when I bought a 6800XT around 9-10 months ago. But since then I've realised just like GeForce Experience/Control Panel on my old 2070 Super, it's only on very rare occasions that I actually have to go into Adrenaline in order to tweak anything, anyway. It's great that it's there and it's far better looking and easier to navigate than Nvidia's dated offering, but then again, it's barely ever used. Just in the same vein as "App Centre" for my Mobo or "Ryzen Master Utility" for my CPU are hardly ever used. Given the choice, I would choose to switch to Nvidias Control Panel & DLSS (which I sorely miss), over Adrenaline's better UI & FSR, any-day of the week. Price to performance was the only reason that I reluctantly chose AMD over Nvidia in the first place, TBH.
BTW, I used to use DLSS on a fairly regular basis on my 2070 Super, however I just find that FSR is just either a poor quality or a downright unusable alternative to it. RT on the other-hand was/is barely useable on either GPU's.
@@quarkidee2878 Everything is still there and possibly with more options TBH, it's just terrible to navigate and looks dated. You are right on your plus point though, it's no where near as heavy on resources, than Adrenaline is.
Though for me adrenaline keeps resetting my UV profile and randomly crashing (and when I switched to AMD I did a clean W install, twice...)
I upgraded my 3060ti to an Rx 6800 and it's been fantastic actually maxing out games at 1440p without having to worry about stuttering when it ran out of vram
I've had a 4090 since December, but a few days ago I picked up an open box 7900 XTX for my second machine, and I will say it's been flawless. It's clear AMD has cleaned up some issues over the past several months because everything I've run on it has been great and I have 0 complaints about the card. I'm actually more intrigued by the 7900 than I am with the 4090 because everyone already knows the 4090 is king, but I like how the 7900 punches above its weight in some games, and some people talk about "AMD driver issues" where I've experienced none. Also, it was $828 vs $1699 I paid for the 4090, so there's that.
Overall, I haven't been this impressed with an AMD card since the HD7950 like 11 years ago.
My only issue with the 7900XTX was the idle power draw. With two monitors and a high refresh rate it sucked 140W from the wall, constantly. I waited a few months but the drivers did not improve. Sold it and went for a 4090 and now it draws 20-30W when idle.
@@Teh-Penguin - Yeah I heard about the multi-monitor idle power draw issue with the XTX. Can't blame you for getting rid of it, I would've too under those circumstances.
@@Teh-Penguin from what i've heard its been fixed by now, but i dont have a 7900xtx so i'm not certain
@@zkrrx good to know! Although I have a 4090 now so it doesn't matter to me anymore ^^
I had no issues with the 7900XTX on a brand new rig....until I updated drivers a few times without DDU, and issues arose. Never had those problems before on previous gen AMD or Nvidia cards.
I spent over a decade with Nvidia GPUs (which were good, no complaints) and in late 2023 bought AMD for the first time.
I was somewhat worried at first because of what Iheard and read online but... it works, performance-wise it's fine, the adrenalin software is decent.
So... yay
Gahaha I love the little text Easter eggs you put into the vids. “One of the Steves” at 7:45 followed up by “The Other Steve” at 7:59 got a pretty good air-out-of-nose outta me. Keep up the good work, Iceberg
Honestly, the 7800xt its really not a bad choice here, its a ~60w less power usage, its new, smaller in size, and almost same performance giving all the new features, idk, maybe its just my view on it
It's also more expensive than a used 6900 xt. Performance wise it's fairly close, but it's also generally closer to the 6800 xt, a card that's also cheaper on the second hand market
@@sosukelele but...That's a used 3 year old card. The difference between a 6900xt/6800xt/7800xt is not that big at all. The extra $100 or so for a brand new card is well worth it imho.
@@sosukeleleIt hase AI accelerator cores
I agree RX 7800XT is better choice: about same performance as RX 6900XT but better power usage and better temps and newer features
I think is all about the price difference between the 7800Xt and 6800Xt.
With similar price the first is a no brainer but if the 6800XT is like a 100 cheaper then you are better buying the 68000XT.
This is my first time seeing any of your videos, and i gotta say, this was great. Your realistic approach to all the gimicks card manufacturers pull to try and make their stuff seem better was refreshing. Ive been running a 6900xt for most of 2023, using it as my living room 4k tv GPU, and i absolutely love it. I'm glad to see someone giving it it's due. Subscribed
Your video has struck something in me more than any other tech TH-camr it wrenches my heart not seeing you more popular with this level of editing quality and talking you are quite literally the matpat of tech
I can get a 6800xt for 300 dollars on Facebook marketplace and I seen 7900xtx for 700 as they were 800 new this summer, if they drop to that or lower black Friday or Cyber Monday I'll buy.
yeah i switched from a gtx 1650 super to an rx480 8gb. to say 4gb cards arent enough is the understatement of the century.
After owning RTX 3060ti and 3070, I've realized most of the time i didnt really need to use RTX feature, made me realize how much value amd can give if its price to performance (thats why im saving for a 7900xt instead of a 4080.. (the prices here in the PH are outrageous)
In australia i saw a rtx4070 for 1700 dollas ridiculous
I loved my old Sapphire 6800XT, the Sapphire 7900XTX I replaced it with seems to have covered up any small holes the 6000 series had.
Old 😂 as I'm laughing in r9 280x on Linux dream with a Xeon 10 core
@@79huddy i went back and benched my 290x and was still impressed with just 2 weeks ago
I also have a Sapphire 7900 XTX, same - couldn't be happier.
If you had the Gaming OC RTX 3070, then we had the same gpu 😅I however chose to take a different path, I purchased an ASRock Phantom Gaming 7800 XT as a replacement on Saturday. It was one of 3 options, that card for $530 USD, a 6800 XT for $500 USD, or a used RTX 3080 12GB for $550 USD. It came down to future driver support for me and the inclusion of Starfield as an added bonus. All in all I would say so far the 7800 XT has been a worthwhile upgrade and I look forward to driver development for the card as well as a general understanding that the card will be supported longer than the other 2 options.
Sadly AMD drivers are complete trash.
@@newax_productions2069 Not anymore
@@nayxyann2374 i literally just sold a 6800XT due to driver issues
@@newax_productions2069
Not even close to true I’ve owned a 580 and a 5700xt, never had driver issues got the 2070 super nothing but driver issues, got the 4070 frequent driver issues it’s more like Nvidia has gotten worse drivers as Dead Space remake was unplayable on the 4000 series GPUs and still is even with the latest drivers
unfortunately for people like me form central EU the slight upcharge on nvidia is worth it cause of the lower power consumptions since our energy costs skyrocketed the last years.
I don't have any brand loyalty, but I think it's simply inarguable right now that Nvidia has the better product at the moment and has for awhile. If you care about ray tracing (and you should, if you're interested in pushing modern features in modern games like Cyberpunk, Control, Spiderman, Fortnite, Alan Wake 2, Ratchet and Clank, etc) then that literally only leaves one manufacturer producing cards capable of running them well. I'm not happy with that reality, but that is the case. Compounding this is features like DLSS and FG which help further push feature envelopes and have become defacto standard (see: inclusion in base standards + AMD's rush to haphazardly roll out pseudo responses). So, if one brand is setting the global standards and driving feature adoption in the latest title, most users are rightfully looking at the market and concluding the other one is not an option. Now, RT isn't great on lower midrange cards, but from the 4070 and up, you can ace RT and get good Path Tracing performance, which has been a game changer in Cyberpunk and Alan Wake 2.
I would say if you're in the lower-midrange tier where RT and PT are off the table, AMD makes a potential purchase. But then you get back into DLSS and FG and the argument kind of falls apart again. And this is before the many, many, many quirks of the Radeon software stack, like incompatibility with some monitors. It's just too much for most consumers to bother with.
I bought a 4070 back in May. Upgraded from a 1660 Ti. This 4070 is my first experience with a high-end, ray tracing-capable GPU. I admit I did have a good bit of fun enabling RT in games that support it, just for the "wow" factor. I'd turn on ray tracing in a game like Doom Eternal, for example. Or Quake II RTX or Portal RTX. And I'd play for a while. I have two things to say about my experience with ray tracing: now that I'm used to the look of ray traced lighting, shadows, reflections, etc, it's difficult to go back to non-ray traced lighting and reflections. When I turn off ray tracing in Doom Eternal, for example. All those missing ray traced reflections becomes very noticeable. It's a bit jarring to turn off ray tracing and you notice all the missing reflections.
However, my second point: at this point in time, until DLSS 3.5 is released, ray tracing is extremely 'noisy'. I thought there was an issue with my PC or GPU actually, because Quake II RTX and Portal RTX has very noisy ray tracing. Like, when light reflects off a metal surface, it doesn't look good. Anyone who's had personal experience with ray tracing knows what I'm talking about. So this is another reason why ray tracing is not even all that useful at this point in time, until DLSS 3.5 is implemented in games, which massively improves the ray tracing denoising quality. For now, what's the point in using ray tracing if it just looks *bad* often?
I get your point but I have a 3070Ti and I see no difference between RTX on and off. I've played the entire cyberpunk storyline with RTX off (on a 5700xt) and when I got the 3070Ti I was disappointed to see no difference between rasterized and ray traced lol... might just be me though. Never played the games you mentioned so might be just my case, though nvidia market cyberpunk as their go-to ray-traced experience
I honestly don't care about brand, i care more about price to performance. I switched from 1050ti to rx 6700xt this summer and I have 0 issues with it and most importantly I am very satisfied with it.
Also switched from intel to amd ryzen.
Hey, Welcome to Team Red!
There are a few things we wanted to make a bit clear here.
1. AMD has a separated ray tracing units and the patent doesn't seem to cover the way things work 100%. There is some load applied on SMs rather than these units, but it is purely connected to bvh calculations and has nothing to do with the rays per triangle calculations. We've talked about that in our video about ray tracing on different vendors differences if you're interested.
2. When it is true that AMD tends to be slower in RT this is not the case in 100% of cases. Obviously there cases when VRAM plays its role since RT is VRAM demanding as well, but also there are cases where rays per triangle calculations become more important that ray per bvh box and since AMD usually have comparable amount of CUs but on higher clock their rays per triangle is faster, that is also explained in our video about RT
3. Don't hurt the baby Series S!!!
P. S. Team Red currently sucks as well with same 8gb cards :(
Fug team red, be TEAM CONSUMER, yo
lol
@@mukkah didn’t you read the comment till the end?
Fug Team blue - red - green
Team Consumer YOOOOOOO @@HardwareLab
I went from an Rx 580 to a 4080 until it melted (It worked for 1 month RIP), then moved back to a 7900XTX
I have been on AMD since 2017 but bought a laptop late last year that had a (mobile) 3070. Honestly, losing "Chill" and AMD's software suite has been the biggest nuisance and it's a shame that my favorite feature (Chill) is almost never mentioned in any GPU review.
I wasn’t upgrading but buying a new PC instead, so it’s less relevant, but my thought process was essentially the same as yours, and being given (by myself lol) a higher budget, I went with a 7900 XT with a 1440p monitor for slightly longer future proofing in terms of VRAM amount and bus speed as well as to squeeze a tiny a bit of performance out of these heavy RT titles when I want to play something like the full path-traced mode of AW2 when it comes out (probably in 1080p with upscaling lol)
Personally speaking I'd love to see your video essays! I also started out with an R9 but mine was a 280X (used ex-mining card; my cats really loved the fact it made my PC double as a space heater lol). When it started artifacting in certain games, I got a GTX 1060 6GB. I'm currently on a RTX 2060 Super but I've always been RTX agnostic at best... maybe because I have an RTX 2060 Super lol but also nothing I really go for gaming-wise has had RTX added yet, so I can't yet render (cough) the enthusiasm. I still play at 1080p (I have a 1440p monitor, it's 75hz and I use it for work) and the games I play the most are relatively undemanding (read: old) so I'm okay for now... my very vague non-specific not at all subject to change plan is to upgrade my GPU before GTA6 so I hopefully I can wring another year or two out of the old girl yet lol
R9 280X is a rebranded 7970Ghz edition with a more mature BIOS and driver support. Interestingly, it was already ancient (in GPU terms) when Maxwell was established, yet it totally destroyed the GTX960. Now sure, the GTX960 was a heavily gimped card (disgusting specs, no SLI support, massively weak compared to the 970) - but the R9 280X was selling (decent Sapphire models) for £170 at the time the GTX960 came to market at £240. Imagine the 3 year old 7970Ghz beating the brand spanking new 960 by around 30%, despite costing 30% less.
I had a few 280X's, only one ended up artifacting but I never took care of that one, and it was heavily used for many years. GREAT cards. Absolutely untouchable in terms of value by the end of their shelf time.
Switched from 3070 to 7900xt and I'm very happy.
I believe it's not always about brand loyalty with content creators, rather pure absolut performance. Since tech is most of their life and as a creator, especially the big ones, spending the extra money for the best component is a nobrainer for them, not to mention some even get things for free.
I myself have been with team green ever since 8800GTX era, life with tech took a couple years of hiatus, returning to the scene with 780Ti, 980Ti and finally 1080Ti. The latter was my main card throughout the early RTX era since GPU scene in general slowed down in generational improvements and prices got stupidly high for the high end parts. I recently upgraded and built yet another new PC with RTX4090, sure they are even more riddiculous in pricing but my 1080Ti starts to struggle with modern games. As you might have noticed I only buy the best card for that generation each time and Nvidia is still the one on top, so my choice very is simple.
Why am I telling you guys this? Well, since I only purchase the best card on the market, I would like to think alot of people, especially creators, also do the same. Sure there are brand loyalty and fanboys, but one can't deny Nvidia has been the best for a very long time. Being the best do come with it's own special perks within the enthusiast community.
Upgraded from intel UHD 630 to Radeon HD 6370D to Vega 3 and then currently a 5700XT. No issues with AMD gpus. HD 6370D outperformed UHD 630 despite being a E2 3200 from 2011. My experience with AMD has been bug free and Great value for how long they has lasted.
What a coincidence, I just upgraded from a 2080 ti to a 6950 xt today! I'm excited to see the improvement!!
Why upgrade from a 2080ti?t
@@evaone4286 6950 xt is on average 50% faster. That's quite an improvement
It’s great to hear from another POV-RAY user from the 90s. That was my first raytracing experience too. :) As far as AMD in DaVinci Resolve, I’ve been benchmarking a lot of GPUs in the last week or two in both Resolve and Premiere Pro and found that in some cases for multi-layer editing it can make good use of multiple cards, spreading the FX processing across multiple cards.
I have not had a chance to test the Arc A770 + Radeon combo yet (waiting on another PSU cable because my used one came with too few) but it might be an option for getting similar decode performance to the UHD770 Intel iGPU for decoding.
In the meantime, let me know if there is any specific testing with a Radeon + iGPU vs Radeon alone that you would like me to do. I’d even be happy to process any example files if there are specific examples that are posing a problem.
I used an RX 570
then upgraded to a GTX 1660 super
and now I own an RTX 3060 Ti
I do sometimes miss my RX 570 tho
Don't. Don't miss it.
I currently have an RX 570 and am about to upgrade onto an rx 6700 xt out of this hell hole of a card. It has so many problems, mainly the drivers. The hardware isn't bad, but running anything on it is.
I get random driver crashes that just cut my monitor output sometimes until a hard restart of my computer, sometimes it gets it back automatically. It even has poor performance for what it is.
On top of this I have problems with running certain games with it due to the absolute garbage driver updates that it gets. As an example, during last summer it got a driver update that did something with the GPU software that would cause your game to crash if it tried to use it. It was such a specific error since the ONLY thing I could find that wasn't working with it, was Minecraft forge for 1.20.1, every other version of the same mod loader worked, every other mod loader worked and even the vanilla game worked. Forge was the only thing that couldn't be ran on it because of this driver. There was no other GPU I could that had the same issue, just the RX 570.
But of course in true TH-cam fashion, I found an Indian TH-cam tutorial that explained the issue and how to fix it, which was to rollback to a previous driver.
Was my first gpu, solid option in late 2019
Made me £200 when I sold it to a miner as well
i'm still on my rx 570, been 6 years now and i had only one problem once but it wasnt the gpu, in fact was a defected ram module that was causing windows to crash, other than that 100% fine, i'm planning to upgrade to an rtx card at some point, a friend of mine have both amd and nvidia and according to him, both had problems there is not a single brand free from having issues
@@Clarkane Yeah nothing is perfect and I hope you get a great GPU my friend
good luck!
I have been and currently running the RX 570 for 1.6 years now, and the only problems I've had are driver updates.
I just went from an EVGA GTX 1060 (the same one in the video lol) -> RX 7800XT. Loving it so far.
Only reason I use nvidia gpu’s not because brand loyalty it’s because it has more optimization then Amd in most games. If it changes and they both get the same optimization then I’m going straight for the Amd cards.
I upgraded to a 6900XT early this year (Got a sweet deal!) from an RX570. I later realized that there isn't much of a difference as you go up in quality, but don't regret as I know that this will last me longer than the RX570.
Jensen will remember this betrayal
Nothing hits harder then that sound at the start of the episode ! !
@@GrainGrown dude it was 5 months ago wtf
@@GrainGrown who cares
@@GrainGrown l + ratio + no maidens
@@GrainGrown 💀
I made the switch when the 6900XT came out, and upgraded from a 1080ti.
Ray tracing wasn't in my mind at all, it was purely because I game on ultrawide 1440p and the 1080ti was starting to struggle at 60fps in newer titles at the time.
I then had some regrets over time. Immediatly I noticed that it struggled in VR for some reason. Nothing demanding. Games like beatsaber would stutter randomly. Hopefully that's ficed now, but I can't test as I don't have the space for VR at the moment.
The other regret came somewhat recently. Decided to get back into rendering and immediately remembered after setting up a scene and heading to render that I was used to using Nvidia Iray which isn't supported on AMD cards. Now a render that would have taken me 1 minute takes 10 on my CPU.
I won't be switching back soon thanks to the crazy prices. But I will be going back to team green whenever the money lines up and the latest cards are worth it.
I've flipped a few times. The only issue that makes me wary of going back to AMD these days is the quality of the drivers. I had to return a 6600XT bought for my niece's PC because it would blackscreen on PC start.
I was having similar issues with my 6700xt, and then I found out the bios was not updated (should have checked that first), plus windows was replacing my driver's for whatever reason. I don't have those issues anymore, I honestly think it is worth giving a shot again, just make sure a few things are configured properly
Commenting to boost the algorithm. You deserve the world my friend.
Honestly, you deserve way more audience and recognition! Great videos and commentary 👌 also the budget part hits well cos im always on the poor side and always come to look at budget options 😅
Great card choice, I have bo doubt that with something as powerful as the 6900 XT you will itch you will itch to play on 4K from here on out. I hope you can upgrade your CPU soon as well, so editing is faster thsn before on all aspects and you cab extract as much performance out of the GPU as possible. I myself got a 6700 XT not too long ago and it runs like a charm.
In my country: 7900xtx 1000$, 4070Ti 1100$, 4080 1500$ and a 4090 2100$... hmmm what would I buy as upgrade... choice is obvious.
One of my biggest issues with AMD is their software. Both the actual drivers as well as the user software just seems inferior in multiple ways that make me want to not switch.
I also went from an 3070 to a 6900. For me it was just a free upgrade since i sold my 3070 for 500 and bought the 6900 for the same price
😂😂 what freaking idiot did buy that shit for 500 off you😱🤪🤦🏻♂️🤦🏻♂️ wtf..
It's been a great year for video game releases. Which is ultimately why I decided to upgrade my 2070 super to a 7900 XTX, haven't had any regrets so far.
I bought my 6950xt just a bit before the launch of the 7900 series and kind of regret it (even tho my wallet wouldve been pretty mad at me for buying it) but still the card does everything I need.
I upgraded recently, from a 2060 to a 3060ti. I considered getting a 6750xt, but the truth is NVidia GPUs, mostly due to brand loyalty, are far easier to resell, and reselling your old card is a crucial part in upgrading
Naaa is not a thing anymore people are buying amd cards this days! I sell my 6800xt very quick
Just wait for the 7700xt to drop in price and you should be good.
depends on where you live, on eBay in the US radeon cards hold value better than nvidia does. During the pandemic the Radeon was actually cheaper than the 3060ti for example, yet the 3060ti sells for 200-220$ while the 6700xt for 270-300$. I recently got a good deal for a 3070Ti upgrading from my 5700xt, and I kinda miss it. DLSS is so worth it, but I only play in 1080p so I'm probably just going to sell the 3070Ti and get a 6800.
I'd work on switching that case asap. No mesh front + no space for the gpu to breath is a problem for the temps. 84c edge and 100+ junction and almost 3000 rpm sounds like a nightmare.
I upgraded from 1080ti to 7900 XTX, was NOT worth it. My Radeon 7900 XTX is so inconsistent and extremely "grainy". Too many AI features that just dont work. The adrenaline software does seem more intuitive than Nvidia control panel. I just want consistent high frames because my eyes are great, high refresh monitor is only utilized sometimes with this AMD card.
I myself used only nvidia for years. My experience over amd was positive plus I love SLI (rip). I went from a 2080 ti to a 7900 xtx and its a great product. The difference is noticeable for sure and understanding how to overcome them is fun. Nice choice sir.
The one true SLI is 3Dfx Scanline Interleave.
I had a GTX295 which is a dual card with internal SLI and absolutely hated it, software support was abysmal where you needed just the right patch level of everything, and it just did not feel good, the frame pacing was atrocious as was the latency. What do i even care about how many frames per second it says it does if it doesn't feel right? I mostly used it in single GPU mode, and i liked dedicating one of the split GPUs to PhysX, that worked very nicely.
One of us! One of us! I stopped being one of Nvidia's customers when they released the 40-series and started selling DLSS instead of actual hardware. AMD seem to be the least profit-obsessed whilst also delivering a good product.
The only 40 series GPU that sells FG instead of actual hardware is arguably 4060 ti, which is still a big improvement compared to 3060 ti in terms of power draw. Everything else is significantly faster and draws less power.
@@yarost12
The 6900xt is a beast! I've had mine since day 1 it was £1130. Lucky it paid for itself during that mining period 😅
Its crazy that im looking at a 6950xt for $629, time is crazy
I switched to AMD and have been having problems with dx11 games and it’s drivers who would have thought amd drivers would have problems… anyways tried some registry edits and seems good now.
My only reason to not use Radeon GPU cards is CUDA and its underperforming RoCM stack, specially on Linux. Encoding has been improved on Radeon in the last year, and rasterization is just excellent.
I chose to finally switch to full AMD for my laptop and to be honest the drivers and everything else is a lot better than i expected
Many people report more driver issues on Nvidia lately than amd.
The stigma behind and having bad drivers is long ago. Now they are the same as nvidia
@@knowingbadger its not only that, newer Nvidia drivers literally limit my older laptops' 3050Ti power, newer ones wont let me draw more than i think it was 60W while the older ones allow up to 85W
@@AurasV well I feel like that also applies to amd laptops too to be fair. Just sounds like a way for them to conserve battery life since it's a laptop
@@knowingbadger laptop was working properly before and after, i did tests with old and new drivers for a week each, nothing changed except fps
I run my gtx 980 from 2014/2021 and switch to my RX-6900xt to 2022 till now I can confirm you Lads it definitely worth it better than nvidia friendlier software and easier to use and less greedy company
Nice to see a little more AMD showing. Never owned one but I had an amazing experience with a 6800xt system for a few weeks and the experience overall was butter smooth.
I am using davinci resolve for my sons online physiotherapy. Upload the videos captured by handphone to the therapist. My question is, why the 5600g utilization when compiling a mp4 video has all cores at 100%, while the rx 570 barely uses 10%? Making videos at 360 x 640, 500 kpbs and voice at 150 kpbs.
Just my guess with experience from using a cpu w/ iGPU (11th gen intel with Iris Xe integrated graphics) but the cpu probably has to work the cores much harder than the gpu because it's not designed for video compiling and rendering. I noticed that if I use Intel Quicksync (their iGPU dedicated encoding), cpu usage suddenly drops from 100% to ~50% while the iGPU is only using ~10-15%.
So basically, GPUs are just much more effective at video processing work?
I am going to get a 7900 xtx AIO cooled this fall with a 7800X3D so I can use SAM and with a nice overclock I will have 4090 speeds or faster for less than the price of a 4090, it's crazy a 7900xtx + 7800X3D is less than. 4090 and with only better performance coming in with every driver update, the games I play the AMD GPU is faster anyways, I am currently using a 5800X3D and. 3080 FTW3 Ultra DDR4 cl16 3466 and I don't think the performance of the upgrade to current gen will be too much at 1440/1080P but my nephew needs a new computer so I am going to build one before I need to so he can get this one :)
I dont know man, im not a fanboy of a brand, since gt240 - gtx650ti - 2060super i never have a problem with nvidia drivers, i use amd HD6000 once, it gives me headache, and i am an IT guy who build my own 1st computer, nope.. i aint gonna have that experience again. And im a busy 8-5 employe, i rather give nvidia 100$ more for less headache.
"Linus somebody" 💀 💀 💀
😂😂😂
He’ll switch back to RTX in 2 years
Why does AMD do such a good job with RT on a console but not for PC? Doesn't make sense
Built my pc in 2022 with a 3070, upgraded to a 6800xt and watercooled it, love it. I made it pull over 400W in furmark the other day using Morepowertool
The only strange issue I have is I have to clear my bios every time I disconnect the card for some reason, but my mobo bios isn’t up to date so maybe that’s it who knows, minor issue
Building a pc for partner’s brother rn, his build will also be full red it’s just a better value and performance for his budget
Well alan wak 2 and robocop are here 😀
Can you tell me the performance in 4k ?
I agree 100% with you sir.. I went from a 1080 to a 5700xt... then up to a 6800.. I like where im at at the moment
I really enjoyed the edit you did with music when transitioning to 2023. I know it's petty, but I liked it :D
I upgraded from a x79 1680 v2 and 980 ti to 6700xt last year, also got a good deal on a x570 motherboard that was "broken" for $20 and put a 5800x3d in it. no regrets with a little tweaking starfield run on mostly high settings with about 60fps average no mods. Honestly will probably go amd again especially if the price to performace is there like it aeems to be, nivida's really has meager offerings the past few gens in the actually affordable price ranges most people can dish out the cash for.
I went from a 1650 super, to a 2070 to a RX 6900 XT and I don't regret a thing. AMD Adrenaline alone is 1000% better than GeForce Experience. Plus, I just don't believe we should expect Nvidia to keep making gaming GPUs with how much they're raking in on AI cards.
bought my 6900 XT used once the prices started going down from the crypto boom, got an xtxh bin for under 600 USD, and it's been perfectly fine for 4k, its a bit overkill for the vast majority of games I play, and for the heavier games it's adequate to play optimized, but mostly high settings. It could use a repaste (and maybe some added thermal pads between the PCB and the back plate) but it still runs a stable overclock of just over 2700 MHz. Truly seems like one of the best value cards to get in 2023
I'd like to watch the RT video essay. if possible put it on this channel in a video essay playlist to get more eyes on.
My 3070 just broke so 6700 xt would be a upgrade for competitive shooter games where 1080p fps is what matters the most ?
I currently have a GTX 750ti and I upgraded my pc and only need a graphics card, but I am undecided between RTX 4060TI, 4070, RX 7700XT or 7800XT, they all vary about 150€ price from each other, ranging from 550€ to 700€ ish in my country. Any tips on what to choose, im currently leaning towards ASRock Radeon RX 7800 XT Phantom Gaming OC 16GB as it only costs 630€.
I have:
CPU-i7-13700K
PSU-Corsair RM850x
Motherboard-MSI PRO Z690-A WIFI DDR4
Ram-2x8GB G.Skill
Case- CoolerMaster TD500 Mesh V2
Tips?
What song is that starting at 6:24 ? I've been searching through the TH-cam audio library and can't seem to find it.
Good to see Iceberg and the changes on the channel. He even says the funny number in English now, instead of soixante neuf cents
what do you mean the 7800 doesnt hold up? You said your getting 60-70 fps at 1440 high, I get 144 fps at 1440 max with the 7800
i tried out an rx 6600 as i felt my 1060 was getting a bit old
i ended up with a pretty unstable card that had lag spikes all over the place
would sometimes freeze my pc in certain games
and generally just did not feel stable at all
returned the card for a 4060 instead and has no longer had any of those issues
since then its been extremely hard for me to consider another amd card like the 7800xt despite their prices because i am very worried its gonna burn me again :/
also i feel like fsr even on quality is far below dlss in terms of image quality especially in the background
though that may be different at higher resolutions ?
Did you use ddu
I myself am switching to my first all AMD build. Going with a Ryzen 9 7950x paired with a Sapphire Tech Nitro+ RX 7900 XTX. Coming from an RTX 2070 laptop. This will be my first top of the line build. My best build before this not counting the laptop had an i5 4570, 16GB DDR3 1600MHz HyperX Fury RAM, and a Sapphire Tech RX 470 4GB. A fairly competent 1080p build.
Switched from NVIDIA, loving my new 7800XT, drivers, software, performance and all
That second channel idea sounds really cool, please do it.
So here is my dilemma, I have a 3080 ftw3 Ultra, it’s works fine but I always want more. Do I wait until the new generation of cards or get a 7900xtx?
I don't care which team make my gpu as long as it's the best bang for bucks for my usage. At the moment happy with my 5800X3D and 6800XT
My first rig used the RX5700 Sapphire AIB, and was plagued by driver issues until after I’d ordered a replacement card in the 3080ti. Haven’t had a single issue with the 3080ti beyond trying to manipulate the PCIE clip for the card on the mobo between the card and the DH15 cpu cooler.
Performance was outstanding on the 5700 non xt, but the driver issues were only solved literally years after release.
For me its hard to go to amd because response times and dlss, i have heard that amds alternatives aren't as good