Sorry I accidentally got the prices the wrong way around in the graph, so much to test and so little time! Will just blur that bit out so it's not confusing.
'techtubers' these days are just corporation shills pushing their overpriced slow obsolete tech. 7 fps in 1080p with ray tracing. its obsolete garbage. just like GPUs that push 30 fps @1080p when 1440p 165-180Hz monitors cost 200-300 bucks and are actually the norm. the 4090 can't even do more than 1440p 100fps without upscale and glitchy fake frames. the 4090 is just a 1440p midrange slow af GPU for $200-300 budget-midrange monitors and should be priced accordingly. 4k GPUs don't exist at all because corporations are milking the market with 0 progress obsolete tech to sell more overpriced af obsolete tech. selling the same shiz year after year after year...
'yesonomics', lol. you're just defending coprorations milking the market, actually their margins are higher than ever = more billions in profits than ever. all GPUs are actually overpriced 3-4 times. $300 before crypto was a price range of xx70 cards that were 30% slower than $600 top cards. the 4080 is actually a xx70 card and should be priced at $300. the 4090 should be 600 card by now. watch dr Ian Cutress videos on price to make Ryzen CPUs than calculate GPU prices yourself if you don't believe me. and look at record profits of corporations.
How am I pushing this product for ray tracing? I said get nvidia if you want to use ray tracing. "their margins are higher than ever" actually no they are not for gaming graphics cards. Should look at the financial before commenting. @@rawdez_
@@techyescity ask Dr Ian Cutress about how much it costs to make an ngreeedya GPU chip. according to wafer calcs a 4070/ti chip should cost around 50 bucks to make, a 4080 chip ~$100, a 4090 chip ~$200 now. everything else on a card is 50-150 bucks. all corporations are just milking the market with overprice.
When the rx580 came out in 4 and 8 gb flavors, they tested almost identical. Today, after almost 10 years, you can still game at 40-60 frames in any game at 1080p if you have the 8 gb version.
'after almost 10 years' which actually shows that there's like 0 actual progress in gaming in 10 years. same ol' obsolete crappy graphics with godrays slapped on top. rays can't improve obsolete low-VRAM textures and models and are overpriced af which forces gamedev to keep making games on obsolete engines for affordable OBSOLETE hardware.
there's 0 progress when you compare 'new' GPUs to 'old' ones. e.g. the 6700xt is cheaper and faster than the 7600xt, the 4060ti isn't faster than the 3060ti, the 7800xt isn't faster than the 6800xt. AFTER YEARS of 'progress'. corporations are milking the market and killed progress to not drop prices on obsolete tech that they keep selling year after year after year...
@@lharsaynavi 32 uses MCDs like navi 31, so no? Each MCD is 64bit, so a 7700xt just has 3 working mcds instead of the full 4 of the 7800xt Amd just had to use more working MCDs on the 7700xt to bring it to 16gb, which they should have done to make it good at 450(also call it the 7800) Like it would be like 10% worse than the 7800xt but also 10% cheaper with same vram capacity(but slightly slower) But would be a good way to use cut down navi 32 dies without making a garbage product that everyone hates
@@krazybonnie5523 I'm almost certain it's the GCD of those GPUs not being able to handle all 4 MCDs at once which is why they replaced one with a dummy die.
I feel like gpu's are becoming more and more like sports hatchbacks. You get a bit more from the newer specs however should you pay for the premium brand you'll get a nicer experience. Drip feeding people is not a good way to keep customers, but if you wish to abuse your position then this makes sense. Great video!
@@8020Alive people ALWAYS SKIPPED GENERATIONS, get real. Very few people upgrade their computers every two years, that has NEVER been true. This generation from Nvidia was a massive jump for AI and raytracing, raw rasterization improvements depended on Moore's Law, which is over. Whiners on social media don't represent actual consumers in the real world, 1 in 10 of whom have actually bought the new rtx40 cards. Don't equate AMD's disappointments with that, that's cooe.
@@Wobbothe3rd no Moore's law actually alive and well. its just its way more profitable for coprorations to milk the market with 0 progress obsolete tech so prices don't drop. so thats what they do while making up fairy tales for noobs about inflation and 'Moore's Law is dead' to justify 0 progress and insane 3-4-10 times overpriced. all to make record profits. which they all actually are making. because noobs eat their marketing fairytales for breakfast.
techtubers these days are just corporation shills pushing their overpriced slow obsolete tech. 7 fps in 1080p with ray tracing. its garbage. just like GPUs that push 30 fps @1080p when 1440p 165-180Hz monitors cost 200-300 bucks and are actually the norm. the 4090 can't even do more than 1440p 100fps without upscale and glitchy fake frames. the 4090 is just a 1440p midrange slow af GPU for $200-300 budget-midrange monitors and should be priced accordingly. 4k GPUs don't exist at all because corporations are milking the market with 0 progress obsolete tech to sell more overpriced af obsolete tech. selling the same shiz year after year after year...
While 16GB is very useful, especially as the card ages, the use of a 128 bit memory bus, is a huge issue. Keep in mind that even the ancient GTX 970 would do around 256GB/s for many non-reference models, and with some overclocking, would do around 275-280GB/s, and the card benefited greatly from such a overclock. The RX 7600 XT does around 288GB/s. If a card that offers around 1/3rd the performance experiences VRAM throughput bottlenecks, imagine what it is doing to the RX 7600 XT. While more VRAM helps a card to age better. For example, cards with 8GB of RAM lasted far longer than ones with 4GB, even in cases when the 8GB card has a slightly slower GPU. When a card offers more VRAM at ample throughput, then you get to crank up VRAM capacity and VRAM bandwidth intensive settings with virtually no frame rate drop. For example, if your card has enough VRAM, running at a fast enough speed, you can crank up textures with negligible impact of frame rates, thus getting a better looking game. On the other hand, if a game needs a lot of VRAM, at a very high throughput, then, having extra VRAM is not as effective since the throughput bottleneck will cause a large performance drop.
True altho a small nitpick is that the 970 is really on an effective 224 bit bus because of the 3.5gb vram meaning you were really getting 224GB/s. The mem bus width being split up is the entire reasons you couldn't really use the last 0.5gb of vram on the card.
Great job Brian with this GPU comparison Benchmark! 👍 Glad you included the RTX 3060 12GB, the GPU I use. It's bottom of the barrel but still works, even in newer games. 🙂
I have the 3060 too. I think it's a great card for the amount I paid for it. $230 used, last summer. I play at 1440p ultrawide and yes, I have to turn down settings, but it can still get high framerates, and is good for video editing and some casual ai stuff on the side.
Previous generation 60 class is decent. Remember there are still people on Pascal. I currently game at 1440p with an RTX 2070 which is similar performance and it's decent. DLSS really has extended the longevity of my card. I'm gonna get the 4080 Super when it comes out and pair it with a new PC build.
@@MisterTonyG Nice, the 4080 super will be a massive upgrade. That's the card I ward to upgrade to, but I don't really have cash for a new gpu at the moment. DLSS has been a lifesaver for the longevity of GPU's. It helps me not feel as bad to stick with the 3060 for a while longer.
@@MisterTonyG It really is. I play a lot of Halo Infinite, and idk if you've noticed but Hardware Unboxed have done enough testing to notice that after 20-30 mins of playing Halo Infinite, it starts to exceed 8 gigs of VRAM. Highest I've seen it at 3440x1440 is up to 11 gigs. So yeah the 12gb is nice. I hope you enjoy the 4080 upgrade when you get it :D
They did not have much options… 7600 allready have full die. 7700xt is allready heacily cutdown and 7800xt offers AMD much better profit than 7700xt. So cutting it more… makes zero sense. So now they have niche option to have 16gb version of 6700 to those who really need more than 8gb of vram… I am sure that AMD does not expect to sell a lot of these 16gb versions. But it is there if sometime really needs it.
I mean considering only 100mhz was added to the boost clock and nothing else is different other than the 8gb vs 16gb and max power usage, this performs a lot better compared to the vanilla 7600 than i thought it would. It's not a crazy bad product and as with 99% of Amd Gpu's launched in the past couple of years the price will drop within 6months - which if it gets to around $300 USD it will be looking like a much better product. Performance isn't trash either (apart from RT) , the raster is solid - not amazing. I had a Rx 6600 and it had no performance issues in the games i played and i didn't feel like i was barely scraping by to play games at only min settings - the 7600 is a chunk faster than the Rx 6600 (and the xt a chunk higher again) so it will perform well enough as long as you keep your expectations in check.
@@Alakazam551 "I mean considering only 100mhz was added to the boost clock and nothing else is different other than the 8gb vs 16gb and max power usage, this performs a lot better compared to the vanilla 7600 than i thought it would. " No it's not, it's like 2-4% faster. Watch GamersNexus review on the 7600 XT. The 6600 XT was 15% faster than the 6600. "It's not a crazy bad product and as with 99% of Amd Gpu's launched in the past couple of years the price will drop within 6months - which if it gets to around $300 USD it will be looking like a much better product. Performance isn't trash either (apart from RT) , the raster is solid - not amazing" Yes it is a bad product, don't even try to cope it's not. Putting 16 GB of VRAM on a crippled 128-bit bus because apparently "more VRAM = good GPU" won't actually make it a good GPU. Also the RX 6700 XT is at the same price as this GPU and the 6700 XT is actually faster. There are literally no scenarios where the 7600 XT would need 16 GB unless you do AI or whatever. Bottom line, this GPU is an absolute joke which no one asked for.
This is a crazy time to be the guy out looking to get a new GPU to upgrade your rig with. Thanks for doing your best at pointing people in the right direction.
Cyberpunk 2077 on ultra graphics settings and 2K resolution, 24 fps on the RTX 4060 and 48 fps on the R10 7600 xt is all you need to know about these graphics cards The processor was Ryzen 7 5700x3d
Bryan I hope you can make a video about the current market and the soon NAND flash "shortage" we are going to see again, imo I smell another DRAM supplier collusion happening soon like it went with DDR4 massive adoption during 2016, since now everything is starting to take DDR5 and NVME drives.
Just like any other gpu out there whether it's amd or nividia after a few driver updates that's when the card really starts to shine and I feel like the 7600xt is one of those cards
Hi @techyescity love the video, great info. Still torn as to hat to do though. I'm currently wanting to upgrade from a RX570 4GB which is getting a bit long in the tooth, paired with a R5 2600 which is still fine I think. I'd like to play stuff at 1440p at 60Hz at a decent quality as I have a Samsung smart TV which is also used by the kids for console stuff. I'm considering a 4060Ti 8Gb or possibly the RX7600XT since the price is similar. Am I barking up the wrong tree with this plan or is there a better way, or am i being unreasonable to think that the R5 2600 plus the updated card can deliver that sort of performance, would an upgraded cpu also e in order? I also only have a 550W PSU is a gold rated unit Seasonic which I think is enough for a 4060Ti but might be a bit optimistic for an AMD card. love your work and the channel.
This is my struggle for a while, but some rtx4060i is now already starting to be on sales, cause the card is "older" now, unless you play a lot of game that requires that much vram, rtx4060i is just way better right now, and if you can visit the used market, 3070ti for example is damn cheap, I still go with the RTX4060ti though cause I don't want to go without warranty On the note these entry level card is ridiculous, no matter which one you pick it won't be perfect, my pure advice would be 4070 super, and 7700xt the only card from both side that won't disappoint you, and 7700xt is cheaper aswell, and if you want more from and 7800xt goodluck
rly good video as usual. i hoped they would sell the 7600 for 230 und the 7600xt for around 280 i dont know how much they earn from that card but for me personal this card is overpriced like all other new cards outthere. i run a small business and dudes come in throw 500-700 bucks on average on the table and want a nice little gaming pc and i would love to provide it with those types of cards... i still go with the 6600 and 6600xt (not 6650xt same price problems) buying refurbished or used. i still love these cards in that pricepoint. maybe with the next generation we can get used or refurbished one of these for around that money. best regards from germany.
I have a 5700XT currently, is this even worth an upgrade at all? The only reason I want to upgrade is the version I have has noisy fans. Or is a 4060ti just a better buy all around.
well... it's the best card I can afford considering low availability of 6800xt. plus I love the look glad it's not all "bad decision", thank you and have a great day
Funnily enough, the Arc a770 will be significantly better in real workloads. It has a 256 bit memory bus while the 4060 ti 16gb and this card have 128bit bus. In workloads that require the extra vram, the a770 will be much better. In fact the 128bit bus makes those other cards completely unusable. They are e-waste essentially.
I bought the asrock steel legend rx 6650xt to fit a mostly white theme build for my nephew. I did a small easy overclock with an undevolt and turned off the silent mode. I found the card itself to be real good all around. I dont like the zero fsn rpm mode and turned it off because the ramp from zero to whatever % just seemed too harsh. So leaving the fans on all the time helped. I would say the cooler is good not great, but at the price I can't really complain, 9 think a denser fin stack would've made this a top tier cooler. Max temp under 100% stress was around 70c in like a 24c environment a little loud but by no means terrible. Obviously with a headset you can't hear it.
We saw this with the 6500 XT 4GB vs. the limited run of 8GB cards. At least the 7600 XT is a different SKU, so it's designed for 16GB, rather than just being a 7600 with double the RAM.
It's not, it uses the same expensive to do clamshell design Nvidia used for the 4060Ti 16gb to double the VRAM while utilising the same memory controller, resulting in 2 VRAM packets sharing the same bus.
This is such a rush job of a product that AMD just took the RX7600 overclocked it then doubled the VRAM just so they could prevent Nvidia from hogging the News cycles with their launch of their SUPER refresh.
Thousands of years ago (in computer years) the 4gb rx580 was deemed equal to the 8gb rx580. The extra 4gb was "unnecessary." Today, thousands of years later, the 8gb rx580 is still a viable gaming card.
Yesterday I got the same Steel Legend brand new from fb marketplace around 100$ cheaper than the market price here. Really happy how it looks and that in gaming I dont need turn the volume to max so i dont hear the fans of my old Rx 5700 Xt 😂🎉
Maybe it was just coincidence in this particular test, but it seems the nvidia cards are much more stable than their AMD counterparts if you look at the average fps and the lows.
Vram talks getting out of hands. 12gb is enough for 4k and 8gb for 1440p. 6-8gb is far enough for 1080p. People should stop complaining about things they dont even need or know how it even works.
Nah just look at re4 and lou and their performance on the 3070ti and just how bottlenecked that card is because of its vram. 12 is good for 1440p while 16 is necessary for 4k nowadays.
7600 xt new benchmarks getting better performance than 6700 and almost up to 6700 xt, the new driver improving a lot, thats why 7600 xt running out of stock
AMD is trying to give consumer GPUs at a decent price point. If playing maybe one or two of the latest titles you will be satisfied with a GPU that you can buy at a decent price point. Now with the RX 7600XT numbers up on a GPU that only has x8 lanes this GPU performed pretty well in some game titles. If I was in the market on buying the RX 7600XT 16GB in playing Retro and RGP titles I be very proud with this GPU on my 1080p 60hz TV monitors.
As bad as it is with USD it’s still better than many other currencies. My currency has dropped 40% in value compared to the USD in the last years so even at the same USD price graphics cards are 40% more expensive because they are imported goods and thus heavily dependent on the currency value. You could get a GTX 580 back in 2011 for 4000 and now 4000 gets you a 4060. I used to always spend around 2000 which was about 250 bucks and get good cards. Now 2000 is more like 150 bucks and you can’t get anything for that except the crappy 6500XT.
I remember the luke warm reception the RX 6600 had when released, but then later became one of the best value for money cards you could buy. I think we may see the same with the RX 7600 XT, when it's price inevitably drops to $300 or below, and games get that much more demanding in VRAM situations going forward. Could be one to review again in a years time.
That's why I don't get the people complaining that there are no good cards under $200 when the 6600 has been there for more than a year now. And a year or more later, the 7600 should hopefully replace that sub $200 price and the 7600xt drops to less than $250
This card does not deserve the XT at the end of the name at all. It is ONLY an overclocked 7600 with more ram. As bad as Nvidia is, at least the 4060 ti has more ray tracing and cuda cores than the regular 4060.
@@tonydavis8696 I don't see as it really matters. Both companies are guilty of stupid naming schemes for their cards over the years. Yeah it's an overlocked 7600 with 8Gb more vram, but at least the public can differentiate that the XT version is the one with more vram. Unlike Nvidia and their 1060 3Gb and 6GB which had entirely different specs and not just in the vram. Same goes for the RTX 3060 12Gb and 8Gb models.
When this gets down below $265, I'll pick one up... Maybe Prime Day in June? If they came out with a single slot, blower style card, I would pick that up for $330 for machine learning in my home lab...
They have been creeping back up but I have picked up 2 used 6700xt cards for 240 and 250 USD respectively in the last 6 months. Kids' PCs are set till College with those bad boys. The 7600xt is just too expensive for what it is even with the added Vram IMHO.
Seems like AMD are trying to bamboozle the less knowledgeable people with "LOOK 16GB!" that don't realise that the 128-bit memory bus and lack of CUs is going to prevent all of that VRAM from really even being utilised in gaming. A bit of a cheap move if you ask me.
I'm actually planing to buy one, it's now the cheapest way to get 16 Gb of VRAM. For other uses other than gaming it's great ARC has many issues still :/
I could have gotten a 6750xt for about €380 around black friday last year... probably about as good as it gets for the foreseeable. The 6800 is about another 100 euro so worth it, IF you have the money
Regardless of your budget, if you need an upgrade, better get on with it because prices are going back up in response to the AI boom. There are millions of cheap used 3060 12GB cards out there going for around $200 in the US. The one I have is equivalent to a 4060 with an overclock - EVGA, Samsung die, Samsung VRAM. I'll be using mine for another year or two. I'll probably buy a used 7900XT, 4080 or 3090Ti, depending on budget. Supply and demand should equalize by then.
Man, if they had used the same die as the 7700xt/7800xt this could have been something special. Hell, if they had launched this for even $10-$20 less, it might have gotten better reviews as it is. The really sad thing is, if you don't have at least $400 to spend, and you want something that will have that VRAM moving forward and you want current gen...then this is pretty much the only option on the market.
This is NOT a bad card at all, it's just a bit mismanaged; You either price this @ 299 or they could have given it 64MB of cache and more MHz to get it to match the 6700xt. In a world where the 6700xt exists, why should I get this card, at the same price, for 10-20% less performance,..it plays a lot with selling VRAM you'll probably never max out.
Barring another crypto craze, the price is only going to get lower on the 7600 XT and it'll become better value. I reckon it'll be at the magic $300 by years end, ot there will at least be deals for it at that price or lower.
I only just recently broke the 8GB mark in 1440p gaming, ironically with the HD texture pack on FC6, a game that is more than 2 years old. Haven't played AW2 but of the new releases I did play last year, AC Mirage, Starfield and the total abortion of a game, Motorfest, all run comfortably with settings maxed at 1440p on an 8GB GPU. From the benchmarks I've seen on AW2, you need more than 8GB to run it at 1080p. Now that they have broken that barrier, other devs will have no qualms about following suit. AW2 is entirely playable on a 3060 12GB, even at 1440p. If we look at AW2 as the new Crysis and expect titles in the near future to have similar resource demands, a GPU of this class, with 12GB VRAM is the least you should be buying in 2024 on the new market, even for 1080p. The RTX 3060 is the new budget standard. Goodbye 1650, it was nice knowin' ya. They could have gone with 12GB for this GPU and they actually would have had a viable competitor to the 4060 because if you are gaming at 1440p there will likely be upcoming releases you can't run on an 8GB GPU. Instead, they ended up at too high a price point because of the extra 4GB VRAM which you definitely won't ever be able to use. By the time you need 16GB, the 7600XT will be as useless as a 16GB RX580 is now. You're probably not going to be running GTAVI at 1440p on an 8GB GPU and maybe not even 1080p, assuming they weren't lying through their teeth about the trailer being actual in engine rendering. The Indy game also looks like it is going to be VRAM hungry. As for RT, it is kinda useless on a GPU of this class even if it is an Nvidia card. There are some games with limited RT that will work like FC6, FH5 and AC Mirage. But games like Cyberpunk and AW2 and basically any new release moving forward that supports RT, forget it. I think your assertion that 8GB is still fine is the result of a conflict of interest between your YT channel and your PC flipping business.
Amd should just make two serious ray tracing models with the next gen. They'll probably drop a ton of cards like they have been and have them have the same average rtx that they've been doing, but just pick 2 cards to really excell in ray tracing for the people who want rtx and want to support AMD. they could drop a 8900 xt and a 8900 rt for example for 4k gaming and then a 8700xt and rt for 1440p
AMD made 4060 look good with this gpu. Amazing work as always. The only worthwhile gpu this generation from AMD is 7800XT, everything else is just plain bad.
@@ShinyHelmet not entirely correct entry level cards are dogshit that no one should ever buy since you can just get an APU that out performs them that being said they might make a nice paperweight
@@ShinyHelmetNo, these cards are literal e-waste. 128 bit memory bus with 16gb of memory is idiotic. Nvidia did the same thing with the 4060 ti 16gb. Useless cards.
@@PainX187 Well that of course depends whether you need to buy a new cpu or not. If your cpu is good enough for what you do, then why waste money on an APU when getting a shitty used card would do? Like I said, it all comes down to the price.
Are people really using Ray tracing other than in Mine Craft and games like that?? Because I have never turned it on. I am always looking for the best high to ultra raster I can get and everything always looks great.. IDK that's just me..
I lucked out and got a 6700 xt for $200 a few weeks ago and I am blown away by it's performance. Especially for the price. Now I'm even more amped it's discontinued, next card will be for a full next gen at this point so I'm going to enjoy my buttery smoothness. Thanks for the comprehensive review as always!
yeah. I have been looking for an upgrade and the best upgrade for me having a 6700XT was to upgrade my CPU to a 7800X3D. My 6700XT seems like it got an upgrade.
The problem with buying this for things outside of gaming is that Nvidia is far, faaaaaar better for basically anything not gaming. Stable diffusion is a great example, it does not even work on AMD without a lot of tinkering and the performance is bad compared to Nvidia even when you get it to work. Cuda gets used a lot outside of gaming and its exclusive to Nvidia.
Kinda slots in with all its budget buddies fine. I can see some people thinking the 16gb on 7600XT is impressive, not realizing the bus isn't great etc, seems okay but not wow.
should been $300 and the 7600 non xt shouldve been officially dropped to $250. The rx 7600 is such a joke at 270 when the $300 4060 has 10% more raster perf, is way more efficient and has all of nvidias features.
This thing doesn't quite have the processing power for 1440p gaming - and certainly not for 4k - so the 16gb isn't really necessary. It would be nice for AI, but at the moment you'd really want an Nvidia card for that because most AI is written with them in mind.
How does it compare doing AI dabbling on Nvidia vs. AMD? It would be interesting video to compare top 5 AI projects being done on RTX 4060 Ti 16GB vs. Rx 7600 XT 16Gb.
This card should be fine for stable diffusion with rocm using a smaller 13million parameter model from huggingface. there is a specific kernel version you need to use according to what wendell at level1tech forums was testing on his linux channel. once DIrect ML from Microsoft launches then we could see more improvement.
The 4060 will smoke it in almost any AI application. To be fair: there have been new improvements for AMD with AI, and the 7000 cards do perform much better than the 6000 cards. Also, more vram is only helpful if the model actually uses it! Not all models scale up to allow more with 16gb!
*Giving 16 GB Vram on a card that actually needs it for RT and high resolution gaming -AMD & Nvidia:🙅 *Giving 16 GB Vram to an entry card that will never use it: -AMD & Nvida: 🤝
Right here lmao. When you can get better performance at $350 for an AMD card than every single NVIDIA card at the same price, yeah AMD is ruling the GPU game right now.
The power requirements are way over exaggerated by AMD. I assume its just to cover themselves legally from insurance claims if a dodgy PSU goes poof. But a good 500w will easily run this mayve even 450w depending on your other hardware.
I don't believe they can't cut margins. How can Intel sell an over 400mm2 die card, made on the same TSMC 6nm process, with the same 16gb of memory but a doubly wide 256 memory bus for the same price? For those that don't know this means that you can literally make 4 7600XT dies for a single arc a770/a750 die, the board is more expensive and it even needs batter cooling because intel cards are less efficient. Unless you think Intel is losing 150$+ per card sold, this makes no sense.
intel is taking a hit and eating the cost, its what needs to happen for their product development and the drivers to be ready. once they are proven in the consumer space for 3-4 years only then business folks are willing to spend any money on ARC flex offerings in the enterprise.
They sell for loss, because 770 and 750 were (maybe still are so bad) Expect battle mage versions to douple the price (unless they are also really bad)
TLDR: Yes it’s a slightly older card but it’s fully capable of gaming in 1440p - 60+fps - high settings It’s a fantastic card for the money and there’s no reason to go Nvidia when you have this amazing gpu for under $350… Video fixed
1440p Gaming??? Damn... I guess I will have to wait a few more years before I fork out another whopping chunk of money to replace my aging GTX 1080Ti to fully utilise my old 4K Monitors abilities, Yup my old GTX 1080Ti "Games and Live streams" @4K, Granted no Ray Tracing, but at least Live streaming performance was at an acceptable 50~60 FPS @4K and yes... I have proof of that... Yeah nah NOPE! I may as well also hold off on an 8K Monitor which would provide a Super UHD HDR delight, especially if it were complimented with Ray tracing. Alas with graphics cards it seems it's just not worth the bang for buck yet, let alone are still incapable of providing a better performance with this Ray Tracing feature enabled, so really the investment in a new card is just not going to be appealing enough, I wanna see at least 4k @100fps gaming and Streaming performance with all graphics settings to the max and with Ray Tracing enabled before I take the plunge and upgrade my old GTX 1080Ti..
Sorry I accidentally got the prices the wrong way around in the graph, so much to test and so little time! Will just blur that bit out so it's not confusing.
You did a little Linus over there whoopsie.
'techtubers' these days are just corporation shills pushing their overpriced slow obsolete tech. 7 fps in 1080p with ray tracing. its obsolete garbage. just like GPUs that push 30 fps @1080p when 1440p 165-180Hz monitors cost 200-300 bucks and are actually the norm.
the 4090 can't even do more than 1440p 100fps without upscale and glitchy fake frames. the 4090 is just a 1440p midrange slow af GPU for $200-300 budget-midrange monitors and should be priced accordingly. 4k GPUs don't exist at all because corporations are milking the market with 0 progress obsolete tech to sell more overpriced af obsolete tech. selling the same shiz year after year after year...
'yesonomics', lol. you're just defending coprorations milking the market, actually their margins are higher than ever = more billions in profits than ever. all GPUs are actually overpriced 3-4 times. $300 before crypto was a price range of xx70 cards that were 30% slower than $600 top cards. the 4080 is actually a xx70 card and should be priced at $300. the 4090 should be 600 card by now. watch dr Ian Cutress videos on price to make Ryzen CPUs than calculate GPU prices yourself if you don't believe me. and look at record profits of corporations.
How am I pushing this product for ray tracing? I said get nvidia if you want to use ray tracing. "their margins are higher than ever" actually no they are not for gaming graphics cards. Should look at the financial before commenting. @@rawdez_
@@techyescity ask Dr Ian Cutress about how much it costs to make an ngreeedya GPU chip. according to wafer calcs a 4070/ti chip should cost around 50 bucks to make, a 4080 chip ~$100, a 4090 chip ~$200 now. everything else on a card is 50-150 bucks. all corporations are just milking the market with overprice.
When the rx580 came out in 4 and 8 gb flavors, they tested almost identical. Today, after almost 10 years, you can still game at 40-60 frames in any game at 1080p if you have the 8 gb version.
Great point
'after almost 10 years' which actually shows that there's like 0 actual progress in gaming in 10 years. same ol' obsolete crappy graphics with godrays slapped on top. rays can't improve obsolete low-VRAM textures and models and are overpriced af which forces gamedev to keep making games on obsolete engines for affordable OBSOLETE hardware.
there's 0 progress when you compare 'new' GPUs to 'old' ones. e.g. the 6700xt is cheaper and faster than the 7600xt, the 4060ti isn't faster than the 3060ti, the 7800xt isn't faster than the 6800xt. AFTER YEARS of 'progress'. corporations are milking the market and killed progress to not drop prices on obsolete tech that they keep selling year after year after year...
lmao
More like 30 FPS in CP 2077 DLC, and 2 FPS in Alan Wake 2.
If only AMD didn't launch the 7700XT they could have said something like "all of our cards over $300 have at least 16GB of VRAM" what a lost chance.
Bud why are you trying to cope for amd ... Should've launched the 7700xt for 380$ and never launch the 7600xt cause it's basically a 6600xt + 16gb
Bruh what difference would it make ? This doesn't change the fact that nobody asked for a 16gb 7600.
Maybe a lot of Navi 32 GPUs had a partially faulty memory controllers which forced them to release a model with 12GB in large volume.
@@lharsaynavi 32 uses MCDs like navi 31, so no?
Each MCD is 64bit, so a 7700xt just has 3 working mcds instead of the full 4 of the 7800xt
Amd just had to use more working MCDs on the 7700xt to bring it to 16gb, which they should have done to make it good at 450(also call it the 7800)
Like it would be like 10% worse than the 7800xt but also 10% cheaper with same vram capacity(but slightly slower)
But would be a good way to use cut down navi 32 dies without making a garbage product that everyone hates
@@krazybonnie5523 I'm almost certain it's the GCD of those GPUs not being able to handle all 4 MCDs at once which is why they replaced one with a dummy die.
I feel like gpu's are becoming more and more like sports hatchbacks. You get a bit more from the newer specs however should you pay for the premium brand you'll get a nicer experience. Drip feeding people is not a good way to keep customers, but if you wish to abuse your position then this makes sense. Great video!
Indeed. And that's why so many will skip generations - and only upgrade when drastic discounts apply
@@8020Alive people ALWAYS SKIPPED GENERATIONS, get real. Very few people upgrade their computers every two years, that has NEVER been true. This generation from Nvidia was a massive jump for AI and raytracing, raw rasterization improvements depended on Moore's Law, which is over. Whiners on social media don't represent actual consumers in the real world, 1 in 10 of whom have actually bought the new rtx40 cards. Don't equate AMD's disappointments with that, that's cooe.
@@Wobbothe3rd ok bro its every 3 years whoop de fakin dooooo
@@Wobbothe3rd no Moore's law actually alive and well. its just its way more profitable for coprorations to milk the market with 0 progress obsolete tech so prices don't drop. so thats what they do while making up fairy tales for noobs about inflation and 'Moore's Law is dead' to justify 0 progress and insane 3-4-10 times overpriced. all to make record profits. which they all actually are making. because noobs eat their marketing fairytales for breakfast.
techtubers these days are just corporation shills pushing their overpriced slow obsolete tech. 7 fps in 1080p with ray tracing. its garbage. just like GPUs that push 30 fps @1080p when 1440p 165-180Hz monitors cost 200-300 bucks and are actually the norm.
the 4090 can't even do more than 1440p 100fps without upscale and glitchy fake frames. the 4090 is just a 1440p midrange slow af GPU for $200-300 budget-midrange monitors and should be priced accordingly. 4k GPUs don't exist at all because corporations are milking the market with 0 progress obsolete tech to sell more overpriced af obsolete tech. selling the same shiz year after year after year...
While 16GB is very useful, especially as the card ages, the use of a 128 bit memory bus, is a huge issue. Keep in mind that even the ancient GTX 970 would do around 256GB/s for many non-reference models, and with some overclocking, would do around 275-280GB/s, and the card benefited greatly from such a overclock. The RX 7600 XT does around 288GB/s. If a card that offers around 1/3rd the performance experiences VRAM throughput bottlenecks, imagine what it is doing to the RX 7600 XT.
While more VRAM helps a card to age better. For example, cards with 8GB of RAM lasted far longer than ones with 4GB, even in cases when the 8GB card has a slightly slower GPU. When a card offers more VRAM at ample throughput, then you get to crank up VRAM capacity and VRAM bandwidth intensive settings with virtually no frame rate drop. For example, if your card has enough VRAM, running at a fast enough speed, you can crank up textures with negligible impact of frame rates, thus getting a better looking game. On the other hand, if a game needs a lot of VRAM, at a very high throughput, then, having extra VRAM is not as effective since the throughput bottleneck will cause a large performance drop.
True altho a small nitpick is that the 970 is really on an effective 224 bit bus because of the 3.5gb vram meaning you were really getting 224GB/s. The mem bus width being split up is the entire reasons you couldn't really use the last 0.5gb of vram on the card.
So what should i buy..lipika frm assam
price swap on the 7600xt and 7600, solid video!
What would be the best pair for valorant with 7800x3d cpu? Gpu price max 550usd
4070 Super is on sale for $569 and 7900 GRE is ~$529
Great job Brian with this GPU comparison Benchmark! 👍
Glad you included the RTX 3060 12GB, the GPU I use. It's bottom of the barrel but still works, even in newer games. 🙂
I have the 3060 too. I think it's a great card for the amount I paid for it. $230 used, last summer. I play at 1440p ultrawide and yes, I have to turn down settings, but it can still get high framerates, and is good for video editing and some casual ai stuff on the side.
Previous generation 60 class is decent. Remember there are still people on Pascal. I currently game at 1440p with an RTX 2070 which is similar performance and it's decent. DLSS really has extended the longevity of my card. I'm gonna get the 4080 Super when it comes out and pair it with a new PC build.
@@MisterTonyG Nice, the 4080 super will be a massive upgrade. That's the card I ward to upgrade to, but I don't really have cash for a new gpu at the moment. DLSS has been a lifesaver for the longevity of GPU's. It helps me not feel as bad to stick with the 3060 for a while longer.
@@nicksterba For sure. Those 12 GBs of VRAM on your 3060 is nice. There are some games where I max my 8 GB out on my 2070.
@@MisterTonyG It really is. I play a lot of Halo Infinite, and idk if you've noticed but Hardware Unboxed have done enough testing to notice that after 20-30 mins of playing Halo Infinite, it starts to exceed 8 gigs of VRAM. Highest I've seen it at 3440x1440 is up to 11 gigs. So yeah the 12gb is nice. I hope you enjoy the 4080 upgrade when you get it :D
It needs more Australian Export Multipurpose Spray.
This is why I can't wait for the next gen cards to come out.. I have a 6800XT and I am still doing very well with it..
7600 XT is literally the 7600 but with double the VRAM and a 100 MHz higher clock on the boost clock speed. AMD has officially lost their mind.
They did not have much options…
7600 allready have full die. 7700xt is allready heacily cutdown and 7800xt offers AMD much better profit than 7700xt. So cutting it more… makes zero sense.
So now they have niche option to have 16gb version of 6700 to those who really need more than 8gb of vram… I am sure that AMD does not expect to sell a lot of these 16gb versions. But it is there if sometime really needs it.
I mean considering only 100mhz was added to the boost clock and nothing else is different other than the 8gb vs 16gb and max power usage, this performs a lot better compared to the vanilla 7600 than i thought it would.
It's not a crazy bad product and as with 99% of Amd Gpu's launched in the past couple of years the price will drop within 6months - which if it gets to around $300 USD it will be looking like a much better product. Performance isn't trash either (apart from RT) , the raster is solid - not amazing.
I had a Rx 6600 and it had no performance issues in the games i played and i didn't feel like i was barely scraping by to play games at only min settings - the 7600 is a chunk faster than the Rx 6600 (and the xt a chunk higher again) so it will perform well enough as long as you keep your expectations in check.
@@Alakazam551 "I mean considering only 100mhz was added to the boost clock and nothing else is different other than the 8gb vs 16gb and max power usage, this performs a lot better compared to the vanilla 7600 than i thought it would.
"
No it's not, it's like 2-4% faster. Watch GamersNexus review on the 7600 XT. The 6600 XT was 15% faster than the 6600.
"It's not a crazy bad product and as with 99% of Amd Gpu's launched in the past couple of years the price will drop within 6months - which if it gets to around $300 USD it will be looking like a much better product. Performance isn't trash either (apart from RT) , the raster is solid - not amazing"
Yes it is a bad product, don't even try to cope it's not. Putting 16 GB of VRAM on a crippled 128-bit bus because apparently "more VRAM = good GPU" won't actually make it a good GPU. Also the RX 6700 XT is at the same price as this GPU and the 6700 XT is actually faster. There are literally no scenarios where the 7600 XT would need 16 GB unless you do AI or whatever.
Bottom line, this GPU is an absolute joke which no one asked for.
so a higher vram and higher clock speed is bad? what am I missing here?
Uhhh what? lol Yes, VRAM costs money.... and you're getting it basically at cost. 8GB is not possible anymore
Yes man posted gotta watch before I sleep :]
This is a crazy time to be the guy out looking to get a new GPU to upgrade your rig with. Thanks for doing your best at pointing people in the right direction.
Love the design of the Steel Legend GPU, definitely looking to buy one to match my Steel Legend motherboard.
Cyberpunk 2077 on ultra graphics settings and 2K resolution, 24 fps on the RTX 4060 and 48 fps on the R10 7600 xt is all you need to know about these graphics cards
The processor was Ryzen 7 5700x3d
Mate, you swapped the prices between the 7600 and 7600 XT on the first table
Sorry in such a rush, just going to blur that bit out so it doesn't confuse.
Tech Yes City King keep it up! My 2070S FE says HELLO 2024
Bryan I hope you can make a video about the current market and the soon NAND flash "shortage" we are going to see again, imo I smell another DRAM supplier collusion happening soon like it went with DDR4 massive adoption during 2016, since now everything is starting to take DDR5 and NVME drives.
Love your content, most honest and helpful 👍👍
Just like any other gpu out there whether it's amd or nividia after a few driver updates that's when the card really starts to shine and I feel like the 7600xt is one of those cards
Hi @techyescity love the video, great info. Still torn as to hat to do though. I'm currently wanting to upgrade from a RX570 4GB which is getting a bit long in the tooth, paired with a R5 2600 which is still fine I think. I'd like to play stuff at 1440p at 60Hz at a decent quality as I have a Samsung smart TV which is also used by the kids for console stuff. I'm considering a 4060Ti 8Gb or possibly the RX7600XT since the price is similar. Am I barking up the wrong tree with this plan or is there a better way, or am i being unreasonable to think that the R5 2600 plus the updated card can deliver that sort of performance, would an upgraded cpu also e in order? I also only have a 550W PSU is a gold rated unit Seasonic which I think is enough for a 4060Ti but might be a bit optimistic for an AMD card. love your work and the channel.
This is my struggle for a while, but some rtx4060i is now already starting to be on sales, cause the card is "older" now, unless you play a lot of game that requires that much vram, rtx4060i is just way better right now, and if you can visit the used market, 3070ti for example is damn cheap, I still go with the RTX4060ti though cause I don't want to go without warranty
On the note these entry level card is ridiculous, no matter which one you pick it won't be perfect, my pure advice would be 4070 super, and 7700xt the only card from both side that won't disappoint you, and 7700xt is cheaper aswell, and if you want more from and 7800xt goodluck
I upgraded everything and went from 4gb to this one. I'm really happy with it, I can play what I want on good setings.
rly good video as usual. i hoped they would sell the 7600 for 230 und the 7600xt for around 280 i dont know how much they earn from that card but for me personal this card is overpriced like all other new cards outthere. i run a small business and dudes come in throw 500-700 bucks on average on the table and want a nice little gaming pc and i would love to provide it with those types of cards... i still go with the 6600 and 6600xt (not 6650xt same price problems) buying refurbished or used. i still love these cards in that pricepoint. maybe with the next generation we can get used or refurbished one of these for around that money. best regards from germany.
I have a 5700XT currently, is this even worth an upgrade at all? The only reason I want to upgrade is the version I have has noisy fans. Or is a 4060ti just a better buy all around.
6700xt or 4060ti is a good buy
Well, I was in the market and tldr I ended up buying a vega 56 instead of a 7800xt or 4070 😂
7700XT
@@FunkyTechyEven the 8gb model?
3060 Ti is the exact same as 4060 Ti
well... it's the best card I can afford considering low availability of 6800xt. plus I love the look
glad it's not all "bad decision", thank you and have a great day
You should do a Radeon RX 7600 XT go head-to-head against the ARC A770 16GB.
Funnily enough, the Arc a770 will be significantly better in real workloads. It has a 256 bit memory bus while the 4060 ti 16gb and this card have 128bit bus. In workloads that require the extra vram, the a770 will be much better. In fact the 128bit bus makes those other cards completely unusable. They are e-waste essentially.
they will be just guessing since idk how cards performs, lowest end is easier, but well not enjoyable
the previous video is smth midrange idk prob 6700xt
I like how you look to it from economy perspective. We have 4050 and it rename as 4060 and 4060 aa 4070
Great gravis card reveiw 👍
I bought the asrock steel legend rx 6650xt to fit a mostly white theme build for my nephew. I did a small easy overclock with an undevolt and turned off the silent mode.
I found the card itself to be real good all around. I dont like the zero fsn rpm mode and turned it off because the ramp from zero to whatever % just seemed too harsh. So leaving the fans on all the time helped. I would say the cooler is good not great, but at the price I can't really complain, 9 think a denser fin stack would've made this a top tier cooler. Max temp under 100% stress was around 70c in like a 24c environment a little loud but by no means terrible. Obviously with a headset you can't hear it.
I just got Robocop Rogue City on the radar so it's awesome to see benchmarks for it. Another banger video, cheers for the testing on this! 😎
The 16Gb VRAM is really just for future proofing and it can help in the 1% lows in games.
We saw this with the 6500 XT 4GB vs. the limited run of 8GB cards. At least the 7600 XT is a different SKU, so it's designed for 16GB, rather than just being a 7600 with double the RAM.
It's not, it uses the same expensive to do clamshell design Nvidia used for the 4060Ti 16gb to double the VRAM while utilising the same memory controller, resulting in 2 VRAM packets sharing the same bus.
This is such a rush job of a product that AMD just took the RX7600 overclocked it then doubled the VRAM just so they could prevent Nvidia from hogging the News cycles with their launch of their SUPER refresh.
Thousands of years ago (in computer years) the 4gb rx580 was deemed equal to the 8gb rx580. The extra 4gb was "unnecessary." Today, thousands of years later, the 8gb rx580 is still a viable gaming card.
Yesterday I got the same Steel Legend brand new from fb marketplace around 100$ cheaper than the market price here. Really happy how it looks and that in gaming I dont need turn the volume to max so i dont hear the fans of my old Rx 5700 Xt 😂🎉
Maybe it was just coincidence in this particular test, but it seems the nvidia cards are much more stable than their AMD counterparts if you look at the average fps and the lows.
Vram talks getting out of hands. 12gb is enough for 4k and 8gb for 1440p. 6-8gb is far enough for 1080p. People should stop complaining about things they dont even need or know how it even works.
Exactly 🤣
Nah just look at re4 and lou and their performance on the 3070ti and just how bottlenecked that card is because of its vram. 12 is good for 1440p while 16 is necessary for 4k nowadays.
So which gpu should i buy if i have the same budget as rx 7600xt but i want a new gpu??????????
7600 xt new benchmarks getting better performance than 6700 and almost up to 6700 xt, the new driver improving a lot, thats why 7600 xt running out of stock
AMD is trying to give consumer GPUs at a decent price point. If playing maybe one or two of the latest titles you will be satisfied with a GPU that you can buy at a decent price point. Now with the RX 7600XT numbers up on a GPU that only has x8 lanes this GPU performed pretty well in some game titles. If I was in the market on buying the RX 7600XT 16GB in playing Retro and RGP titles I be very proud with this GPU on my 1080p 60hz TV monitors.
As bad as it is with USD it’s still better than many other currencies. My currency has dropped 40% in value compared to the USD in the last years so even at the same USD price graphics cards are 40% more expensive because they are imported goods and thus heavily dependent on the currency value. You could get a GTX 580 back in 2011 for 4000 and now 4000 gets you a 4060. I used to always spend around 2000 which was about 250 bucks and get good cards. Now 2000 is more like 150 bucks and you can’t get anything for that except the crappy 6500XT.
I remember the luke warm reception the RX 6600 had when released, but then later became one of the best value for money cards you could buy. I think we may see the same with the RX 7600 XT, when it's price inevitably drops to $300 or below, and games get that much more demanding in VRAM situations going forward. Could be one to review again in a years time.
Yes my friend.After some updates (driver) price drop this card will be like rx 6600..
That's why I don't get the people complaining that there are no good cards under $200 when the 6600 has been there for more than a year now. And a year or more later, the 7600 should hopefully replace that sub $200 price and the 7600xt drops to less than $250
Everyone said the rx580 8gb was a waste... That card lasted a LONG time compared to the 4gb
This card does not deserve the XT at the end of the name at all. It is ONLY an overclocked 7600 with more ram. As bad as Nvidia is, at least the 4060 ti has more ray tracing and cuda cores than the regular 4060.
@@tonydavis8696 I don't see as it really matters. Both companies are guilty of stupid naming schemes for their cards over the years. Yeah it's an overlocked 7600 with 8Gb more vram, but at least the public can differentiate that the XT version is the one with more vram. Unlike Nvidia and their 1060 3Gb and 6GB which had entirely different specs and not just in the vram. Same goes for the RTX 3060 12Gb and 8Gb models.
"Price will go up and never go down". So True!
No way RTX 3060 consuming only 30W over the 4060. Weird numbers
I have hp victus 5700 6600xt 16gb. Is it worth it to upgrade to 32 gb? If so what brand Nd model number
I think you have the prices the wrong way round. I would like extra vram at a cheaper price though lol.
Sorry in such a rush, just going to blur that bit out so it doesn't confuse.
@@techyescity No worries. I was excited for a moment haha.
When this gets down below $265, I'll pick one up... Maybe Prime Day in June? If they came out with a single slot, blower style card, I would pick that up for $330 for machine learning in my home lab...
The 4060 would be a much better choice for AI. Stop the denial amd cope.
Can a rx 7600 8gb steel legend, with ryzan 5 5600g run gta 6 in good/ descent settings?
F*** now i'm more confused
I was waiting for 7600 xt and now i don't know what to get anymore
They have been creeping back up but I have picked up 2 used 6700xt cards for 240 and 250 USD respectively in the last 6 months. Kids' PCs are set till College with those bad boys. The 7600xt is just too expensive for what it is even with the added Vram IMHO.
Seems like AMD are trying to bamboozle the less knowledgeable people with "LOOK 16GB!" that don't realise that the 128-bit memory bus and lack of CUs is going to prevent all of that VRAM from really even being utilised in gaming.
A bit of a cheap move if you ask me.
I'm actually planing to buy one, it's now the cheapest way to get 16 Gb of VRAM. For other uses other than gaming it's great
ARC has many issues still :/
Yeah, it will run Stable Diffusion but you'd better be patient. Might be some benefit in video editing if you are working with 8k video.
I could have gotten a 6750xt for about €380 around black friday last year... probably about as good as it gets for the foreseeable. The 6800 is about another 100 euro so worth it, IF you have the money
Regardless of your budget, if you need an upgrade, better get on with it because prices are going back up in response to the AI boom. There are millions of cheap used 3060 12GB cards out there going for around $200 in the US. The one I have is equivalent to a 4060 with an overclock - EVGA, Samsung die, Samsung VRAM. I'll be using mine for another year or two. I'll probably buy a used 7900XT, 4080 or 3090Ti, depending on budget. Supply and demand should equalize by then.
Man, if they had used the same die as the 7700xt/7800xt this could have been something special. Hell, if they had launched this for even $10-$20 less, it might have gotten better reviews as it is. The really sad thing is, if you don't have at least $400 to spend, and you want something that will have that VRAM moving forward and you want current gen...then this is pretty much the only option on the market.
I have a AMD Ryzen 7 5800X processor, can I use an AMD Radeon RX 6700 XT graphics card with it?
This is NOT a bad card at all, it's just a bit mismanaged; You either price this @ 299 or they could have given it 64MB of cache and more MHz to get it to match the 6700xt.
In a world where the 6700xt exists, why should I get this card, at the same price, for 10-20% less performance,..it plays a lot with selling VRAM you'll probably never max out.
Barring another crypto craze, the price is only going to get lower on the 7600 XT and it'll become better value. I reckon it'll be at the magic $300 by years end, ot there will at least be deals for it at that price or lower.
They can not and cache, because normal 7600 allready use full die…
Is 6750XT worth £30 more than 6700XT? £300 vs £330
dam my rx6700xt is still killing it can't wait to upgrade in 3-5 yrs might upgrade to amd base cpu later on since i've heard the new one is great too
I only just recently broke the 8GB mark in 1440p gaming, ironically with the HD texture pack on FC6, a game that is more than 2 years old. Haven't played AW2 but of the new releases I did play last year, AC Mirage, Starfield and the total abortion of a game, Motorfest, all run comfortably with settings maxed at 1440p on an 8GB GPU.
From the benchmarks I've seen on AW2, you need more than 8GB to run it at 1080p. Now that they have broken that barrier, other devs will have no qualms about following suit.
AW2 is entirely playable on a 3060 12GB, even at 1440p. If we look at AW2 as the new Crysis and expect titles in the near future to have similar resource demands, a GPU of this class, with 12GB VRAM is the least you should be buying in 2024 on the new market, even for 1080p. The RTX 3060 is the new budget standard. Goodbye 1650, it was nice knowin' ya.
They could have gone with 12GB for this GPU and they actually would have had a viable competitor to the 4060 because if you are gaming at 1440p there will likely be upcoming releases you can't run on an 8GB GPU. Instead, they ended up at too high a price point because of the extra 4GB VRAM which you definitely won't ever be able to use. By the time you need 16GB, the 7600XT will be as useless as a 16GB RX580 is now.
You're probably not going to be running GTAVI at 1440p on an 8GB GPU and maybe not even 1080p, assuming they weren't lying through their teeth about the trailer being actual in engine rendering. The Indy game also looks like it is going to be VRAM hungry.
As for RT, it is kinda useless on a GPU of this class even if it is an Nvidia card. There are some games with limited RT that will work like FC6, FH5 and AC Mirage. But games like Cyberpunk and AW2 and basically any new release moving forward that supports RT, forget it.
I think your assertion that 8GB is still fine is the result of a conflict of interest between your YT channel and your PC flipping business.
so, for ray tracing, nvidia is still better?
This is disappointing.
AMD are actually listening to the reviewers that are moaning about RAM
and when they listen they still get poor reviews.
Amd should just make two serious ray tracing models with the next gen. They'll probably drop a ton of cards like they have been and have them have the same average rtx that they've been doing, but just pick 2 cards to really excell in ray tracing for the people who want rtx and want to support AMD. they could drop a 8900 xt and a 8900 rt for example for 4k gaming and then a 8700xt and rt for 1440p
AMD made 4060 look good with this gpu. Amazing work as always.
The only worthwhile gpu this generation from AMD is 7800XT, everything else is just plain bad.
No bad cards, just bad prices.
@@ShinyHelmet not entirely correct entry level cards are dogshit that no one should ever buy since you can just get an APU that out performs them that being said they might make a nice paperweight
@@ShinyHelmetNo, these cards are literal e-waste. 128 bit memory bus with 16gb of memory is idiotic. Nvidia did the same thing with the 4060 ti 16gb. Useless cards.
@@PainX187 Well that of course depends whether you need to buy a new cpu or not. If your cpu is good enough for what you do, then why waste money on an APU when getting a shitty used card would do? Like I said, it all comes down to the price.
@@username8644 That's your opinion, to which I disagree.
There is a 4050, but its only for mobile uses. TSMC charges more money for smaller process nodes.
Cheers Bri !
is the 6700 xt still a better deal today?
Are people really using Ray tracing other than in Mine Craft and games like that?? Because I have never turned it on. I am always looking for the best high to ultra raster I can get and everything always looks great.. IDK that's just me..
Still got shitty memory bandwidth. Everyone has cut that down this gen, hidden restrictions most people don't look at.
very nice undervolting results
I lucked out and got a 6700 xt for $200 a few weeks ago and I am blown away by it's performance. Especially for the price. Now I'm even more amped it's discontinued, next card will be for a full next gen at this point so I'm going to enjoy my buttery smoothness. Thanks for the comprehensive review as always!
yeah. I have been looking for an upgrade and the best upgrade for me having a 6700XT was to upgrade my CPU to a 7800X3D.
My 6700XT seems like it got an upgrade.
The problem with buying this for things outside of gaming is that Nvidia is far, faaaaaar better for basically anything not gaming.
Stable diffusion is a great example, it does not even work on AMD without a lot of tinkering and the performance is bad compared to Nvidia even when you get it to work.
Cuda gets used a lot outside of gaming and its exclusive to Nvidia.
Future proof 1080p card especially for fancy games to play on ultra eye candy. Price should be 200-300 bucks tho
I think the 7600 XT was brought about by the marketing department instead of engineers and testers who knew how it would perform.
Kinda slots in with all its budget buddies fine. I can see some people thinking the 16gb on 7600XT is impressive, not realizing the bus isn't great etc, seems okay but not wow.
should been $300 and the 7600 non xt shouldve been officially dropped to $250. The rx 7600 is such a joke at 270 when the $300 4060 has 10% more raster perf, is way more efficient and has all of nvidias features.
Sounds like this GPU has been shrinkflated.
How about playing at mid resolution on a 4K monitor/tv?
just look at 1440p results and couple with fsr or dlss... thats what I generally do.
This thing doesn't quite have the processing power for 1440p gaming - and certainly not for 4k - so the 16gb isn't really necessary. It would be nice for AI, but at the moment you'd really want an Nvidia card for that because most AI is written with them in mind.
Why is everyone comparing the 7600 to the 6700? Shouldn't this card be compared to the 6600?
How does it compare doing AI dabbling on Nvidia vs. AMD?
It would be interesting video to compare top 5 AI projects being done on RTX 4060 Ti 16GB vs. Rx 7600 XT 16Gb.
The big problem for game 128 bit bus is what is strangling the performance of the card regardless of 16gb of vram
You don't use AMD for AI.
This card should be fine for stable diffusion with rocm using a smaller 13million parameter model from huggingface. there is a specific kernel version you need to use according to what wendell at level1tech forums was testing on his linux channel. once DIrect ML from Microsoft launches then we could see more improvement.
The 4060 will smoke it in almost any AI application. To be fair: there have been new improvements for AMD with AI, and the 7000 cards do perform much better than the 6000 cards. Also, more vram is only helpful if the model actually uses it! Not all models scale up to allow more with 16gb!
How does this 16gb card play escape from tarkov.
That game needs a ton of video memory
we need a dad-man cameo soon bryan. its been too long.
MSI Afterburner still working?
This card PCI Express limited?
*Giving 16 GB Vram on a card that actually needs it for RT and high resolution gaming
-AMD & Nvidia:🙅
*Giving 16 GB Vram to an entry card that will never use it:
-AMD & Nvida: 🤝
where are amd fanboys 😂
Right here lmao. When you can get better performance at $350 for an AMD card than every single NVIDIA card at the same price, yeah AMD is ruling the GPU game right now.
The power requirements are way over exaggerated by AMD. I assume its just to cover themselves legally from insurance claims if a dodgy PSU goes poof. But a good 500w will easily run this mayve even 450w depending on your other hardware.
lessgo yes man
I don't believe they can't cut margins. How can Intel sell an over 400mm2 die card, made on the same TSMC 6nm process, with the same 16gb of memory but a doubly wide 256 memory bus for the same price?
For those that don't know this means that you can literally make 4 7600XT dies for a single arc a770/a750 die, the board is more expensive and it even needs batter cooling because intel cards are less efficient.
Unless you think Intel is losing 150$+ per card sold, this makes no sense.
intel is taking a hit and eating the cost, its what needs to happen for their product development and the drivers to be ready. once they are proven in the consumer space for 3-4 years only then business folks are willing to spend any money on ARC flex offerings in the enterprise.
Intel is in fact HEAVILY subsidizing the cost of the Arc cards.
@@Wobbothe3rd Do you think its reasnoble to think that they are losing at least over 100$ for every arc card sold?
They sell for loss, because 770 and 750 were (maybe still are so bad)
Expect battle mage versions to douple the price (unless they are also really bad)
I'm not complaining I have the 6700xt but looking to get another card and give my 6700xt to my daughter
Used RDNA2 and 3000 are pretty sweet ATM. Spank the new stuff for way less money 🤷
8Gb is going to be obsolete, 16Gb might be to much but more is better than less. Its not about FPS but loading textures as Hardware Unboxed shown.
TLDR:
Yes it’s a slightly older card but it’s fully capable of gaming in 1440p - 60+fps - high settings
It’s a fantastic card for the money and there’s no reason to go Nvidia when you have this amazing gpu for under $350…
Video fixed
Most AI midels require CUDA, though. So, not as useful as RTX 4960TI 16GB
This Card sets a new standard for the lowest tier. That's good. Never again we will see a new card under 16GB
Maybe nvidia will launch rtx 4060 super 16GB card at $399 and discontinue rtx 4060ti
I have a theory that this is an entry AI card, with AMD trying to enter the consumer AI market.
how to change color rbg on rx 7600 xt. not info for this
The RX7600XT cost as much as a 4060Ti here.... so I would not even consider the 7600 at all lol.
1440p Gaming??? Damn...
I guess I will have to wait a few more years before I fork out
another whopping chunk of money to replace my aging
GTX 1080Ti to fully utilise my old 4K Monitors abilities, Yup my
old GTX 1080Ti "Games and Live streams" @4K, Granted no
Ray Tracing, but at least Live streaming performance was at an
acceptable 50~60 FPS @4K and yes... I have proof of that...
Yeah nah NOPE! I may as well also hold off on an 8K Monitor
which would provide a Super UHD HDR delight, especially if it
were complimented with Ray tracing. Alas with graphics cards
it seems it's just not worth the bang for buck yet, let alone are
still incapable of providing a better performance with this
Ray Tracing feature enabled, so really the investment in a new
card is just not going to be appealing enough, I wanna see at
least 4k @100fps gaming and Streaming performance with all
graphics settings to the max and with Ray Tracing enabled
before I take the plunge and upgrade my old GTX 1080Ti..
RX 6800 costs 20 percent more at $399 and has 50% more perf. Same vram.