If i had a nickel for everytime amd made and released a technology where they'd stack memory on their products, i'd have 2 nickels. Which isn't alot but its funny that it happened twice.
3D-Vache is a technology from TSMC, AMD is just using it for their chips. So it wasnt actually developed by AMD. TSMC created this packaing tech and AMD decided to use it on their CPUs.
Well last time Intel tried doing something unique with memory/storage it was ultimately too expensive and deprecated... Also HBM still exists, it's on its 3rd iteration and it's being used in AI products due to its massive bandwidth compared to GDDR6 and 7
i absolute love those generations of AMD and Nvidea cards i build my very first gaming PC in winter 2016 with my best friend and i bought a Radeon R9 390x nitro+ from Sapphire which looked like a space ship for us at this time i had the best GPU in my whole friend circle (when 300$ was considered a TON of money for a graphics card) and that thing was cramped into my super cheap, tiny sharcoon case :D and i would defenitely love to own a Fury X aswell. The integrated watercooler is defenitely a good thing and i always liked how tiny they made the card itself. i can just imagine the heat coming out of a case that has 2x fury x cards running in crossfire! and due to moores law being not accurate anymore, i could totally see that Crossfire and SLI makes a comeback. especially since we have Nvlink, which is WAY faster, works better and actually doubles the vram of each card maybe they will bring card that have 2GPUs on them and run in Nvlink, kinda like the epic 295x2 did back then and other cards
@@BatmanandRobinVsLarryHoover i was also an AMD boy. Mostly for money saving purposes i took the FX 6300 and OCd it to healthy 4,7ghz. Together with the OCd 390x nitro i felt like i had a nuclear reactor in my room haha. but games played so well despite the CPU defenitely being the weak spot.
thanks for the interesting story! In 2015-2016 I had an R9 270X, which ran all games at medium and sometimes high settings coupled with the FX 6300 processor. And I also dreamed of an R9 Fury X then)) And now, after so many years, it’s in my hands! As for the future development of video cards, we will probably see a multi-chip layout like Ryzen processors in the future🤔
Both these cards aged extremely well. If you also mined crypto with your investment when not playing games, you would have easily paid itself over twice. My R9 nano paid itself off 10 times+ mining ethereum (30MH/s) from 2017 - 2020
HBM2 is used in a Server segment. Higher bandwidth does not necessary means higher TFLOPS. But it has a massive throughput where multiple Ai machines bennefits from it. In a gaming segment, you need fast memory with low latency and GDDRX has it quite a bit lower than HBM. The second problem is the GPU CHIP. It is not fast enough to process over 500GB/s of data. If you pair HBM2 memory with RTX 4090, the card will be like 25% faster but almost twice as expensive.
It's amazing that they actually made these and the subsequent Vega cards as well. AMD was struggling financially at that time and this was a big risk for them. These (and Vega) sadly had more downsides than positives. They were expensive at the time, drew a lot of power, the air cooled models ran hot (especially the VRMs). Reliability in the longer term was also an issue. AMD's drivers were not as good back then as they are today. Vega was a lot better. Still had issues, but they were competitive with Nvidia for the most part, endlessly tune-able.
skipped fury went 6970 to 290 390 390x then vega 56 (hated this card) then 1080 2080ti 1080ti (used very cheap 50usd) sold them all and got a 7900xtx and a770
@@ut2k4wikichici I'm sorry to hear your V56 experience was so bad. I've kept my V56 because it hits above its class and it's resale value isn't enough to make me want to sell it.
i actually got myself a perfectly working Vega 56 on the side as a backup card but also as a reminder that AMD had the balls to manufacture an HBM card.
@@livegamesii because at that time vega 56 was way more expensive than 1070 and had less frames and more wattage and higher temps...the minir era of amd cards even rx 580 was 100-200 dollars more than 1060
You have to hand it to AMD. They are the ones who came up with HBM, tried it out and innovated the technology... as well as MCM... and APUs and other similar technologies which have pushed tech forward over the years. All while being the underdog.
I dreamed about this gpu as i configured my first pc online. Couple of years later i bought it and was so hyped. Foundees edition gpu water cooled i mean how much chad do they want to ne. I still live it.
I've had terrible luck with AMD's HBM cards. First with an R9 Fury Nano, and then with a Vega 64. Both seemed to suffer from progressive degradation issues of the HBM modules, despite my best efforts to keep them as cool as possible (short of replacing the original air coolers with a custom liquid cooler). And even now whenever I built custom gaming rigs for people, I always stay away from any of the HBM cards as a precaution. Which is a shame because HBM could've been awesome.
There are too many tests; they could be presented in a more condensed form. Otherwise, I like this kind of approach to using older hardware with the right games.
i absolutely love that genrtation of AMD cards. but its so sad to see how once flagship now barely catches up (or cant even launch some games lol) with modern Radeon embedded graphics in newer Ryzen APUs in handhelds..
I have both and I’ll take the 4070 super😂. It’s kind of the argument of I have this old V8 truck but now I can only get a v6 twin turbo that’s faster and more efficient. The Fury sure can heat your house in the winter though.
I have the Fury X myself, it's a nice card, but I have to say the coolest one was the Nano, it would have been an awesome flex if it was able to beat Maxwell.
Yes, these are great graphics cards. But in my opinion, why they did not find demand - a high price. If they cost $ 100 less, they would become much more popular.
They differ in the cooling system and power elements. They have the same graphics processor. And in operation, you will not notice the difference between them. Asus may be quieter under load
Ha, i have two of these in my closet form an old cross-fire system i built and barely used. I still can't believe i spent over $1500 for them and I can't give them away today now.
@@Al-USMC-RET send them to me! I do micro soldering and I would love to have some HBM tiles to use for... Science. Each chip has 1024-bit bus. 128GB/s bandwidth. Fun stuff to use for projects. Especially AI. Huge inference gain because of the bandwidth.
Самая большая проблема старых hi-end карт это объем памяти. Но новое поколение конечно же лучше, TDP ниже, производительность выше. Если бы не проклятые майнеры, играть было бы дешево.
The biggest problem which AMD encountered was indeed the political relationship of Game developers with NVIDIA. This is apparent more than before if you compare the games with the likes of Skyrim and Fallout 4 favoring NVIDIA then now Starfield favoring AMD. Engineers having the lack of resources to optimize their drivers for previous generations before RDNA1 onwards lead to many AMD GPU owners giving up their cards in favor of NVIDIA cause they cannot play their favorite games in Day 1 and some had to wait for months before it got fixed. Square-Enix ties with NVIDIA in the past like FF12, FF13, FF14 and FF15 as a good example. Now if not with AMD winning the PS4 and PS5 and XBOX ONE/Series Consoles as its SoC's exclusive manufacturer and now winning the Linux Community, things would have not favor AMD today and remained a niche alternative to NVIDIA.
And yes - I have owned GPUs from both AMD as well as NVIDIA for 3 decades. From Riva TNT2, GeForce 2 MX400, 3DFX Voodoo4 4500, GeForce 9400 GT, ATi HD4850, GT630, GTS 450, RX 460, GTX 1660 Ti, RX 6700 XT.
everytime i see these videos i get reminded of people complaing their top of the line 2024 GPU cant play at 4k 155 fps on ultra graphics with raytracing and no DLSS or frame gen.. and then i look at these cards strugguling to get to 60 fps on 900p-1080p
Yes, that's true. But for some reason, I personally find it more interesting to watch old video cards trying to run games than to watch new ones, which are already clear that they will run all games on high-ultra settings.
@@livegamesii I bought the card just for the games I had a GTX 970 at the time and let me tell you that RX570 aged so well and that price was just months away from the mining nightmare 😔
It seems that all the games from the BF series are illegal because the Russian language has been added and as we know, the cracked versions of this game were mostly Russian-language
@@livegamesii This is so freaking weird all the time on TH-cam. I know that YT does also shadowban me. Often I read a comment with stuff that is obviously and usually not allowed on TH-cam (like having a conservative opinion on something), but the comment stays bc its probably a new account, that hasnt been flagged yet for being "problematic" in the eyes of the tech oligarchs. I can 1:1 copy and post the same exact comment, in the same comment section - then I refresh the page once and the comment is gone. but now I think it was maybe deleted bc I posted the patent number and there are a lot of numbers, which could result in a false positive in being a scam phone number. Its just CRAZY what is happening on YT. Good to know you dont delete comments, it just didnt make any sense that it gets deleted everytime.
@@livegamesii see, the comment I just made is gone again. I think both of you should have gotten a notification with the comment, you can read it once. If you have e-mail notifications activated, you probably also got the email with the reply. But I think only @livegamesii got an e-mail bc I only tagged him.
@@DJdoppIer @livegamesii see, the comment I just made is gone again. I think both of you should have gotten a notification with the comment, you can read it once. If you have e-mail notifications activated, you probably also got the email with the reply. But I think only @livegamesii got an e-mail bc I only tagged him.
nvidia always had better flagship gpus...i remember since 780 ti .980 ti and after it was nvidia monopoly....i dont remember before these gpus because i was a kid before these gpus
A flag ship in name only, it performace was marginally better the the cheaper air cooled version whichbwas considered a step down. And definitely not as good as what nvidia had out at the time. Dont get me wrong, i wish it had been better.
buying a new gpu these days feels like buying a faster car without fixing the roads first who can enjoy a ferrari when its driving on an UGLY ENGINE 5 road.
Hi! Guys, I misspoke here 5:42 - I meant Radeon VII
🔥🔥🔥THE BEST VIDEO CARDS FOR 2024🔥🔥🔥
✅Budget graphics cards:
RTX 4060 8GB amzn.to/4dUE5mH
RX 7600 8GB amzn.to/4hcspyG
RX 7600XT 16GB amzn.to/40fdy0m
RTX 4060 Ti 8GB amzn.to/48d0rPf
RTX 4060 TI 16GB amzn.to/4098YRl
✅Medium graphics cards:
RX 7700 XT 12GB amzn.to/3C2HD9e
RTX 4070 12GB amzn.to/4eOw5VG
RX 7800 XT 16GB amzn.to/4fg0ZX1RTX
RTX 4070 SUPER 12GB amzn.to/4dTH3YL
RTX 4070 TI 12GB amzn.to/48hdxek
✅Top video cards:
RX 7900 GRE 16GB amzn.to/3YiLxSW
RTX 4070 TI SUPER 16GB amzn.to/40dpvnp
RX 7900 XT 20 GB amzn.to/3YACn4D
What music do you use in this video in the background? I really like it.
If i had a nickel for everytime amd made and released a technology where they'd stack memory on their products, i'd have 2 nickels. Which isn't alot but its funny that it happened twice.
They probebly used gained knowledge from developing HBM and baked it into the 3D V-cache technology.
3D-Vache is a technology from TSMC, AMD is just using it for their chips. So it wasnt actually developed by AMD. TSMC created this packaing tech and AMD decided to use it on their CPUs.
When they announced the x3d, many said it would fail like hbm. Amd is the only one brave enough to do something like this
Well last time Intel tried doing something unique with memory/storage it was ultimately too expensive and deprecated...
Also HBM still exists, it's on its 3rd iteration and it's being used in AI products due to its massive bandwidth compared to GDDR6 and 7
@ yes it still exists. I’m using rx vega gpu with hbm 2
AMD has never been afraid to experiment!)
@@livegamesii What's the )?
Its not fail, its just too expensive for consumers
i absolute love those generations of AMD and Nvidea cards
i build my very first gaming PC in winter 2016 with my best friend
and i bought a Radeon R9 390x nitro+ from Sapphire which looked like a space ship for us
at this time i had the best GPU in my whole friend circle (when 300$ was considered a TON of money for a graphics card)
and that thing was cramped into my super cheap, tiny sharcoon case :D
and i would defenitely love to own a Fury X aswell. The integrated watercooler is defenitely a good thing
and i always liked how tiny they made the card itself.
i can just imagine the heat coming out of a case that has 2x fury x cards running in crossfire!
and due to moores law being not accurate anymore, i could totally see that Crossfire and SLI makes a comeback.
especially since we have Nvlink, which is WAY faster, works better and actually doubles the vram of each card
maybe they will bring card that have 2GPUs on them and run in Nvlink, kinda like the epic 295x2 did back then and other cards
I had a 390x build a black and red MSI gaming X with a i5 4690k
I had the same card with an fx 8350. Blew my mind back then
@@BatmanandRobinVsLarryHoover i was also an AMD boy. Mostly for money saving purposes i took the FX 6300 and OCd it to healthy 4,7ghz. Together with the OCd 390x nitro i felt like i had a nuclear reactor in my room haha.
but games played so well despite the CPU defenitely being the weak spot.
thanks for the interesting story! In 2015-2016 I had an R9 270X, which ran all games at medium and sometimes high settings coupled with the FX 6300 processor. And I also dreamed of an R9 Fury X then)) And now, after so many years, it’s in my hands!
As for the future development of video cards, we will probably see a multi-chip layout like Ryzen processors in the future🤔
@max.racing I managed to overclock my FX 6300 to 4.6 GHz, which also greatly accelerated the system😅
If only the went to 8gb of HBM1.
I'd argue even if it was 6 gigs, it could have done very well
And the price would be lower than Nvidia))
to be fair, memory compression is a thing and the hbm performed capacity wise better than other 4 GB cards and was on close to the 6 GB on the 980
yes, this also played a significant role in the speed of operation.
Both these cards aged extremely well. If you also mined crypto with your investment when not playing games, you would have easily paid itself over twice. My R9 nano paid itself off 10 times+ mining ethereum (30MH/s) from 2017 - 2020
the electricity was way lower then ..now its a very bad value to have this card even for mining you have way better alternatives
HBM2 is used in a Server segment. Higher bandwidth does not necessary means higher TFLOPS. But it has a massive throughput where multiple Ai machines bennefits from it. In a gaming segment, you need fast memory with low latency and GDDRX has it quite a bit lower than HBM. The second problem is the GPU CHIP. It is not fast enough to process over 500GB/s of data. If you pair HBM2 memory with RTX 4090, the card will be like 25% faster but almost twice as expensive.
Yes, that's exactly it!
I've r9 290x liquid cooled one. And I'm really happy 😊
Yes, a great map for its age, which runs many modern games 😃
I had the nano. Where coil whine was louder then the fan
It's amazing that they actually made these and the subsequent Vega cards as well.
AMD was struggling financially at that time and this was a big risk for them.
These (and Vega) sadly had more downsides than positives. They were expensive at the time, drew a lot of power, the air cooled models ran hot (especially the VRMs). Reliability in the longer term was also an issue. AMD's drivers were not as good back then as they are today.
Vega was a lot better. Still had issues, but they were competitive with Nvidia for the most part, endlessly tune-able.
Yes, I totally agree with you!
skipped fury went 6970 to 290 390 390x then vega 56 (hated this card) then 1080 2080ti 1080ti (used very cheap 50usd) sold them all and got a 7900xtx and a770
Why didn't you like Vega 56?
@@ut2k4wikichici I'm sorry to hear your V56 experience was so bad. I've kept my V56 because it hits above its class and it's resale value isn't enough to make me want to sell it.
i actually got myself a perfectly working Vega 56 on the side as a backup card but also as a reminder that AMD had the balls to manufacture an HBM card.
@@livegamesii because at that time vega 56 was way more expensive than 1070 and had less frames and more wattage and higher temps...the minir era of amd cards even rx 580 was 100-200 dollars more than 1060
For a nearly ten year old GPU, it does alright. If your tastes lean towards undemanding games, this is still viable to some degree.
Fury, loved it, it still has special spot for HW in my heart
You have to hand it to AMD. They are the ones who came up with HBM, tried it out and innovated the technology... as well as MCM... and APUs and other similar technologies which have pushed tech forward over the years. All while being the underdog.
I loved this card so much because it was a different approach. I still want one.
I dreamed about this gpu as i configured my first pc online. Couple of years later i bought it and was so hyped. Foundees edition gpu water cooled i mean how much chad do they want to ne. I still live it.
17:07: NFS Unbound*, not Heat
Could U test it witch custom nimeZ drivers?
A video card ahead of its time! The review is very interesting, I love watching things like this 😃👍
Thank you, glad you liked it!
what drivers did you use?
I used 2 drivers: 1. The latest official 2. The latest from NimeZ (in new games: Alan Wake, Horizon FW)
@@livegamesii Thanks! I get massive issues in the new cod warzone, could you maybe try to test that next?
@@livegamesii Do you happen to have a link to the nimeZ drivers?
sourceforge.net/projects/radeon-id-distribution/files/
Probably, the problems also arise due to the amount of memory 🤔 The driver will no longer help here.
Time proved that R9 390X was a better buy.
But this was clear from the start :)
underrated channel. good woork
Thank you very much! Subscribe to our channel! :)))
I almost went and got a non-X several years ago, but that 4GB VRAM made me to forget it.
I came just to see benchmarks but damn the history of the gpu was great to see.
I had 2 laptops before my pc , they had a 950M and 1050 respectively , they were both awful , but the upgrade to the 7900xtx was much needed:)
I've had terrible luck with AMD's HBM cards. First with an R9 Fury Nano, and then with a Vega 64. Both seemed to suffer from progressive degradation issues of the HBM modules, despite my best efforts to keep them as cool as possible (short of replacing the original air coolers with a custom liquid cooler). And even now whenever I built custom gaming rigs for people, I always stay away from any of the HBM cards as a precaution. Which is a shame because HBM could've been awesome.
There are too many tests; they could be presented in a more condensed form. Otherwise, I like this kind of approach to using older hardware with the right games.
Imagine current amd gpus with hbm3e it can do over 1tb/s
if they could have done 6-8gb hdbm this gpu would still be amazing its limited by its memory
I own both a Fury X and a R9 Nano. These cards really punch above their weight tbh, not even a 1.55 GHz RX 590 can touch a Fury X in Time Spy.
i absolutely love that genrtation of AMD cards. but its so sad to see how once flagship now barely catches up (or cant even launch some games lol) with modern Radeon embedded graphics in newer Ryzen APUs in handhelds..
I remember this one i was dreaming of buying it one day, but the best I could afford was GTX 960 STRIX
4:38 And 9 years later, we have RTX 4070 Super which has even less memory bandwidth than the Fury X.
I have both and I’ll take the 4070 super😂. It’s kind of the argument of I have this old V8 truck but now I can only get a v6 twin turbo that’s faster and more efficient. The Fury sure can heat your house in the winter though.
Go get Fury X then, What you waiting for? Go on, Buy.
Its guest PC has a Vega 56, it works an excellent for playing games
I have the Fury X myself, it's a nice card, but I have to say the coolest one was the Nano, it would have been an awesome flex if it was able to beat Maxwell.
Yes, these are great graphics cards. But in my opinion, why they did not find demand - a high price. If they cost $ 100 less, they would become much more popular.
How come they never released a special Ronaldo Nazario version of these R9 cards.
That isn't a Radeon 8. It's VII. 5+1+1=7.
Yes, I misspoke. And the script said “VII” 😅
No overheat for this card. Temp is not higher than 60 degree😯
nice video! i enjoyed the little history lesson on the gpu, very good card for its time :)
Thank you very much, I'm glad you liked it! Subscribe to the channel, there will be a lot of interesting videos🙂
Whats the difference between Asus ProArt rtx 4080 super and Zotac 4080 super Why is ProArt More expensive then Zotac! Plz help
Proart uses better power delivery than ZOTAC and can be pushed harder and faster than ZOTAC
They differ in the cooling system and power elements. They have the same graphics processor. And in operation, you will not notice the difference between them. Asus may be quieter under load
Such a unique card ngl
Ha, i have two of these in my closet form an old cross-fire system i built and barely used. I still can't believe i spent over $1500 for them and I can't give them away today now.
And now they are almost worthless 🥲
@@Al-USMC-RET send them to me! I do micro soldering and I would love to have some HBM tiles to use for... Science. Each chip has 1024-bit bus. 128GB/s bandwidth. Fun stuff to use for projects. Especially AI. Huge inference gain because of the bandwidth.
that Fury X load temp is lower than my RX570 Idle temp😂
"Budget Builds Official has entered the chat."
Самая большая проблема старых hi-end карт это объем памяти. Но новое поколение конечно же лучше, TDP ниже, производительность выше.
Если бы не проклятые майнеры, играть было бы дешево.
Sir did we talked to each other's on discord ?.
it's unlikely 🤔
Radeon 8... is Radeon 7
Yes, I misspoke.. But it was written correctly in the script 😅
The biggest problem which AMD encountered was indeed the political relationship of Game developers with NVIDIA. This is apparent more than before if you compare the games with the likes of Skyrim and Fallout 4 favoring NVIDIA then now Starfield favoring AMD.
Engineers having the lack of resources to optimize their drivers for previous generations before RDNA1 onwards lead to many AMD GPU owners giving up their cards in favor of NVIDIA cause they cannot play their favorite games in Day 1 and some had to wait for months before it got fixed. Square-Enix ties with NVIDIA in the past like FF12, FF13, FF14 and FF15 as a good example.
Now if not with AMD winning the PS4 and PS5 and XBOX ONE/Series Consoles as its SoC's exclusive manufacturer and now winning the Linux Community, things would have not favor AMD today and remained a niche alternative to NVIDIA.
And yes - I have owned GPUs from both AMD as well as NVIDIA for 3 decades. From Riva TNT2, GeForce 2 MX400, 3DFX Voodoo4 4500, GeForce 9400 GT, ATi HD4850, GT630, GTS 450, RX 460, GTX 1660 Ti, RX 6700 XT.
Yes, you are right, I agree with you!
Wow, you had a lot of video cards😮 By the way, subscribe, there will be a lot of interesting videos about video cards and processors!
Subbed :)
what game is at 4:12? also great video
Cyberpunk 2077. Thank you! Subscribe, there will be many interesting videos about video cards and processors!)
everytime i see these videos i get reminded of people complaing their top of the line 2024 GPU cant play at 4k 155 fps on ultra graphics with raytracing and no DLSS or frame gen.. and then i look at these cards strugguling to get to 60 fps on 900p-1080p
Yes, that's true. But for some reason, I personally find it more interesting to watch old video cards trying to run games than to watch new ones, which are already clear that they will run all games on high-ultra settings.
radeon 890m performs basically like this
Maybe 🤔 It needs to be checked
I dreamed of one like this... but I only had enough money for a gtx 750 ti 😁
When it came out, the 750 ti was a great card for its price👍
Remember when the RX 570 was 99.99 bundled with 3 games worth over a 100 will we ever see something like that ever again?
@@RoerDaniel i wish
@RoerDaniel Was there really such an action?😅
@@livegamesii I bought the card just for the games I had a GTX 970 at the time and let me tell you that RX570 aged so well and that price was just months away from the mining nightmare 😔
It seems that all the games from the BF series are illegal because the Russian language has been added and as we know, the cracked versions of this game were mostly Russian-language
No, not all. Only BF 4. The rest I bought from EA App.
why are you deleting my comment about the patent that TSMC holds?
I didn't delete your comment 🤔
TH-cam is probably responsible for that. Their bots will flag comments for no reason at all and delete them entirely sometimes.
@@livegamesii This is so freaking weird all the time on TH-cam. I know that YT does also shadowban me. Often I read a comment with stuff that is obviously and usually not allowed on TH-cam (like having a conservative opinion on something), but the comment stays bc its probably a new account, that hasnt been flagged yet for being "problematic" in the eyes of the tech oligarchs. I can 1:1 copy and post the same exact comment, in the same comment section - then I refresh the page once and the comment is gone.
but now I think it was maybe deleted bc I posted the patent number and there are a lot of numbers, which could result in a false positive in being a scam phone number. Its just CRAZY what is happening on YT.
Good to know you dont delete comments, it just didnt make any sense that it gets deleted everytime.
@@livegamesii see, the comment I just made is gone again. I think both of you should have gotten a notification with the comment, you can read it once. If you have e-mail notifications activated, you probably also got the email with the reply. But I think only @livegamesii got an e-mail bc I only tagged him.
@@DJdoppIer @livegamesii see, the comment I just made is gone again. I think both of you should have gotten a notification with the comment, you can read it once. If you have e-mail notifications activated, you probably also got the email with the reply. But I think only @livegamesii got an e-mail bc I only tagged him.
🔥🔥✨omg
What, Crysis 3 is optimized af
nvidia always had better flagship gpus...i remember since 780 ti .980 ti and after it was nvidia monopoly....i dont remember before these gpus because i was a kid before these gpus
music is annoying.
A flag ship in name only, it performace was marginally better the the cheaper air cooled version whichbwas considered a step down. And definitely not as good as what nvidia had out at the time.
Dont get me wrong, i wish it had been better.
Yes, it is. But this amd graphics card was almost the same in performance as the GTX 980 ti. Fury X lagged slightly behind it.
👍🏾
buying a new gpu these days feels like buying a faster car without fixing the roads first
who can enjoy a ferrari when its driving on an UGLY ENGINE 5 road.
Me still using R9 370🥲