For my. GTX 970 (EVGA SSC) , I used used the vbios editor to enable a slightly higher GPU vtage, as well as a higher power limit to 230 watts. That allowed for 1.5GHz on the GPU, and a 600 MHz VRAM overclock.
Still rocking it! It can play less demanding games pretty well! Visions of Mana runs on high, 1080p, between 30 and 50 fps. It feels pretty smooth, even when it hits 30.
Just got a pc with this and a 6th gen i5 and its great, been running a laptop with a pentium only and it is mindblowing. I also got it for only 200 quid so its a steal i say
Hats off to AMD for making FSR available for everyone, making even 10 year old GPUs still able to run newer games, might not be much but at least it'll give gamers 1 or 2 years more of performance until they upgrade, that alone is enough for me to sell my soul to AMD, they actually care.
i loved my 970 from MSI until my PSU fried my entire system and i had to shift over to a 1660ti after awhile, otherwise i probably would have rocked that until the current gen
I've used MSI's GTX 970 until Aug 2022. Bought it used for $75 in early 2019 as transitional GPU waiting for new RTX 3000 series (2020) or a price drop in 2000 series (2018)... and then the prices went insane, with even GTX 970 asking price reaching x3 of what Ive paid. To be honest I was perfectly fine with 970 (teamed with 4570 for 2 years, and then with 12600k for 1 and a half). I mostly played Paladins which capped out at game's 175 fps limit even on GTX 650 Ti Boost, some Ghost Recon Wildlands, Dying Light, Assetto Corsa, GTA 5, War Thunder, but even going for the new Modern Warfare (end of 2019) it was managing fine. The issue however started with playing Warzone 1 the Caldera era (end of 2020), because no matter how you set the game, it would use up more than 3.5GB of VRAM, and that was where the limited bandwidth of the last 0.5GB was causing some choppiness. If only nVidia didn't cheap out on that VRAM configuration that card would hold up well for much longer, especially since I was still able to play Warzone somewhat reasonably thanks to nVidia Reflex - that tech is a magic when you run into GPU bottleneck. Because of my experience with Reflex on 970 I went for 3070 Ti next instead of Radeon despite it seemingly being a better value for money. With the current prices, having that cushion of Reflex which can help you delay buying a new card is considerable value in itself imo. Honestly that 970 is probably the best money spent on computer hardware I did, and I started in TNT2 32MB era with those Celerons and soon socket A Durons overclocking by 50%.
I have exactly same card in my Win 7 machine. Picked it up for 20 euros couple of months ago. A bit disappointed that You didn't show GPU power usage. Mine usually goes up to 170W and temps are exactly same 76-78C. I wonder if changing termopaste would help. All in all, nice video.
I've also overclocked it a bit. Without additional voltage, +150MHz core and +300 MHz memory is nice and stable. With card old like this one, every extra frame helps, although I tend to play games from Win 7 era on this computer which (usually) are not that demanding.
970 XTREME 4GB very happy with it performs reasonable most games hitting 60+ fps on avg playable (without stutters) 80-95 fps - Auto OC (two modes great for beginners) however with - Manual OC - running at 1340MHz core clock (under-clocked to achieve) temps are great only ever go above 35 when underload three fans running at 80-100% overall great card with only one limitation; vram
4gb of VRAM is a bit low em planning to order A 6gb GPU card.the amd system em to build whould have 8gb of ddr3 next to 24gb of ddr4 for better efeincty.
They used two different types of ram, 3.5GB was the faster 196 GB/s listed on the spec sheet while the remaining 0.5GB was a fraction of that speed running about 20GB/s. The uproar was in response to Nvidia's bareface lies. They didn't think they would be caught out and probably wouldn't if Skyrim modding wasn't a thing. Eitherway people lost a lot of faith in Nvidia thought that. Long before threatening reviewers, mining antics and other anti consumer practices. It didn't help that people felt 4GB was low for a card of that performance while AMD cards had 8GB and was considered excessive. In all honesty with the advantage of hindsight the card was fine, even for games released years later. It probably did fly close to the limit but Nvidia did get it right. Modding was one of the few scenarios that did push it over which did affect some people but was probably a smaller group of people than the vocal majority led you to believe.
The 3.5GB issue was that the VRAM arrangement that Nvidia went with, lead to the last 512MB package sharing one SM n the GPU, thus needing context switching to fully utilize from people using CUDA based benchmarking tools on that 512MB pool, they topped out at around 26GB/s at stock speeds, though with a 600MHz or so overclock of the VRAM, you could push the main pool to around 277GB/s and the 512MB pool to about 35GB/s. The separate pool introduces a small latency penalty in addition to the throughput penalty for using that last 512MB, which would lead to a performance hit of around 5% to 10% depending on the game and how much VRAM it actively needed to use. Since the nvidia drivers treated the last 512MB as a second memory level, it allowed the first 3.5GB to be prioritized, then if a game needed the full 4GB, it would allocate it, but send lesser used data to the 0.5GB pool to minimize the performance impact. The GTX 980 used 8SMs with a 512MB package connected to each SM, when Nvidia crippled toe chip to make a GTX 970, they disabled an entire SM rather than just the CUDA cored of the SM. They then had one of the SMs connect to 2 packages, which required a context switch of that SM to access the other package rather than the more efficient way the system would handle more than one chip and DIMM per channel, where operations are evenly spread across the modules and interleaved to avoid slowdowns. The time when the 3.5GB claim really held true enough for a class action lawsuit that lead to Nvidia refunding around $50 for each user, was the spillover behavior when a game needed to actively use more than 4GB. For most games that needed more than 4GB actively, you would see the card gradually drop to 3.5GB of VRAM use, and then begin allocating system RAM when active use of system RAM would be 768MB or higher in terms of active use and not idle allocation, such as when a game may preload assets for a level that is not actively used until you are teleported to a different location. The overall design that Nvidia used, made the card behave as if it only had 3.5GB, if it needed to effectively use close to 5GB or more of video memory, this was to avoid dealing with the latency penalty of 3 context switches to actively use the 3 pools of memory. PS, spillover to system memory for GPU use, carries a 20% or so performance hit up until the PCIe bus interface in one direction gets saturated. After that, you will run into significantly larger performance drops, as well as major hitching and stuttering issues. PCIe 3.0 X16 typically means that you can get away with around 768MB of spillover for active use before hitching issues happen, and if a card has a PCIe 4.0 X16, then you can typically do 1.5GB to 2GB before you encounter hitching and other major performance issues. Though the trend these days, is for game devs blacklisting the the shared memory pool in their game engines. For example the launch version of Hogwarts legacy, would try to actively use well over 8GB of VRAM at high settings, and cards like the RTX 3070, would end up having major performance issues as the game would try to use over 10GB of VRAM. The company then patched the game to stop loading more textures once the dedicated VRAM pool was filled and actively used. This meant that the user would see some muddy/ blurry textures in some areas rather than experiencing major hitching issues. This has now become a default state for many major game releases since both Nvidia and AMD have started releasing cards with PCIe X8 interfaces, and in some cases, even X4 interfaces. A card using an X4 interface, would likely choke when it encounters any shared memory being actively used, as those cards run near saturation just from the basic commands sent by the CPU.
@@Javadamutt No it wasnt even fine for its time and its not now. its a stuttery mess, if you look closely its dropping textures to try and maintain fps and frametime is all over the place. ive seen older videos of other games like GTAV during long play sessions where the game becomes unplayable due to the vram limit.
FSR is made by AMD but they allow all brands of GPU to use it, in fact their latest FrameGen technology can also work with Nvidia DLSS which is awesome.
@@TechLabUK Still confused. Are you saying you run AMD drivers on a Nvida card? Where is the setting in Nvidia drivers to turn FSR on? I've never see it, I do see DLSS in some games. I have a few old 970s the key to those were keeping the games at 3.5 GB or lower. Its a 4Gig card but really 3.5 due to the way they implemented the memory. Does FSR reduce ram usage too?
@@Physics072 No, you don't need AMD drivers. FSR is built into the games so games that support it, you can enable and use it regardless of your GPU. FSR won't reduce RAM usage but it will allow you to increase performance at settings within that ram usage. FSR 3 and Framegen will generate frames so performance is increased a lot.
@@TechLabUK Did not know that just never noticed it as an option. Must not be in many games. So FSR is just another frame generation but you still get the input lag of the base fps if FSR was off. No free lunch but it can look smoother just not "Feel" smoother.
@@Physics072 FSR is in a lot of games and is generally added to most new titles because of how "open-source" like it is. FSR it self is the upscaling technology so basically like DLSS but the latest version of FSR (3) supports Frame Generation as well, developers can implement FSR3 without FrameGen though if they want to.
Apart from the video shows it's playable in alot of titles, funny how people still play PS4 putting out 30fps but a 970 achieving 60fps or more in the same games isn't playable.
@@veilmontTV 1080ti in the UK is about 3x the price of a 970 . Not everyone has the money or the need for a more powerful GPU. I could buy any GPU i want but currently my only pc with a discreet GPU is a 750ti, it still turns in 150fps n Fortnite at 1080p performance mode.
@@roasthunter you must not value your hobbies or must not game a lot. I'm gaming at 4k 240hz and that is an experience worth paying for. 1080p is a resolution that should be discarded
@@mercurio822 It is better to search for more information and see that everyone thinks it is the best card ever, which is still perfect after 10 years. That's why I still play with it and have the same framerate in Squad than people with a modern card of 1000 euros. All good games can be played perfectly with it
During the GPUpocalypse, this was a great alternative from the used market.
I snagged a 3070 at msrp right before prices got ridiculous
For my. GTX 970 (EVGA SSC) , I used used the vbios editor to enable a slightly higher GPU vtage, as well as a higher power limit to 230 watts. That allowed for 1.5GHz on the GPU, and a 600 MHz VRAM overclock.
Built a PC for my GF with my old GTX 970 from 10 years ago. I'm still incredibly impressed by this little card.
Still rocking it! It can play less demanding games pretty well! Visions of Mana runs on high, 1080p, between 30 and 50 fps. It feels pretty smooth, even when it hits 30.
Just got a pc with this and a 6th gen i5 and its great, been running a laptop with a pentium only and it is mindblowing. I also got it for only 200 quid so its a steal i say
Built loads of systems with these couple years ago!
Hats off to AMD for making FSR available for everyone, making even 10 year old GPUs still able to run newer games, might not be much but at least it'll give gamers 1 or 2 years more of performance until they upgrade, that alone is enough for me to sell my soul to AMD, they actually care.
i loved my 970 from MSI until my PSU fried my entire system and i had to shift over to a 1660ti after awhile, otherwise i probably would have rocked that until the current gen
10 years anniversary for gtx970, test it with I7 4790 on same year
its really a great gpu .. i still have a i5 7500, 16gb ram, and zotac gtx970 this 2024 and it works great
I've used MSI's GTX 970 until Aug 2022. Bought it used for $75 in early 2019 as transitional GPU waiting for new RTX 3000 series (2020) or a price drop in 2000 series (2018)... and then the prices went insane, with even GTX 970 asking price reaching x3 of what Ive paid. To be honest I was perfectly fine with 970 (teamed with 4570 for 2 years, and then with 12600k for 1 and a half). I mostly played Paladins which capped out at game's 175 fps limit even on GTX 650 Ti Boost, some Ghost Recon Wildlands, Dying Light, Assetto Corsa, GTA 5, War Thunder, but even going for the new Modern Warfare (end of 2019) it was managing fine. The issue however started with playing Warzone 1 the Caldera era (end of 2020), because no matter how you set the game, it would use up more than 3.5GB of VRAM, and that was where the limited bandwidth of the last 0.5GB was causing some choppiness. If only nVidia didn't cheap out on that VRAM configuration that card would hold up well for much longer, especially since I was still able to play Warzone somewhat reasonably thanks to nVidia Reflex - that tech is a magic when you run into GPU bottleneck. Because of my experience with Reflex on 970 I went for 3070 Ti next instead of Radeon despite it seemingly being a better value for money. With the current prices, having that cushion of Reflex which can help you delay buying a new card is considerable value in itself imo. Honestly that 970 is probably the best money spent on computer hardware I did, and I started in TNT2 32MB era with those Celerons and soon socket A Durons overclocking by 50%.
I do have one and run it as the GPU on my Xeon 12c 24t rig, which I use for e-sports games like Fortnite and CSGO2, runs great, would also recommend 👍
@ 1525 core/ 4000 ram, mine still gets the job done
Are you gonna review the new 8000gpu,s sir?
Do you mean the 8000 series APUs from AMD or the next generation of GPUs from Radeon?
@@TechLabUK yep. I see zero vids on YT for the 8500g.
Gap in the market ;)
I have exactly same card in my Win 7 machine. Picked it up for 20 euros couple of months ago. A bit disappointed that You didn't show GPU power usage. Mine usually goes up to 170W and temps are exactly same 76-78C. I wonder if changing termopaste would help.
All in all, nice video.
I've also overclocked it a bit. Without additional voltage, +150MHz core and +300 MHz memory is nice and stable. With card old like this one, every extra frame helps, although I tend to play games from Win 7 era on this computer which (usually) are not that demanding.
I still use it to edit 1080p videos. I don't edit 4K videos, cause my eyes can't tell the difference between 1080p and 4K.
970 XTREME 4GB very happy with it performs reasonable most games hitting 60+ fps on avg playable (without stutters) 80-95 fps - Auto OC (two modes great for beginners) however with - Manual OC - running at 1340MHz core clock (under-clocked to achieve) temps are great only ever go above 35 when underload three fans running at 80-100% overall great card with only one limitation; vram
0:58 Good looking? OH! Thé Orange pipes 😀
Most GPUs have copper heat pipes.
@@TechLabUK OMG! Really...
I would really like to battle a 970 vs the worst 1060
4gb of VRAM is a bit low em planning to order A 6gb GPU card.the amd system em to build whould have 8gb of ddr3 next to 24gb of ddr4 for better efeincty.
3.5GB VRAM!
Why is it that it's 4GB Vram is actually 3.5? I vaguely remember the internet drama back when this card launched because of the "only 3.5 Gigs!" 🙂
Not 100% sure tbh, just read that the card uses 0.5GB for something else. Didn't cause us any issues while testing though.
@@TechLabUK Curious. I think it's the only time I have heard of such a GPU
They used two different types of ram, 3.5GB was the faster 196 GB/s listed on the spec sheet while the remaining 0.5GB was a fraction of that speed running about 20GB/s.
The uproar was in response to Nvidia's bareface lies. They didn't think they would be caught out and probably wouldn't if Skyrim modding wasn't a thing. Eitherway people lost a lot of faith in Nvidia thought that. Long before threatening reviewers, mining antics and other anti consumer practices.
It didn't help that people felt 4GB was low for a card of that performance while AMD cards had 8GB and was considered excessive.
In all honesty with the advantage of hindsight the card was fine, even for games released years later. It probably did fly close to the limit but Nvidia did get it right. Modding was one of the few scenarios that did push it over which did affect some people but was probably a smaller group of people than the vocal majority led you to believe.
The 3.5GB issue was that the VRAM arrangement that Nvidia went with, lead to the last 512MB package sharing one SM n the GPU, thus needing context switching to fully utilize from people using CUDA based benchmarking tools on that 512MB pool, they topped out at around 26GB/s at stock speeds, though with a 600MHz or so overclock of the VRAM, you could push the main pool to around 277GB/s and the 512MB pool to about 35GB/s. The separate pool introduces a small latency penalty in addition to the throughput penalty for using that last 512MB, which would lead to a performance hit of around 5% to 10% depending on the game and how much VRAM it actively needed to use. Since the nvidia drivers treated the last 512MB as a second memory level, it allowed the first 3.5GB to be prioritized, then if a game needed the full 4GB, it would allocate it, but send lesser used data to the 0.5GB pool to minimize the performance impact.
The GTX 980 used 8SMs with a 512MB package connected to each SM, when Nvidia crippled toe chip to make a GTX 970, they disabled an entire SM rather than just the CUDA cored of the SM. They then had one of the SMs connect to 2 packages, which required a context switch of that SM to access the other package rather than the more efficient way the system would handle more than one chip and DIMM per channel, where operations are evenly spread across the modules and interleaved to avoid slowdowns.
The time when the 3.5GB claim really held true enough for a class action lawsuit that lead to Nvidia refunding around $50 for each user, was the spillover behavior when a game needed to actively use more than 4GB.
For most games that needed more than 4GB actively, you would see the card gradually drop to 3.5GB of VRAM use, and then begin allocating system RAM when active use of system RAM would be 768MB or higher in terms of active use and not idle allocation, such as when a game may preload assets for a level that is not actively used until you are teleported to a different location.
The overall design that Nvidia used, made the card behave as if it only had 3.5GB, if it needed to effectively use close to 5GB or more of video memory, this was to avoid dealing with the latency penalty of 3 context switches to actively use the 3 pools of memory.
PS, spillover to system memory for GPU use, carries a 20% or so performance hit up until the PCIe bus interface in one direction gets saturated. After that, you will run into significantly larger performance drops, as well as major hitching and stuttering issues. PCIe 3.0 X16 typically means that you can get away with around 768MB of spillover for active use before hitching issues happen, and if a card has a PCIe 4.0 X16, then you can typically do 1.5GB to 2GB before you encounter hitching and other major performance issues.
Though the trend these days, is for game devs blacklisting the the shared memory pool in their game engines. For example the launch version of Hogwarts legacy, would try to actively use well over 8GB of VRAM at high settings, and cards like the RTX 3070, would end up having major performance issues as the game would try to use over 10GB of VRAM. The company then patched the game to stop loading more textures once the dedicated VRAM pool was filled and actively used. This meant that the user would see some muddy/ blurry textures in some areas rather than experiencing major hitching issues.
This has now become a default state for many major game releases since both Nvidia and AMD have started releasing cards with PCIe X8 interfaces, and in some cases, even X4 interfaces. A card using an X4 interface, would likely choke when it encounters any shared memory being actively used, as those cards run near saturation just from the basic commands sent by the CPU.
@@Javadamutt No it wasnt even fine for its time and its not now. its a stuttery mess, if you look closely its dropping textures to try and maintain fps and frametime is all over the place. ive seen older videos of other games like GTAV during long play sessions where the game becomes unplayable due to the vram limit.
Try a GTX760 next
Best 300 💷 build next ?
FSR2? Thought that was AMD cards and Nvidia uses DLSS for upscaling. Never seen FSR option in a game with my 4070.
FSR is made by AMD but they allow all brands of GPU to use it, in fact their latest FrameGen technology can also work with Nvidia DLSS which is awesome.
@@TechLabUK Still confused. Are you saying you run AMD drivers on a Nvida card? Where is the setting in Nvidia drivers to turn FSR on? I've never see it, I do see DLSS in some games.
I have a few old 970s the key to those were keeping the games at 3.5 GB or lower. Its a 4Gig card but really 3.5 due to the way they implemented the memory.
Does FSR reduce ram usage too?
@@Physics072 No, you don't need AMD drivers. FSR is built into the games so games that support it, you can enable and use it regardless of your GPU. FSR won't reduce RAM usage but it will allow you to increase performance at settings within that ram usage. FSR 3 and Framegen will generate frames so performance is increased a lot.
@@TechLabUK Did not know that just never noticed it as an option. Must not be in many games. So FSR is just another frame generation but you still get the input lag of the base fps if FSR was off. No free lunch but it can look smoother just not "Feel" smoother.
@@Physics072 FSR is in a lot of games and is generally added to most new titles because of how "open-source" like it is. FSR it self is the upscaling technology so basically like DLSS but the latest version of FSR (3) supports Frame Generation as well, developers can implement FSR3 without FrameGen though if they want to.
I guess back in the day its 4 GB DDR3
Robocop rogue city is bad joke! It doesnt look that good almost the same as terminator resistance and the requirements went of charts because of UE5!
Ahem 3.5gb card
This just isnt playable unless you have nothing else. Idk how people can live ten years in the past. I upgrade every generation.
Apart from the video shows it's playable in alot of titles, funny how people still play PS4 putting out 30fps but a 970 achieving 60fps or more in the same games isn't playable.
@@roasthunter if I was broke and had no other option. Buy a 1080ti off the used market for 100 bucks dude
@@veilmontTV 1080ti in the UK is about 3x the price of a 970 . Not everyone has the money or the need for a more powerful GPU. I could buy any GPU i want but currently my only pc with a discreet GPU is a 750ti, it still turns in 150fps n Fortnite at 1080p performance mode.
@@roasthunter you must not value your hobbies or must not game a lot. I'm gaming at 4k 240hz and that is an experience worth paying for. 1080p is a resolution that should be discarded
@@veilmontTV brah you don't value gaming much, you should be pushing 360fps at 8k, anything less is trash
Its one of the worst cards ever made, fake vram = stuttery mess. why anyone wanted this crap card is beyond me, the 390 8GB was the card to buy.
Its the best card ever, you fool🤦
@@antoniocalimero1173 no the vram is shiet its a garbage tier card
@@mercurio822 It is better to search for more information and see that everyone thinks it is the best card ever, which is still perfect after 10 years. That's why I still play with it and have the same framerate in Squad than people with a modern card of 1000 euros. All good games can be played perfectly with it
@@antoniocalimero1173 lol you are soo nvidioted.
@@antoniocalimero1173 th-cam.com/video/Gd1pzPgLlIY/w-d-xo.htmlsi=5T83rbpJ6OPAI_zs the 970 domt even have 4gb vram its worse then garbage tier
The 970 was garbage back when it came out.... Still garbage today
Its the best card ever, you fool🤦