It's not recommended to scrape it off, but later I did give it a bit of a proper clean up using some solvent, and it wasn't particularly marked, it's mainly small shavings of thermal paste giving that scratched look.
still, find yourself a bit of plastic to scrape off, we get offended easily on youtube! I wonder why i spend so much time looking at old hardware. Ahh nostalgia!
Budget-Builds Official Whatever you're using to remove thermal paste doesn't seem to work very well. What you need to buy is a can of some electronics cleaner/contact cleaner. It completely dissolves thermal paste, drastically decreasing the amount of effort you have to put into it, and you sure as hell won't need a screwdriver. It even gets every last bit of paste off of the resistors around the GPU die. You really need to get some, it's literally the best thing I've ever used when it comes to removing thermal paste. I had a thermal paste stain on my case that had been there for a couple years and I could never get it off using alcohol, but the contact cleaner got it off within 30 seconds.
I can't believe that GTA V still plays at all with 256mb vram. Shows just how well this game scales for all kinds of hardware. I play it on a 1080ti @1440p (max settings but 2x msaa) and it still heats up my gpu a lot maintaining 144fps to match my monitor. Yet at the other end it kinda handles 256mb cards. That is incredibly good scaling!
RE:Skyrim- Last year my previous GPU died on me and I pulled my old 8800, 512MB version, out of a drawer to use as a temp card (ended up using it for nearly 3 months), I never really figured out why but it hated the Low settings preset for Skyrim but found it playable on mixed low/medium settings. Otherwise I was happily surprised by how well it held up after so many years, it could play pretty much any game up until ~2013 or so, even if it did need to be on the lowest settings for many of them.
I've found that a few games seem to work better when on slightly higher graphic settings, especially Skyrim and Fallout. I'm guessing it's something to do with the game engines not utilizing all that it has on lower settings.
It must be and engine problem. I also experience a lot of issues with Skyrim and Fallout 4 on low settings, like low framerate and random crashes while on the highest settings they run like a charm.
With those games, everything is about the resolution. If you pick a low resolution, you can max out everything else and the game will run smoothly. If you pick a resolution that is "too high", there is no help... Turning off all the effects is not gonna give you more than few fps. It sucks...
Ahh the 8800 GT 256. I remember replacing my two 7800 GT CO editions with one. The performance was amazing, even over the two 7800 GT’s in SLI. Such a great card.
@@syahrizkyathaullahanandisa9814 take the gtx 480 as an example then, 11 years after release its still chugging along okay, certainly better than this card
Most systems that can run Crysis can run Crysis 2 as well. I had the Quadro FX3600M in my HP workstation 8710w that was the developer card based right off the 8800GT architecture and played both Crysis and Crysis 2 quite nicely. If you use the card with Vista or 7 and allow Crysis 2 to run in DX10 mode, it will run awesome on that card, believe it or not! Hope this helps :P
i was while playing crysis on a intel Pentium 4 - 3.06 GHz , 2 the ram, GT7300LE all low windowed. after that I got and phenom II 955 BE , overclocked it to 4.5 GHz . however i never got decent GPU till later
Back in those days I was dreaming on the 2900XT Crossfire setup I had seen on TH-cam. It didn't even look like it ran the game that well, but it was a damn sight better than the Celeron D (underclocked Pentium 4 with no hyper threading) + integrated graphics I was rocking at the time. I do miss those grimy days of PC building
I have an old 9800 GT lying around as a spare (and its roughly comparable to the 8800 GT due to being the underclocked eco version) and I can pretty much confirm most of the experiences since I had to unpack it recently (I couldnt use my usual 570 GTX due to a PSU issue; it overheated when too much power was drawn and eventually shut down) and GTA V was right on the border of being playable, and playing Crysis back in the day also left me at a decent enough 30-40 fps with medium settings. Pretty surprising what these old cards can still do.
@@8-bitcentral31 Not necessarily. The 9800 GT eco has a lot of under the hood changes like a slightly different chip design and different power delivery. Its possible it can reach a stock 9800GT in clocks, but even that is doubtful, and it certainly couldnt catch up to a OCed 9800GT.
This channel should be called Bargain Bin-Builds or NoBudget-Builds. Never seen so many of these videos on one channel that feature whacky PC builds using parts that cost as much as a used postage stamp. Love it, some of my favorite PC build videos.
I used to have a 512mb 8500 GT. Then I upped to the 9800 GT, then when it broke I got a 9800 GT eco from warranty (slightly downclocked version with less power usage and no 6 pin PSU connector) because the standard wasnt sold anymore, and by now I have 570 GTX. Either way, I have had 3 cards that were pretty comparable to this one. Mind you, having 512 mb of vram on the 8500 was of no help whatsoever since the GPU was so ridiculously weak that it had plenty trouble with games that even back then werent exactly new, so a bit of memory doesnt help much.
I had 9800 GT 512MB variant soft modded to boost it's clocks closer to GTX counterpart. I remember playing Mass Effect 3 at 1080p at max settings while averaging around 60fps, then Crysis 2 also at max settings at 1080p with great framerates. I thought this was the card that could handle everything I've thrown at it. Funnily enough, my Core 2 Duo E4500 running at 3.2GHz was actually the bottleneck in my system, so horrible that graphical settings didn't change framerates AT ALL.
You might not read this comment at all but i just want to tell you how i feel: Every video you make is just Perfect. Just the way how you put everything together. The editing, the cleaning of the videocard. The instalation progress. And finally the testing. You are a one of a kind TH-cam channel, your content is always surprisingly fun to watch. And your most of the time testing the same games. Wich is a good thing. It gives a classic feel to your video's, a comforting feel. It's always nice to see you play CS: GO and Skyrim on your budget builds. Also it's a great way to compare other video cards with each other. Please, and i BEG YOU, Please never change your videos. Keep doing what your doing. You are awesome ❤️
Here's man greatly appreciate the kind words. Not often I get a genuine comment like that. Cheers. And have a good Evening/Day/Morning wherever you may be.
I had the 8800GT 1GB model for a long time and it held up running games from several years after it's release very well 👌 A friend of mine has it now and is very happy with it 😁
What an amazing card, my own 8800GT blew a mosfett while trying to play minecraft at 1080p almost 7-8 years or so ago now. An amazing piece of hardware that sparked my interest in PC building right at the start of secondary school.
so addicted to your channel. i personally download them, that way i can watch them having a beer with out any annoying ad's or buffering popping up. keep them up mate. awesome stuff
probably 9800GT performed worse in most scenarios because there is tons of underclocked 9800GT out of the box, which have 600mhz core clock, which is way too low, while my gigabyte and other few had 740mhz core clock out of the box, my 9800GT also had 9463 graphics score in 3dmark cloud gate, which 8800GT and 8800GTS can't achiveve, and only 8800GTX/Ultra, 9800GT~9800GTX+ can
that's just pretty sad on how many of the cards were shipped with 600core clock, especially 1gb versions, you can see that sad story right here www.techpowerup.com/vgabios/?architecture=NVIDIA&model=9800+GT&page=1 , some even have 550 core clock which is even worse, then you can see that single one gigabyte with 740, you could call me lucky but, i'm absolutely certain that every single one of these cards right there could get 740 as well, and even 765~785
cskillers1: The RX560 with the shaders cut was ment to be sold as the RX560D, however some manufacturers pulled a fast one, and tried to sell us cut down variants. Which AMD Resolved. A better example would be the gimped GTX1060 3GB, and the incoming GT1030 with DDR4 (Admittedly not as dramatic as everyone seems to make it).
You should not use sharp objects when cleaning old TIM on the core or heatsink since it will damage the surface contacts. Even though it looks fine, it is already damaged microscopically which may affect heat dissipation after reapplying new paste. I learned this the hard way on my laptop heatsink lol. Nice video btw.
I think the Skyrim is a engine limitation. It's known for having a ridiculous amount of draw calls, and the shadows are a good example of how the engine murders hardware. Shadow distance mods dynamically adjusting based on fps, and totally curing low fps in complex scenes. That in combination with it having to constantly access system Ram because of the low frame buffer, was causing it to run at the mercy of the bandwidth available at all times. So basically everything was waiting on communication back and forth through the GPU to system Ram. Which also ties up the CPU's access to system Ram.
Thanks for saving my boring evening. Also, could you do a video on modern gaming on a pentium 4 with an overpowered (for the system) GPU so it'll definetly get bottleneck'd? I think it might be fun.
If that comment is meant serious (I think so), that's a good example of different personal requirements - I already have my first game where the 4GiB VRAM of the GTX 970 are not really enough.^^ But well, I own a 1440p monitor and want to run things in native resolution as everything below doesn't look nice.
I know that, I'm not a layman regarding hardware, and still it's 4 GiB of VRAM. It's just that the 3.5 GiB can be accessed with high bandwidth/speed and the other 0.5 GiB is very limited, because of a different memory controller configuration. The situation improved quite a bit with driver updates, though, I would say. I even have a video of Watch Dogs running on my PC: th-cam.com/video/1CCAykxUgMk/w-d-xo.html And yeah, there sure are some games, but I don't own all of those like Mirror's Edge Catalyst or CoD Black Ops III. The only ones that take up more than 3.5 GiB are Forza Horizon 4 (3700-4000 MiB, slightly less but still above 3500 @ FHD), Watch Dogs (3300-3800 @ FHD) and DOOM (2900-3600) IIRC. And I don't have problems regarding stuttering or lags in any of them, I only have texture swaps somtimes (high-quality textures being replaced with lower quality ones, or vice-versa) in FH4, but that's okay and doesn't hurt the gameplay (although I would prefer to not have them^^). :) I actually also own GTA V since ~one month, but haven't tried it yet - that will also fill the VRAM, though, I'm curious. So yeah, it is limited and hurting the performance theoretically, no discussion, but it's not as bad as it was after release since quite some time..
Ah ok, didn't know about Rise of the Tomb Raider but knew about Shadow of Mordor. I know, I already thought about upgrading to a GTX 1070 after I got my 1440p monitor (which is almost two years).^^ It's just that I wasn't that active in gaming in the last ~2 years and my current games are mostly running relatively well, which is why I decided to keep my GTX 970 yet. :) Actually I'm thinking about upgrading to a GTX 1070/1080, or Navi (new AMD architecture coming second half of 2019) this year, as I just got more into gaming again since November.
Those Skyrim results were rather shocking. I never had an 8800 GT, I had the 384MB 8800GS "Alpha Dog Edition" from XFX, and it ran Skyrim *extremely* well. Noticeably bottlenecked at 1080p though, but perfectly playable at 720p with high details all around.
I had the 8800GS from ASUS with incredible 386MB of VRAM in my first gaming pc 😂 (same cooler) Good old days overclocking it and my Athlon64 X2 3200+ to 20% overall just to reach 60FPS in FarCry2 😂
I had the 8800GTX that came out right after the GT, it had 768MB VRAM and to my surprise it would run fallout 3 on max settings in 1080P, it had occasional FPS drops on big explosions but ran well besides that.
I'm still rocking with my vmodded ATI HD4670 512mb and 19'' 1280x1024 monitor. I can play AAA games well like The Elder Scrolls: Oblivion/Skyrim, The Witcher 2, Dragon Age 1/2/3 and Fallout 3/NV/TTW. Too bad I can't play Fallout 4 or The Witcher 3, because my GPU doesn't support DX11, but I can watch ''Let's play'' videos and it's almost like playing the game. I have my studio tour and gaming setup video on my channel.
I had one of these the summer after they were released, saved all my money from 2 jobs to pay about £90 on special offer for it, absolutely blew my mind. Proper poverty spec, definitely the market equivalent of the gtx 1050, but when you think the prices until the mining boom were relatively similar, it does make you laugh when you see people complaining about prices of gaming components. Good times man. Thanks for this video. Oh I had that Skyrim issue with 8800m gt sli, I flashed them both with the 9800gt bios and it fixed it, was able to run fairly well, although whether the fully fledged version would work in the same one is anyone's guess.
I like how in your GTA 5 Benchmark Video you just casually try to murder a single mother on her way to soccer practice, a feat only surpassed by you failing to do that and then get run over by her SUV. Good for her, i guess.
Now days software are way more efficient with the hardware resources, plus SSD's and system Ram are way more faster than before that alone help's a lot to the lack of video ram. heck now days with some high-and CPU can even run games without a GPU.
Budget-Builds Official umm ik im saying if slyrim ran like ass cause of your drivers then you should have tried fallout 3 or nv since they run on the same engine and if its still not good then its an engine bug
Optimisations. When you have low level access to hardware you can pull off a lot more. However the PS3 was very hard to develop for, as seen by a lot of the rough ports including the likes of Half Life 2 on the PS3.
I believe the Cell Architecture worked similarly to having One Core, and 7 Threads from a Development Standpoint, the Xbox 360 instead utilised 3 Cores with HT to essentially have 6 Threads, meaning it was much nicer to develop for. The PS3 when pushed to its max could output some great stuff, but most of the time developers didn't have the knowledge, time, or financial incentive to utilise all this power.
it makes sense it runs Doplhin well, I myself had it running on an Intel Celeron 2981U with iGPU back some time ago and it already ran some games decently like Mario Kart Wii, DBZ BT3, Smash or Kirby's Return to Dreamland
DoubleBubble28 It also depends on the resolution you are playing these games at. It’s complete overkill for that card to be running anything much newer than games from 2010 on low/medium at 720p for the ones you had listed. To be running BF3 on high settings 1080p you would need at least a 6870 or 560Ti and a 660TI/7950 for FC3 on ultra at 1080p. The VRAM usage would be lower of course at 720p because there is less stress on the GPU. I haven’t owned the 8800 personally but every card I’ve had usually maxes out it’s worth at the VRAM limit before it’s better of to just upgrade is maybe what I’m saying.
Video ram is one of the most important parts of the video card. Your GPU cant do its job if the data isnt getting to it fast enough. And if the ram on your card has to keep reloading your GPU is just sitting around twiddling its thumbs. People gave me shit for waiting for and spending extra money on the 4GB 680gtx when it came out. But you know what? Jokes on them because that extra 2GB has kept this GPU alive. How would extra ram extend the life of a gpu? Everytime a new game sets the bar for graphics, your resolution isnt the only factor. Texture resolutions also increase. And I knew that. Im still gaming at 1080p, but even though my resolution hasnt changed since 2012 the texture resolutions have (except for the case of crysis 2 where high resolution textures to them were 240p). And at 1080p GTAV and BF4 will both use all 4GB of my card (and GTAV uses almost all 12GB of my system ram) and I can still manage over 45fps constant 60 in most areas.
Shabuti R18 Shabuti R18 I still have a 770 2GB card which you probably know is pretty much the same as the 680 2gb. I’m definitely due for an upgrade soon but I haven’t had any major issues with VRAM on the card for 1080p. Maybe the case with a few games to turn down texture quality from high to medium on some new games but at this point even a 1050Ti or maybe even the 1050 are more efficient cards with better performance and much lower power usage.
Edmundo studios Yes power usage is becoming an issue with me. Im looking into building a new machine after nvidia releasing their next gpu. My i7 970 with HT off is pulling almost 150w and Im sure my video card is probably doubling that. Looking at new 8 core CPUs that have a TDP of 65w, half of what my 6 core is.
Pinku Pantsu No...HT consumes ALOT of power. Its one of the main reasons the Prescott ran so much hotter than the Northwoods back in the day of the P4. Also, I have 6 cores. I dont really have a "need" for HT in gaming that would warrant a 15c temp increase. Im happy running 6 cores/6 threads and stock frequency and voltage with room temp idle temps and around 50c gaming temps.
The PC I use the most is an upgraded Dell XPS M1730, with the 9800M GTX SLI. It does everything you tried in this video, but better! And Skyrim runs like a dream. It's weird it didn't work.
Thats because new laptops are expensive... I got a laptop with an I5, 8 GB ram and a gtx 720M and a 250 GB SSD for about 300 euros when i bought it used, and it runs gta v nicely 30 fps, rarely 50 fps.
Very nice review, as far as price goes, where I live they cost as much as 5x the price you found yours ;) It's insane how big of a difference sometimes location has on pricing.
This takes me back... I remember my old 8800GT 512mb card when I first upgraded from my Radeon X1950gt on my old AMD Opteron 165. Everything seemed godly by comparison. THEN, I got another 8800gt and SLI'd them. Ohhhh 2007... I had upgraded to a E8400 wolfdale system and rode those 8800gt's until I grabbed a Radeon 6950 with 1GB of VRAM in 2011. I gamed sporadically on this dated system until September of this year when I finally splurged and got the system I had always dreamed of. $4000 later... and 1080ti with an 8700K. Love these old cards though. Shit is nostalgic. Takes me back. Thanks.
Back in the day I had two 512MB EVGA 8800GTs in SLI. They replaced a Radeon 9700 pro. They were driving two 19" NEC "flat" CRTs. Wow, that was a long time ago.
I remember eagerly awaiting the evga version of this card and getting it for myself for christmas 2007. It was between this, 8600gts, 8800gts 320mb and the 8800gt 512mb. This card let me play crysis the way it was meant to be played(7100gs before). I still have it and up until a couple years ago I would periodically place it into the pc just to see how it was holding up. Fun times.
I actully recently sold my gt8500 512mb ddr2 card for a upgrade but thanks to budget builds the card which is my upgrade is the gt9500 which is great to see that someone has made a video about it
Built my first computer today. The building process was smooth and it booted the first time. Using it to type this comment. Edit: I just made a video on it if you want to watch it. th-cam.com/video/7qL5-doS5Kk/w-d-xo.html
As I remember that the 256MG 8800GT had some really bad VRAM overflow issues and underperformed the HD3850 256MB, but it seems like it was fixed with driver updates. Also, the 8800GT was never a high-end card. The G92 GPU may have been on the 8800GTS 512MB, but on the GT it had fewer shader processors. The 8800GT was always and was always intended to be a mid-range graphics card.
@@lucki2112 if you mean mad because the game itself is bad, that'd be correct. it's literally a rip off of like 30 other games with a garbage generic "calarts" esque style to it. It doesn't deserve a tenth of the fans or publicity it has.
@@lucki2112 it's ugly and battleroyale is done to death. join the FGC if you want to play games that take actual skill fam. I don't hate it for the way it plays, i hate it because it's another cashgrab on a dead genre that people have flocked to because of memes.
@@lucki2112 how am i being edgy? I legit hate the artstyle and the gameplay, it has literally fuckall to do with being edgy or being on a bandwagon. I hate it for my own reasons. Get fucked fanboy.
@@lucki2112 You literally have no idea how fighting games work if you think that's all there is to it. End of. You clearly haven't played any current street fighter iteration or much of any 2D fighter if you think all it is is "keeping the enemy up in the air with a combo".
I had exactly the same card, it was excellent in it's day, moved to a radeon 4890, then to a 5870, which I still have in another machine, from there I skipped a gen and got 2 7870 Joker edition cards for my first and last go at crossfire, my friend got one of those and I traded the other with some cash for a 290x, that lasted me about 4 years until I swapped it out for the card I have now, a Palit gtx 1070. It's amazing what you remember!
Of course it works, but it would scratch the heatsink quite a bit (can be fixed by using more thermal paste though). And did you really use a philips head screwdriver? A flat head would work better...
I have a 512MB 8800GT and Skyrim runs just fine at a combo of medium and high settings. I cannot imaging that the 256MB could be the culprit, I mean after all, it is a port from 7th Gen consoles.
Not gunna lie, you using a screwdriver to scratch off thermal paste got me pretty tilted xD
It's not recommended to scrape it off, but later I did give it a bit of a proper clean up using some solvent, and it wasn't particularly marked, it's mainly small shavings of thermal paste giving that scratched look.
still, find yourself a bit of plastic to scrape off, we get offended easily on youtube! I wonder why i spend so much time looking at old hardware. Ahh nostalgia!
Budget-Builds Official Whatever you're using to remove thermal paste doesn't seem to work very well. What you need to buy is a can of some electronics cleaner/contact cleaner. It completely dissolves thermal paste, drastically decreasing the amount of effort you have to put into it, and you sure as hell won't need a screwdriver. It even gets every last bit of paste off of the resistors around the GPU die.
You really need to get some, it's literally the best thing I've ever used when it comes to removing thermal paste. I had a thermal paste stain on my case that had been there for a couple years and I could never get it off using alcohol, but the contact cleaner got it off within 30 seconds.
Sam T: This Thermal paste was baked on, it hadn't been changed in over 10 Years, but I may look in to getting some.
:S I was just making a joke, guys. Didn't seem to cause any damage and the product isn't worth anything :P
I'm surprised GTA V ran at all, good video btw
it ran at 30FPS on the Xbox 360 at 720pmed so yes and no.
It was natively designed for 256 mb of VRAM, so not surprising really.
Uhm dude the Intel HD Graphics is much newer....
The game was made for Xbox 360 which is as old as the card is.
8600 GT (except with 256 MB VRAM) is the equivalent of PS3 GPU.
I can't believe that GTA V still plays at all with 256mb vram. Shows just how well this game scales for all kinds of hardware.
I play it on a 1080ti @1440p (max settings but 2x msaa) and it still heats up my gpu a lot maintaining 144fps to match my monitor.
Yet at the other end it kinda handles 256mb cards. That is incredibly good scaling!
MaTtRoSiTy sorry but this is just so much flex xD, and im realy jelly
It was on last gen consoles which is why
Edmundo studios that is a good theory
it just eats my fury x's HBM, but actually leaving a fair bit left ~0.3GB
very nice
Very nice man! Those fortnite results were very impressive considering it's an 11 year old card. Keep it up!
480p though
still impressive
It's Epic. Everything they do engine-wise is very impressive.
HoppsTech also there are some good players who swear by the game, it was also very impressive considering they are 11 year old kids
On fortnite my ati radeon hd 3450 gets 5fps at 360p lowest lol but it plays minecraft at 1080p
Basically 8$ for a gpu with the power of an xbox 360?
Yes basically lol
Bought an alienware laptop with these two bad boys in SLI. $1800 in 2006. Feels good knowing the cards are worth the price of a beer
@@LostPie F
LostPie expensive beer you are buying
Cocktail Music and now in English please
I love this channel, the accent of the narrator, the creative B-roll, the hardware, its the best channel a nerd on a budget can visit
joshcogaming paging Nerd On A Budget
joshcogaming p
RE:Skyrim- Last year my previous GPU died on me and I pulled my old 8800, 512MB version, out of a drawer to use as a temp card (ended up using it for nearly 3 months), I never really figured out why but it hated the Low settings preset for Skyrim but found it playable on mixed low/medium settings. Otherwise I was happily surprised by how well it held up after so many years, it could play pretty much any game up until ~2013 or so, even if it did need to be on the lowest settings for many of them.
I've found that a few games seem to work better when on slightly higher graphic settings, especially Skyrim and Fallout. I'm guessing it's something to do with the game engines not utilizing all that it has on lower settings.
It must be and engine problem. I also experience a lot of issues with Skyrim and Fallout 4 on low settings, like low framerate and random crashes while on the highest settings they run like a charm.
GLQuake had this issue. If you run the game at the lowest settings it runslike garbage, but runs perfectly on the max settings
With those games, everything is about the resolution.
If you pick a low resolution, you can max out everything else and the game will run smoothly.
If you pick a resolution that is "too high", there is no help... Turning off all the effects is not gonna give you more than few fps. It sucks...
throw old gpu in oven or reflow it with the heatgun.. also use flux nc559asm-uv under the chip it will work fine..and with flux it will last longer.
Ahh the 8800 GT 256. I remember replacing my two 7800 GT CO editions with one. The performance was amazing, even over the two 7800 GT’s in SLI. Such a great card.
2:04 OH MY GOD! AN ANT
This is RTX 2080ti, 11 years from 2018.
I doubt it.
@@Wetballs me too
the gtx 750ti an 6 years old card runs the actual titles so why 2080 didn't
@@maizomeno comparing 2007 gpu with 2014 gpu, its like comparing a ps3 with ps4, both of em are different gen.
@@syahrizkyathaullahanandisa9814 take the gtx 480 as an example then, 11 years after release its still chugging along okay, certainly better than this card
But can it run crysis
2?
I had the hd9600 and could, almost at minimum
why not?
calm down, little degenerate. I asked you simple question. Tell me why it shouldn't run.
Most systems that can run Crysis can run Crysis 2 as well. I had the Quadro FX3600M in my HP workstation 8710w that was the developer card based right off the 8800GT architecture and played both Crysis and Crysis 2 quite nicely. If you use the card with Vista or 7 and allow Crysis 2 to run in DX10 mode, it will run awesome on that card, believe it or not! Hope this helps :P
Obviously, Crysis 2 was less demanding than the 1.
Who remembers this as the high end 8800gtx’s little brother that you dreamed to have and play crysis on it?
Well i did, and i dreamed of the new amazing core2quad q6600 the best cpu in the world, with 4 cores!!!
Feels in 10 years you’ll have the memory of the oh so old ryzen 1700 with its 16 threads while today’s cpus have like 64? Hell idk
i was while playing crysis on a intel Pentium 4 - 3.06 GHz , 2 the ram, GT7300LE all low windowed. after that I got and phenom II 955 BE , overclocked it to 4.5 GHz . however i never got decent GPU till later
Ryosuke Takahashi i had that cpu overclocked to 3.2ghz paired with a BFG 8800GT OC. Good times !
Back in those days I was dreaming on the 2900XT Crossfire setup I had seen on TH-cam. It didn't even look like it ran the game that well, but it was a damn sight better than the Celeron D (underclocked Pentium 4 with no hyper threading) + integrated graphics I was rocking at the time. I do miss those grimy days of PC building
I have an old 9800 GT lying around as a spare (and its roughly comparable to the 8800 GT due to being the underclocked eco version) and I can pretty much confirm most of the experiences since I had to unpack it recently (I couldnt use my usual 570 GTX due to a PSU issue; it overheated when too much power was drawn and eventually shut down) and GTA V was right on the border of being playable, and playing Crysis back in the day also left me at a decent enough 30-40 fps with medium settings.
Pretty surprising what these old cards can still do.
If it is underclocked, then shurly you can just re-clock it with MSI-Afterburner.
@@8-bitcentral31 Not necessarily. The 9800 GT eco has a lot of under the hood changes like a slightly different chip design and different power delivery. Its possible it can reach a stock 9800GT in clocks, but even that is doubtful, and it certainly couldnt catch up to a OCed 9800GT.
This channel should be called Bargain Bin-Builds or NoBudget-Builds. Never seen so many of these videos on one channel that feature whacky PC builds using parts that cost as much as a used postage stamp. Love it, some of my favorite PC build videos.
MrBurtbackerack: Cheers man, much appreciated.
ahh the same card I bought quite a few years back but I had the 512 mb version
Are you still rich? LOL
I have the 1024mb gt8600
I used to have a 512mb 8500 GT. Then I upped to the 9800 GT, then when it broke I got a 9800 GT eco from warranty (slightly downclocked version with less power usage and no 6 pin PSU connector) because the standard wasnt sold anymore, and by now I have 570 GTX. Either way, I have had 3 cards that were pretty comparable to this one.
Mind you, having 512 mb of vram on the 8500 was of no help whatsoever since the GPU was so ridiculously weak that it had plenty trouble with games that even back then werent exactly new, so a bit of memory doesnt help much.
I had the 8800GT 512mb Too 😉👍🏻
I had 9800 GT 512MB variant soft modded to boost it's clocks closer to GTX counterpart. I remember playing Mass Effect 3 at 1080p at max settings while averaging around 60fps, then Crysis 2 also at max settings at 1080p with great framerates.
I thought this was the card that could handle everything I've thrown at it.
Funnily enough, my Core 2 Duo E4500 running at 3.2GHz was actually the bottleneck in my system, so horrible that graphical settings didn't change framerates AT ALL.
You might not read this comment at all but i just want to tell you how i feel:
Every video you make is just Perfect. Just the way how you put everything together. The editing, the cleaning of the videocard. The instalation progress. And finally the testing. You are a one of a kind TH-cam channel, your content is always surprisingly fun to watch. And your most of the time testing the same games. Wich is a good thing. It gives a classic feel to your video's, a comforting feel. It's always nice to see you play CS: GO and Skyrim on your budget builds. Also it's a great way to compare other video cards with each other. Please, and i BEG YOU, Please never change your videos. Keep doing what your doing. You are awesome ❤️
Here's man greatly appreciate the kind words. Not often I get a genuine comment like that. Cheers. And have a good Evening/Day/Morning wherever you may be.
I had the 8800GT 1GB model for a long time and it held up running games from several years after it's release very well 👌 A friend of mine has it now and is very happy with it 😁
What an amazing card, my own 8800GT blew a mosfett while trying to play minecraft at 1080p almost 7-8 years or so ago now. An amazing piece of hardware that sparked my interest in PC building right at the start of secondary school.
It has more potential than my gt720 2GB mobile GPU LOL
I got a gt 320 my guy
I Got Gt720M lol
so addicted to your channel. i personally download them, that way i can watch them having a beer with out any annoying ad's or buffering popping up. keep them up mate. awesome stuff
Well I think cs go can literally run on anything now
CSGO is more bloated now than it has ever been.
Agreed but I can still run it on my 5th gen i3 processor on my old laptop which is impressive considering
Budget-Builds Official what do you mean by "bloated"
r8573 yeah run it at 250fps don't think so
Source is old. It's pretty inefficient these days.
A screwdriver to clean the cooler... A FUCKING SCREWDRIVER TO CLEAN THE COOLER !!! YOU ARE A MONSTER !!!
Right
probably 9800GT performed worse in most scenarios because there is tons of underclocked 9800GT out of the box, which have 600mhz core clock, which is way too low, while my gigabyte and other few had 740mhz core clock out of the box, my 9800GT also had 9463 graphics score in 3dmark cloud gate, which 8800GT and 8800GTS can't achiveve, and only 8800GTX/Ultra, 9800GT~9800GTX+ can
that's just pretty sad on how many of the cards were shipped with 600core clock, especially 1gb versions, you can see that sad story right here www.techpowerup.com/vgabios/?architecture=NVIDIA&model=9800+GT&page=1 , some even have 550 core clock which is even worse, then you can see that single one gigabyte with 740, you could call me lucky but, i'm absolutely certain that every single one of these cards right there could get 740 as well, and even 765~785
yes, some rx560 versions come with 896 stream processors, and some with 1024 like asus strix oc, which is also pretty sad
nice pfp
cskillers1: The RX560 with the shaders cut was ment to be sold as the RX560D, however some manufacturers pulled a fast one, and tried to sell us cut down variants. Which AMD Resolved.
A better example would be the gimped GTX1060 3GB, and the incoming GT1030 with DDR4 (Admittedly not as dramatic as everyone seems to make it).
my friend bought Zotac 1030 it has a vga. Pray for zotac, long life for them!
You should not use sharp objects when cleaning old TIM on the core or heatsink since it will damage the surface contacts. Even though it looks fine, it is already damaged microscopically which may affect heat dissipation after reapplying new paste. I learned this the hard way on my laptop heatsink lol. Nice video btw.
I still have my 8800GTX's, had 2 in SLi, back in 2007 I felt like someone with 2080Ti's now.
I think the Skyrim is a engine limitation. It's known for having a ridiculous amount of draw calls, and the shadows are a good example of how the engine murders hardware. Shadow distance mods dynamically adjusting based on fps, and totally curing low fps in complex scenes. That in combination with it having to constantly access system Ram because of the low frame buffer, was causing it to run at the mercy of the bandwidth available at all times. So basically everything was waiting on communication back and forth through the GPU to system Ram. Which also ties up the CPU's access to system Ram.
Wow !! That's awesome lol 2007 was quite a year
12:35 stutter XD good video as usual :)
Thanks for saving my boring evening. Also, could you do a video on modern gaming on a pentium 4 with an overpowered (for the system) GPU so it'll definetly get bottleneck'd? I think it might be fun.
I may revisit the Pentium 4 when I get another LGA775 Motherboard.
I might hit you up with one, I've got a whole system laying around somewhere.
1:58 Ah yes, good old 10 year old heat pads. These have baned my existence for forever.
I'm still using my 256mb card and it runs fine!
rip for you
Better than any version of intel HD.
If that comment is meant serious (I think so), that's a good example of different personal requirements - I already have my first game where the 4GiB VRAM of the GTX 970 are not really enough.^^
But well, I own a 1440p monitor and want to run things in native resolution as everything below doesn't look nice.
I know that, I'm not a layman regarding hardware, and still it's 4 GiB of VRAM. It's just that the 3.5 GiB can be accessed with high bandwidth/speed and the other 0.5 GiB is very limited, because of a different memory controller configuration.
The situation improved quite a bit with driver updates, though, I would say. I even have a video of Watch Dogs running on my PC: th-cam.com/video/1CCAykxUgMk/w-d-xo.html
And yeah, there sure are some games, but I don't own all of those like Mirror's Edge Catalyst or CoD Black Ops III. The only ones that take up more than 3.5 GiB are Forza Horizon 4 (3700-4000 MiB, slightly less but still above 3500 @ FHD), Watch Dogs (3300-3800 @ FHD) and DOOM (2900-3600) IIRC. And I don't have problems regarding stuttering or lags in any of them, I only have texture swaps somtimes (high-quality textures being replaced with lower quality ones, or vice-versa) in FH4, but that's okay and doesn't hurt the gameplay (although I would prefer to not have them^^). :)
I actually also own GTA V since ~one month, but haven't tried it yet - that will also fill the VRAM, though, I'm curious.
So yeah, it is limited and hurting the performance theoretically, no discussion, but it's not as bad as it was after release since quite some time..
Ah ok, didn't know about Rise of the Tomb Raider but knew about Shadow of Mordor.
I know, I already thought about upgrading to a GTX 1070 after I got my 1440p monitor (which is almost two years).^^ It's just that I wasn't that active in gaming in the last ~2 years and my current games are mostly running relatively well, which is why I decided to keep my GTX 970 yet. :)
Actually I'm thinking about upgrading to a GTX 1070/1080, or Navi (new AMD architecture coming second half of 2019) this year, as I just got more into gaming again since November.
Those Skyrim results were rather shocking. I never had an 8800 GT, I had the 384MB 8800GS "Alpha Dog Edition" from XFX, and it ran Skyrim *extremely* well. Noticeably bottlenecked at 1080p though, but perfectly playable at 720p with high details all around.
Looks like myth was busted... Or not?)
8:30 I finished crysis 1 demo with those frames... at least half a dozen times in 2008
No overclocking done? ;0 it should give some nice boost if you did the core.
MelodyZE Didn't look overclockable
yeah it will give you 10fps over 9fps on Skyrim
with such a relic of a card i'd not find it safe to overclock it since it could fucking explode or something
@@paoloh885 then it would be trash
Keep up the good work man!
8:16 "It just works."
I ran an integrated mobile version of the 8800 for YEARS. A 7870 upgrade in 2013 seemed like the best thing ever.
I had the 8800GS from ASUS with incredible 386MB of VRAM in my first gaming pc 😂 (same cooler)
Good old days overclocking it and my Athlon64 X2 3200+ to 20% overall just to reach 60FPS in FarCry2 😂
I love your videos man, trying out old shit for fun.
Ok fine but
*Does it run minesweeper?*
My potato clock runs crysis.
I had the 8800GTX that came out right after the GT, it had 768MB VRAM and to my surprise it would run fallout 3 on max settings in 1080P, it had occasional FPS drops on big explosions but ran well besides that.
I knew I heard palm tree panic!
NULL Such Edge That's exactly what I said lol
I had 2x 8800GTS 320mb in sli back in the day. Think I used to play games at 1680x1050 back then. Ended up with a 8800GTX in the end :) great video
I'm still rocking with my vmodded ATI HD4670 512mb and 19'' 1280x1024 monitor. I can play AAA games well like The Elder Scrolls: Oblivion/Skyrim, The Witcher 2, Dragon Age 1/2/3 and Fallout 3/NV/TTW. Too bad I can't play Fallout 4 or The Witcher 3, because my GPU doesn't support DX11, but I can watch ''Let's play'' videos and it's almost like playing the game. I have my studio tour and gaming setup video on my channel.
coming closer and closer to 100k.. I bet till the end of the next month you'll get there 😉
06:27 Gerdek-basanFlorasanHasan :D her yerde abazalığımızı belli ediyoruz :D
Aynen 😂
I had one of these the summer after they were released, saved all my money from 2 jobs to pay about £90 on special offer for it, absolutely blew my mind. Proper poverty spec, definitely the market equivalent of the gtx 1050, but when you think the prices until the mining boom were relatively similar, it does make you laugh when you see people complaining about prices of gaming components.
Good times man. Thanks for this video. Oh I had that Skyrim issue with 8800m gt sli, I flashed them both with the 9800gt bios and it fixed it, was able to run fairly well, although whether the fully fledged version would work in the same one is anyone's guess.
WOW... 70 fps on Fortnite... 11 year old card... *IMPRESSIVE*
the fortinte players ain't much older either
Sweet workbench man.
I like how in your GTA 5 Benchmark Video you just casually try to murder a single mother on her way to soccer practice, a feat only surpassed by you failing to do that and then get run over by her SUV. Good for her, i guess.
and in Far Cry 3 you roast civilians with your flamethrower
Now days software are way more efficient with the hardware resources, plus SSD's and system Ram are way more faster than before that alone help's a lot to the lack of video ram. heck now days with some high-and CPU can even run games without a GPU.
12:20 Whatever you Find Cheapest , Whatever you Find Cheapest. xd
You're Awseome! just waiting for your videos.
Cheers man
:D
For skyrim
You should have tried fallout 3 or nv to see if you got the same fps if you didn't it must have just been an game support issue
Budget-Builds Official umm ik im saying if slyrim ran like ass cause of your drivers then you should have tried fallout 3 or nv since they run on the same engine and if its still not good then its an engine bug
Only Skyrim and fallout 4 use the creation engine...
I've still got my BFG 8800GT OC in a drawer, the single slot cooler used to sound like a banshee mind but i still loved it !
Arch Linux makes better use of RAM and VRAM than Windows does. that 256MB GPU is still useful
as a desktop accelerator ? Because not for games...
@@xolox2k if you only play Valve games, then it could actually be smarter to use linux
I had the 512mb version of that particular one... What a beast. Remember playing Mass Effect on 1600x1200 :D
Is it true that the PS3 only has 256MB of VRAM? If so, how could it possibly run games like The Last of Us?
Optimisations. When you have low level access to hardware you can pull off a lot more. However the PS3 was very hard to develop for, as seen by a lot of the rough ports including the likes of Half Life 2 on the PS3.
I believe the Cell Architecture worked similarly to having One Core, and 7 Threads from a Development Standpoint, the Xbox 360 instead utilised 3 Cores with HT to essentially have 6 Threads, meaning it was much nicer to develop for. The PS3 when pushed to its max could output some great stuff, but most of the time developers didn't have the knowledge, time, or financial incentive to utilise all this power.
While the PS3 does have a 6 core CPU for games, the X360 has a triple core with hyperthreading. So the difference is not really that big.
Rares Macovei oh, i never knew the ps3 hat HT, i love learning new things like this
the PS3 also has 256MB of MAIN RAM!!!
I actually have a 9800gt identical in design to this one. I always knew the 9xxx is a rebrand series, but I wasn't expecting them to be identical.
But does it run crysis 2?
it makes sense it runs Doplhin well, I myself had it running on an Intel Celeron 2981U with iGPU back some time ago and it already ran some games decently like Mario Kart Wii, DBZ BT3, Smash or Kirby's Return to Dreamland
VRAM is mostly a non issue because most cards seem to max out their potential when they reach that the memory limit.
DoubleBubble28 It also depends on the resolution you are playing these games at. It’s complete overkill for that card to be running anything much newer than games from 2010 on low/medium at 720p for the ones you had listed.
To be running BF3 on high settings 1080p you would need at least a 6870 or 560Ti and a 660TI/7950 for FC3 on ultra at 1080p. The VRAM usage would be lower of course at 720p because there is less stress on the GPU.
I haven’t owned the 8800 personally but every card I’ve had usually maxes out it’s worth at the VRAM limit before it’s better of to just upgrade is maybe what I’m saying.
Video ram is one of the most important parts of the video card. Your GPU cant do its job if the data isnt getting to it fast enough. And if the ram on your card has to keep reloading your GPU is just sitting around twiddling its thumbs.
People gave me shit for waiting for and spending extra money on the 4GB 680gtx when it came out. But you know what? Jokes on them because that extra 2GB has kept this GPU alive. How would extra ram extend the life of a gpu?
Everytime a new game sets the bar for graphics, your resolution isnt the only factor. Texture resolutions also increase. And I knew that. Im still gaming at 1080p, but even though my resolution hasnt changed since 2012 the texture resolutions have (except for the case of crysis 2 where high resolution textures to them were 240p). And at 1080p GTAV and BF4 will both use all 4GB of my card (and GTAV uses almost all 12GB of my system ram) and I can still manage over 45fps constant 60 in most areas.
Shabuti R18 Shabuti R18 I still have a 770 2GB card which you probably know is pretty much the same as the 680 2gb. I’m definitely due for an upgrade soon but I haven’t had any major issues with VRAM on the card for 1080p. Maybe the case with a few games to turn down texture quality from high to medium on some new games but at this point even a 1050Ti or maybe even the 1050 are more efficient cards with better performance and much lower power usage.
Edmundo studios Yes power usage is becoming an issue with me. Im looking into building a new machine after nvidia releasing their next gpu. My i7 970 with HT off is pulling almost 150w and Im sure my video card is probably doubling that. Looking at new 8 core CPUs that have a TDP of 65w, half of what my 6 core is.
Pinku Pantsu No...HT consumes ALOT of power. Its one of the main reasons the Prescott ran so much hotter than the Northwoods back in the day of the P4.
Also, I have 6 cores. I dont really have a "need" for HT in gaming that would warrant a 15c temp increase.
Im happy running 6 cores/6 threads and stock frequency and voltage with room temp idle temps and around 50c gaming temps.
The PC I use the most is an upgraded Dell XPS M1730, with the 9800M GTX SLI. It does everything you tried in this video, but better! And Skyrim runs like a dream. It's weird it didn't work.
what the hell, i cant even play Fortnite correctly or even csgo with my 400 dollars laptop from 2017
Thats because new laptops are expensive... I got a laptop with an I5, 8 GB ram and a gtx 720M and a 250 GB SSD for about 300 euros when i bought it used, and it runs gta v nicely 30 fps, rarely 50 fps.
@@lonttugamer2939 Tell me that you're playing GTA on high or higher settings, if not then that's pretty bad.
@@lonttugamer2939 Use it while charging it will boost the fps😊
Really surprised it does so good in new titles! :o suspected much being API this just confirmes suspicion of it
Lol it still games better than my 2108 MacBook Pro
u come from the future bro??
macbook suck for gaming
I was playing skyrim on 8800 GTS 320 MB so that was a budget version of the 8800 and it was smooth :) that card is a legend :)
Can it run crysis- oh... Yes it can!
Yay early! Love u budget builds keep up the good work and stay awesome
Now SLI...lol
Amazing that some of these games are actually playable on such an old card.
probably does better than a MacBook
Seeing you put ancient GPU’s on a moist rock makes me anxious, great video though!
But does it run roblox
Very nice review, as far as price goes, where I live they cost as much as 5x the price you found yours ;) It's insane how big of a difference sometimes location has on pricing.
can it run DDLC?
legoboy2003 Minecraft What is DDLC?
+Sheamus cz -_-
oh, ok NOW I KNOW
+Sheamus cz it's the best game of 2017
Yeah, i know. But its propably not gonna run on this card.
This takes me back... I remember my old 8800GT 512mb card when I first upgraded from my Radeon X1950gt on my old AMD Opteron 165. Everything seemed godly by comparison. THEN, I got another 8800gt and SLI'd them. Ohhhh 2007... I had upgraded to a E8400 wolfdale system and rode those 8800gt's until I grabbed a Radeon 6950 with 1GB of VRAM in 2011. I gamed sporadically on this dated system until September of this year when I finally splurged and got the system I had always dreamed of. $4000 later... and 1080ti with an 8700K. Love these old cards though. Shit is nostalgic. Takes me back. Thanks.
Do a review on the fake gt 645
Back in the day I had two 512MB EVGA 8800GTs in SLI. They replaced a Radeon 9700 pro. They were driving two 19" NEC "flat" CRTs. Wow, that was a long time ago.
I still have 64mb bro
I have been using a computer with intel hd graphics for a few years -.- only now I discovered what vram is, I'm so dumb lol
ahhh far cry 3 i remember playing that game with a used gtx 480 i had back in the day
Lol Fortnite looks so bad normally that even on the lowest settings on 480p it barely looks any different.
I remember eagerly awaiting the evga version of this card and getting it for myself for christmas 2007. It was between this, 8600gts, 8800gts 320mb and the 8800gt 512mb. This card let me play crysis the way it was meant to be played(7100gs before).
I still have it and up until a couple years ago I would periodically place it into the pc just to see how it was holding up. Fun times.
PEople say Fortnite is badly optimized lol
If it can run on this it should work fine on my phone but it won't even install so it is bad
I actully recently sold my gt8500 512mb ddr2 card for a upgrade but thanks to budget builds the card which is my upgrade is the gt9500 which is great to see that someone has made a video about it
Built my first computer today. The building process was smooth and it booted the first time. Using it to type this comment.
Edit: I just made a video on it if you want to watch it. th-cam.com/video/7qL5-doS5Kk/w-d-xo.html
Built my first computer 10 years ago, with the mighty 8800gt. Have fun dude!
Congratulation
Yeah, cable management sucks. mainly because of the case having no space for anything to go around the back.
Even though I'm never planning on buying budget parts, (cuz I have da money, duh) it's still interesting to watch ur vids, keep up the great work,😁
DUDE!! I HAVE 1 GB GRAPHICS CARD SUPPORTS ONLY DIRECT X 10 AND FORTNITE AT LOWEST LOWEST GRAPHICS IT RUNS AT *1-3 FPS* !!!! WTFFFFFFFFFFFFF
IT'S GeForce 9400 GT
Edee Champian: The 9400GT is much wotse than the 8800GT.
@Budget-Builds Official It rly is? but it's 1 GB....
that's the problem.
I can't buy online, I got no visa, and the Shipping in Saudi Arabia is VERY EXPENSIVE
He's also paired it with a Ryzen R5 1600 CPU for what it's worth.
As I remember that the 256MG 8800GT had some really bad VRAM overflow issues and underperformed the HD3850 256MB, but it seems like it was fixed with driver updates.
Also, the 8800GT was never a high-end card. The G92 GPU may have been on the 8800GTS 512MB, but on the GT it had fewer shader processors. The 8800GT was always and was always intended to be a mid-range graphics card.
Fortnite looks amazing since its trash it also looks amazing when you toss it in the garbage can
@@lucki2112 if you mean mad because the game itself is bad, that'd be correct. it's literally a rip off of like 30 other games with a garbage generic "calarts" esque style to it. It doesn't deserve a tenth of the fans or publicity it has.
@@lucki2112 it's ugly and battleroyale is done to death. join the FGC if you want to play games that take actual skill fam. I don't hate it for the way it plays, i hate it because it's another cashgrab on a dead genre that people have flocked to because of memes.
@@lucki2112 how am i being edgy? I legit hate the artstyle and the gameplay, it has literally fuckall to do with being edgy or being on a bandwagon. I hate it for my own reasons. Get fucked fanboy.
@@lucki2112 You literally have no idea how fighting games work if you think that's all there is to it. End of. You clearly haven't played any current street fighter iteration or much of any 2D fighter if you think all it is is "keeping the enemy up in the air with a combo".
Hating on the game doesnt make you cool..
I had exactly the same card, it was excellent in it's day, moved to a radeon 4890, then to a 5870, which I still have in another machine, from there I skipped a gen and got 2 7870 Joker edition cards for my first and last go at crossfire, my friend got one of those and I traded the other with some cash for a 290x, that lasted me about 4 years until I swapped it out for the card I have now, a Palit gtx 1070.
It's amazing what you remember!
Can it run ROBLOX?
No
@@nothing.1240 thats sad, i guessed it would run since roblox's graphics are pretty shit. Oh well
Intel hd graphics
Cabn barely handle
Scratching thermal paste of a heatsink with a screwdriver... Good job!
It worked well.
Of course it works, but it would scratch the heatsink quite a bit (can be fixed by using more thermal paste though). And did you really use a philips head screwdriver? A flat head would work better...
B_Chan: The heatsink wasn't scratched...
it's more powerful than my gpu (9600GT)
it's almost sad
I have a 9600 gt aswell, and its not more powerful imo. I played crysis 2 in 30-55 fps, never dropping below 30.
Hey I have the same exact card. Bought it for $4 on ebay a while ago. Overclocked it quite a bit when testing it.
Is better than intel hd graphics 630 :'v
But you can play skyrim with hd 630.
@@myolgiden how about intel hd graphics family?
At the time, I actually had the 8600 gt which also has 256 mb vram that I bought in 2007. Still performed better than the 7600 gs that I had before.
I have a 512MB 8800GT and Skyrim runs just fine at a combo of medium and high settings. I cannot imaging that the 256MB could be the culprit, I mean after all, it is a port from 7th Gen consoles.
8800 was a beast. no contest. it took intel about 7 years to come up with equivalent integrated graphics (such as HD4600).