I can’t believe it’s been almost 2 years since nortyx first discovered the card in your server, time flies lol Also for those wondering, the reason my name is on the card is because I’ve requested it to be sent to nortyx afterwards, I asked dawid to sign it too so nortyx could have a cool small piece of GPU history as a keepsake Lastly, I want to thank you a lot for everything you’ve done for me over the years, I wouldn’t be where I am today without your advice and the twitch thumbnail gig.
Shows how much a card can be held back by shitty bandwith. I mean, look at how bad the RX 6400 does on PCIe 3.0 or how the fastest GT 730 is not the one with the beefiest chip but the one with the fastest memory.
@@mikeymaiku depends on graphical settings, what game, api, and what clever rendering /upscaling techniques you're using or not i could get forza horizon 4 to run at 120fps high settings a a 1080p output but by impact he image quality to a significant extent, by using FSR quality + half rate shading (the latter one, on windows, is only really avaliable in a few games, you can't force it via driver, sadly) but yeah it only does 144hz if you're playiong a light game or willing to do sacrifices to rthe resolution or visuals. but it technically *can*
Always wondered what happened to these things after nividia put up drivers for the 1010 and 1020 on their website and said they where launching the cards soon but never did anything with them
I’m pretty sure the 1010 (didn’t know there was a 1020) was only meant for work office computers and extremely low power OEM consumer computers. I won’t be surprised to see bargain bins of these GPUs in 3-5 years time.
@@zgoaty9235 Could you imagine someone fretting over whether to go with a 1010, or *go all out* and get a 1030? This hypothetical person makes the *reasonable choice* of going with a 1020. They would like more than the 1010 offers. But, the 1030 really seems like overkill to them.
That GT 1010 is still more impressive than the ridiculous GT 1630 with 2 fans and 6 pins connector. Btw, I remembered that @OzTalksHW was trying to get that card and he also made a video about the possible specs a year ago. I hope he will see this video and get his own GT 1010 one day. 😀
@Captain_Morgan bruh. What the fuck lmao why would they do that? Is it like half the performance of the 1650 that was already bad enough? At least the 1650 doesn't need any external power
If the 1630 is cheaper to produce than the 1030 I could see why. But if not, might as well use the old Samsung 14LPP process instead of TSMC 12FFN and keep making the 1030. I mean the 710 & 730 were made for far longer than they deserved. Usually the older processes are put into use for less demanding parts like chipsets or controllers, because they do fine on a bigger node.
It has been periodically updated, but it just shows art style and attention to detail makes all the difference. Plus it is the game that launched Steam, so valve definitely went all in.
And half life Alyx is still gonna be the best looking VR game five years from now. Valve does some serious dirt when they actually deign to come down from their steam throne in the clouds and actually make a damn game.
I somewhat recently played hl2 on an old laptop (i5-3230m with no dGPU). The game looked old mostly due to low polygon counts for the whole lot of 10-20 minutes after which I conpletely forgot about it being nearly 20 years old at this point
@@jonathanellis6097 Launched Steam ? What are you talking about ? It didn't launch steam. Steam was a thing before HL2. It probably brought a lot of people to Steam though. Edit : It released officially a year before HL2 but seems like it was around longer than that to me but I guess I used the Beta version of it back in 2002.
11:00 The problem with FSR is it requires a tiny bit of performance to actually apply the sharpening filter, which probably in this case seems like it is taking the GPU over the edge (could also be VRAM related?).
I once bought the DDR3 version of an HD 7000 series card by accident, because hey, it's the same GPU but cheaper! Got about half the framerate I did on the 3 year old GPU it was supposed to replace. I've learnt my lesson.
Which one? I remember them having like hundred models due low yields, each range have at least 4 models but never heard of same model with different memory on ATI.
@@gorky_vk powercolor with shit specs on radeon hd3870 and hd3850 one with 1gb ddr2 and one with 512mb gddr4. i bought the 1gb version because the quantity of ram, at end the more ram save me on later games but the performance was degraded with ddr2 1000 mhz on OC
Can't help but think back to when HL2 came out and was such a demanding game for the time. I had just gotten my first own PC that wasn't a hand-me-down and it had a Pentium 4 CPU, I forget the GPU (ATI model), but it ran HL2 pretty smoothly and I remember thinking what a beast of a system it was, haha. Yet this little 1010 absolutely dunks on it.
Presetting to Ultra could be because the game didn't recognise the card? I had that happen with The Sims 4 and my Radeon 680M (tho that ran fine on Ultra, but I knocked it down to High just to be safe)
Linus: Which high-end 16 or 24 core CPU will I use to play CSGO? Dawid: How will this GTX 1010 perform on Cyberpunk on 4K? Never change, I love your videos.
what the crack? You were calling the GT 1030 a GT Ten Thirty in this video.......I'm so used to hearing GT Ten Thiddy that my whole world is in question. I mean am I even breathing oxygen anymore or is it some obscure gas some weirdo with 8 heads and 50 eyes cooked up in a lab inside my ear canal
“I’m still not entirely sure that there’s not just a fart in this box”. LMAO. Dawid is the only TH-camr that can make me laugh. Nobody else comes close to his absurd humour
10:30 FSR 2.2 failed in this case because it put a little more strain on your already exhausted 2GB VRAM buffer. When VRAM is full the system uses RAM which is extremely slow for the GPU to access since it has to go through the CPU to access it which is basically ages away. FSR increased your VRAM usage (and thus RAM usage) by around 700MB which is a lot for a 2GB VRAM buffer. 14300MB RAM to 15000MB is a really noticeable increase for 1 setting. It's probably not meant to upscale to 4K anyway which is why RAM usage was so high, its more for 720p > 1080p
1:55 Value is nicely demoed by the grocery bag: no attampts made to preserve at all cost. CSGO will start with GeForce 6000 series from 2004 (not playable) and play acceptably on a 7300.
4gb of gddr5 would be nice. And a modded bios, cause when trying to oveclock the 1030 you run in to the powerlimit before even going in to overvoltage, heat issues, or instability. And the bios doesn't allow to increase it. On a decent 1030 with a fan, maybe even copper in the heatsink, i reckon somewhere close to 50w is possible thermally, maybe less depending on how much overvoltage would be needed
12:10 To be fair Intel only put the "good" (well better at least) igpu's on their mobile processors it seems. The i7-12700H (and the 3 processors above) has an igpu far ahead of the one on the desktop i9-12900K for example.
New subscriber here. Can’t believe I’ve only just found your channel, but you have instantly catapulted to my number one, all time favourite channel. Have just spent the last day watching pretty much all of your content and, other than running off to the local medical clinic for VD inoculations, have loved every single minute of it. I’m hooked. Thank you!
It should be noted that FSR 2.x is significantly worse for budget cards than FSR 1.0 . FSR 1.0 was simple upscaler , while 2.o and 2.1 are using "smart algorithms" that actually require powerful GPU to execute them.
You know it is going to be another GREAT Dawid video when there is a " Low Video Memory " warning in the first minuet AND it looks like it was done on a kids toy, not a real monitor ( That you can see IS A REAL MONITOR ) !! Edited to add: Achievement Unlocked: GTA1010 running FORZA at 4k extreme low settings and not having a coffee while doing one track!
At first I thought you was pranking us by using a higher power graphics card then the gt1010 but I haven’t realized how far these GT cards have become since I had a GT640.
The GTX 1080 is about 3x as fast as a GTX 680. Scaling it down and you still have a card that’s decently fast. Crazy how fast technology used to progress when AMD was actually competitive. Hope AMD lights a fire under NVIDIA’s ass this generation, so I can get a well priced Blackwell card.
Im watching this while trying to fix my pc my pc keeps on saying plug in pcie power cables but I have. I dont know what to do my ram light also lights up. I wonder if you could help.
Well I can give Nvidia some credit here with the GT1010 is not a rebranded card like the GT710 but a watered down version of the 1030. But pretty good performance out of the GT1010 at 1080p Low settings on the games that you tested. Hell not bad performance out of the GT 1010 @4K with some games.
Holy shit it's finally here, the day I've been waiting for, GT 1010 benchmarks. But now that I've seen them I'm a bit dissapointed. It's just so reasonably decent for what it is. Had this card been more available I think it would be a great option for entry level gaming paired with an older system, since if it was available it'd have to be even cheaper than the 1030. Sadly it's relegated to being an obscure piece of tech history.
10:45 Temporal upscaling (TAS, FSR 2/3, DLSS 2/3, XeSS, etc...) has a small but notable VRAM & memory bandwidth cost, that these mere 2GB cards simply cannot cope with, with an output resolution as high as 4K. 🤷
I've been able to connect a VGA monitor to it on my personal card, so it does function. I do wonder if I could use it to passthrough from a newer GPU to an old CRT lol.
fsr takes a fixed amount of graphics processing to do the rescaling. let's say it takes 100bp to use fsr at 4k from whichever resolution you're scaling from. if a 130bp card like the 1030 does it, it's left with 30 bp to render the actual game. which is why it's slower than native even though the resolution is lower. now for a decent card, say a 3060, that has 1000 bp. the 100bp it takes to do the rescaling is a drop in the ocean and is easily compensated by the reduced load of rendering in a lower internal resolution. fsr should only improve performance for the 1010 in resolutions below 1080p
This video made me think that getting a notebook with a 940mx ddr4 (instead of a 940mx gddr5, which is much rarer) was a big mistake gpu-performance wise. Basically things manufactures won't tell you.
partially correct, DVI-D which is what that connector is, removed all the analog signals - as evident by the missing pins around the ground '--' Flat pin. and depending whether the monitor was single link or dual link would determine which of the pins would feed data when the monitor was detected and the info was sent to the GPU what type it was. DVI-I incorporates both Digital and Analog signals and also includes single and dual link modes. DVI-A literally just the analog pins in DVI Format. those pins were later converted to digital in DVI-D specs. Quoting: "Digital Visual Interface (DVI) is a video display interface developed by the Digital Display Working Group (DDWG). The digital interface is used to connect a video source to a display device, such as a computer monitor. It was developed with the intention of creating an industry standard for the transfer of digital video content. The interface is designed to transmit uncompressed digital video and can be configured to support multiple modes such as DVI-D (digital only), DVI-A (analog only), or DVI-I (digital and analog). Featuring support for analog connections, the DVI specification is compatible with the VGA interface. This compatibility, along with other advantages, led to its widespread acceptance over competing digital display standards Plug and Display (P&D) and Digital Flat Panel (DFP). Although DVI is predominantly associated with computers, it is sometimes used in other consumer electronics such as television sets, video game consoles and DVD players."
Hi Dawid and everyone, my brother has an HP pavilion gaming PC with the motherboard tg01-2 b550a US and he’s looking to get a ryzen 7 5700g to upgrade the CPU, are these compatible? I’m trying to figure it out but I can’t find a definitive answer
I run a GT 1030 DDR5, it's been a great card since 2017. Not the most powerful of course because it can't run anything real demanding like Battlefield or Forza, but it's quite impressive for $70.
@@Njazmo You definitely want your server to have access to a monitor. If something goes wrong and you can't connect, it's the only way to fix things. Hence, that's exactly why VGA is used. It's easy to get a little portable VGA monitor.
For a mythical card you can see where nvidia was aiming, and the basic architecture went on to create the 10- line and have the say it’s impressive for what it is today, back in the day it would have been a great budget card
I believe this was actually the last 10 series card to be introduced. It's still made new today (albeit exclusive to China still) and more so seems to be a way to get rid of lower binned/partially defective GT1030 cores
It doesn't surprise me tha the GT 1010 is limited to four lanes, as the GT 1030 is as well. And both use the full size slot anyways. If anything, I thought maybe the GT 1010 would use two lanes.
Anybody have any potential explanations for why FSR made it perform worse? My guess is maybe it has an effect on memory bandwidth and since we're now dipping into the system RAM and using that as VRAM FSR has a detrimental effect.
That was the date of a paper launch. It's true release to market is unknown. They just started popping up very infrequently until a couple months ago when a boxed release was confirmed.
FSR still needs a relativly good GPU to be utilized :FSR 2: AMD Radeon™ RX 6000, RX 5000, RX Vega Series graphics cards, and the Radeon™ RX 590 graphics card if the minimum requirements of the game are met, FSR 1.0 was much more lenient on the minimum specs.
I think I have never seen input lag that can be measured in minutes before. But a game disconnecting because the framerate is too low is something I experienced before.
You know whats harder than finding one GT 1010? finding 9 of them, all i want is to get some 9 way SLI going so i can create the RTX 9090 and game at 8K 690hz
The new AMD chips dont have an integrated GPU, they have an integrated display adapter. It's deliberately underpowered, and essentially just enough to let you configure the BIOS/EFI or other POST time configured accessories. It maybe being able to run Windows acceptably is more of an afterthought.
I can’t believe it’s been almost 2 years since nortyx first discovered the card in your server, time flies lol
Also for those wondering, the reason my name is on the card is because I’ve requested it to be sent to nortyx afterwards, I asked dawid to sign it too so nortyx could have a cool small piece of GPU history as a keepsake
Lastly, I want to thank you a lot for everything you’ve done for me over the years, I wouldn’t be where I am today without your advice and the twitch thumbnail gig.
hello
Dapz! Thanks for being a core member of the Dawid Does Tech Stuff family.
This collab is awesome.
Weird seeing you here after only knowing you from the Consider discord, you got the beefiest of graphics tho man.
its cool how i watch both of yall
Excited to see the future of this card! Hope it has a long lived life!
After seeing the massive power of the GT 1010, I did 7 push-ups instead of my usual 4.
this is the most based comment ever
I've got lagging lats and I can still as many pullups at 20% bodyfat than you can do pushups! More pullups now at 17%, even while cutting!
@@TheBcoolGuy Over the head! Hahahahahahah. You go girl!
@@PDXCustomPCS "I was merely pretending to be weak."
Yeah, kick rocks.
I laughed so hard at how the 1010 is twice as fast as the 1030D4 =))
Shows how much a card can be held back by shitty bandwith. I mean, look at how bad the RX 6400 does on PCIe 3.0 or how the fastest GT 730 is not the one with the beefiest chip but the one with the fastest memory.
@RETRY it's not so bad, i run my stuff at 1080p medium to high settings and it runs fine, 60 fps
@@mparagames but it doesnt do 1080p 144 locked. Not worth (says the internet)
@@mikeymaiku depends on graphical settings, what game, api, and what clever rendering /upscaling techniques you're using or not
i could get forza horizon 4 to run at 120fps high settings a a 1080p output but by impact he image quality to a significant extent, by using FSR quality + half rate shading (the latter one, on windows, is only really avaliable in a few games, you can't force it via driver, sadly)
but yeah it only does 144hz if you're playiong a light game or willing to do sacrifices to rthe resolution or visuals.
but it technically *can*
meh ddr4 holds your mom back so of course it will hold back any gpu back from giving it's all at gaming period🤣🤣🤣
in a future video, you should run 3 gt 1030’s together and see if they equal the performance of a 3090, because adding 1030+1030+1030=3090
This man is too smart
@@soundeffectscentral1235 R.I.P. OP. Jensen put out a note on f4keinternetgril to end that line of inquiry.
Jensen needs a new leather jacket.
200 IQ comment
@@Grimmwoldds what do you mean?
@@thatunknownfella f4keinternetgril got isekai'ed by truck-kun. Coincidence that Jensen's 1600 USD 4090 was about to get eclipsed by 4 1030s.
whenever you hear dawid say "in this box" you know it's gonna be good or give you severe trauma that will take countless therapy sessions to combat
I love Dawid's mystery box adventures
That is not mutually exclusive.
Always wondered what happened to these things after nividia put up drivers for the 1010 and 1020 on their website and said they where launching the cards soon but never did anything with them
I’m pretty sure the 1010 (didn’t know there was a 1020) was only meant for work office computers and extremely low power OEM consumer computers. I won’t be surprised to see bargain bins of these GPUs in 3-5 years time.
@@zgoaty9235 Could you imagine someone fretting over whether to go with a 1010, or *go all out* and get a 1030?
This hypothetical person makes the *reasonable choice* of going with a 1020. They would like more than the 1010 offers. But, the 1030 really seems like overkill to them.
1020?!
@@JordanRichardson9 i think we have a new graphics card to search for
@@adamfra64 Yes
"it can barely render the warning"
and I thought my gt 630 was slow 💀
That GT 1010 is still more impressive than the ridiculous GT 1630 with 2 fans and 6 pins connector. Btw, I remembered that @OzTalksHW was trying to get that card and he also made a video about the possible specs a year ago. I hope he will see this video and get his own GT 1010 one day. 😀
Wait, there's a 1630??
@Captain_Morgan bruh. What the fuck lmao why would they do that? Is it like half the performance of the 1650 that was already bad enough? At least the 1650 doesn't need any external power
If the 1630 is cheaper to produce than the 1030 I could see why. But if not, might as well use the old Samsung 14LPP process instead of TSMC 12FFN and keep making the 1030. I mean the 710 & 730 were made for far longer than they deserved. Usually the older processes are put into use for less demanding parts like chipsets or controllers, because they do fine on a bigger node.
What. We have clearly been in North America for too long. Dawid says ‘Thirrrrty’ now.
That was the first thing i noticed. Scrolled through all the comments to look for someone mentioning this.
Half-Life 2 looks amazing for a 2004 title! Still holds up perfectly today! A testament to Valve's developers!
It has been periodically updated, but it just shows art style and attention to detail makes all the difference. Plus it is the game that launched Steam, so valve definitely went all in.
And half life Alyx is still gonna be the best looking VR game five years from now. Valve does some serious dirt when they actually deign to come down from their steam throne in the clouds and actually make a damn game.
I somewhat recently played hl2 on an old laptop (i5-3230m with no dGPU). The game looked old mostly due to low polygon counts for the whole lot of 10-20 minutes after which I conpletely forgot about it being nearly 20 years old at this point
@@jonathanellis6097 Launched Steam ? What are you talking about ? It didn't launch steam. Steam was a thing before HL2. It probably brought a lot of people to Steam though.
Edit : It released officially a year before HL2 but seems like it was around longer than that to me but I guess I used the Beta version of it back in 2002.
HL2 VR is amazing.
Gt 10 series is wild from a 1010 to the still quite mighty 1080ti
11:00 The problem with FSR is it requires a tiny bit of performance to actually apply the sharpening filter, which probably in this case seems like it is taking the GPU over the edge (could also be VRAM related?).
Both FSR and DLSS drive up RAM usage, so I wouldn't be shocked if that was the problem.
It's not compute limited, it's VRAM & memory bandwidth limited. Temporal upscaling has a small but notable cost to VRAM & memory bandwidth.
After all of the RTX 4080 and 4090 videos, this is sooooo refreshing. Thanks Dawid!
This isn't a "gaming" GPU, this is a "videoplayer" for old PC, but, GTA 5 at 4K in that, that's a surprise...
this is probably the first youtube video, that actually tests the 1010 in existance. thank you for the review!
8:25 Wow, I had no idea that DDR 5 vs DDR 4 gpu memory made such a difference!
Gddr5*, ddr5 will still choke the gpu
Going by the numbers here, DDR5 would be on par with GDDR5
@@talibong9518 only with 128 bit bus, but the ddr4 version uses 64 bit (single channel)
I once bought the DDR3 version of an HD 7000 series card by accident, because hey, it's the same GPU but cheaper!
Got about half the framerate I did on the 3 year old GPU it was supposed to replace. I've learnt my lesson.
Which one? I remember them having like hundred models due low yields, each range have at least 4 models but never heard of same model with different memory on ATI.
@@gorky_vk powercolor with shit specs on radeon hd3870 and hd3850 one with 1gb ddr2 and one with 512mb gddr4. i bought the 1gb version because the quantity of ram, at end the more ram save me on later games but the performance was degraded with ddr2 1000 mhz on OC
Can't help but think back to when HL2 came out and was such a demanding game for the time. I had just gotten my first own PC that wasn't a hand-me-down and it had a Pentium 4 CPU, I forget the GPU (ATI model), but it ran HL2 pretty smoothly and I remember thinking what a beast of a system it was, haha. Yet this little 1010 absolutely dunks on it.
Less than a year ago people would have been gauging eachothers eyes out in the street for that beast
Holy crap! The 1010 running gta v on 4k? I am impressed
When I saw the notification I thought, “wow unlike Dawid to use a good graphics card…” and then I saw the thumbnail
If this exists, there should be a prototype RTX 4010 somewhere...
a 3030 might be more of a possibility .
Presetting to Ultra could be because the game didn't recognise the card? I had that happen with The Sims 4 and my Radeon 680M (tho that ran fine on Ultra, but I knocked it down to High just to be safe)
1:43 "involved smuggling inside multiple ups delivery people" bro what 🤨🤨🤨🤨
I put a 1030 in a new build during the gpu shortage but made sure it was the ddr5 version. At 1080p it was quite decent to be honest.
I'm so glad this GT 1010 story concluded!
but can it run crysis?🤣
Linus: Which high-end 16 or 24 core CPU will I use to play CSGO? Dawid: How will this GTX 1010 perform on Cyberpunk on 4K?
Never change, I love your videos.
2:15 haha yeah.... that joke sure never gets old............
The pink madness build in the "canfsdhfahh zmdiuhfdth " case. Got it. The 1010 is on pcie x4? WTF
what the crack? You were calling the GT 1030 a GT Ten Thirty in this video.......I'm so used to hearing GT Ten Thiddy that my whole world is in question. I mean am I even breathing oxygen anymore or is it some obscure gas some weirdo with 8 heads and 50 eyes cooked up in a lab inside my ear canal
What the...
@@beataoo Glad you understand me
“I’m still not entirely sure that there’s not just a fart in this box”. LMAO. Dawid is the only TH-camr that can make me laugh. Nobody else comes close to his absurd humour
Scotty Kilmer
Are you sure about that?
10:30 FSR 2.2 failed in this case because it put a little more strain on your already exhausted 2GB VRAM buffer. When VRAM is full the system uses RAM which is extremely slow for the GPU to access since it has to go through the CPU to access it which is basically ages away. FSR increased your VRAM usage (and thus RAM usage) by around 700MB which is a lot for a 2GB VRAM buffer. 14300MB RAM to 15000MB is a really noticeable increase for 1 setting. It's probably not meant to upscale to 4K anyway which is why RAM usage was so high, its more for 720p > 1080p
@6:10, Dawid. You took the cooler off that GPU to use it on a CPU, to see if it would keep the CPU from thermal throttling.
1:55 Value is nicely demoed by the grocery bag: no attampts made to preserve at all cost. CSGO will start with GeForce 6000 series from 2004 (not playable) and play acceptably on a 7300.
Will you do a video with the ASL GTX 1650 4G battle knife single slot low profile
I think that you should try a 4gb GT 1030 from AliExpress at 4K.
The 2GB of VRAM is too little 🤣
that is if he can get a gddr5 version at 4 gb cause if it's 4 gb of ddr4 then it won't be worth it at all🤣
4gb of gddr5 would be nice. And a modded bios, cause when trying to oveclock the 1030 you run in to the powerlimit before even going in to overvoltage, heat issues, or instability. And the bios doesn't allow to increase it. On a decent 1030 with a fan, maybe even copper in the heatsink, i reckon somewhere close to 50w is possible thermally, maybe less depending on how much overvoltage would be needed
I was always curious to know how the mysterious GT 1010 would perform, honestly it can run old game like Battlefield 4 and Battlefield 1 just fine.
problem is that any card you can find for $20 can run these "fine".
12:10 To be fair Intel only put the "good" (well better at least) igpu's on their mobile processors it seems. The i7-12700H (and the 3 processors above) has an igpu far ahead of the one on the desktop i9-12900K for example.
Can't be as exclusive as a wish graphics card
meh the gt 1010 looks to be beating out the gt 710🤣🤣🤣
New subscriber here. Can’t believe I’ve only just found your channel, but you have instantly catapulted to my number one, all time favourite channel. Have just spent the last day watching pretty much all of your content and, other than running off to the local medical clinic for VD inoculations, have loved every single minute of it. I’m hooked. Thank you!
the gddr5 1030 was like so you throw me away like trash for the ddr4 version and now you want me to do shit for you well I will show you🤣🤣🤣
It should be noted that FSR 2.x is significantly worse for budget cards than FSR 1.0 . FSR 1.0 was simple upscaler , while 2.o and 2.1 are using "smart algorithms" that actually require powerful GPU to execute them.
have you seen these HD 6770 4GB cards? They actually sell them on eBay. They have a GPU z screenshot of 4GB DDR5 Samsung memory. very curious. 🤷♂
Budget builds’s pronunciation of “GT 1010” sounds a lot better, but you “1030” is unbeatable
there had better be a t shirt in the store that says "Don't jabait me bro" or "Bro... do you even jabait"
You know it is going to be another GREAT Dawid video when there is a " Low Video Memory " warning in the first minuet AND it looks like it was done on a kids toy, not a real monitor ( That you can see IS A REAL MONITOR ) !!
Edited to add:
Achievement Unlocked: GTA1010 running FORZA at 4k extreme low settings and not having a coffee while doing one track!
At first I thought you was pranking us by using a higher power graphics card then the gt1010 but I haven’t realized how far these GT cards have become since I had a GT640.
The GTX 1080 is about 3x as fast as a GTX 680. Scaling it down and you still have a card that’s decently fast. Crazy how fast technology used to progress when AMD was actually competitive. Hope AMD lights a fire under NVIDIA’s ass this generation, so I can get a well priced Blackwell card.
Im watching this while trying to fix my pc my pc keeps on saying plug in pcie power cables but I have. I dont know what to do my ram light also lights up. I wonder if you could help.
If it's a modular PSU, is the PCIe power plugged into the PSU?
I love seeing Dawid torture random and obscure pieces of tech like this! *Insert evil laugh* :D
Well I can give Nvidia some credit here with the GT1010 is not a rebranded card like the GT710 but a watered down version of the 1030. But pretty good performance out of the GT1010 at 1080p Low settings on the games that you tested. Hell not bad performance out of the GT 1010 @4K with some games.
Holy shit it's finally here, the day I've been waiting for, GT 1010 benchmarks.
But now that I've seen them I'm a bit dissapointed. It's just so reasonably decent for what it is.
Had this card been more available I think it would be a great option for entry level gaming paired with an older system, since if it was available it'd have to be even cheaper than the 1030.
Sadly it's relegated to being an obscure piece of tech history.
people in 1900 seeing this: truly beautiful
people from 1010 seeing this: truly now
Hahah loved that you uploaded at the same time as dapzz
I cant believe that the 1010 really can play those games with that FPS numbers. I got blown away. This cant be true.
The missing heatsink on the gt 1030 was probably used in the video where he put a gpu heatsink on a cpu
the dd4 1030 vram is only like 1060mhz and the 1010s vram was like 3000mhz hence double the fps.
What’s the programme he uses to see the GPU and Frames? Cheers
MSI Afterburner
@@Njazmo Thanks man!
10:38 yeah I don’t use it for that reason. My 3080 pats when I turn it on. So I leave it off. I mean I am still getting almost 200fps lol
I know what happened:
Nvidia didn't want to cannibalize their performance tiers. They shouldve just kept at the 1010 GDDR5.
10:45 Temporal upscaling (TAS, FSR 2/3, DLSS 2/3, XeSS, etc...) has a small but notable VRAM & memory bandwidth cost, that these mere 2GB cards simply cannot cope with, with an output resolution as high as 4K. 🤷
That sounds like the most dawid video idea ever
I always enjoy watching Dawid torture small GPUs. I know it's sick but I can't help myself.
Me learning there is a crappy version of the 1030 after I buy a second hand 1030 with ddr4: 😐
here from dapz! this is truly an interesting story
Wonder how it has the dinosaur era video port as AFAIK 10 series doesn't support analog output. Unless there's a DAC.
I've been able to connect a VGA monitor to it on my personal card, so it does function. I do wonder if I could use it to passthrough from a newer GPU to an old CRT lol.
'Use it unreasonably' should be a shirt.
The 1010 was only available to OEMs for business machines.
fsr takes a fixed amount of graphics processing to do the rescaling. let's say it takes 100bp to use fsr at 4k from whichever resolution you're scaling from.
if a 130bp card like the 1030 does it, it's left with 30 bp to render the actual game. which is why it's slower than native even though the resolution is lower.
now for a decent card, say a 3060, that has 1000 bp. the 100bp it takes to do the rescaling is a drop in the ocean and is easily compensated by the reduced load of rendering in a lower internal resolution.
fsr should only improve performance for the 1010 in resolutions below 1080p
This video made me think that getting a notebook with a 940mx ddr4 (instead of a 940mx gddr5, which is much rarer) was a big mistake gpu-performance wise.
Basically things manufactures won't tell you.
1x frames per second is the ultimate cinematic experience. Who agrees?
I have small correction: 1030 is not all digital. DVI includes analog signals and simple adapter, or DVI to VGA cable can feed Mesozoic era monitors.
partially correct,
DVI-D which is what that connector is, removed all the analog signals - as evident by the missing pins around the ground '--' Flat pin. and depending whether the monitor was single link or dual link would determine which of the pins would feed data when the monitor was detected and the info was sent to the GPU what type it was.
DVI-I incorporates both Digital and Analog signals and also includes single and dual link modes.
DVI-A literally just the analog pins in DVI Format. those pins were later converted to digital in DVI-D specs.
Quoting:
"Digital Visual Interface (DVI) is a video display interface developed by the Digital Display Working Group (DDWG). The digital interface is used to connect a video source to a display device, such as a computer monitor. It was developed with the intention of creating an industry standard for the transfer of digital video content. The interface is designed to transmit uncompressed digital video and can be configured to support multiple modes such as DVI-D (digital only), DVI-A (analog only), or DVI-I (digital and analog). Featuring support for analog connections, the DVI specification is compatible with the VGA interface. This compatibility, along with other advantages, led to its widespread acceptance over competing digital display standards Plug and Display (P&D) and Digital Flat Panel (DFP). Although DVI is predominantly associated with computers, it is sometimes used in other consumer electronics such as television sets, video game consoles and DVD players."
@@CapStar362 Awesome. Thank for explanation.
GT 10 "Thirty" ??
Hi Dawid and everyone, my brother has an HP pavilion gaming PC with the motherboard tg01-2 b550a US and he’s looking to get a ryzen 7 5700g to upgrade the CPU, are these compatible? I’m trying to figure it out but I can’t find a definitive answer
i love how its the size of a smol circuit board, a little lovely friend
I run a GT 1030 DDR5, it's been a great card since 2017. Not the most powerful of course because it can't run anything real demanding like Battlefield or Forza, but it's quite impressive for $70.
This is really cool, but it makes me so absolutely thankful I have a 4090 and don't have to deal with this pedigree of hardware.
VGA is still common in servers. DVI is still useful, easy to convert to HDMI
Umm, servers don't need monitors, and VGA is so 90's.
@@Njazmo You definitely want your server to have access to a monitor. If something goes wrong and you can't connect, it's the only way to fix things.
Hence, that's exactly why VGA is used. It's easy to get a little portable VGA monitor.
Kinda reminds me of the agp 3870 or the 4850 powercolor with gddr5
How do you not have a million subs yet? Your content is FAR more unique than the average tech TH-camr.
Wow I have the same desktop image at 5:03, neat!
3:24 me: oh is that the wifi card?
WOW! Always used to joke about 1010Ti but its a real thing it seems. Now Im waiting for my RTX 4010 Ti (tie!)
For a mythical card you can see where nvidia was aiming, and the basic architecture went on to create the 10- line and have the say it’s impressive for what it is today, back in the day it would have been a great budget card
I believe this was actually the last 10 series card to be introduced. It's still made new today (albeit exclusive to China still) and more so seems to be a way to get rid of lower binned/partially defective GT1030 cores
what program is the top left for mhz,fps etc?
Msi after burning
Thanks. I had it just couldn’t get it to work on games. Figured it out tho
PCIEX4.... VGA.... LOW PROFILE..... THIS IS BETTER THAN I COULD HAVE EVER IMAGINED
YAY, 1010! Finally someone got it!
dawid video before going to bed ...... count me in
You should have tested the games on a CRT with that VGA.
10:54 surprised that worked at all, Thats a amd graphics card setting.
It doesn't surprise me tha the GT 1010 is limited to four lanes, as the GT 1030 is as well. And both use the full size slot anyways.
If anything, I thought maybe the GT 1010 would use two lanes.
well, 6500XT also using 4lanes only with performance not that far from 3050ti.
Wait, pascal doesn't output any analogue signal? I mean it isn't supposed to.
Yeah it's been noticed that this is the only pascal GPU with an analog out. I'll hafta take a closer look as it may have a DAC onboard
@@cpufreak101 I thought about that too but that would be even more nonsensical than actually producing the gpu.
Anybody have any potential explanations for why FSR made it perform worse? My guess is maybe it has an effect on memory bandwidth and since we're now dipping into the system RAM and using that as VRAM FSR has a detrimental effect.
@Anna, Has Dawid gotta Santa Claus side hustle up soon?
That beard, a ginger thick boy!
In GTA 5, it was visible that the DDR4 GT1030 was using around 15 Watt's and the DDR5 version about double that consumption.
hey dawid! may i ask what speaker are you using?
3:30 It was apparently released in January 2022
That was the date of a paper launch. It's true release to market is unknown. They just started popping up very infrequently until a couple months ago when a boxed release was confirmed.
@@cpufreak101 Ok that's quite interesting. Thanks for telling me
FSR still needs a relativly good GPU to be utilized :FSR 2: AMD Radeon™ RX 6000, RX 5000, RX Vega Series graphics cards, and the Radeon™ RX 590 graphics card if the minimum requirements of the game are met, FSR 1.0 was much more lenient on the minimum specs.
I think I have never seen input lag that can be measured in minutes before.
But a game disconnecting because the framerate is too low is something I experienced before.
You know whats harder than finding one GT 1010? finding 9 of them, all i want is to get some 9 way SLI going so i can create the RTX 9090 and game at 8K 690hz
So far I know of about three in existence, good luck finding the other 6
How about comparing to the AMD integrated gpu since apparently all new Ryzen chips have one? Are they any good compared to 1010/1030?
Ryzen 5 5600g trades blows with the 1030 gddr5 version.
The new AMD chips dont have an integrated GPU, they have an integrated display adapter. It's deliberately underpowered, and essentially just enough to let you configure the BIOS/EFI or other POST time configured accessories. It maybe being able to run Windows acceptably is more of an afterthought.
Looking at Forza, I now understand and am grateful that Spider-Man has the ability to set graphical settings before loading up the game.