So, my roomate has a 960 gtx, with DVI-I, will that output analog? Also wanted to ask, are there any potential issues with any VGA to DVI-I adapters I'd buy? Any possible limits on refresh, etc. I got a decent CRT, and really wanna run interlaced and high refresh for gaming. Since it's just converting one analog to another analog input, I assume I shouldn't have to worry about limitations. I have a 1060 gtx, and my motherboard has two pci express slots, so I plan on trying to use both cards, see what the hell happens.
@@3hited www.amazon.com/Cable-Matters-VGA-DVI-DVI-I/dp/B078PS8LZ6/ref=sr_1_10?crid=7CQ1V3MDR9Y6&keywords=vga+female+to+dvi+male&qid=1569966211&s=electronics&sprefix=vga+female+to+dvi+%2Celectronics%2C194&sr=1-10#customerReviews www.amazon.com/StarTech-com-Slimline-HD15-Gender-Changer/dp/B00066HJBG/ref=sr_1_11?keywords=vga+female+to+female&qid=1569967620&s=electronics&sr=1-11 Ya, I'm about to order these, so I can hook it all up. I'm a little worried about bandwith limitations. So, idk, we'll see I suppose.
@@copper4eva That should be good but the listing is a bit weird because it says it supports up to 1080p. It should support higher unless it is infact an active converter. That being said it looks like a passive converter. I would asume they just said up to 1080p because they think the common use for the cable is for the LCD monitors that supported vga input.
@@3hited It worked. I got 137Hz now on my CRT monitor. Although I don't seem to be able to get my max resolution of 1600x1200, I wonder if that's because of a limit in the adapter. What exactly would you use to hook up vga to a dvi-I?
Also, it seems with this way, you can completely run a game on a 1060 GTX, and it output through the 960 GTX, into the CRT. So you can basically use the 960 as a physx card, an over glorified analog adapter.] Point is, I don't think you really need to get a special analog out GPU, any will work, if it's in tandem with a newer better card.
For arcade gaming (groovymame) I think the best option is radeon hd 7970 Ghz edition, and R9 380 4Gb, or R9 285 OC. Those are the last cards AMD made with analog ports. If your are using it with a CRT TV, DVI should do the trick with adapters to component video or scart. And crt emudriver with vmmaker. 👌🏼
R9 380X and 980 Ti are the best analogue videocards for CRTs. I hope these will not go up in price, at least until a good OLED true RGB screen comes in. That could be 10 years from now on. But until then CRTs are the way to go for purists.
@@socaranectien1933it should be. I'm assuming he said 980 ti because that card is a lot more popular and was way more suitable for gaming back when it came out
What you described as a possible solution - using older card as a video output only (by configuring as physx processor) - something similar can be achieved with integrated graphics (so you don't need two 16x pcie slots a la ITX motherboard). So if the motherboard has analog out (be it in DSUB or DVI, assuming it has decent RAMDAC, which i have no reference for) you could get the videocard to process the picture and pass through motherboard. This requires: 1) Motherboard that allows the integrated graphics to be turned on even if videocard is installed 2) Either use hacked laptop drivers to get Nvidia Optimus for iGPU+dGPU config or use duplicated screen configuration in windows with regular drivers while having 1 screen plugged in GPU and other in mobo (you effectively get limited to 1 monitor for your PC). The laptop driver hack is used for getting the mining GPUs to run. The duplication screen method is nice for when you want to use USB-C dock to output video from a desktop (but don't have an RTX card with USB port).
Thanks for your video, that is one of the really few informative ones on the topic. Let me add some points. There is only one graphics card (that i know of and i've checked most) putting out more than 400Mhz. That was a b&w card made by Barco with a 550Mhz RAMDAC in the 90s. 400MHz (which is the RAMDAC frequency all others use) is the limit. Don't think you can calculate like this: H- resolution x V- resolution x vertical frequency= Pixel clock. There's something called "porches", which needs to be in the calculation as well. Usually you need about 25% (20 to 30%, depending on your monitor or projector) more time for the retrace (v and h) plus sync width. In fact you'll end up with like 1920x1080x85x1,25=220MHz instead of 1920x1080x85=176Mhz. These 25% are the porches for horizontal and vertical retrace. You can make them visible on your crt if you shrink the picture all the way to 0% in both directions and turn up brightness to max. Now you'll see a faint raster sourrounding the actual picture. You can try to lower the porches to a minimum and save bandwidth that way. There is even a way calculating the minimum porches needed for the specific resolution and monitor, feel free to ask, but that is kind of complicated. Timings including porches are standardized in the GTF (use powerstrip by entech taiwan or CRU like shown in the video for this). GTF settings work well with any CRTs i know of. So, GTF timings are a good starting point if you want to experiment. I had to go a different route: i wanted the fastest card supporting TWO analogue monitors (crt projectors in my case) and the best i found was a GTX480 with two DVI-I. It's not the fastest card and a big room heater,even in desktop use. But yeah, it works and i can play games.
AMD actually stopped supporting DVI-I the same year as Nvidia, in 2016. The R9 380x, which is the highest end AMD card with DVI-I, came out in 2015. The same year as the Titan X and 980 Ti.
For what I've heard you can output analog signal up to the R9 380x with a passive adapter from DVI-I to VGA. Also the CRT emudrivers work only on AMD cards.
The best GPU with analog out is the GTX TITAN X (Maxwell), the problem is that all TITAN have a premium price because they are collectible items, in reality the best GPU with Analog out for gaming is a GTX 980 Ti, another reason for going GTX 980 Ti is that u can find custom models with very good cooling, and u can OC the 980 Ti to became faster then Titan X with Nvidia cooler.
@@rodrigofilho1996 It's for a Hatarex polo mtc900 arcade monitor, popping straight into the Arcade cabinet. So the titan will output 31khz without emudriver?
@@o_Juffo_o Sure, any GPU with VGA analog out can display 31khz, in the case of the Titan u will use (DVI-I to VGA), just remember that VGA uses RGBHV, while SCART or RGB modded TVs use RGBS. The main idea of using a Titan or GTX 980 Ti is to run modern games on VGA CRT monitors, if u are only going to run old games (RetroArch) with 15khz CRT TVs i think the best way would be with Emudriver and some old cheap HD 5000 GPU. In the case of your Arcade Monitor the first thing u need to know is what type of signal it runs, from there u find what is the best approach to get a good image to it. Even something like this is not that bad: th-cam.com/video/GSBxHvrR7sw/w-d-xo.html Just remember that in all cases lag is introduced when going from digital to analog, ive seen great and terrible display port to VGA converters, some introduce near 0.1ms of delay, others up to 30ms of delay, thats why a GPU with native analog out is preferred.
@rodrigofilho1996 Alrighty, I will get a Titan X then as obviously it's for modern games along with Retroarch & MAME, I have a 14tb hyperspin drive with all kinds of emulators. I will need to purchase a J-Pac for the controls, just trying to figure out what audio amp to get for 2.1 speakers & woofer..
active converters are also nerfed at 60 Hz edit : the 60 Hz max is for 1080p only. Mine runs at 85 Hz when the res is dropped to 1200 x 1080. Like it says in the manual. Adapters are fine.
What's interesting is that it basically means that both AMD and Nvidia stopped supporting analog signals at the same time they phased out Windows XP drivers, at some point they basically said screw the past. I build a lot of hybrid systems that dual-boot Windows XP and Windows 7 or 10, and as such, an AMD 7990 or an Nvidia 980ti (they both required a slightly tweaked driver) card are very good choices on a fully XP compatible system such as an Sandy/Ivy Bridge CPU.
Windows 7 is obsolete and has no compatibility advantage for older games over Windows 10 (that I'm aware of). The 980ti (and the top of the line Sandy/Ivy CPUs) are still passable, but of course you get performance that would pass for low-end new hardware. Newer GPUs cannot display an analog signal over DVI, however there are transcoders available, that in theory would add a tiny bit of lag, but the fact is, for a strictly Windows 10 build, you can go for newer hardware with a transcoder and still use a CRT. My current build is a 3770K w/ 980ti OC & 32gb and Auzentech Hometheater HD for EAX compatible Audio over HDMI. It dual boots Windows 10-64 and XP-32, and is connected to three displays: A 3D Vision compatible LCD monitor, a 3D Active 4K HDR TV (the last Panasonic TV to do so) and a FW900 CRT monitor. It's all a bit long in the tooth, but I'll upgrade all of it after a while when all the next-gen consoles are on the market. I will of course not throw away this awesome high compatibility equipment.
I don't have experience with interlaced custom resolutions, though I know some people have had success with them. Google is your friend for that. Good luck. I would definitely tweak the driver to support XP and dual boot the two operating systems, it should still work under a Ryzen motherboard. I'm never letting a Maxwell card go to waste when it comes to fast XP gaming.
I don't know, supposedly no Pascal card natively support VGA, though I had a look on google and obviously there are some 1030 with a D-sub connector, but it may have an internal DAC that adds a tiny bit of lag, don't quote me on that. Anyway the thing is that if you look around, you can still find some 950 cards for cheap and they perform faster than a 1030 with guaranteed native analog over DVI. Some of which don't even require an additional power connector, so if price and efficiency is an issue, I would hunt one of those instead. But again I'm a stickler for XP compatibility as well, which you might not be.
Still informative all these years later, thank you. I still have my GTX 980 and it seems to support analog out. My question is does it help image quality? I've been using a VGA to HDMI converter and it seems a little blurry even at supported resolutions.
@@ph_stuff can't account for prices all over the world, just general areas. I mean he showed the list in the video, retro gaming doesn't require this OP setup. For me it was $130 for titan Maxwell, but I can only imagine what that'd be elsewhere. You don't need a ton of horsepower for retrogaming just something that does analogue signal really..
I’ve heard that the 1080ti was the last card to support interlaced. I want to go with that, but is it a good idea? I use a cpd-g420s. The only game I play competitively is ssbm so I’d just use progressive for that. But I play Minecraft a bunch and would like “higher” resolution, so interlaced seems appealing. Also how do I go about finding the highest/best interlaced resolution for my monitor? I do have cru btw.
I found how to find the best resolution. Just use cru and put in 4:3 resolutions in and find the highest one that is lower than your monitors horizontal frequency. Thanks for nothing anyone who saw this and knew how, but didn’t say anything. It took me months to find out assholes.
@@Polowogs nah, I know how to now, but wow I don’t remember writing so angrily lol. Also btw the 1080ti is not the best that does interlacing. It’s the titan x pascal. Not much better than 1080ti, but still better.
@@socaranectien1933 you can use interlacing through an intel integrated gpu and render with an RTX 30 series. its a bit limited but you can do high res like 1920x1440i@140hz, 2048x1536i@120hz, 1440x1080i@200hz, 1280x960i@240hz. lemme know if that sounds interesting
Yes, except for that you are likely able to use a newer nvidia card with an older one with analog out as PhysX card. And then use the video out on that one. I haven't tested personally but coppper4eva did and what he described sounded like it does work. And also keep in mind that if you do that you would probably have a lot more troubleshooting to do then getting one of the graphics cards I mentioned.
What if you wanted to use two CRT's? The 980 ti and 280x both only have the one analog output. What are the best cards that have multiple analog outputs?
Hey man,i got 1050 ti and crt samsung syncmaster 160hz monitor.. Can you tell?wich adapter or cabel i should use for this ?? Vga to display port or vga to dvi d???? Plz tell
Fascinating, thank you! I just got my 19inch Viewsonic and intend to use it with my RTX 3080 laptop via USBC to VGA converter. Do you think that would be viable?
@@3hited it's the G90FB, maxes out at 97 khz, so no unicorn unfortunately. Turns out I can max it out as per the spec sheet but for some reason interlaced mode won't work, not even in low res which kinda sucks. Any ideas?
@@adamplechaty I've heard from some people that the work around to get interlaced mode to work with nvidia was broken at some point. It's kind of unfortunate, only solution is probably to use really old drivers if available.
@@adamplechaty interlaced resolutions are not supported on GPUs for nvidia past the 1080ti I’ve heard. I don’t know if the titan x pascal works for them though. People feel like idiots and forget about the titan x cards, it seems, including in this comment section even though the maxwell is in the god damn video. Maybe they haven’t mentioned the pascal because they are forgetting about the titan x cards. I’ve heard rumors, however, that it’s possible to use one new card for processing and then you can run that card through to an old card with analog out. My problem is that I’ve only heard rumors and these fucking people never explain how the ?&@! to do it. I don’t even know if they’re able to do interlaced or not!!!!! Djsjakbshaj so god dam infuriating to spread interesting rumors like this and completely ignore the fact that YOU STILL HAVNT EXPLAINED HOW THE &$@“ TO DO IT!!!!!!!!!! I’ve looked it up for so long and can’t find how
The best GPU that has analog is the GTX 980 Ti, u can get that over DVI-i. As for composite and component the best GPU is GTX 285, u can get that over S-Video.
980 ti isn't the best, as it's basically a trimmed down titan x (maxwell). Then there is another graphics card that is the same, hardware wise, as a titan X (maxwell) but with twice the vram.
@@3hited I'm talking about standard GPUs, Titans are collective items, they cost way more then GTX 980 Ti, but provide little gains. Besides, for CRTs a GTX 980 Ti is way more then enough.
@@rodrigofilho1996 I think you're missing the point of this video. It's not about strictly what the best value is but it's about what is literally the best performer(s), which also depends on what games you're playing! I also made this video with the expectation that it would be useful much later in the future where the cost of these best GPUs becomes much less significant. That's why i listed the quadro card which is much, much, more expensive than a Titan X. Also a titan X genuinely has a lot of value over a 980 ti, if you have a very high end CRT you might be doing 2048x1536 at 80/85Hz. Vram usage in that situation can go up pretty quickly. I brought one knowing that I would be sticking to a CRT for a long time and wouldn't be able to really upgrade again. I also know people that are also into CRTs quite a bit telling me they got a titan x. But, yes a 980 ti is a better value I agree.
Amd had analog up to 300 series, not 200 tho with the 300 it was hit or miss. It was pretty much up to the brand releasing the card but still there’s a good chunk of 380x models with dvi-i. There are even 390 models with dvi-I. So you are wayyyyyyy off with your resolution limit
I'm not waaaaaay off. it's a firnware limit not some limit from lack of gpu power. Also I hope you can understand if you make a video about something and say something might have analog out you will proceed to get spam messaged by every person unsure. And from that era of graphics cards DVI-I ≠ analog out. They needed a ramdac as well. Do you know of any specific 390 models that have confirmed analog out ?
I would say almost certainly not, but if you have both you could try it. Basically the way using a 2nd gpu works is just with nvidia's physx. You wouldn't necessarily be using physx but physx lets you use almost any nvidia gpu for physx stuff (which isn't even used much anymore). But one of the unintended results of that is it kind of enables that other GPU which lets you use it for more video outputs. I'm still not 100% sure if this works myself, but one of my viewers did contact me and said it worked for them, but this sort of stuff is always finicky.
I use an mx4000 to connect to a crt, in svideo, it was the only one that I found with 60hz in the svideo, I bought a gf 6200 but the svideo comes out 30hz interpolated, and other geforce are 30hz too, do you know any new generation gpu that be 60hz in crt component
I saw for a while they were at $500-600 but because of crypto mining they went up a lot. IIRC for some mining just having a lot of VRAM is super important, so the 24GB versions are really good for mining.
So im in a bit of a unique situation I guess, I use windows for my plex server ( I could probably migrate it to linux, but I cba to do that) I should've just kept using my 5700XT and then bought an older card. processed on the 5700XT and output through one of the CRT emudriver cards.. BUT I needed some hardware decoding so I went with a titan X (maxwell) not thinking about it lol, oh well. I really wanted to just play old SNES and stuff. So I'm gonna run RGB2COMP from retroink through it all while retaining my plex server's decoding. I shoulda waited til the 5090, since im swapping out my main rigs 3090.. but hey a titan X Maxwell for $120 why not will be fun :)
So, my roomate has a 960 gtx, with DVI-I, will that output analog?
Also wanted to ask, are there any potential issues with any VGA to DVI-I adapters I'd buy? Any possible limits on refresh, etc. I got a decent CRT, and really wanna run interlaced and high refresh for gaming. Since it's just converting one analog to another analog input, I assume I shouldn't have to worry about limitations.
I have a 1060 gtx, and my motherboard has two pci express slots, so I plan on trying to use both cards, see what the hell happens.
Make sure to get a passive adapter and not an active adapter. And let me know how this works I'm very curious.
@@3hited
www.amazon.com/Cable-Matters-VGA-DVI-DVI-I/dp/B078PS8LZ6/ref=sr_1_10?crid=7CQ1V3MDR9Y6&keywords=vga+female+to+dvi+male&qid=1569966211&s=electronics&sprefix=vga+female+to+dvi+%2Celectronics%2C194&sr=1-10#customerReviews
www.amazon.com/StarTech-com-Slimline-HD15-Gender-Changer/dp/B00066HJBG/ref=sr_1_11?keywords=vga+female+to+female&qid=1569967620&s=electronics&sr=1-11
Ya, I'm about to order these, so I can hook it all up. I'm a little worried about bandwith limitations. So, idk, we'll see I suppose.
@@copper4eva That should be good but the listing is a bit weird because it says it supports up to 1080p. It should support higher unless it is infact an active converter. That being said it looks like a passive converter. I would asume they just said up to 1080p because they think the common use for the cable is for the LCD monitors that supported vga input.
@@3hited
It worked. I got 137Hz now on my CRT monitor. Although I don't seem to be able to get my max resolution of 1600x1200, I wonder if that's because of a limit in the adapter.
What exactly would you use to hook up vga to a dvi-I?
Also, it seems with this way, you can completely run a game on a 1060 GTX, and it output through the 960 GTX, into the CRT. So you can basically use the 960 as a physx card, an over glorified analog adapter.]
Point is, I don't think you really need to get a special analog out GPU, any will work, if it's in tandem with a newer better card.
For arcade gaming (groovymame) I think the best option is radeon hd 7970 Ghz edition, and R9 380 4Gb, or R9 285 OC. Those are the last cards AMD made with analog ports. If your are using it with a CRT TV, DVI should do the trick with adapters to component video or scart. And crt emudriver with vmmaker. 👌🏼
R9 380X and 980 Ti are the best analogue videocards for CRTs. I hope these will not go up in price, at least until a good OLED true RGB screen comes in. That could be 10 years from now on. But until then CRTs are the way to go for purists.
I thought titan-x is best? It says in the video wtf?
@@socaranectien1933it should be. I'm assuming he said 980 ti because that card is a lot more popular and was way more suitable for gaming back when it came out
@@socaranectien1933 my 1080ti has DVI lol
What you described as a possible solution - using older card as a video output only (by configuring as physx processor) - something similar can be achieved with integrated graphics (so you don't need two 16x pcie slots a la ITX motherboard). So if the motherboard has analog out (be it in DSUB or DVI, assuming it has decent RAMDAC, which i have no reference for) you could get the videocard to process the picture and pass through motherboard.
This requires: 1) Motherboard that allows the integrated graphics to be turned on even if videocard is installed 2) Either use hacked laptop drivers to get Nvidia Optimus for iGPU+dGPU config or use duplicated screen configuration in windows with regular drivers while having 1 screen plugged in GPU and other in mobo (you effectively get limited to 1 monitor for your PC).
The laptop driver hack is used for getting the mining GPUs to run. The duplication screen method is nice for when you want to use USB-C dock to output video from a desktop (but don't have an RTX card with USB port).
Thanks for your video, that is one of the really few informative ones on the topic.
Let me add some points.
There is only one graphics card (that i know of and i've checked most) putting out more than 400Mhz. That was a b&w card made by Barco with a 550Mhz RAMDAC in the 90s.
400MHz (which is the RAMDAC frequency all others use) is the limit. Don't think you can calculate like this: H- resolution x V- resolution x vertical frequency= Pixel clock. There's something called "porches", which needs to be in the calculation as well.
Usually you need about 25% (20 to 30%, depending on your monitor or projector) more time for the retrace (v and h) plus sync width. In fact you'll end up with like 1920x1080x85x1,25=220MHz instead of 1920x1080x85=176Mhz. These 25% are the porches for horizontal and vertical retrace. You can make them visible on your crt if you shrink the picture all the way to 0% in both directions and turn up brightness to max. Now you'll see a faint raster sourrounding the actual picture. You can try to lower the porches to a minimum and save bandwidth that way. There is even a way calculating the minimum porches needed for the specific resolution and monitor, feel free to ask, but that is kind of complicated.
Timings including porches are standardized in the GTF (use powerstrip by entech taiwan or CRU like shown in the video for this). GTF settings work well with any CRTs i know of. So, GTF timings are a good starting point if you want to experiment.
I had to go a different route: i wanted the fastest card supporting TWO analogue monitors (crt projectors in my case) and the best i found was a GTX480 with two DVI-I.
It's not the fastest card and a big room heater,even in desktop use. But yeah, it works and i can play games.
AMD actually stopped supporting DVI-I the same year as Nvidia, in 2016.
The R9 380x, which is the highest end AMD card with DVI-I, came out in 2015. The same year as the Titan X and 980 Ti.
The issue isn't not having DVI-I, the issue is not having analog out.
And DVI-I was the only way of getting analog video out of discrete GPUs after VGA connectors were removed.
For what I've heard you can output analog signal up to the R9 380x with a passive adapter from DVI-I to VGA. Also the CRT emudrivers work only on AMD cards.
Can you see the difference between 200Hz interlaced and 100Hz progressive?
The best GPU with analog out is the GTX TITAN X (Maxwell), the problem is that all TITAN have a premium price because they are collectible items, in reality the best GPU with Analog out for gaming is a GTX 980 Ti, another reason for going GTX 980 Ti is that u can find custom models with very good cooling, and u can OC the 980 Ti to became faster then Titan X with Nvidia cooler.
Yo, so what emu driver would you use for a Titan X on windows 10?
@@o_Juffo_o no need for emudriver, just use a 31khz crt pc monitor
@@rodrigofilho1996 It's for a Hatarex polo mtc900 arcade monitor, popping straight into the Arcade cabinet. So the titan will output 31khz without emudriver?
@@o_Juffo_o Sure, any GPU with VGA analog out can display 31khz, in the case of the Titan u will use (DVI-I to VGA), just remember that VGA uses RGBHV, while SCART or RGB modded TVs use RGBS.
The main idea of using a Titan or GTX 980 Ti is to run modern games on VGA CRT monitors, if u are only going to run old games (RetroArch) with 15khz CRT TVs i think the best way would be with Emudriver and some old cheap HD 5000 GPU.
In the case of your Arcade Monitor the first thing u need to know is what type of signal it runs, from there u find what is the best approach to get a good image to it.
Even something like this is not that bad: th-cam.com/video/GSBxHvrR7sw/w-d-xo.html
Just remember that in all cases lag is introduced when going from digital to analog, ive seen great and terrible display port to VGA converters, some introduce near 0.1ms of delay, others up to 30ms of delay, thats why a GPU with native analog out is preferred.
@rodrigofilho1996 Alrighty, I will get a Titan X then as obviously it's for modern games along with Retroarch & MAME, I have a 14tb hyperspin drive with all kinds of emulators. I will need to purchase a J-Pac for the controls, just trying to figure out what audio amp to get for 2.1 speakers & woofer..
What's the best GPU with TV out? I'm trying to use with my CRT 480i component tv
Short write up in the description
active converters are also nerfed at 60 Hz
edit : the 60 Hz max is for 1080p only. Mine runs at 85 Hz when the res is dropped to 1200 x 1080. Like it says in the manual. Adapters are fine.
Mine outputs 85hz, but I had to mess with the custom resolution tool.
@@fatfalcon3434 Good point. I played around with the resolutions after posting and the lowerones can go up to 85 Hz.
What's interesting is that it basically means that both AMD and Nvidia stopped supporting analog signals at the same time they phased out Windows XP drivers, at some point they basically said screw the past.
I build a lot of hybrid systems that dual-boot Windows XP and Windows 7 or 10, and as such, an AMD 7990 or an Nvidia 980ti (they both required a slightly tweaked driver) card are very good choices on a fully XP compatible system such as an Sandy/Ivy Bridge CPU.
Windows 7 is obsolete and has no compatibility advantage for older games over Windows 10 (that I'm aware of).
The 980ti (and the top of the line Sandy/Ivy CPUs) are still passable, but of course you get performance that would pass for low-end new hardware.
Newer GPUs cannot display an analog signal over DVI, however there are transcoders available, that in theory would add a tiny bit of lag, but the fact is, for a strictly Windows 10 build, you can go for newer hardware with a transcoder and still use a CRT.
My current build is a 3770K w/ 980ti OC & 32gb and Auzentech Hometheater HD for EAX compatible Audio over HDMI.
It dual boots Windows 10-64 and XP-32, and is connected to three displays:
A 3D Vision compatible LCD monitor, a 3D Active 4K HDR TV (the last Panasonic TV to do so) and a FW900 CRT monitor.
It's all a bit long in the tooth, but I'll upgrade all of it after a while when all the next-gen consoles are on the market.
I will of course not throw away this awesome high compatibility equipment.
I don't have experience with interlaced custom resolutions, though I know some people have had success with them. Google is your friend for that. Good luck.
I would definitely tweak the driver to support XP and dual boot the two operating systems, it should still work under a Ryzen motherboard.
I'm never letting a Maxwell card go to waste when it comes to fast XP gaming.
I don't know, supposedly no Pascal card natively support VGA, though I had a look on google and obviously there are some 1030 with a D-sub connector, but it may have an internal DAC that adds a tiny bit of lag, don't quote me on that.
Anyway the thing is that if you look around, you can still find some 950 cards for cheap and they perform faster than a 1030 with guaranteed native analog over DVI. Some of which don't even require an additional power connector, so if price and efficiency is an issue, I would hunt one of those instead. But again I'm a stickler for XP compatibility as well, which you might not be.
I went for a R9 270x with a passive adapter and emudrivers, as AMD graphics cards can output analog signal through DVI-I up to the R9 380x.
I'm getting a new build and am looking at a GTX 1660 super.
How well would it compare to a Titan X for gaming on a CRT?
Very informative video, thanks for making it!
Now if only I could find a high-end 19-22 inch CRT monitor that can do 2048x1536@85Hz.
I found Iiyama 510 Pro for 50 eur recently. Close enough. Although I wouldn't want to run it higher than 1600x1200 due to dot pitch limitations.
dell p1130 can do that progressive and it can do it at 96hz on a 3060ti, tested myself
This is why I'm glad I bought a new 980 Ti in 2016, for $400, instead of waiting for the 1070s to come back in stock.
Still informative all these years later, thank you. I still have my GTX 980 and it seems to support analog out. My question is does it help image quality? I've been using a VGA to HDMI converter and it seems a little blurry even at supported resolutions.
It would have been nice to show some budget-friendly options too. These GPUs seem like ridiculously overkill for a retro game setup.
980 ti is very cheap now. This video is nearly 7 years old
@@pimplepimp Believe me: not so cheap here in Brazil.
@@ph_stuff can't account for prices all over the world, just general areas.
I mean he showed the list in the video, retro gaming doesn't require this OP setup. For me it was $130 for titan Maxwell, but I can only imagine what that'd be elsewhere.
You don't need a ton of horsepower for retrogaming just something that does analogue signal really..
you can input your analog into my output any day
mattguad miss you guad
I’ve heard that the 1080ti was the last card to support interlaced. I want to go with that, but is it a good idea? I use a cpd-g420s. The only game I play competitively is ssbm so I’d just use progressive for that. But I play Minecraft a bunch and would like “higher” resolution, so interlaced seems appealing. Also how do I go about finding the highest/best interlaced resolution for my monitor? I do have cru btw.
I found how to find the best resolution. Just use cru and put in 4:3 resolutions in and find the highest one that is lower than your monitors horizontal frequency. Thanks for nothing anyone who saw this and knew how, but didn’t say anything. It took me months to find out assholes.
still need help?
@@Polowogs nah, I know how to now, but wow I don’t remember writing so angrily lol. Also btw the 1080ti is not the best that does interlacing. It’s the titan x pascal. Not much better than 1080ti, but still better.
@@socaranectien1933 you can use interlacing through an intel integrated gpu and render with an RTX 30 series. its a bit limited but you can do high res like 1920x1440i@140hz, 2048x1536i@120hz, 1440x1080i@200hz, 1280x960i@240hz. lemme know if that sounds interesting
@@Polowogs wow, really? That does sound interesting. How do you achieve this?
Do you still stand by these recommendations in 2021? Thank you for this.
Yes, except for that you are likely able to use a newer nvidia card with an older one with analog out as PhysX card. And then use the video out on that one. I haven't tested personally but coppper4eva did and what he described sounded like it does work. And also keep in mind that if you do that you would probably have a lot more troubleshooting to do then getting one of the graphics cards I mentioned.
What if you wanted to use two CRT's? The 980 ti and 280x both only have the one analog output. What are the best cards that have multiple analog outputs?
make an audiobook so i can listen to your voice while i fall asleep daddi
Very informative man
Hey man,i got 1050 ti and crt samsung syncmaster 160hz monitor..
Can you tell?wich adapter or cabel i should use for this ?? Vga to display port or vga to dvi d???? Plz tell
Does anyone know how to get interlaced resolution working on GTA 4? i have been able to make it work with csgo but idk how to do it with gta4.
what's the lower bound for interlaced resolutions using an nvidia card?
i might have to opt for an r9 380x because linux. Thoughts on that gpu?
can i use a dvi i to vga adapter for my 980 ti for a crt monitor?????
Fascinating, thank you! I just got my 19inch Viewsonic and intend to use it with my RTX 3080 laptop via USBC to VGA converter. Do you think that would be viable?
Your bandwidth will be capped by a lot, but It depends on what that CRT is meant to do.
@@3hited it's the G90FB, maxes out at 97 khz, so no unicorn unfortunately. Turns out I can max it out as per the spec sheet but for some reason interlaced mode won't work, not even in low res which kinda sucks. Any ideas?
@@adamplechaty I've heard from some people that the work around to get interlaced mode to work with nvidia was broken at some point. It's kind of unfortunate, only solution is probably to use really old drivers if available.
@@3hited I guess some retro PC with VGA out would solve this... thanks!
@@adamplechaty interlaced resolutions are not supported on GPUs for nvidia past the 1080ti I’ve heard. I don’t know if the titan x pascal works for them though. People feel like idiots and forget about the titan x cards, it seems, including in this comment section even though the maxwell is in the god damn video. Maybe they haven’t mentioned the pascal because they are forgetting about the titan x cards.
I’ve heard rumors, however, that it’s possible to use one new card for processing and then you can run that card through to an old card with analog out. My problem is that I’ve only heard rumors and these fucking people never explain how the ?&@! to do it. I don’t even know if they’re able to do interlaced or not!!!!! Djsjakbshaj so god dam infuriating to spread interesting rumors like this and completely ignore the fact that YOU STILL HAVNT EXPLAINED HOW THE &$@“ TO DO IT!!!!!!!!!! I’ve looked it up for so long and can’t find how
What would be reasonable prices for a titan x or 980 ti
rx 570 or rx 580 do not work on a monitor? Even with an interlaced signal?
How do you get the new drivers for amd cards on windows 8.1?
The best GPU that has analog is the GTX 980 Ti, u can get that over DVI-i.
As for composite and component the best GPU is GTX 285, u can get that over S-Video.
980 ti isn't the best, as it's basically a trimmed down titan x (maxwell). Then there is another graphics card that is the same, hardware wise, as a titan X (maxwell) but with twice the vram.
@@3hited I'm talking about standard GPUs, Titans are collective items, they cost way more then GTX 980 Ti, but provide little gains. Besides, for CRTs a GTX 980 Ti is way more then enough.
@@rodrigofilho1996 I think you're missing the point of this video. It's not about strictly what the best value is but it's about what is literally the best performer(s), which also depends on what games you're playing! I also made this video with the expectation that it would be useful much later in the future where the cost of these best GPUs becomes much less significant. That's why i listed the quadro card which is much, much, more expensive than a Titan X.
Also a titan X genuinely has a lot of value over a 980 ti, if you have a very high end CRT you might be doing 2048x1536 at 80/85Hz. Vram usage in that situation can go up pretty quickly. I brought one knowing that I would be sticking to a CRT for a long time and wouldn't be able to really upgrade again. I also know people that are also into CRTs quite a bit telling me they got a titan x. But, yes a 980 ti is a better value I agree.
The king of CRTs
lol tyvm
Amd had analog up to 300 series, not 200 tho with the 300 it was hit or miss. It was pretty much up to the brand releasing the card but still there’s a good chunk of 380x models with dvi-i. There are even 390 models with dvi-I. So you are wayyyyyyy off with your resolution limit
I'm not waaaaaay off. it's a firnware limit not some limit from lack of gpu power. Also I hope you can understand if you make a video about something and say something might have analog out you will proceed to get spam messaged by every person unsure. And from that era of graphics cards DVI-I ≠ analog out. They needed a ramdac as well. Do you know of any specific 390 models that have confirmed analog out ?
what gpu do you advice for crt now? i thinking about sell gtx1060(they have dvi-d, hdmi and dp) and buy something to play csgo on 100+ hz. tnx.
Did u figure it out bro?
So for example, if I have 1080 ti connected to my main ultrawide monitor and r9 280x to CRT, will it work?
I would say almost certainly not, but if you have both you could try it. Basically the way using a 2nd gpu works is just with nvidia's physx. You wouldn't necessarily be using physx but physx lets you use almost any nvidia gpu for physx stuff (which isn't even used much anymore). But one of the unintended results of that is it kind of enables that other GPU which lets you use it for more video outputs. I'm still not 100% sure if this works myself, but one of my viewers did contact me and said it worked for them, but this sort of stuff is always finicky.
I use an mx4000 to connect to a crt, in svideo, it was the only one that I found with 60hz in the svideo, I bought a gf 6200 but the svideo comes out 30hz interpolated, and other geforce are 30hz too, do you know any new generation gpu that be 60hz in crt component
using adv menu because using old pc not work in retroarch
I'm from the future, 2021, nah, quadro m6000 is still hell expensive.
I saw for a while they were at $500-600 but because of crypto mining they went up a lot. IIRC for some mining just having a lot of VRAM is super important, so the 24GB versions are really good for mining.
3dfx voodoo ... :)
Just bought an R9 270x for 60 euros.
can i play crt 144hz blackbars?
So im in a bit of a unique situation I guess, I use windows for my plex server ( I could probably migrate it to linux, but I cba to do that)
I should've just kept using my 5700XT and then bought an older card. processed on the 5700XT and output through one of the CRT emudriver cards.. BUT I needed some hardware decoding so I went with a titan X (maxwell) not thinking about it lol, oh well. I really wanted to just play old SNES and stuff.
So I'm gonna run RGB2COMP from retroink through it all while retaining my plex server's decoding. I shoulda waited til the 5090, since im swapping out my main rigs 3090.. but hey a titan X Maxwell for $120 why not will be fun :)
what are your specs
tl:dw titan X
R u on drugs?