thats true bro, i have a lots of monitors in my house and one thay I test them all, and my old crt of 15 inch was the best looking, playing cs go at high level i can true the tell, this monitor has an incredible arquitecture, high quality picture and excellente fresh rates
@@IAm-zo1bo there's no kind of buffer for that to happen, or anything really, pure analog tech, 4k 60hz? then you will get a 4k image every 16ms, want it sooner? crank the refreshrate up.
@@Purple__ motion blur on CRTs is better than most LCDs, and the best LCDs with motion blur-reduction techniques, ironically, have insane input lag caused by strobing backlight drives, BFI (black frame insertion) processing which can add a half a frame of lag or worse, and this is all cumulatively adding up. CRTs are the gold standard for motion blur, phosphor excitation and decay times happen in well below a millisecond. You can't say that about the black-to-black/grey-to-grey/fall-to-rise on any LCD, maybe OLEDs get closer but they still suffer from BFI-induced processing lag that CRTs simply do not have to worry about due to the nature of an electron scanning beam and phosphor-coated vacuum tube for drawing persistent images humans can make illusory motion out of.
I just managed to get an insane resolution at 60hz. 2560x1920 Unreal. The old CRT I grew up with had the power to beat my 1440p monitor I just bought. This is lunacy.
@@jwhi419 not even? my new 32inch ASUS PG32UCDM 4k 240hz QD-OLED will look better than any CRT ever did, would and likely will. CRT is good but OLED, especially QD-OLED (Samsung tech) or WOLED (LG tech) is insanely good. Its a lot better looking than any other tech atm. Micro LED (not Mini LED, not any current tech at least) could be the holy grail but its not existing yet in any purchasable form. So for the time being the panel in my monitor is the best monitor you can buy at the moment. There is a burn in risk with OLED (but CRTs also have a risk of burn in afaik), its still a possibility but ASUS gives 3 yr warranty and the amazing picture quality is worth the risk to me. so yes, 4k240hz QD-OLED isnt just comparable to CRT, its the superior looking tech, no CRT monitor on the market looks better. would be really cool tho to get the best possible CRT monitor, a newly built one with 4k res. If they could build a 4k 240hz CRT, I would gladly compare it my QD-OLED. But it isnt viable to make a 4k CRT. it would be to expensive to make, it would be big and heavy, as I previously said - it would probably also have a burn in risk (CRTs apparently have this risk just innately according to my searches), they have a lot of electromagnetic waves (and are also very sensitive to magnetic fields themselves), many could hear a type of coil whine, they use a fckton of power. Yes, they do have very fast response time (like OLED), but nowadays OLEDs are the way to go, you arent missing anything with a GOOD OLED screen.
I remember doing this as a kid :D Lemme try to explain how the whole interlaced vs progressive thing works: With interlaced framing/refresh, you're effectively halving the vertical resolution to achieve a higher scan refresh rate. In the case of a 1440 line interlaced vertical resolution, each displayed frame is effectively only 720 horizontal lines per scan-pass, and each scan-pass alternates back and forth between the odd and even scan lines on the display. Due to the the nature of CRT's inherent softness and quick response, the alternating between the odd and even scan lines is effectively imperceptible to the naked eye. With progressive framing/refresh, each frame utilizes the entire vertical resolution of the display, and the vertical scan lines for each frame are 'progressively' scanned one-at-a-time instead of alternating between odd and even horizontal scan lines. On CRT's, especially at higher resolutions, progressively scanned images will generally appear brighter, but they also may tend appear to flicker more prominently at lower refresh rates.
I miss my view sonic so bad. I should never have changed to flat panel. CRT was heavy and worked forever. That's why they had to be replaced with thousand dollar tvs that need replaced every 5 years.
More like 2 years ... my 55' samsung 4k tv lasted exactly that long, broke 2 months after warranty ended ( purple spots all over the panel ) btw and that was already the second panel, my original one was pretty much broken the day i bought it .. clouding and dark spots, got it replaced 6 months after i had it
There was this weird period of time where everyone was stuck at 60 Hz with LCDs for over a decade. I really did miss gaming at 85 Hz on a CRT, but the sharpness of LCDs was just so much more compelling. Glad we now have 120 Hz and OLED tech now to match all best the qualities CRTs had.
we don't. not even close. you can't beat 0 input lag and 0 motion blur for FPS gaming. oled is the only thing that will even come close to rivaling CRT technology, and its not even as good.
@@BradTate there is some input lag even on a CRT, but it's analog and effectivity less than 1 frame. A digital signal could as render within 1 frame as long as no buffering of the data stream takes place. I'm not sure if such a protocol exists though.
A "crappy HDMI to VGA adapter" may not be good enough for high resolutions and refresh rates. Most of them have a maximum pixel clock of 165MHz (which is about what you get with 1920x1080@60Hz). Older video cards could do up to 400MHz on their VGA outputs. A CRT monitor is basically limited by the horizontal and vertical frequencies. Newer models (like yours) detect if the signal is out of range and show and error message (older ones either show a distorted picture/nothing at all or just blow up). So, the problem with image quality that probably is because your HDMI to VGA converter is overclocked and producing those problems. As far as know, Delock 62967 (Display port -> VGA) is one of the few converters that can do high pixel clocks. Though it has problems with my KVM switch (plugging the monitor straight to it works fine).
@@laserfights Looks like 87685 is also lower frequency From the Delock site "VGA resolution up to 1920 x 1200 @ 60 Hz" - though it looks better than the usual HDMI VGA adapters that only go up to 1920x1080@60Hz. It's probably 200MHz compared to the usual 165. So you may have the same problem at higher resolutions or refresh rates, at least according to official specs. Maybe the device can actually work at higher frequencies than the specs say though.
@@Pentium100MHz i was under the impression that the 87685 is comparable to the DPU3000. Though, I have only heard that on r/crtgaming, I do not have the DPU3000 in hand to test.
@@laserfights The specs for DPU3000 do not explicitly say what's the maximum pixel clock for VGA, however, apparently it supports "max 2560x1600piex@60Hz each monitor." for "DisplayPort1.2 MST source: (multi video/audio steams)" which would be about 350MHz. I don't know if it's possible to have the MST source for a single monitor, but maybe.
I never had the same feeling that I had with a CRT on LCD. I would pay a big amount of money to have a new CRT, the best experience, sharpness picture on quick movement.
I remember playing 1.6 1024x768@75Hz. It was so much better and smoother than the 1920x1080@60 TN with 7ms display I'm currently using to play Global Offensive.
LCDs have an underappreciated problem. The illumination on an LCD is constant where as a CRT briefly illuminates very brightly for a few tens of microseconds when the beam scans a pixel (effectively it's like a rolling shutter scanning across the screen). If your eye is tracking an enemy that moves across your screen side to side in 1 second (which is not that fast, see e.g. an arena shooter like quake III or UT) on a 1920x1080 monitor at 60 Hz this means that your eyes move 32 pixels in a frame in a continuous motion. The image however moves in discrete steps of 32 pixels per frame. This mismatch between continuous analog motion and the discrete, discontinuous motion of the picture on the screen smears the image as your eyes scan across those 32 pixels smoothly. The effect is as if you had a 32 pixel "linear blur" filter in photoshop and it looks clear as mud. If you have a CRT however, the image briefly flashes in less time than it takes your eye to move 1 pixel; the after image on your retina is stationary, but your eyes are moving, giving no persistence blur at all and the illusion of smoother movement than the screen is capable of actually displaying. Flashing the image briefly (BFI etc.) is very helpful as long as it is occurring faster than the flicker fusion frequency, around 75-85 Hz for most people, so that the screen doesn't flicker noticeably.
I have a 200hz hp crt. Man. It looks amazing. You just can't best that 3d look. Those amazing colors. And thirdly the fucking lack of motionblur and response time. Oh wait... the no pixels part? Leaving everything looking seamlessly natural to the naked eye
I spent £700 back in 2002 importing a Diamond Mitsubishi 22" SB flatscreen CRT from the USA to the UK. I was exclusively playing Quake III OSP back then, mostly as a dueller but also in Division 1 TDM Euro Cup. Needed a good monitor and that was considered one of the best. That thing was insane - absolutely huge but amazing display quality. Most importantly it could do 1024x768 @ 160Hz. It pushed my Geforce 3 to it's limits and I had to buy a Geforce 4 Ti 4800SE the following year just to feed the screen with enough FPS. The max resolution was also insane, well beyond what was usable or feasible especially for gaming. So much desktop space! And on older titles like Starcraft and Diablo II, hitting that "Super Bright" button lit the extra bulbs inside the monitor and gave them a completely different look and feel. I kept it until 2007, about the time when I stopped taking Quake seriously and started playing only for fun. Sold it to a friend of mine who kept it for many years. I moved over to a 24" Asus LCD and eventually an LG 42" OLED in recent years. But I do miss the precision and the speed and the general awesomeness of that Diamond CRT! Greatest peripheral I ever bought. Except maybe the Razer Boomslang....
@@JinnaiT That is untrue mate, crts emit only xrays and not radioactive radiation like alpha- beta- and gammarays. Still ionizing radiation though, but radioactive poisoning ist not even remotely possible, lead infused or not (you can't get poisoned by xrays, they don't emit particles except photons) Also only > 1% of total radiation are xrays which most of radiate in opposite direction of the viewers position (better not stand behind it in case of more "ancient" models). More modern TCO standards also made it completely safe to use certified crts...
@@sentheaS It can be, but it isn't like photons = gamma rays. We have radio waves and microwaves, infrared and (ultraviolet) light, X-rays and gamma rays. All of them are made up of photons.
When running interlaced, the higher the refresh rate, the better it works and the less you are likely to notice a difference. This monitor should be capable of 1600*1200 interlaced at 180hz, more fluid than any other display I know of, even 120hz OLED, 240hz IPS or even 360hz VA. Try it. For pixel timings choose CVT:Reduced blanking. Try it with CS:GO. This can actually give you a competitive advantage. Of you have problems try a different converter or an RX 390X
Great video I wish I had that monitor (I have a Viewsonic A90 which is not bad but not as good as that one), but just letting you know that monitor can do way higher refresh rates then that. It has a horizontal refresh rate of 110khz so if you do the math which is 110000 divided by the horizontal resolution you want to use times .95 and that will give you the max refresh rate for that resolution in progressive mode not interlaced. So lets say you wanna do 1024x768p you take 110000/768x.95= 136.0677. . . So you could run 1024x768 at 136hz max without using interlaced resolutions. Only problem is that I don't know if your hdmi to vga adapter can handle the bandwidth. That adapter might be the only thing holding you back. look into getting a higher quality one or if you have a usb c out on your gpu look for a usb c to vga 4k adaptor.
@@laserfights Hey np, here's a great guide to using CRTs that someone made and posted here it helped me learn a lot about mine linustechtips.com/main/topic/1050162-navs-complete-crt-guide/
@@laserfights Also I've been looking for more content about modern gaming on CRTs, most people that are into these monitors are mostly into retro gaming. I've already posted a couple short benchmark videos on my channel just to get some more stuff out there, but I was thinking you should also focus more on CRTs and Modern games I think people would really enjoy that kind of content (I know I do) and it helps spread the word about how good a nice CRT really can be. It's a really unique experience.
@@TripleMoonPanda I would absolutely be interested in posting more crt content. If you have any suggestions or even places/guides/etc for more information I would greatly appreciate it.
@@laserfights Well Digital foundrys videos on the subject are a great place to start, but you might've already seen that. I'm in this facebook group call the CRT collective and there are a lot of great people who love CRT monitors and TVs also there's the CRTGaming subreddit. That's most of the best places I know to go for information. I'm going to start trying to post some more videos actually on the subject I just don't know where to start, I'm still learning about all this stuff also. But if you have some simple questions just throw them my way and I'll do my best to help out.
(I have been corrected, read the edit.) Please be careful. Overclocking a CRT can be EXTREMELY DANGEROUS. Unlike the flat screens of today, CRTs can explode if put under too much stress. They’re vacuum sealed, and full of energy. (I have been corrected. I confused overvolting, and overclocking. The original comment is archived to avoid confusion.)
@@WhoTnT I thought CRTs were volatile when too much voltage was applied. There was a guy who did a lot of overclocking videos. He did a few involving CRTs, but I found him for his video involving a washing machine. The overclocking it's self didn't break it, but he threw a brick into it, and it began tearing its self apart. The CRT video featured a TV getting more and more voltage until eventually the screen stopped working with a loud bang. Perhaps he put some kind of explosive in it, but this was from a long time ago.
What memories the web CNET. A few years ago they removed all traces of CRT monitors, and many of these now have almost no information on the Internet :'/
The worst part is many of those pages were not even saved to the wayback machine so they really are gone, I had one saved in my bookmarks for my Compaq 21" CRT many years ago, and now I wanted to check the specs again and boom it's gone, no trace of it not even in the archive's wayback machine, the manuals are gone too, it reaaaally sucks.
for the green tint and artifacts, im not sure if it helps but setting the color space from Full to Limited fixed my issue. anyone who is having this issue might wanna try that
The green issue is because of analog bandwidth exceeding the capability of the hdmi to vga adapter. A solution would be to use a sunix dpu3000 which allows you to use 600mhz analog bandwidth(above even native analog cards like 980ti at 400mhz). Maxes out any crt except some crt projectors, even enabling 4:3 4k60 on the monitors which can scan it like the 2070sb. By the way, this is not overclocking. You call it overclocking with fixed pixel displays because they're designed to hit one specific refresh rate at native res, but can sometimes go above that refresh rate hence the "overclocking" moniker. Crts just have a minimum and maximum vertical refresh rate(hz), and a minimum and maximum horizontal scan rate(khz, determined by vertical resolution and vertical refresh rate). Anything within those two ranges will work, it's designed to. Overclocking a crt's refresh rate would be exceeding those specifications by modding the monitor in some way. Cheers!
The people at Digital Foundry use a USB-C to VGA adapter to connect their C.R.T. computer monitor to a RTX 2080 ti. Do you know if USB C to VGA connectors to VGA suffer from this green screen coloring, due to a lack of bandwidth? How does DVI-I/DVI-A to VGA compare to USB-C, lagwise? One is analog to analog, one is digital to analog.
@@badreality2 yes at a certain point they can, but usually they'll just cut out which an hdmi to vga will as well if you go too far above the bandwidth capability. Dvi-i to vga is lagless because it's analog to analog. The pins for vga are on the header of dvi-i so it's not even a conversion, simply a cable adapter. With digital to analog transcoding(no scaling, just signal conversion which is what these devices do) added lag is under 1ms, so subjectively zero.
@@copper4eva Unfortunately because it went out of stock a few months back yes it is rare and expensive. A good price would be 60-80 usd, which is what it cost originally but now that it's hard to find I've seen ridiculous prices. There may be rebrands of it with the same chip floating around the net somewhere.
@@displaytalk That original price is very fair. Too bad I suppose that it's now being sold higher. I think people should just get cheap analog graphics card and run it along with their modern card (as I mentioned in other comments). Just a cheap and for sure way to hook up a CRT.
Interlaced works in something called "fields". It displays half of it in one field, and the other in a different field. So it causes the "flicker" associated.
If you want, you can buy an older gtx card (like a 900 series or whatever) that has analog out. And if you have two pci express slots on your motherboard, you can run it together with your 1080 Ti. Like, your 1080 Ti will still be doing all the work, the 900 series will literally just be outputting analog for you. Basically an over glorified digital to analog converter. You can get an older Nvidia pretty cheap of course. That CRT is a beast. You can definitely get higher refresh rates with an analog connection rather than your hdmi converter. I was able to get 1280x1024@137Hz, and my CRT ain't half what yours is. I wish I could get once like yours, that thing is a beast. These things can be overclocked. th-cam.com/video/Ya3c1Ni4B_U/w-d-xo.html This dude got his running at 2496x1872@60Hz. Completely insane what some of these things can do.
@@webb4158 Funny you should ask, on the crtgaming reddit somebody said in the comments that they had done it with AMD graphics cards. So, if he is to be believed, then yes, you can. It really works rather well, if you have a desktop that can support it obviously. You might have trouble getting the OS you use to figure out that you have two GPU's, but once you do, then you're good.
@Big Smoke I have a 5 monitor setup with a 960 GTX and 1060 GTX. So multiple monitors are fine. Only one analog out though sadly. I guess you'll have to look at older cards, which might have multiple analog outputs, but I'm not sure.
via chat gpt: can ur crt monitor explode if you over clock it too much? ChatGPT said: ChatGPT While it's unlikely that a CRT monitor would literally explode from overclocking, overdriving it can cause serious problems. Overclocking a CRT monitor means increasing its refresh rate beyond its intended specifications, which could lead to: Overheating - Components inside the monitor can overheat, potentially causing a failure. Degraded image quality - Flickering, distortion, or misalignment of the image may occur. Shortened lifespan - The higher strain on components can reduce the monitor's lifespan. Electrical failure - In rare cases, pushing a CRT too far could cause internal components, like capacitors or transistors, to fail, possibly with smoke or sparks, but not an explosion. It's best to stick within the manufacturer's specifications to avoid damage.
you can lower or alleviate any harm done when changing the operating resolution and frequency by offsetting the values when you make changes. what i mean by that is if you want more resolution, then drop the refresh rate. that way, the bandwidth you are pushing can still be comparable to what you had before. there are actually a number of specs from the factory that do come like this. The listed Maximum Refresh Rates from the manual are: 1600 x 1280 NI @ 71 Hz 1600 x 1200 NI @ 76 Hz 1280 x 1024 NI @ 88 Hz 1024 x 768 NI @ 116 Hz 800 x 600 NI @ 146 Hz 640 x 480 NI @ 180 Hz 640 x 400 NI @ 180 Hz And then you can just play in between these values and still have a pretty safe use.
We threw two perfectly functional crts at the local dump, just what you did when everyone moves to LCD tvs and LCD monitors thinking CRTs were old news, kept my CRT tv for years but ultimately it stopped working as it was a normal cheap tv as opposed to the actually decent monitor.
It looks smooth and it has a certain fidelity to it that really makes you feel like you're playing a video game. Curious as to how I can explore the possibilities with this. Doom Eternal is certainly the perfect game for this, and it is a very fun game.
watching this 2024 with my old Sony trinitron at 1280 x 960 and 85hz. just swapped from my main 165hz monitor and i can tell that 165hz LCD has never been smoother than this 85hz CRT. if i could get my hands on something like even 21" CRT, i propably stick with that as long as it has time left
The main downside is they use VGA or in the case of the high end models with crazy resolutions - DVI. Which are formats which have both been discontinued; Thankfully, as you said because they interlace differently, using a converter doesn't impact the video quality. We're talking interlaced VGA and DVI here which translates crisper than even Analog RGB - not something like composite. Yeah with this sort of stuff it's all analog, we're talking wave signals here, not digital LCD pixel signals.
If you use any adapter then you will have input lag and it won't work properly with hz. You have to use VGA ports and the last video card with VGA on is the GTX 750 ti. I use that graphics card with a CRT and it feel so smooth and it runs at 130hz. So amazing!
when was nfs most wanted at its peek?! i still had a 1024x768 crt at that time and the integrated nvidia 6600 or something like that was so bad i had to play it at some x800 resolution and it still looked freeking awesome. Bring crt back!!!!!
i can push my old Visio tv at native 1360 by 768 to 144hz not the smoothest and def tears but this its pretty cool too think about the refresh rates i could of been getting when it was my main monitor.
I know this is an old video, but I have the same monitor and ran into the green artifacts in blacks at high resolutions. It's because you're outputting full RGB. From what I've read, a lot of hdmi converters don't like the wider darkness range of full RGB. If you switch your output to limited RGB, the green will go away.
There are IBM and Sony professional Trinatron monitors that could 120hz 1600x1200 progressive. I know i had one. It was 21 inch flat screen trinatron and it was massive weighed darn near 100lbs. Was amazing for motion. Just took up tomuch space to be practical. Only monitor that has kept up with it is my newest 144hz LG ulta gear ultra wide and it cost nearlyn 1000.
I wonder if you can use with your PC with this monitor the CRT and use a program on your desktop called "custom resolution utility" you can go pretty high with it guarantee you that but I would first watch how to do it and understand it first
My monitor can do 800x600 120 hertz / 1024x768 100 hertz / 1366x768 86 hertz / 1920x1080 60 hertz and many other cuztomized resolutions but those are the ones i mostly use... motion clarity is at another level if compared to LCD, LED,OLED... that said, i must confess i prefer using LED over CRT because of the flickering, it's so hard on the eyes that it makes you want go running back to LED,/LCD- IPS.
oh that’s unfortunate, I’ve been daily driving an OLED for awhile now. Are you prone to headaches or what’s happening? i haven’t even really considered the flickering but I also don’t really hear CRT whine and people comment about it a lot haha
@@laserfights Yes man, sometimes i have headaches when i use CRTs mostly by the result of the flickering, i love CRTs but I guess I'm not used to them anymore, see I grew up with these monitors and used them on a daily basis until 2007.. Now i can't stand looking to the screen for much time because it hurts my yes badly and it feels hard on the eyes... even at high refresh rate, I love CRT motion clarity but i feel that's not a big deal to me anymore since modern LED displays are doing very well when you look for motion clarity and response time. When I want to revisit old games that fit better on CRT I use Scanlines and Dithering to emulate a CRT and i'm starting to enjoy more than playing on a native CRT for some reasons... I still have 3 of those big boys here but i never hook them up anymore...
Download CRU custom resolution utility. You can set resolution and timing to max of your horizontal frequency 240 or something for your monitor. Im on mobile phone so I cant see it well. You can also enable interlaced resolution which doubles screen height so you can have the same resolution at max 160hz or how much ur mon supports. Downside for interlaced image is its bad for text in windows, but its good for games and movies.
There's a better tool for overclocking monitors, called "Custom Resolution Utility" or CRU. It's much more complicated but it gives you access to things like custom resolutions, refresh rates, and even detailed timing parameters that go beyond what the standard control panels from NVIDIA or AMD allow. However, its flexibility comes with risks, such as potential damage if you push settings too far beyond the monitor's specs. For advanced users who understand CRT limitations, CRU is a powerful tool, but it requires careful handling.
oh wow. i kinda want this monitor. but wouldn't the phosphors be active to long and give you ghosting? this is a very interesting video. i want to know how long the actual pixels glow. i have a lot of question about the mechanics of this monitor.
Damn, no matter what game it is and how modern, it still gives me Counter Strike 1.6 vibes because it's coming out of that massive tube. Also would have been better to see it at night with no reflections. And would be cool to see it compared to an OLED.
I thought only SONY Trinitron's had HD CRT's. Wow. Edit: I thought that 1440p 120hz was a 2020+ thing they had invented. Cool to know it was already available waay back in time.
I could never go back to a CRT as I have no room for it and my eyes are getting quite bad so a 21" monitor is too tiny for my eyes. Even my 43" is a little too small. I am on the hunt for a 60", but they are too expensive at the moment. But, I am glad you are happy with yours.
I think the custom resolution part was the overclocking part. Though, there wasn't some super balancing act of 10 sliders to get the extra hz, it still did the job to get get interlaced 87fps (2x refreshrate).
Xbox series x will only support up to 120hz and was wondering if a CRT is faster than a 240hz monitor when both are at 120hz. Would the 240hz still on 120hz outspeed a CRT at 120hz? I know for sure the 240hz at 240hz is faster but was wondering this since I’m buying a monitor for Xbox series x.
@@User_3303_ there are lots of factors with things like ghosting, response time, and etc that aren't accounted for via refresh rate numbers like 240hz and 120hz. If you are buying a decent 240hz monitor, then I would think it would be much easier to have support for newer games, and I would recommend going with something newer based on that alone.
*i mean, it's a curved screen, who doesn't like a curved screen?*
Curved the other way
@@x256h *well its still a curved screen*
Ronaldo Ross me
@YDBoss youre dumb
me
imagine telling ur friends that ur monitor is also overclocked
gotta go fast
thats true bro, i have a lots of monitors in my house and one thay I test them all, and my old crt of 15 inch was the best looking, playing cs go at high level i can true the tell, this monitor has an incredible arquitecture, high quality picture and excellente fresh rates
I overclocked my laptop monitor to do 90hz. Its not uncommon even for LCD displays
rn i got a smooth 75hz
That Plotagon profile pic 👍🏻
No lag input, no motion blur... I would take it.
If it has too high resolution it has inout delay
@@IAm-zo1bo Nope.
@@IAm-zo1bo there's no kind of buffer for that to happen, or anything really, pure analog tech, 4k 60hz? then you will get a 4k image every 16ms, want it sooner? crank the refreshrate up.
There is motion blur.... the phosphors don't directly go back to black. If its better then high-end LCDs is debatable nowadays
@@Purple__ motion blur on CRTs is better than most LCDs, and the best LCDs with motion blur-reduction techniques, ironically, have insane input lag caused by strobing backlight drives, BFI (black frame insertion) processing which can add a half a frame of lag or worse, and this is all cumulatively adding up. CRTs are the gold standard for motion blur, phosphor excitation and decay times happen in well below a millisecond. You can't say that about the black-to-black/grey-to-grey/fall-to-rise on any LCD, maybe OLEDs get closer but they still suffer from BFI-induced processing lag that CRTs simply do not have to worry about due to the nature of an electron scanning beam and phosphor-coated vacuum tube for drawing persistent images humans can make illusory motion out of.
I just managed to get an insane resolution at 60hz. 2560x1920 Unreal. The old CRT I grew up with had the power to beat my 1440p monitor I just bought. This is lunacy.
We just never had cards that could render 4k at all to show the potential of crt in the early 2000’s
I remember laughing at lcd gamers back in the day. Lcds didn't match crt until never. Oled is the only comparable tech to crt.
@@zapa47not even. They need to reach 1000hz to start simulating crt strobing. Atleast they'll be here in 2028. Not too far off
@@jwhi419 not even? my new 32inch ASUS PG32UCDM 4k 240hz QD-OLED will look better than any CRT ever did, would and likely will. CRT is good but OLED, especially QD-OLED (Samsung tech) or WOLED (LG tech) is insanely good. Its a lot better looking than any other tech atm. Micro LED (not Mini LED, not any current tech at least) could be the holy grail but its not existing yet in any purchasable form. So for the time being the panel in my monitor is the best monitor you can buy at the moment. There is a burn in risk with OLED (but CRTs also have a risk of burn in afaik), its still a possibility but ASUS gives 3 yr warranty and the amazing picture quality is worth the risk to me.
so yes, 4k240hz QD-OLED isnt just comparable to CRT, its the superior looking tech, no CRT monitor on the market looks better.
would be really cool tho to get the best possible CRT monitor, a newly built one with 4k res. If they could build a 4k 240hz CRT, I would gladly compare it my QD-OLED. But it isnt viable to make a 4k CRT. it would be to expensive to make, it would be big and heavy, as I previously said - it would probably also have a burn in risk (CRTs apparently have this risk just innately according to my searches), they have a lot of electromagnetic waves (and are also very sensitive to magnetic fields themselves), many could hear a type of coil whine, they use a fckton of power. Yes, they do have very fast response time (like OLED), but nowadays OLEDs are the way to go, you arent missing anything with a GOOD OLED screen.
@@zapa47 lol
I remember doing this as a kid :D
Lemme try to explain how the whole interlaced vs progressive thing works:
With interlaced framing/refresh, you're effectively halving the vertical resolution to achieve a higher scan refresh rate.
In the case of a 1440 line interlaced vertical resolution, each displayed frame is effectively only 720 horizontal lines per scan-pass, and each scan-pass alternates back and forth between the odd and even scan lines on the display. Due to the the nature of CRT's inherent softness and quick response, the alternating between the odd and even scan lines is effectively imperceptible to the naked eye.
With progressive framing/refresh, each frame utilizes the entire vertical resolution of the display, and the vertical scan lines for each frame are 'progressively' scanned one-at-a-time instead of alternating between odd and even horizontal scan lines. On CRT's, especially at higher resolutions, progressively scanned images will generally appear brighter, but they also may tend appear to flicker more prominently at lower refresh rates.
I miss my view sonic so bad. I should never have changed to flat panel. CRT was heavy and worked forever. That's why they had to be replaced with thousand dollar tvs that need replaced every 5 years.
More like 2 years ... my 55' samsung 4k tv lasted exactly that long, broke 2 months after warranty ended ( purple spots all over the panel ) btw and that was already the second panel, my original one was pretty much broken the day i bought it .. clouding and dark spots, got it replaced 6 months after i had it
Are you joking? CRT get screen burn-in pretty easy and the brightness is terrible.
I've never had a flat screen last more then like 2 years with out breaking in some way
@@張彥暉-v8p your wrong crts can be very bright and screen burn in isn't really that easy an example of a bright crt is an IBM 5151
@@ducksonplays4190 IBM 5151 is a low resolution single color display, you compare it with full color crt and other display tech? LOL.
There was this weird period of time where everyone was stuck at 60 Hz with LCDs for over a decade. I really did miss gaming at 85 Hz on a CRT, but the sharpness of LCDs was just so much more compelling. Glad we now have 120 Hz and OLED tech now to match all best the qualities CRTs had.
we don't. not even close. you can't beat 0 input lag and 0 motion blur for FPS gaming. oled is the only thing that will even come close to rivaling CRT technology, and its not even as good.
@@BradTate there is some input lag even on a CRT, but it's analog and effectivity less than 1 frame. A digital signal could as render within 1 frame as long as no buffering of the data stream takes place. I'm not sure if such a protocol exists though.
@@BradTate also, good OLED or LCD should have no motion blur than a CRT now since they have no phosphors that retain an image.
LCD is shlty.
@@BradTate facking true.
Peasant : Overclocked CPU
Gigachad: Overclocked monitor
Omnichat: Overclocked CMOS battery
@@chidubem826mirachad :
Hi..
The best budget gaming monitor you can get, coming to an attic near you
Aww man. I miss CRT's so much!
A "crappy HDMI to VGA adapter" may not be good enough for high resolutions and refresh rates. Most of them have a maximum pixel clock of 165MHz (which is about what you get with 1920x1080@60Hz). Older video cards could do up to 400MHz on their VGA outputs.
A CRT monitor is basically limited by the horizontal and vertical frequencies. Newer models (like yours) detect if the signal is out of range and show and error message (older ones either show a distorted picture/nothing at all or just blow up). So, the problem with image quality that probably is because your HDMI to VGA converter is overclocked and producing those problems.
As far as know, Delock 62967 (Display port -> VGA) is one of the few converters that can do high pixel clocks. Though it has problems with my KVM switch (plugging the monitor straight to it works fine).
ive actually had a delock 87685 in my hand for awhile, been wanting to make a followup video
@@laserfights Looks like 87685 is also lower frequency
From the Delock site "VGA resolution up to 1920 x 1200 @ 60 Hz" - though it looks better than the usual HDMI VGA adapters that only go up to 1920x1080@60Hz. It's probably 200MHz compared to the usual 165.
So you may have the same problem at higher resolutions or refresh rates, at least according to official specs. Maybe the device can actually work at higher frequencies than the specs say though.
@@Pentium100MHz i was under the impression that the 87685 is comparable to the DPU3000. Though, I have only heard that on r/crtgaming, I do not have the DPU3000 in hand to test.
@@laserfights The specs for DPU3000 do not explicitly say what's the maximum pixel clock for VGA, however, apparently it supports "max 2560x1600piex@60Hz each monitor." for "DisplayPort1.2 MST source: (multi video/audio steams)" which would be about 350MHz. I don't know if it's possible to have the MST source for a single monitor, but maybe.
lol, blow up? I'd like to see that
I never had the same feeling that I had with a CRT on LCD. I would pay a big amount of money to have a new CRT, the best experience, sharpness picture on quick movement.
totally agree, hopefully OLED starts to get close
I remember playing 1.6 1024x768@75Hz. It was so much better and smoother than the 1920x1080@60 TN with 7ms display I'm currently using to play Global Offensive.
I still use crt 160160hz. Lcd waste of money.
Probably because you didn't disable vsync
@@ChineseNinjaWarrior wtf
LCDs have an underappreciated problem. The illumination on an LCD is constant where as a CRT briefly illuminates very brightly for a few tens of microseconds when the beam scans a pixel (effectively it's like a rolling shutter scanning across the screen).
If your eye is tracking an enemy that moves across your screen side to side in 1 second (which is not that fast, see e.g. an arena shooter like quake III or UT) on a 1920x1080 monitor at 60 Hz this means that your eyes move 32 pixels in a frame in a continuous motion. The image however moves in discrete steps of 32 pixels per frame. This mismatch between continuous analog motion and the discrete, discontinuous motion of the picture on the screen smears the image as your eyes scan across those 32 pixels smoothly. The effect is as if you had a 32 pixel "linear blur" filter in photoshop and it looks clear as mud. If you have a CRT however, the image briefly flashes in less time than it takes your eye to move 1 pixel; the after image on your retina is stationary, but your eyes are moving, giving no persistence blur at all and the illusion of smoother movement than the screen is capable of actually displaying. Flashing the image briefly (BFI etc.) is very helpful as long as it is occurring faster than the flicker fusion frequency, around 75-85 Hz for most people, so that the screen doesn't flicker noticeably.
CRT better due to zero motion blur. 🤗
I have a 200hz hp crt. Man. It looks amazing. You just can't best that 3d look. Those amazing colors. And thirdly the fucking lack of motionblur and response time. Oh wait... the no pixels part? Leaving everything looking seamlessly natural to the naked eye
crts are amazing but oled is amazing as well
Whats the model number so I can look up specs.
I'm only aware of an iiayma exceeding 200hz.
It hink this is click baity.... the title says "1440p 120hz", making it seem he was able to run at that res and hz at the same time....
I spent £700 back in 2002 importing a Diamond Mitsubishi 22" SB flatscreen CRT from the USA to the UK. I was exclusively playing Quake III OSP back then, mostly as a dueller but also in Division 1 TDM Euro Cup. Needed a good monitor and that was considered one of the best. That thing was insane - absolutely huge but amazing display quality. Most importantly it could do 1024x768 @ 160Hz. It pushed my Geforce 3 to it's limits and I had to buy a Geforce 4 Ti 4800SE the following year just to feed the screen with enough FPS.
The max resolution was also insane, well beyond what was usable or feasible especially for gaming. So much desktop space! And on older titles like Starcraft and Diablo II, hitting that "Super Bright" button lit the extra bulbs inside the monitor and gave them a completely different look and feel.
I kept it until 2007, about the time when I stopped taking Quake seriously and started playing only for fun. Sold it to a friend of mine who kept it for many years. I moved over to a 24" Asus LCD and eventually an LG 42" OLED in recent years. But I do miss the precision and the speed and the general awesomeness of that Diamond CRT! Greatest peripheral I ever bought. Except maybe the Razer Boomslang....
LG C4?
@@SS-uq6wv C3!
A screen with an ELECTRON GUN! those were the days.
Could also give you radiation poisoning, if there wasn't lead infused into the glass on the front!
@@JinnaiT That is untrue mate, crts emit only xrays and not radioactive radiation like alpha- beta- and gammarays. Still ionizing radiation though, but radioactive poisoning ist not even remotely possible, lead infused or not (you can't get poisoned by xrays, they don't emit particles except photons) Also only > 1% of total radiation are xrays which most of radiate in opposite direction of the viewers position (better not stand behind it in case of more "ancient" models). More modern TCO standards also made it completely safe to use certified crts...
Unless you spill liquid uranium on your crt and lick it up, then you can get radioactive poisoning from your crt haha... but better don't do it ;)
@@MrRobsn89 photons ARE gamma rays, just the ones in this case are fairly low energy. also, non-radioactive radiation is an oxymoron.
@@sentheaS It can be, but it isn't like photons = gamma rays. We have radio waves and microwaves, infrared and (ultraviolet) light, X-rays and gamma rays. All of them are made up of photons.
You can set it to progressive scan in the nvidia menu. I'm running 1024x768 at 116hz progressive on my NEC950!
Interlaced is better on CRT TVs and Monitors.
@@Miawzhies right, I think it's like the analog equivalent of doubling resolution or refresh rate without changing the bitrate...
I was in love with my old ViewSonic E50c that I got back in 2004 with my 1st PC ! That was a thing of beauty !
yeah i remember how my crt monitor wasnt plugged into a surge protector like my pc. and it caught fire when there was a huge lighting storm.
:(
I'd shit myself 😂😂
Sorry to hear that :-(
This video is incredible
watching this from my SyncMaster 794v OC at 113Hz :) still rocking with my old CRT!
50fps interlaced is not 100fps, in fact it is 50 times a half frame (every second line). so 50i uses the same bandwidth as 25p
cool bro. just wanted some1 to counter argue you but no1 did.😥
@@AugustoV8Cesar lmaooo
Try 1280x960 Progressive, and try to get it to 120Hz.
Progressive > Interlaced
Does that really matter with CRT? (I don't know, I haven't used a CRT for PC gaming ever)
@@muizzsiddique Yes it does, its less important on CRTs but a progressive image will be more clean when the image is moving even on a CRT.
@@rodrigofilho1996 Oh okay. Thanks.
what you mean? how is it better
@@AugustoV8Cesar Interlaced is actually half the refresh rate of progressive.
There was just something so truly special about CRTs you know that's why they had to take them away
When running interlaced, the higher the refresh rate, the better it works and the less you are likely to notice a difference.
This monitor should be capable of 1600*1200 interlaced at 180hz, more fluid than any other display I know of, even 120hz OLED, 240hz IPS or even 360hz VA.
Try it. For pixel timings choose CVT:Reduced blanking.
Try it with CS:GO. This can actually give you a competitive advantage. Of you have problems try a different converter or an RX 390X
Great video I wish I had that monitor (I have a Viewsonic A90 which is not bad but not as good as that one), but just letting you know that monitor can do way higher refresh rates then that. It has a horizontal refresh rate of 110khz so if you do the math which is 110000 divided by the horizontal resolution you want to use times .95 and that will give you the max refresh rate for that resolution in progressive mode not interlaced. So lets say you wanna do 1024x768p you take 110000/768x.95= 136.0677. . . So you could run 1024x768 at 136hz max without using interlaced resolutions. Only problem is that I don't know if your hdmi to vga adapter can handle the bandwidth. That adapter might be the only thing holding you back. look into getting a higher quality one or if you have a usb c out on your gpu look for a usb c to vga 4k adaptor.
Wow I had no idea, I'm going to have to do some testing. Thanks!
@@laserfights Hey np, here's a great guide to using CRTs that someone made and posted here it helped me learn a lot about mine linustechtips.com/main/topic/1050162-navs-complete-crt-guide/
@@laserfights Also I've been looking for more content about modern gaming on CRTs, most people that are into these monitors are mostly into retro gaming. I've already posted a couple short benchmark videos on my channel just to get some more stuff out there, but I was thinking you should also focus more on CRTs and Modern games I think people would really enjoy that kind of content (I know I do) and it helps spread the word about how good a nice CRT really can be. It's a really unique experience.
@@TripleMoonPanda I would absolutely be interested in posting more crt content. If you have any suggestions or even places/guides/etc for more information I would greatly appreciate it.
@@laserfights Well Digital foundrys videos on the subject are a great place to start, but you might've already seen that. I'm in this facebook group call the CRT collective and there are a lot of great people who love CRT monitors and TVs also there's the CRTGaming subreddit. That's most of the best places I know to go for information. I'm going to start trying to post some more videos actually on the subject I just don't know where to start, I'm still learning about all this stuff also. But if you have some simple questions just throw them my way and I'll do my best to help out.
(I have been corrected, read the edit.) Please be careful. Overclocking a CRT can be EXTREMELY DANGEROUS. Unlike the flat screens of today, CRTs can explode if put under too much stress. They’re vacuum sealed, and full of energy. (I have been corrected. I confused overvolting, and overclocking. The original comment is archived to avoid confusion.)
I really hope this is a troll comment because nothing you said is correct.
@@WhoTnT I thought CRTs were volatile when too much voltage was applied. There was a guy who did a lot of overclocking videos. He did a few involving CRTs, but I found him for his video involving a washing machine. The overclocking it's self didn't break it, but he threw a brick into it, and it began tearing its self apart. The CRT video featured a TV getting more and more voltage until eventually the screen stopped working with a loud bang. Perhaps he put some kind of explosive in it, but this was from a long time ago.
@@WhoTnTit’s not a crt screen is basically a vacuum tube
@@beanmchocolate3900 Overclocking is different from Overvolting.
So j wrote this comment with Zero Knowledge?? Source: trust me bro
you can do 1920x1440i 140+hz on that display, you need to download CRU and use an interlaced resolution
It looks so smooth and flawless.
What memories the web CNET. A few years ago they removed all traces of CRT monitors, and many of these now have almost no information on the Internet :'/
:(
The worst part is many of those pages were not even saved to the wayback machine so they really are gone, I had one saved in my bookmarks for my Compaq 21" CRT many years ago, and now I wanted to check the specs again and boom it's gone, no trace of it not even in the archive's wayback machine, the manuals are gone too, it reaaaally sucks.
Looks great. But I think this is 1440i not 1440p. The i stands for interlaced.
for the green tint and artifacts, im not sure if it helps but setting the color space from Full to Limited fixed my issue. anyone who is having this issue might wanna try that
ill look into it thanks
The green issue is because of analog bandwidth exceeding the capability of the hdmi to vga adapter. A solution would be to use a sunix dpu3000 which allows you to use 600mhz analog bandwidth(above even native analog cards like 980ti at 400mhz). Maxes out any crt except some crt projectors, even enabling 4:3 4k60 on the monitors which can scan it like the 2070sb. By the way, this is not overclocking. You call it overclocking with fixed pixel displays because they're designed to hit one specific refresh rate at native res, but can sometimes go above that refresh rate hence the "overclocking" moniker. Crts just have a minimum and maximum vertical refresh rate(hz), and a minimum and maximum horizontal scan rate(khz, determined by vertical resolution and vertical refresh rate). Anything within those two ranges will work, it's designed to. Overclocking a crt's refresh rate would be exceeding those specifications by modding the monitor in some way. Cheers!
The people at Digital Foundry use a USB-C to VGA adapter to connect their C.R.T. computer monitor to a RTX 2080 ti. Do you know if USB C to VGA connectors to VGA suffer from this green screen coloring, due to a lack of bandwidth?
How does DVI-I/DVI-A to VGA compare to USB-C, lagwise? One is analog to analog, one is digital to analog.
@@badreality2 yes at a certain point they can, but usually they'll just cut out which an hdmi to vga will as well if you go too far above the bandwidth capability. Dvi-i to vga is lagless because it's analog to analog. The pins for vga are on the header of dvi-i so it's not even a conversion, simply a cable adapter. With digital to analog transcoding(no scaling, just signal conversion which is what these devices do) added lag is under 1ms, so subjectively zero.
Isn't the sunix dpu3000 pretty expensive?
@@copper4eva Unfortunately because it went out of stock a few months back yes it is rare and expensive. A good price would be 60-80 usd, which is what it cost originally but now that it's hard to find I've seen ridiculous prices. There may be rebrands of it with the same chip floating around the net somewhere.
@@displaytalk
That original price is very fair. Too bad I suppose that it's now being sold higher. I think people should just get cheap analog graphics card and run it along with their modern card (as I mentioned in other comments). Just a cheap and for sure way to hook up a CRT.
I may be nooby but this is a fire idea, it's creative and I love experimental stuff.
Thanks a lot I've been googling this for weeks.
Damn, this monitor has a soul! I need one.
Interlaced works in something called "fields". It displays half of it in one field, and the other in a different field. So it causes the "flicker" associated.
If you want, you can buy an older gtx card (like a 900 series or whatever) that has analog out. And if you have two pci express slots on your motherboard, you can run it together with your 1080 Ti. Like, your 1080 Ti will still be doing all the work, the 900 series will literally just be outputting analog for you. Basically an over glorified digital to analog converter. You can get an older Nvidia pretty cheap of course.
That CRT is a beast. You can definitely get higher refresh rates with an analog connection rather than your hdmi converter. I was able to get 1280x1024@137Hz, and my CRT ain't half what yours is. I wish I could get once like yours, that thing is a beast. These things can be overclocked.
th-cam.com/video/Ya3c1Ni4B_U/w-d-xo.html
This dude got his running at 2496x1872@60Hz. Completely insane what some of these things can do.
Is that a Nvidia only thing or will it work with AMD too? I couldn't find any info online about it.
@@webb4158
Funny you should ask, on the crtgaming reddit somebody said in the comments that they had done it with AMD graphics cards. So, if he is to be believed, then yes, you can.
It really works rather well, if you have a desktop that can support it obviously. You might have trouble getting the OS you use to figure out that you have two GPU's, but once you do, then you're good.
Бро, спасибо за совет, надо самому попробовать. Спасибо ещё раз и удачи =)
@Big Smoke
I have a 5 monitor setup with a 960 GTX and 1060 GTX. So multiple monitors are fine.
Only one analog out though sadly. I guess you'll have to look at older cards, which might have multiple analog outputs, but I'm not sure.
@@copper4eva how do you do that
that radioactivity is going hard 💥
☢
I remember my crt in 2005 I felt like a god.
You should try 900x720 quz thats the half of the resolution and 144hz
via chat gpt: can ur crt monitor explode if you over clock it too much?
ChatGPT said:
ChatGPT
While it's unlikely that a CRT monitor would literally explode from overclocking, overdriving it can cause serious problems. Overclocking a CRT monitor means increasing its refresh rate beyond its intended specifications, which could lead to:
Overheating - Components inside the monitor can overheat, potentially causing a failure.
Degraded image quality - Flickering, distortion, or misalignment of the image may occur.
Shortened lifespan - The higher strain on components can reduce the monitor's lifespan.
Electrical failure - In rare cases, pushing a CRT too far could cause internal components, like capacitors or transistors, to fail, possibly with smoke or sparks, but not an explosion.
It's best to stick within the manufacturer's specifications to avoid damage.
you can lower or alleviate any harm done when changing the operating resolution and frequency by offsetting the values when you make changes.
what i mean by that is if you want more resolution, then drop the refresh rate. that way, the bandwidth you are pushing can still be comparable to what you had before.
there are actually a number of specs from the factory that do come like this. The listed Maximum Refresh Rates from the manual are:
1600 x 1280 NI @ 71 Hz
1600 x 1200 NI @ 76 Hz
1280 x 1024 NI @ 88 Hz
1024 x 768 NI @ 116 Hz
800 x 600 NI @ 146 Hz
640 x 480 NI @ 180 Hz
640 x 400 NI @ 180 Hz
And then you can just play in between these values and still have a pretty safe use.
for amd you need to install amd adrenaline and make a custom resolution or use third party software
144hz on crt is insane. I mean i play ps1 games on a 60hz crt tv and most of the games run at 30 fps and that is butter smooth
We threw two perfectly functional crts at the local dump, just what you did when everyone moves to LCD tvs and LCD monitors thinking CRTs were old news, kept my CRT tv for years but ultimately it stopped working as it was a normal cheap tv as opposed to the actually decent monitor.
Im watching this on an oled screen rn. Looks pretty good ngl
lucky you enjoying the nostalgia again :(
It looks smooth and it has a certain fidelity to it that really makes you feel like you're playing a video game. Curious as to how I can explore the possibilities with this. Doom Eternal is certainly the perfect game for this, and it is a very fun game.
watching this 2024 with my old Sony trinitron at 1280 x 960 and 85hz. just swapped from my main 165hz monitor and i can tell that 165hz LCD has never been smoother than this 85hz CRT. if i could get my hands on something like even 21" CRT, i propably stick with that as long as it has time left
watching on my Trinitron MultiscanG400
That **is** a really good monitor for the price (fckn free)
These were the ones we had at my local cybercafe in '04-'06
The main downside is they use VGA or in the case of the high end models with crazy resolutions - DVI. Which are formats which have both been discontinued; Thankfully, as you said because they interlace differently, using a converter doesn't impact the video quality. We're talking interlaced VGA and DVI here which translates crisper than even Analog RGB - not something like composite. Yeah with this sort of stuff it's all analog, we're talking wave signals here, not digital LCD pixel signals.
If you use any adapter then you will have input lag and it won't work properly with hz. You have to use VGA ports and the last video card with VGA on is the GTX 750 ti. I use that graphics card with a CRT and it feel so smooth and it runs at 130hz. So amazing!
when was nfs most wanted at its peek?! i still had a 1024x768 crt at that time and the integrated nvidia 6600 or something like that was so bad i had to play it at some x800 resolution and it still looked freeking awesome.
Bring crt back!!!!!
oh yeah, you can use CRU(custom resolution utility) as well for setting up custom resolutions and refresh rate
1:24 Im watching on a 4k HDR plus 10 bit color 1444 nits peak brightness 120hz phone lol. Its a 4 year old almost 5 year old OnePlus 8 pro
i can push my old Visio tv at native 1360 by 768 to 144hz not the smoothest and def tears but this its pretty cool too think about the refresh rates i could of been getting when it was my main monitor.
I know this is an old video, but I have the same monitor and ran into the green artifacts in blacks at high resolutions. It's because you're outputting full RGB. From what I've read, a lot of hdmi converters don't like the wider darkness range of full RGB. If you switch your output to limited RGB, the green will go away.
I'll look into that. I also have a better hdmi adapter now
Haha I enjoyed this! And learned a lot
Glad you enjoyed it!
There are IBM and Sony professional Trinatron monitors that could 120hz 1600x1200 progressive. I know i had one. It was 21 inch flat screen trinatron and it was massive weighed darn near 100lbs. Was amazing for motion. Just took up tomuch space to be practical. Only monitor that has kept up with it is my newest 144hz LG ulta gear ultra wide and it cost nearlyn 1000.
SONY FW900
I would be cool if crt monitors made a comeback like the flip phones
Huge youtube rec pull. I shot one of these with my BB gun fully pumped and it barely dented the tv, lucky i didn't lose an eye. I was a stupid kid
Perfect 2K gaming monitor
I don't think this is a 4:3 aspect ratio as 1280x1024 is a 5:4 resolution, just like 1800x1440
So maybe set the game to 5:4 instead of 4:3
I want just one company to realize there's a CRT market ready for the taking
I don't know why I was here, I just watched to listen to advices only. I figured it out already.
Imagine even having a working CRT.
Last time it happened to me was with a PS790 in May 2022, and it only lasted 12 months.
I wonder if you can use with your PC with this monitor the CRT and use a program on your desktop called "custom resolution utility" you can go pretty high with it guarantee you that but I would first watch how to do it and understand it first
Why is the video title 1440p 120hz while the most you were able to get was 1200p 87hz?
old but gold
My monitor can do 800x600 120 hertz / 1024x768 100 hertz / 1366x768 86 hertz / 1920x1080 60 hertz and many other cuztomized resolutions but those are the ones i mostly use... motion clarity is at another level if compared to LCD, LED,OLED... that said, i must confess i prefer using LED over CRT because of the flickering, it's so hard on the eyes that it makes you want go running back to LED,/LCD- IPS.
oh that’s unfortunate, I’ve been daily driving an OLED for awhile now. Are you prone to headaches or what’s happening? i haven’t even really considered the flickering but I also don’t really hear CRT whine and people comment about it a lot haha
@@laserfights Yes man, sometimes i have headaches when i use CRTs mostly by the result of the flickering, i love CRTs but I guess I'm not used to them anymore, see I grew up with these monitors and used them on a daily basis until 2007.. Now i can't stand looking to the screen for much time because it hurts my yes badly and it feels hard on the eyes... even at high refresh rate, I love CRT motion clarity but i feel that's not a big deal to me anymore since modern LED displays are doing very well when you look for motion clarity and response time. When I want to revisit old games that fit better on CRT I use Scanlines and Dithering to emulate a CRT and i'm starting to enjoy more than playing on a native CRT for some reasons... I still have 3 of those big boys here but i never hook them up anymore...
It looks pretty good.
1080 has been pretty standard for a few years now 1440 has become almost standard but most TV's are moving to 4k now
0:23 ...hold on
I laughed, I subbed. I'm a simple bro
Download CRU custom resolution utility. You can set resolution and timing to max of your horizontal frequency 240 or something for your monitor. Im on mobile phone so I cant see it well. You can also enable interlaced resolution which doubles screen height so you can have the same resolution at max 160hz or how much ur mon supports. Downside for interlaced image is its bad for text in windows, but its good for games and movies.
There's a better tool for overclocking monitors, called "Custom Resolution Utility" or CRU. It's much more complicated but it gives you access to things like custom resolutions, refresh rates, and even detailed timing parameters that go beyond what the standard control panels from NVIDIA or AMD allow.
However, its flexibility comes with risks, such as potential damage if you push settings too far beyond the monitor's specs. For advanced users who understand CRT limitations, CRU is a powerful tool, but it requires careful handling.
Fun fact. Widescreen 1440 adds 547,200 pixels per side.
The things I would do to have a CRT like this
pure insanity!
more crts getting thrown away = MORE FLYBACKS YEA BOIIIIIII
oh wow. i kinda want this monitor. but wouldn't the phosphors be active to long and give you ghosting? this is a very interesting video. i want to know how long the actual pixels glow. i have a lot of question about the mechanics of this monitor.
@jamestmartin wow that is so frikin cool man. thank you so much for that amazing knowledge. you frikin rock james
Why does it look way better than new monitors? It's like makes me play games more than I usually play.
this "green" stuff is bad RGB signal.
change to YCbCr444 in your nvidia setting and safe 😉
it really works ver i changed my monitor to 129hz it's amazing
Damn, no matter what game it is and how modern, it still gives me Counter Strike 1.6 vibes because it's coming out of that massive tube. Also would have been better to see it at night with no reflections. And would be cool to see it compared to an OLED.
I've been daily driving an OLED for awhile now, i might make an update video.
Anything you'd like to see in it?
@@laserfights mainly how they perform in ideal viewing conditions (dark room). CRT enjoys always raving on about it, would be cool to see.
It's beautiful. Must get.
All your videos are GOOD
I don't get it, you set up the custom resolution for 1024x768, so how it it being used for 1440p gaming?
WarHammer Darktide looks amazing on a crt
i’ll have to add it to my list. thanks!
I thought only SONY Trinitron's had HD CRT's. Wow.
Edit: I thought that 1440p 120hz was a 2020+ thing they had invented. Cool to know it was already available waay back in time.
Hello me from the past
I could never go back to a CRT as I have no room for it and my eyes are getting quite bad so a 21" monitor is too tiny for my eyes. Even my 43" is a little too small. I am on the hunt for a 60", but they are too expensive at the moment. But, I am glad you are happy with yours.
Where's the part where you're overlocking it ? Am I missing something
I think the custom resolution part was the overclocking part. Though, there wasn't some super balancing act of 10 sliders to get the extra hz, it still did the job to get get interlaced 87fps (2x refreshrate).
you can do that on AMD too btw
Best monitor for my gaming.
You can use custom resolution utility also on any card
when crt monitors have better resolution and refresh rate than your lcd screen:
Xbox series x will only support up to 120hz and was wondering if a CRT is faster than a 240hz monitor when both are at 120hz. Would the 240hz still on 120hz outspeed a CRT at 120hz? I know for sure the 240hz at 240hz is faster but was wondering this since I’m buying a monitor for Xbox series x.
i would go with the 240hz monitor
@@laserfights is the 240hz faster than CRT at 120hz?
@@User_3303_ there are lots of factors with things like ghosting, response time, and etc that aren't accounted for via refresh rate numbers like 240hz and 120hz. If you are buying a decent 240hz monitor, then I would think it would be much easier to have support for newer games, and I would recommend going with something newer based on that alone.
beautiful
Can you still download and install the monitor "driver", either as an installer or just the .icc, .icm (or .inf) files?
not on windows 10 but it won't make a difference anyways
Nice video bro. I’d there a way for getting 100-120hz working on ps5/Xbox series x at a resolution of 1800x1440 to 1600x1200?
Sadly no