I am crying right now. My dad used to have a huge ass Sony CRT TV in the basement which he threw out a week ago. Now that I watched this video, I found out that it was the Sony KV-40XBR700 which is one of the Holy Grail's of CRTs.
Damn, man. Sorry to hear that. One of the best for playing retro consoles. I know what it feels like because my dad threw away a ViewSonic desktop CRT like 5 years ago. I feel sick every time I think about it.
Don't be too upset. The KV-40XBR isn't that great. Anything that's not 1080i is scaled and adds quite a bit of lag, and doesn't look that great. Basically all 1080i content is 16:9, and the 40xbr is 4:3. So you essentially are guaranteed to either have lag, or letterboxing. It is by no means at all a holy Grail, and honestly is pretty awful for gaming in general unless you are going deep down the scaler rabbit hole.
The difficult part isn’t to believe the superiority of crt… the difficult part is to go to a store and buy one. Not to mention find a shop to service an old one. The last shop that I knew that they did it… has been closed for a couple of years.😢
Getting your hands on a working CRT is difficult but the very next video I'm currently working on covers some of the most effective ways of getting a CRT. I didn't mention it in this video because it really was beyond the scale of what I wanted to focus on. But expect a video soon that covers this.
@@GTXDashI don't think it's hard to get a CRT, it's hard to find a high quality CRT. Ideally you'd want something capable of pushing 1080P+ and as of right now you'll pay hundreds to get ahold of one
Great video. To my understanding, the reason CRT computer monitors have more visible flicker at 60hz than their television counterparts is due to the choise of phosphors. Shorter persistence phosphors dont leave as long trails, but result in more visible flicker.
@@WalnutOWnah we know how to make them. Not as if the over century worth of R&D just up and disappeared. The community behind CRT's would somehow have to convince a company (like Sony or Samsung) it would be worth their time to re-invest in production.
One of my goals it to make a 4k 200hz crt. Would require a ton of curcuit design work but I think its possible. As far as I know, most crt monitors could display higher resolutions, but the circuitry inside driveing the electron gun wasn't designed for the inputs
i’m using CRT monitors since 2011 when i discovered hertz. i have played modern titles on ultra preset on my gtx 980ti. i play competitive games too. the research this guy has done is 100% accurate. i am well experienced now about new modern displays and CRT monitors. i use my sony 20inch CRT monitor than my 2k 160hz Lg ultra gear.
My dad was repairing CRT TV's and monitors, so I see many different models and pictures. When LCD start taking over, I never understand why people were changing from CRT to LCD. Picture quality was so much worse but people just did not care. Thanks to my dad, I have 4 CRT Tv's saved.
Sub pixel light scatter is something no other technology has been able to recreate... there is something so realistic and true to the light that emits from the phosphorous grid... skin looks like skin in a way that has depth, unlike oled or even led. It's like comparing the light of an incandescent light bulb, with an led bulb that runs the light at the same colour... one fills you with warmth (yes, physically also, but... not my point), and the other just feels cold, even though the colour of the light is warm. Incandescent light bulbs emit more light in the spectrum... making it also healthier for us, and... I wonder if light from a cathode ray tube also has that...
You're referring to the natural incandescent nature of CRTs, which is incredible, but you are mistaken on it being exclusive to CRTs, the last few generations of Plasmas exhibited this quality quite strongly, no as much as CRTs, but especially the last two generation of Pioneer Kuro Plasma displays had beautiful incandescence.
I didn't stop using crts because of the size I stopped because they were headache inducing and super washed out and hard to even see unless the lights were off. I had to spend 8 hours a day working on a crt for 10 years and the introduction of lcd monitors was the best thing ever for me. For gaming the only thing I ever cared about was input lag, but that was only a problem like 10 years ago it isn't a thing anymore.
My favourite feature of CRTs was turning them off to make the screen go static, then holding it and touching someone to give them a static shock. That was truly the best feeling. Also I learned that rubbing your hand across a SCART connector gives you a small shock. A lot of my CRT experiences involve electric shocks.
I hate that they didn't keep even one factory running...WE GET FREAKING TN PANELS that are basically e-waste, but not CRTs which are basically required hardware for retro gaming and TV
@hunn20004 They still make tube over seas. There may not be CRT repair techs per say, but specialized video game and pinball repair shops will often re tube them and do the alignment/calibration work. It won't be cheap, but the local pinball shop I work at has started doing them due to increased demand, as we are already used to working around high voltage electronics.
Awesome video! I only have a couple things to add: Reguarding your comment that CRTs have next to no latency, the latency to the center of the screen on a CRT is affected by the scanout behavior. Since a CRT scans the image line by line from top to bottom, the latency to the center of the screen would be approximately half of the frame time. At 60Hz, the frame time is 16.67 milliseconds (1000ms / 60Hz). Therefore, the latency to the center of the screen would be around half of that, approximately 8.33 milliseconds. This accounts for the time it takes the electron beam to reach the center of the screen during the scanning process. This is not much faster than most gaming monitors at 60hz if also measuring to the center of the screen so this is not really an advantage of CRTs. And 360hz+ gaming monitors will still have lower latency than basically all CRTs. I still think that impulsed display technology (CRT or LCD/OLED BFI) is a bandage for truly 'retina' motion. I agree with your point that it is impossible to run most modern games at extremely fast refresh rates natively. But for motion to look realistic, the gold standard should be native 1000Hz+ without any strobing. I don't think this is unrealistic to achieve in the near future either. Lossless Scaling already supports 3x AI frame generation in any game to make this easier. Just because 60Hz has been around since Atari does not suggest that the gaming industry is not undergoing a paradigm shift towards higher refresh rates in recent years. We already have 480Hz OLEDs, which is already 50% of the way to retina motion clarity. I think that with the help of AI frame generation and other technologies, the future will be 1000Hz+ OLED/micro-LED displays without any BFI or flickering, and this is probably not as far away as many people think. BFI and CRTs are just bandage solutions to make lower framerates look good, and they inherently cannot fix temporal aliasing (wagon wheel effect). Only true high-Hz can fix that.
The bad part is that people like us whom like 0 lag and blur are labeled as picky and fussy. The very first time I used an LED TV I felt the horrible lag. Unfortunately in the past my family forced me to get rid of my CRT monitor because of space, it was an LG Flatron EZ 17" (t730sh), I feel sad about it but thinking over and over will only make me feel worse. I will eventually buy another one. I hope that the industry will still manufacture some CRT Monitors and TVs as retro is slowly growing popular, the problem is that what dictates the trends is the mass and most don't care about purity of gaming experience. Hope never dies, I keep positive.
I remember when LCD monitors started to become widespread in offices in the early 2000s. Boy were they shitty. Grainy, blurry and un responsive. They only made the beancounters happy, due to less energy consumption (also for the air con) and more real-estate on the desk.
I think the answer is in gray to gray ghosting. LCD is ghosting because of the time it takes to change a pixel from one color to another. Sometimes the time it take a pixel to change into whatever the next frame requires are longer than the time it takes for yet another frame instruction to arrive.
@@GTXDash What I was thinking, is that there might actually be a much larger delay than a few milliseconds per frame then what we test. For one, a detector might see the time between one moment and when there has been enough change for it to detect as another frame, but the pixels haven't reach the desired color yet. This might not in itself be as big of a problem if the being recreated is only 23.97 fps since the the time between ec frame is so long that it will reach the correct color, but trying to pump 300 fps of fast moving pictures through a LCD panel, one could imagine the picture could get a little slurry. But ultimately by seeing this transition taking place before your eyes instead of only seeing a crisp image stropping, could also add to the blurry effect? CRT pictures don't have the problem of particle filters that physically have to change alignment to block light, it will only have the aspect of the rise and decay of the fluorescent light when it's activated, which I think only translate to overall perceived image brightness? There is also the whole CRTs being additive light and LCDs being subtractive light which add to the magic experience of our beloved CRTs.
I wish I lived in an alternate reality where SED/FED tech took over LCD in 2005 and then the tiny electron emitters mimicked CRT strobing to lower motion blur.
I remember when we all started to transition to LCDs from CRTs back in the early 2000's. Sure LCD's were new, sleek, and opened up your whole desk, but it was generally accepted that they looked way worse. The biggest issues were the worse black levels and the motion blur. There really wasn't much of a competition though, the heft of a CRT was just untenable. We all just stopped recognizing that CRT's even existed.
One thing that this video doesn't mention about disadvantages of CRTs is their power comsumption and influence of "environment" on the image quality. I remember from my childhood that CRT in comparison with even early LCDs looked like washed out crap with non-existing black levels and pain-inducing flickering. The reason is that during daylight CRTs were not able to produce bright enough image to win against the power of the sun, so the whole image became a grey mess. It wasn't an issue during the nighttime, but let's be honest as a kid you could not use a PC that late, or even be awaken and most people work during daylight hours. So the issue of crappy image quality was prevalent in circumstances when CRTs were meant to be used.
The minimum acceptable refresh rate on a 17" CRT was 70Hz. Anything less had obvious flicker. LCDs don't flicker, because they don't rely on phosphor persistence. Then there is the much lower resolution and smaller screen size of any kind of affordable CRT, combined with the huge amount of space they consume. The blurriness of a CRT is of course an advantage when playing ancient consoles, but you can just use an emulator instead with an LCD.
@@GTXDash LG C1/G1 are the latest OLEDS that support 120hz BFI; even then, they have a (slightly) lower motion resolution than the CX and GX models. Following models (C2, G2, C3, G3 and etc) have BFI but it does not work in 120hz mode. No other major OLED tv manufacturers have good BFI modes. Aside from some oled BVMs, that's as good as it gets.
@@Heymisterbadguy The retrotink 4K has full-fat hardware accelerated 120Hz & 240Hz BFI with HDR injection, so there is no brightness/luminance loss when using BFI, it is the best BFI algorithm to date, except for Sony's RBFI in their BVM OLED monitors, as you already stated, which is able to get smooth motion @ just 60Hz.
@@Wobble2007 well it's BFI is really really nice but i dont know if it can reach the motion resolution of the C1/G1, which is not just double motion resolution (like the 4k would be since it caps at 240hz) Also, the retrotink in 240hz caps at 1080p
I wonder how awesome crt monitors would be today if they were still in development today. It would be awesome if there were brands that would take up manufacturing crt monitors again and put time into developing thinner and lighter than they used to be would be cool if it was possible.
It's a nice thought, but the fundamental way in which a CRT works means there's a limit to how thin and light you can make one. Keep in mind that the picture is drawn by a particle beam fired from a single emitter at the back of the tube. It's basically the center point of a giant sphere, with the screen being the outside surface of said sphere. The thinner you make it (that is, the smaller the radius of the sphere), the smaller and more curved the screen has to be to get a legible picture. All this has to be done while maintaining a vacuum inside the tube, and I'm pretty sure the bulk of a CRT's weight back in the day already came from needing to use materials strong enough to do that. Could we improve them with the technology we have now? Probably a little, but I imagine part of the reason we stopped developing CRTs is because we were already seeing diminishing returns.
They should. Update the technology. Samsung and Sony back in the 2000s were working on new CRT tech that made the tube just a few inches thick. Would've been awesome.
Same here, mine is a CTX EX701F which just a few days ago stopped going into the menus, reset all the picture settings to defaults, and shows up on my PC as a "LXH-GJ769IIF" which makes me think some sort of firmware corruption. I don't want to get rid of the CRT but I might have to replace it.
The best thing about crts is the thing you can't explain to someone who hasn't experienced it: the immersion in content. It's almost like vr when compared to lcd
And you definitely! Cannot explain to people who weren't visually trained to recognize frames at 1/20th of a second. At 0.005 seconds was where I began to notice anti-gravity effects on a broad scale... my goodness - outerspace not having gravity means that light goes both directions so fast that it alters gravity itself... that is what the physics engine realspace2 produced into the world.... the theory became a reality 100%.
The biggest drawback of CRT monitors is IMHO eye strain. I still remember having headaches and sore eyes after spending a late night in front of the PC during CRT days. Having to deal with image geometry and soft corners was also a bit of a pain with CRTs.
CRT monitors are not bad and they are not ugly. They have personality and they are robust. People say they are fragile but they are not, you just have to be careful not to drop them like you have to be careful not to drop your flat monitors. Honestly I am wondering if I should invest in a nice CRT Monitor because I rarely play any FPS games.
Just get any old 1280x1020 @85 17 or 19 inch of any brand, should be cheap. I have a 1280x1020 Dell M990 from 2000 and it's great. These resolution CRT's are amazing and give you the flexibility of higher def at higher refresh rate top limit while also giving insane refresh rates at lower resolutions. Plus anything bigger than 19inch imo is too big and heavy unless you have the space
@@GTXDash You didn't like my comment? I'm not referring to the comment I'm making now or the other one I made that says Hello, maybe you didn't see it, I made a comment to you, I hope you see it, it's not hate
So to sum it up, you can push as many new frames to the display as you want but if it's not clearing/blanking the old frames first you've got motion blur.
Exactly. Especially if the blanking is longer than when a frame is visible. BRI is very limited in this regard. This is why backlight strobing is what display manufacturers should continue to improve on.
Thank you so much for finally for finally somebody being able to explain this because I am sick and tired of hearing people saying on TH-cam oh this is a 360 Hz monitor. This is a 180 Hz monitor. This is a 270 Hz monitor I mean, who gives a hoot if you can’t put that many frames out that fast Meaningless
I think one of the reasons why crt got replaced very quickly in offices is because lcds have a more clear picture and is less blurry which helps with text.
I made a pong… on lcd I got square for a ball, on crt it was round . Love CRTs blurriness for games. LCD good for text and fine line tech drawings… CRT for visuals. Beautiful colors and natural looks. This is why I still sport my CRT and will never get rid of it.
11:11 I own both a CRT display @60 Hertz(It's a Sanyo VM4209) and a gaming monitor(BenQ EX2780Q @144 hertz) and I can affirm your claim. They both feel about as smooth to use despite being made about 45 years apart.
Great video! I just honestly wouldn't downplay how great BFI modes are in a select few recent oled tvs, especially the LG CX/C1 and a few PC monitors tuned by Blurbuster. They can absolutely reach or surpass CRT's motion clarity. an LGC1 at 120hz with BFI maxed is incredible for videogames! It took basically 2 decades but we're getting there, finally.
Actually, power consumption differences are negligible in the favor of flat-panel displays at best. That’s only if you’re comparing square inch of display per watt, and using only their peak wattage draw, for like the extent of 2-4 hours. The problem with the comparison is that CRT’s are only at their peak power draw at the first few seconds of startup, and it’s continuous wattage draw is a significantly small fraction of that. Whereas, Flatpanels generally just reach their max power draw and stay there roughly continuously. Monochrome crts generally use like roughly a third of what color ones do even. Basically, leave em plugged in side by side one another on a power consumption monitor, and eventually a crt will probably end up consuming less power in continuous long haul usage.
Theres truth to what you say. My backroom LG 1440p 144hz monitor and I will put the wattmeter on it tomorrow to see its average. But I know for sure OLED will eat more power than a high refresh CRT. My buddies 27 inch/1440p-240hz OLED was using 55watt idle at desktop(dark background) and around 86watt and some frequent 100 watt normal figures while gaming at his settings(with 110watt during bright scenes), and 120watt on white background webpages. Idk how the mfg are cooking the numbers on the manual but thats fairly high. Whereas on my Sony G520 in the living room(21 inch CRT) 1440p at 85hz(128.5KHz tube speed) will spike to 133watt during degauss cold startup and then 87watt average gaming and 96watts in bright scenes, with the occasional 101watt spike with a almost full white background website page. Next to it(dual CRT setup for convenience lol) the Fujitsu/Siemens 21inch(shadow-mask) at 1200p at 90hz(112KHz tube speed) spikes to 126watts degauss cold start, then averages between 80-91watts depending on the scene gaming, and 104watts mostly white screen internet page. Before I got the Siemens, was given a Sony HMD-A100(15 inch) by the same OLED buddy. It looks spectacular after a 10 minute warmup at 768p 85hz. It used a very low amount of power and surprised me. I put it in the shop as the music/info terminal. Lastly my Samsung 997DF(19 inch) 1200p at 72hz is oddly close to the Sony 520 in power consumption but looks great. I put it in a vented box with a big moisture eater bag from work in there. I check the bag and bake it every 7 months to keep the Samsung in good shape for emergency backup. Sadly I dont have data on the last 2 CRT monitors saved on my computer, but I did test them when the wattmeter arrived. Sorry for the long text but its rare to see someone who has modern experience with older display tech. I daily the CRT monitors(typing this comment out on the big Sony) because its vastly superior to my not so old 32inch 1440p LCD in every aspect besides screen size. Thats why I pick up 17inch+ roadside/dump tube monitors every chance available and test them, LCD seems terrible in comparison and I like having backups. Would buy an OLED but its just too expensive.
@@insurgentlowcash7564 Naw, no worries. That was insightful and kind of illustrates basically what I was getting at, with data, however anecdotal it is. Reading is no difficulty to me. I read and write effortlessly, by fortune of being effectively literate. 😄
if you ever want to use these i suggest to get a sunglasses and an eye drop to use every 5 minutes because i remember these was painful to look at for more than 1 hour
DLSS and FSR do wonders with the frame rate for gpu limited scenarios. But yeah, I wish I could get a good 1080i crt, but those are so expensive… I wish someone started to produce crts again. And I mean they are freaking particle accelerators which is awesome.
16:18 My Syncmaster 955DF is a 1856x1392 VGA CRT monitor, i counted twice the amount of phosphor dots for text vs pixels on my 1080p LCD, it's basically a true 4:3 1440p monitor. Not only there's no aliasing on my monitor but text is way, WAY sharper than my 1080p LCD.
I love CRT tvs. I still have my "40inch" Toshiba CRT. Play my PS4 on it. The image clarity, sound (bass/treble/theator quality surround sound), color depth, contrast, etc are amazing to me. One of the many things I love about my CRT (and it's a major plus) is that the image & sound quality have not decreased at all over the years. Nor has the image started to look washed out/overlit. Every HD tv I've had (top brands too) have always lost image & sound quality as they start to age & always end up looking washed out. My CRT Toshiba still looks & sounds just as great as the day I bought it. Great video!
bro... not saying i dont like crts but i m not sure if i can take a person serious that talks sound from tv speakers... what about you just get an actual amplifier and speakers and use those instead?
Yeah when you go back to crt from playing on a flatpanel, plugging in like a Mega Drive into a CRT and pressing jump buttom feels like the character is jumping slightly before you press it. Not a joke it genuinely feels like that. Obviously impossible but its more of a feeling thing, must be something to with your brain being adjusted to lag.
The main 'latency' of an CRT is the time the phosphor needs to illuminate, so there is nothing that has no latency, but it's pretty much instant compared to any kind of digital image transfer.
These are arguably Mitsubishi's best tubes they ever produced, better quality as far as fidelity and IQ than the slightly newer 2070SB tube, much like Sony's GDM-5002PT9 being better than the GDM-F520, despite being older, it's the phosphor quality and electronics being slight better in the slightly older models for some reason.
If you are ok with CRTs or Backlight strobing on modern displays, i honestly envy you. Here is my story. I used CRTs in school and university and i had headache after some time. Bigger and flat ones were better for me, but still painful after an hour. Some of them ran at 100hz at 768p, also not helping much. Not happening on TV, but i viewed them from quire far. Never had that on TN, IPS, OLED. Friends VA also was okay, but I didn't had them personally. On another hand, my cheap 144hz TN has noticeable motion blur. If i use Backlight strobing it does reduce blur significantly, almost to none. But my eyes hurt after a few minutes. I guess I'm sensitive to that thing.
I used a 17" CRT monitor next to a 22" LCD a few months back. The CRT would often develop a painful high pitched squeel. The CRT was dimmer & took a few seconds to reach peak brightness. The CRT's text was blurry. The CRT's image needed tweeking a bunch of settings to be mostly square and centered--and changing the resolution meant all those settings went wonky. I didn't really care about the weight and bulk when I later replaced the CRT with another LED monitor.
Believe me when I say this: I love CRTs, I have a 29 inch trini, rgb moded, hooked to an RGB-Pi arcade setup (OS4). CRTs are great for retro gaming. That said, CRTs, generally speaking, are NOT better. It's just more cons than pros at this point. I just pray for my trini to last many more years. Even after doing a recap, I know it's not forever (it's already 15 years old).
60hz flicker looks awful on a CRT, it was the bane of many a worker's existence if they were staring at it all day for work. 60hz on an LCD doesn't flicker the same way at all and modern zero flicker backlights are much nicer as well.
Excellent points all around MXDash2! Thoroughly enjoyed how you put all this together and you're spot on for every point! I spent a pretty high fortune for my whole PC setup, and when it comes especially in running my older games I miss my CRT that was 1600x1200p. OLED at the moment gives me most everything I want and it's impossible for me to want to go back, but I haven't forgotten the perks of the old CRT. I actually back in the 90s ran my TV and VCR through my GPU because everything looked so much better on the monitor. Somewhere around 2005 I think was when I stopped using my Radeon AIW 9800XT (If I remember correctly) where I had all that set up and everything just felt great. These days it's amazing when it all works, but I'm plagued with games that don't work with windows, or multi-core CPUs are the issue, or graphic drivers obsolete.... A decent number just flat out don't work, and even more recent titles don't.
The solution should have been SED/FED tech at 60-75Hz with it's individual electron emitters doing flicker naturally.........I'm really bummed they never made it to production 😢☹️
LCD/LED has always been inferior but could you imagine how deep a 100” crt would need to be. I still have a CRT monitor for retro gaming, but you need a deep desk, modern monitors take up so little room.
If manufacturers continued building CRTs, they would've improved on the technology the same that LCDs continued to improve. Samsung (before canceling it) got a 30" CRT to be only a few inches thick. That being said, I think the industry needs to keep working on OLED technology to get to the reduction of motion blur that CRTs were able to achieve.
@@GTXDash Can't wait for multi-stack RGB-OLED with native HDR-RBFI, which will tide us over until eQD which will hopefully have a native rolling-scan modulation method.
Nothing beats a crt! And I have an oled c1, a crt and a plasma so I can say it because I am always comparing... specially the movement of cameras that are very noticed on oled/lcd but not at all on a crt...
I have a CRT monitor I got free from someone and I love it. It has downsides like, yeah, size and danger if dropped or if I ever need to repair to, but it works better for me. I game at 1024x768 at 118Hz (the best I can get out of it), and it actually works *very* well, I oftentimes forget that it's not 1080p. As long as I have antialiasing enabled, at least. I did have to scrub the damaged anti-glare coating off it because otherwise it was awful, so I can't really use it without having the curtains closed in my room. I will not be getting any new LCD or any flat panel monitor until I can get something that is as good as a CRT in picture quality and feel. And I do wish I could exchange resolution for refresh rate on modern monitors.
Pioneer and Panasonic plasmas are very close to CRT when it comes to motion clarity. I owned a Panny plasma for 12 years and gaming on my consoles was always a smooth experience. Keep in mind that the vast majority of these games were running at 30 FPS. Eventually my plasma burned out and I replaced it with an LG C2 OLED and I couldn't believe how bad the motion was. 30 FPS was simply unplayable because of the stutter, and even 60 FPS looked very blurry. And it wasn't just the motion clarity. Somehow the colors looked more natural on the plasma, and that was "only" in SDR. And then there's banding and uniformity. Both are still superior on a plasma. Anyway, I sold my C2 and went back to plasma. I bought an excellent refurbished Kuro and it's literally the best picture I've ever seen. I don't care if it's "only" SDR and 1080p. The motion clarity and natural colors are simply stunning.
Someone where i live is trying to sell a 720p pioneer kuro.... its tempting but not sure if it can fit in my room. Id mainly use it just to game on 360, ps3 and watch movies via blu ray. My panasonic crt is decent but 480i not 480p like later hd models. It has component and overall, I love it. It was free from a church and given i have low contrast and just a tiny bit higher brightness than factory id say its very low hours.
@@Stoddardian pdp-1130HD. Should be a 50". Has the box, and display. Speakers I can get on ebay for 20 bucks. I'm mostly concerned about hours on it. Ironically I have a cheaper pioneer set but the power board is nla and it'd cost more to get the board than a whole tv...lol.
@@roveradventures If it's cheap I would go for it. I got really lucky with mine. It was refurbished by a guy who worked as a Pioneer technician for 15 years. It still had over 90% of its original brightness left. It's an LX6090.
@@Stoddardian indeed, it was 120. But if I go for it I'll try to drop the price since speakers are about $20 on ebay. Ideally they didn't use max contrast or brightness. The panasonic factory settings or the settings I saw. "Picture" is maxed out usually which can really decrease crt life.
you point around 7 minute mark is exactly why I push for black frame insertion in modern displays. because these MF dont realize increasing refresh rate doesn't matter if you can't drive it. currently on the oled_gaming subreddit, you can find people literally buying $800-1400 OLED displays but their computer can't even drive them properly. And its just insane. I could drive a 360hz OLED playing counter strike 2 or hunt showdown or world of warcraft, but most gamers cannot. What would be superior is having an LCD/OLED that mimics a CRT in terms of pixels on vs pixels off, giving that same illusion to our eyes that give us such perfect motion clarity. If they developed an OLED that was say 240hz and included a true black frame insertion mode that kept the full refresh rate, it would be pretty sweet. As it stands, modern BFI will generally double your visual performance. 120hz + BFI looks exactly the same as native 240hz. 240hz + BFI would look like native 480hz.... and obviously having less fps, BFI helps improve. Sure, new 4k monitors are coming that are 240hz, and have a 120hz BFI mode (runs in 240hz mode, but only every other frame is displayed, so frame black frame black frame black and so on). sure, for console, that 120hz + BFI mode might be awesome, but most people already have 120hz OLED gaming TV's with BFI.... not to mention television BFI is superior because both LG and SONY (the top brands for oled tv's) use a "rolling scan" black frame insertion. and it works. hell my sony xperia 1 mark 2 uses a rolling scan BFI and its only 60hz but feels like 120hz. its amazing. watching movies/anime on my phone in bed is way nicer than watching anime/movies on my PC gaming monitor, simply because my aw3423dw 3440x1440 175hz oled doesn't have BFI. so you get extreme judder in panning shots and its clear as day in anime.... yet on my phone which is oled + BFI you can't even tell it exists.... I would kill for a proper 240hz+BFI oled gaming monitor. sadly, we aren't getting it yet. i had that shitty ips viewsonic blur busters certified 1080p display which had BFI at 60, 120, and 240hz.... 60hz bfi made my eyes hurt. it was atrocious. 120hz + bfi wasn't bad, but looked like native 240hz.... and then 240hz vs 240hz+bfi was no difference because pixel response wasn't good enough to show a difference. so technically its ;capped to 240hz with or without BFI. however OLED actually has TRUE sub 1ms pixel response, so any BFI at any level would be amazing. I really want 240hz+BFI.... true 240hz BFI. a solution to the native aliasing on crt displays for LCD/OLED? honeycomb subpixel structure. instead of a subpixel of red/green/blue we need a technology where 1 subpixel can produce the entire range of color in one subpixel. have it in a honeycomb/hexagonal setup. and since the subpixels would be so small, you end up with a sharper display as the PPI would be much much higher.
Yeah I picked up 3 CRT's on the side of the road and the Sony Trinitron 32inch that I got is just amazing. Plugging in a Super Nintendo with Composite cable looks almost crystal clear, there's no latency, and no blurring or ghosting. It just looks incredible, and it has a warmer glow that feels more alive than the artificial light that my LCD gives off.
I"m still using my CRT for playing games, sometimes i use the LED ones for some games and aplications such as Adobe Animate, Blender, Premiere but my main playing monitor is the CRT one, Motion Clarity even at 60 hertz is another level, there is no modern scream that come even close to what CRTs can make, there's no input lag, colors are great, natural bloom, black levels.. CRT is the king.
My dream is some company that will one day create a modern 4:3 ratio monitor that perfectly replicates (in hardware) CRTs. I wonder what modern technology could do to replicate the cathode-ray tube but without the weight, size and maybe without the glass
i like crts for that same exact reason, but i also like crts course of the shape and course how big it is course i like how it looks, the design, and has allot of detail, like my fravoite type of detail, but that would still be cool if they made a flat screen computer that replicates crt computers, like a flatscreen version of a crt computer monitor!!
Pumping up resolution will basically make it easier to display non native ones. Compare footage of an old game on a new phore screen to a large monitor. The phone will look better because the density of pixels hides the distortions.
There was supposed to be a flatscreen version of CRT....it was called SED or FED. Instead of an electron gun, there was individual electron emitters for each subpixel. It's downfall was caused by company greed.
Didn't really get why Oled can't come close to matching CRT in motion clarity, the near instantaneous pixel response time and black frame insertion would create a very similar effect, at the cost of brightness, whilst having all those other benefits, and a 4k display can drop to 1080p losslessly, just dispay 4 of the same pixel. Yes Oled will never look exactly like CRT but is CRT the end all be all best technology?
The problem is that they aren't doing that. That's exactly the issue. I've been chomping at the bit for OLED with heavier BFI for ages because I'm in a light-controlled environment where I won't mind the brightness lost so much and it just isn't available still.
I forgot the name of the tech... it was supposed to replace CRT with a pixel grid structure like LCD, but, each pixel was a very small cathode ray tube like pixel... that would've been a good replacement... I think. Maybe, though, the motion blur would still be a problem with that as well.
@@ShankMods Makes me sad every time I think about SED & FED, even Pioneer's Kuro tech got shut down, seems like quailty displays are only destined for the professional grade markets.
I still have a PC CRT monitor in my gear and for 2D platformers, old fps games, 3d platformers and other games where low input lag and perfect motion clarity are a massive bonus, I use the CRT always.
It is funny to talk about motion quality of displays, when game developers still fok up their camera movement to an unnatural jerky headache making shock experience. I've yet to see the first ever FPS that has a camera movement that does justice to anything above 24fps.
I hope so. If it's a high refresh rate display where 5 out of 6 frames are black while the 1 is bright enough to compensate for the 5x darkening caused by those 5 frames. Sounds outrageous but that's really what it's gonna take to reach that level of no motion blur that CRTs are known for.
Standard BFI, that is straight impulse modulation can not produce the same type of motion IQ and performance that raster-scan CRTs can, so for BFI to make CRTs obsolete, they will need to come up with an effective "race the beam" algorithm to simulate raster-scan, this will also need to simulate the incredibly fast phosphor decay times and phosphor glow and all that good stuff that CRTs have, the hardest thing to match will be the CRTs incredible native image depth, which with a good quality input source can be almost three-dimensional, looking glass, aka light-field tech is a good candidate.
11:05 PAL 50hz televisions are notorious for their noticeable flicker even on consumer TVs with relatively slow decaying phosphors (which, however, were good enough to display 60hz content without flicker). This was also the reason why 100hz european televisions with digital preprocessing were made (which is an absolute bummer and also the reason for HDTV CRT televisions being primarily in NTSC countries). This is an additional issue: the slower the phosphor decay is, the less noticeable the flicker on lower refresh rates is, and the more noticeable are blur trails on high contrast moving content (mouse pointer on a black background, for example) still much more bearable than any sample-and-hold LCD tho
When I switched from CRT to LCD back in the day the lack of flicker on the LCD was the main benefit. Even at 120hz a CRT flickers more than LCD at 60hz. You claim 48hz is enough not to see flicker but 50hz TVs that we have here flicker a lot to me. Playing at 60 in games that support PAL60 helps but it doesn’t get rid of flicker. So flicker or motion blur? Best would be to have neither. When I game on a CRT monitor I prefer 85hz minimum for my eyes not to hurt.
I might have missed it but other advantage crt has over new monitors is that, because of the lower resolution you can choose, games at that resolution benefits in performance but at the same time they look better at that resolution in crt than a newer monitor because of how crt works. The only downside of them are maybe HUD elements that are programmed with higher res in mind being out of proportion
Have you dabbled with black frame insertion on OLED monitors? I have a lgc1 tv that does it. It Darkens the picture a lot and the flicker feels noticeable but it seems to help a lot with blur. I've been thinking about this quest we've been on to get to where we were 30 years ago and it makes me mad lol. These 1000 dollar alternatives are only just now approaching CRT levels of clarity and it's very unsustainable when you account for new games with better visuals and higher pixel counts. The industry is full of snake oil too with things like frame generation which increases input lag and decrease responsiveness to appear better.😢 I'd but a new CRT if a company started making them again.
@testtube173 Even I play newer games on a modern display. It really does depend where the tradeoff of motion clarity for image sharpness/resolution no longer becomes worth it. BFI is better than nothing, but even without it, I still generally only play games on my CRT if they predate the 2010s.
I am crying right now. My dad used to have a huge ass Sony CRT TV in the basement which he threw out a week ago. Now that I watched this video, I found out that it was the Sony KV-40XBR700 which is one of the Holy Grail's of CRTs.
Damn, man. Sorry to hear that. One of the best for playing retro consoles.
I know what it feels like because my dad threw away a ViewSonic desktop CRT like 5 years ago. I feel sick every time I think about it.
RIP for the Trinitron🫡😔
40inch 300 pound Godzilla Trinitron RIP🫡🥲🫡
Don't be too upset. The KV-40XBR isn't that great. Anything that's not 1080i is scaled and adds quite a bit of lag, and doesn't look that great. Basically all 1080i content is 16:9, and the 40xbr is 4:3. So you essentially are guaranteed to either have lag, or letterboxing. It is by no means at all a holy Grail, and honestly is pretty awful for gaming in general unless you are going deep down the scaler rabbit hole.
@@ShankMods yeah you gotta find out how to put it in 540p mode in the service menu.
The difficult part isn’t to believe the superiority of crt… the difficult part is to go to a store and buy one. Not to mention find a shop to service an old one. The last shop that I knew that they did it… has been closed for a couple of years.😢
Getting your hands on a working CRT is difficult but the very next video I'm currently working on covers some of the most effective ways of getting a CRT. I didn't mention it in this video because it really was beyond the scale of what I wanted to focus on. But expect a video soon that covers this.
@@GTXDashI don't think it's hard to get a CRT, it's hard to find a high quality CRT. Ideally you'd want something capable of pushing 1080P+ and as of right now you'll pay hundreds to get ahold of one
@PJxBuchwild Yep. I don't have a CRT that can do 1080p. However, I wouldn't call them "low quality".
They are inferior, dude
Tell me about it! I've had a Street Fighter 2 cabinet with a vertically collapsed screen for years, and i just cannot find anyone to service it.
Great video. To my understanding, the reason CRT computer monitors have more visible flicker at 60hz than their television counterparts is due to the choise of phosphors. Shorter persistence phosphors dont leave as long trails, but result in more visible flicker.
I would love to see a new 1440p CRT at 200hz. I'd buy it
Won’t happen, unfortunately. It’s like going to the moon, we don’t know how to make CRTs anymore. The assembly lines don’t exist anymore.
@@WalnutOWnah we know how to make them. Not as if the over century worth of R&D just up and disappeared. The community behind CRT's would somehow have to convince a company (like Sony or Samsung) it would be worth their time to re-invest in production.
One of my goals it to make a 4k 200hz crt. Would require a ton of curcuit design work but I think its possible. As far as I know, most crt monitors could display higher resolutions, but the circuitry inside driveing the electron gun wasn't designed for the inputs
Same
The dream monitor.
i’m using CRT monitors since 2011 when i discovered hertz. i have played modern titles on ultra preset on my gtx 980ti. i play competitive games too.
the research this guy has done is 100% accurate. i am well experienced now about new modern displays and CRT monitors. i use my sony 20inch CRT monitor than my 2k 160hz Lg ultra gear.
My dad was repairing CRT TV's and monitors, so I see many different models and pictures. When LCD start taking over, I never understand why people were changing from CRT to LCD. Picture quality was so much worse but people just did not care. Thanks to my dad, I have 4 CRT Tv's saved.
Sub pixel light scatter is something no other technology has been able to recreate... there is something so realistic and true to the light that emits from the phosphorous grid... skin looks like skin in a way that has depth, unlike oled or even led. It's like comparing the light of an incandescent light bulb, with an led bulb that runs the light at the same colour... one fills you with warmth (yes, physically also, but... not my point), and the other just feels cold, even though the colour of the light is warm. Incandescent light bulbs emit more light in the spectrum... making it also healthier for us, and... I wonder if light from a cathode ray tube also has that...
Check out plasma TVs, especially the 9.5g KURO Pioneers. They also use phosphors.
You're referring to the natural incandescent nature of CRTs, which is incredible, but you are mistaken on it being exclusive to CRTs, the last few generations of Plasmas exhibited this quality quite strongly, no as much as CRTs, but especially the last two generation of Pioneer Kuro Plasma displays had beautiful incandescence.
I didn't stop using crts because of the size I stopped because they were headache inducing and super washed out and hard to even see unless the lights were off. I had to spend 8 hours a day working on a crt for 10 years and the introduction of lcd monitors was the best thing ever for me. For gaming the only thing I ever cared about was input lag, but that was only a problem like 10 years ago it isn't a thing anymore.
Yeah. That's why we only use CRTs for specific things. other stuff, especially productivity, you need a flat panel.
My favourite feature of CRTs was turning them off to make the screen go static, then holding it and touching someone to give them a static shock.
That was truly the best feeling. Also I learned that rubbing your hand across a SCART connector gives you a small shock.
A lot of my CRT experiences involve electric shocks.
I sit down at PC, grab mouse by hand, get shocked into finger from mouse. USB hub reboots, monitor blanks. Then it all restores
The amount of hate in the comments is a bit sad. CRTs totally rock!
I hate that they didn't keep even one factory running...WE GET FREAKING TN PANELS that are basically e-waste, but not CRTs which are basically required hardware for retro gaming and TV
@hunn20004
They still make tube over seas.
There may not be CRT repair techs per say, but specialized video game and pinball repair shops will often re tube them and do the alignment/calibration work. It won't be cheap, but the local pinball shop I work at has started doing them due to increased demand, as we are already used to working around high voltage electronics.
crts are not uggly. They are very charming
Awesome video! I only have a couple things to add:
Reguarding your comment that CRTs have next to no latency, the latency to the center of the screen on a CRT is affected by the scanout behavior. Since a CRT scans the image line by line from top to bottom, the latency to the center of the screen would be approximately half of the frame time. At 60Hz, the frame time is 16.67 milliseconds (1000ms / 60Hz). Therefore, the latency to the center of the screen would be around half of that, approximately 8.33 milliseconds. This accounts for the time it takes the electron beam to reach the center of the screen during the scanning process. This is not much faster than most gaming monitors at 60hz if also measuring to the center of the screen so this is not really an advantage of CRTs. And 360hz+ gaming monitors will still have lower latency than basically all CRTs.
I still think that impulsed display technology (CRT or LCD/OLED BFI) is a bandage for truly 'retina' motion. I agree with your point that it is impossible to run most modern games at extremely fast refresh rates natively. But for motion to look realistic, the gold standard should be native 1000Hz+ without any strobing. I don't think this is unrealistic to achieve in the near future either. Lossless Scaling already supports 3x AI frame generation in any game to make this easier. Just because 60Hz has been around since Atari does not suggest that the gaming industry is not undergoing a paradigm shift towards higher refresh rates in recent years. We already have 480Hz OLEDs, which is already 50% of the way to retina motion clarity. I think that with the help of AI frame generation and other technologies, the future will be 1000Hz+ OLED/micro-LED displays without any BFI or flickering, and this is probably not as far away as many people think. BFI and CRTs are just bandage solutions to make lower framerates look good, and they inherently cannot fix temporal aliasing (wagon wheel effect). Only true high-Hz can fix that.
This guy knows tech. holy smokes lol
@@ethanwright752unlike the guy making the video who thinks that people buy 360+ hz monitors to play triple A games at high res 😂😂😂
Lossless scaling is expensive and degrades image quality, if not stuttering all the time. I rather use 120 hz backlight strobing.
The bad part is that people like us whom like 0 lag and blur are labeled as picky and fussy. The very first time I used an LED TV I felt the horrible lag.
Unfortunately in the past my family forced me to get rid of my CRT monitor because of space, it was an LG Flatron EZ 17" (t730sh), I feel sad about it but thinking over and over will only make me feel worse. I will eventually buy another one.
I hope that the industry will still manufacture some CRT Monitors and TVs as retro is slowly growing popular, the problem is that what dictates the trends is the mass and most don't care about purity of gaming experience. Hope never dies, I keep positive.
Finally I understood the benefit of black frame insertion, subscribed.
I remember when LCD monitors started to become widespread in offices in the early 2000s. Boy were they shitty. Grainy, blurry and un responsive. They only made the beancounters happy, due to less energy consumption (also for the air con) and more real-estate on the desk.
I think the answer is in gray to gray ghosting. LCD is ghosting because of the time it takes to change a pixel from one color to another. Sometimes the time it take a pixel to change into whatever the next frame requires are longer than the time it takes for yet another frame instruction to arrive.
This is true. But even on fast flat panels that only take a few milliseconds for a frame to change still suffers immensely from motion blur.
@@GTXDash What I was thinking, is that there might actually be a much larger delay than a few milliseconds per frame then what we test. For one, a detector might see the time between one moment and when there has been enough change for it to detect as another frame, but the pixels haven't reach the desired color yet. This might not in itself be as big of a problem if the being recreated is only 23.97 fps since the the time between ec frame is so long that it will reach the correct color, but trying to pump 300 fps of fast moving pictures through a LCD panel, one could imagine the picture could get a little slurry. But ultimately by seeing this transition taking place before your eyes instead of only seeing a crisp image stropping, could also add to the blurry effect?
CRT pictures don't have the problem of particle filters that physically have to change alignment to block light, it will only have the aspect of the rise and decay of the fluorescent light when it's activated, which I think only translate to overall perceived image brightness?
There is also the whole CRTs being additive light and LCDs being subtractive light which add to the magic experience of our beloved CRTs.
Its that gray area my dad alwaya said....
I wish I lived in an alternate reality where SED/FED tech took over LCD in 2005 and then the tiny electron emitters mimicked CRT strobing to lower motion blur.
I remember when we all started to transition to LCDs from CRTs back in the early 2000's. Sure LCD's were new, sleek, and opened up your whole desk, but it was generally accepted that they looked way worse. The biggest issues were the worse black levels and the motion blur. There really wasn't much of a competition though, the heft of a CRT was just untenable. We all just stopped recognizing that CRT's even existed.
What if you had a CRT with the electron gun where the stand for a LCD monitor would be, and reflected the image with a mirror?
One thing that this video doesn't mention about disadvantages of CRTs is their power comsumption and influence of "environment" on the image quality. I remember from my childhood that CRT in comparison with even early LCDs looked like washed out crap with non-existing black levels and pain-inducing flickering. The reason is that during daylight CRTs were not able to produce bright enough image to win against the power of the sun, so the whole image became a grey mess. It wasn't an issue during the nighttime, but let's be honest as a kid you could not use a PC that late, or even be awaken and most people work during daylight hours. So the issue of crappy image quality was prevalent in circumstances when CRTs were meant to be used.
The minimum acceptable refresh rate on a 17" CRT was 70Hz. Anything less had obvious flicker. LCDs don't flicker, because they don't rely on phosphor persistence. Then there is the much lower resolution and smaller screen size of any kind of affordable CRT, combined with the huge amount of space they consume. The blurriness of a CRT is of course an advantage when playing ancient consoles, but you can just use an emulator instead with an LCD.
Edited the dat file on my monitor to run at 240hz 480p and the smoothness is insane. I wish CRT tech kept being improved.
Phosphors. After glow. CRTs use very different technology. Games look spectacular on CRT
Plasmas used phosphors too, and after CRT, were the best for motion clarity and natural colors.
120Hz BFI on an OLED was the closest we get to good motion resolution. Guess what, they dumped it...
Wait... what? Who did? The manufacturers?
@@GTXDash LG C1/G1 are the latest OLEDS that support 120hz BFI; even then, they have a (slightly) lower motion resolution than the CX and GX models. Following models (C2, G2, C3, G3 and etc) have BFI but it does not work in 120hz mode.
No other major OLED tv manufacturers have good BFI modes. Aside from some oled BVMs, that's as good as it gets.
@@Heymisterbadguy The retrotink 4K has full-fat hardware accelerated 120Hz & 240Hz BFI with HDR injection, so there is no brightness/luminance loss when using BFI, it is the best BFI algorithm to date, except for Sony's RBFI in their BVM OLED monitors, as you already stated, which is able to get smooth motion @ just 60Hz.
@@Wobble2007 well it's BFI is really really nice but i dont know if it can reach the motion resolution of the C1/G1, which is not just double motion resolution (like the 4k would be since it caps at 240hz)
Also, the retrotink in 240hz caps at 1080p
@@Wobble2007 when I mentioned 120hz bfi, i meant bfi for 120hz sources
I wonder how awesome crt monitors would be today if they were still in development today. It would be awesome if there were brands that would take up manufacturing crt monitors again and put time into developing thinner and lighter than they used to be would be cool if it was possible.
It's a nice thought, but the fundamental way in which a CRT works means there's a limit to how thin and light you can make one. Keep in mind that the picture is drawn by a particle beam fired from a single emitter at the back of the tube. It's basically the center point of a giant sphere, with the screen being the outside surface of said sphere. The thinner you make it (that is, the smaller the radius of the sphere), the smaller and more curved the screen has to be to get a legible picture. All this has to be done while maintaining a vacuum inside the tube, and I'm pretty sure the bulk of a CRT's weight back in the day already came from needing to use materials strong enough to do that.
Could we improve them with the technology we have now? Probably a little, but I imagine part of the reason we stopped developing CRTs is because we were already seeing diminishing returns.
At the very end, they finally found a way to emit particule beams without the need for a tube which was about to lead to actual flat panels like LCDs.
man i love watching stuff on the old tube, they should really start making CRTs again
They should. Update the technology. Samsung and Sony back in the 2000s were working on new CRT tech that made the tube just a few inches thick. Would've been awesome.
Good job dude this is actually great. Definitely will be checking in from time to time
I used to use a CRT as a 2nd monitor on my main PC, then it died and did not feel like frying my insides just to use it longer.
@@teh_supar_hackr fry your insides?
@@GTXDash capacitors hold charge. If its not discharged and you open it, it'll discharge into you.
Same for microwaves. Be careful trying to fix them
@@ChArLie360115 Oh yeah. The CRT's anode terrifies me.
@@GTXDash Repairing a CRT to me is like a death wish I don't feel like risking.
Same here, mine is a CTX EX701F which just a few days ago stopped going into the menus, reset all the picture settings to defaults, and shows up on my PC as a "LXH-GJ769IIF" which makes me think some sort of firmware corruption. I don't want to get rid of the CRT but I might have to replace it.
The best thing about crts is the thing you can't explain to someone who hasn't experienced it: the immersion in content. It's almost like vr when compared to lcd
It's the brain being saturated with information 1 pixel at a time by 1 electron beam drawing 1 pixel at a time on a screen inside a vacuum tube.
And you definitely! Cannot explain to people who weren't visually trained to recognize frames at 1/20th of a second.
At 0.005 seconds was where I began to notice anti-gravity effects on a broad scale... my goodness - outerspace not having gravity means that light goes both directions so fast that it alters gravity itself... that is what the physics engine realspace2 produced into the world.... the theory became a reality 100%.
The biggest drawback of CRT monitors is IMHO eye strain. I still remember having headaches and sore eyes after spending a late night in front of the PC during CRT days. Having to deal with image geometry and soft corners was also a bit of a pain with CRTs.
did you have your crt at 85Hz? I get the same issue when I run my CRT at 60Hz, but I don't get it while at 85Hz+
@@jskilabe5986 I don't remember for sure, but I think it was around 75 Hz. With 60 Hz the flickering was unbearable even for shorter periods.
CRT monitors are not bad and they are not ugly. They have personality and they are robust. People say they are fragile but they are not, you just have to be careful not to drop them like you have to be careful not to drop your flat monitors. Honestly I am wondering if I should invest in a nice CRT Monitor because I rarely play any FPS games.
Just get any old 1280x1020 @85 17 or 19 inch of any brand, should be cheap. I have a 1280x1020 Dell M990 from 2000 and it's great.
These resolution CRT's are amazing and give you the flexibility of higher def at higher refresh rate top limit while also giving insane refresh rates at lower resolutions. Plus anything bigger than 19inch imo is too big and heavy unless you have the space
@@tommynobakaand a gtx 1080
Everything you said is indeed fact APART FROM.....when you said CRT's are ugly. They are beautiful goddessess.
They were cool but my eyes never want to go back to flickering. I am perfectly happy with my 165hz IPS monitor
motion blur is just "reverb" for video games. I love it!
I also like blur but only the intended rendered motion blur that gives the cinematic look.
@@GTXDash Hello
@@GTXDash You didn't like my comment? I'm not referring to the comment I'm making now or the other one I made that says Hello, maybe you didn't see it, I made a comment to you, I hope you see it, it's not hate
I love your videos, keep in mind that I will keep up with your videos but I won't always have time.
@@AlexBernard777 Yeah, sometimes I accidently miss comments or TH-cam doesn't show them right when they're posted. But thanks for the comment.
So to sum it up, you can push as many new frames to the display as you want but if it's not clearing/blanking the old frames first you've got motion blur.
Exactly. Especially if the blanking is longer than when a frame is visible. BRI is very limited in this regard. This is why backlight strobing is what display manufacturers should continue to improve on.
Came for the CRT commentary, stayed for being a fellow ZA boy.
Thank you so much for finally for finally somebody being able to explain this because I am sick and tired of hearing people saying on TH-cam oh this is a 360 Hz monitor. This is a 180 Hz monitor. This is a 270 Hz monitor I mean, who gives a hoot if you can’t put that many frames out that fast Meaningless
I think one of the reasons why crt got replaced very quickly in offices is because lcds have a more clear picture and is less blurry which helps with text.
I think CRT monitors are beautiful, unironically.
I made a pong… on lcd I got square for a ball, on crt it was round . Love CRTs blurriness for games. LCD good for text and fine line tech drawings… CRT for visuals. Beautiful colors and natural looks. This is why I still sport my CRT and will never get rid of it.
CRT for visuals
My 20 year old 17 inch Dell CRT looks smoother than a 180hz freesync LCD that I just bought and returned. It's really frustrating.
Such a good video my guy, ty for explaining this
11:11
I own both a CRT display @60 Hertz(It's a Sanyo VM4209) and a gaming monitor(BenQ EX2780Q @144 hertz) and I can affirm your claim.
They both feel about as smooth to use despite being made about 45 years apart.
12:04 I didn't stop using CRT because of what you said. I had to buy LCD because there was no CRT to buy.
Great video! I just honestly wouldn't downplay how great BFI modes are in a select few recent oled tvs, especially the LG CX/C1 and a few PC monitors tuned by Blurbuster. They can absolutely reach or surpass CRT's motion clarity. an LGC1 at 120hz with BFI maxed is incredible for videogames! It took basically 2 decades but we're getting there, finally.
Finally a clear and easy video for learning
I always ask myself why I always enjoyed to watch movies on CRT TVs or monitors.
Something was right.
you were younger back then.
Life felt more wonderful in general.
Now you’re old, experienced and bored.
@@maalikserebryakov interesting perspective.
I'm just 38.
The end-all video on CRT.
Actually, power consumption differences are negligible in the favor of flat-panel displays at best. That’s only if you’re comparing square inch of display per watt, and using only their peak wattage draw, for like the extent of 2-4 hours. The problem with the comparison is that CRT’s are only at their peak power draw at the first few seconds of startup, and it’s continuous wattage draw is a significantly small fraction of that. Whereas, Flatpanels generally just reach their max power draw and stay there roughly continuously. Monochrome crts generally use like roughly a third of what color ones do even. Basically, leave em plugged in side by side one another on a power consumption monitor, and eventually a crt will probably end up consuming less power in continuous long haul usage.
Theres truth to what you say. My backroom LG 1440p 144hz monitor and I will put the wattmeter on it tomorrow to see its average. But I know for sure OLED will eat more power than a high refresh CRT. My buddies 27 inch/1440p-240hz OLED was using 55watt idle at desktop(dark background) and around 86watt and some frequent 100 watt normal figures while gaming at his settings(with 110watt during bright scenes), and 120watt on white background webpages. Idk how the mfg are cooking the numbers on the manual but thats fairly high.
Whereas on my Sony G520 in the living room(21 inch CRT) 1440p at 85hz(128.5KHz tube speed) will spike to 133watt during degauss cold startup and then 87watt average gaming and 96watts in bright scenes, with the occasional 101watt spike with a almost full white background website page.
Next to it(dual CRT setup for convenience lol) the Fujitsu/Siemens 21inch(shadow-mask) at 1200p at 90hz(112KHz tube speed) spikes to 126watts degauss cold start, then averages between 80-91watts depending on the scene gaming, and 104watts mostly white screen internet page.
Before I got the Siemens, was given a Sony HMD-A100(15 inch) by the same OLED buddy. It looks spectacular after a 10 minute warmup at 768p 85hz. It used a very low amount of power and surprised me. I put it in the shop as the music/info terminal.
Lastly my Samsung 997DF(19 inch) 1200p at 72hz is oddly close to the Sony 520 in power consumption but looks great. I put it in a vented box with a big moisture eater bag from work in there. I check the bag and bake it every 7 months to keep the Samsung in good shape for emergency backup. Sadly I dont have data on the last 2 CRT monitors saved on my computer, but I did test them when the wattmeter arrived.
Sorry for the long text but its rare to see someone who has modern experience with older display tech. I daily the CRT monitors(typing this comment out on the big Sony) because its vastly superior to my not so old 32inch 1440p LCD in every aspect besides screen size. Thats why I pick up 17inch+ roadside/dump tube monitors every chance available and test them, LCD seems terrible in comparison and I like having backups. Would buy an OLED but its just too expensive.
@@insurgentlowcash7564 Naw, no worries. That was insightful and kind of illustrates basically what I was getting at, with data, however anecdotal it is. Reading is no difficulty to me. I read and write effortlessly, by fortune of being effectively literate. 😄
@@insurgentlowcash7564 You can run a 30" 4:3 CRT on 100w. With a much bigger picture.
if you ever want to use these i suggest to get a sunglasses and an eye drop to use every 5 minutes because i remember these was painful to look at for more than 1 hour
DLSS and FSR do wonders with the frame rate for gpu limited scenarios. But yeah, I wish I could get a good 1080i crt, but those are so expensive… I wish someone started to produce crts again. And I mean they are freaking particle accelerators which is awesome.
16:18 My Syncmaster 955DF is a 1856x1392 VGA CRT monitor, i counted twice the amount of phosphor dots for text vs pixels on my 1080p LCD, it's basically a true 4:3 1440p monitor.
Not only there's no aliasing on my monitor but text is way, WAY sharper than my 1080p LCD.
I love CRT tvs. I still have my "40inch" Toshiba CRT. Play my PS4 on it. The image clarity, sound (bass/treble/theator quality surround sound), color depth, contrast, etc are amazing to me. One of the many things I love about my CRT (and it's a major plus) is that the image & sound quality have not decreased at all over the years. Nor has the image started to look washed out/overlit. Every HD tv I've had (top brands too) have always lost image & sound quality as they start to age & always end up looking washed out. My CRT Toshiba still looks & sounds just as great as the day I bought it. Great video!
Sound from a CRT TV is always perfect
bro... not saying i dont like crts but i m not sure if i can take a person serious that talks sound from tv speakers... what about you just get an actual amplifier and speakers and use those instead?
Extremely underrated channel.
Yeah when you go back to crt from playing on a flatpanel, plugging in like a Mega Drive into a CRT and pressing jump buttom feels like the character is jumping slightly before you press it. Not a joke it genuinely feels like that. Obviously impossible but its more of a feeling thing, must be something to with your brain being adjusted to lag.
The main 'latency' of an CRT is the time the phosphor needs to illuminate, so there is nothing that has no latency, but it's pretty much instant compared to any kind of digital image transfer.
So what analog signal are you using with your CRT? 🤔👍
@@MikJames-d1g Well VGA on PC Monitors (obviously) and RGB SCART on CRT TVs if possible.
great video. i have a mitsubishi 2060u, 22 inch 120khz. running it at 800x600 at 160hz it has the smoothest image ive ever seen in person
These are arguably Mitsubishi's best tubes they ever produced, better quality as far as fidelity and IQ than the slightly newer 2070SB tube, much like Sony's GDM-5002PT9 being better than the GDM-F520, despite being older, it's the phosphor quality and electronics being slight better in the slightly older models for some reason.
Honestly I would pay real money for a nice CRT monitor. I rarely play super competitive games.
If you are ok with CRTs or Backlight strobing on modern displays, i honestly envy you. Here is my story.
I used CRTs in school and university and i had headache after some time. Bigger and flat ones were better for me, but still painful after an hour. Some of them ran at 100hz at 768p, also not helping much. Not happening on TV, but i viewed them from quire far. Never had that on TN, IPS, OLED. Friends VA also was okay, but I didn't had them personally.
On another hand, my cheap 144hz TN has noticeable motion blur. If i use Backlight strobing it does reduce blur significantly, almost to none. But my eyes hurt after a few minutes. I guess I'm sensitive to that thing.
yeah I never had headaches looking at CRT's growing up in the 90's, but I got headaches a lot while staring at LCD monitors once those became a thing.
I wish I could get my hands on a decent quality CRT that wasn't over $100
@@bylectricagarmoniya they arent
Crt have been improved as FED and SED but it never came to fruition.
It's a shame CRT monitors are no longer being produced or improved on
The best thing about CRTs over the other types: no latency. Zero.
I used a 17" CRT monitor next to a 22" LCD a few months back. The CRT would often develop a painful high pitched squeel. The CRT was dimmer & took a few seconds to reach peak brightness. The CRT's text was blurry. The CRT's image needed tweeking a bunch of settings to be mostly square and centered--and changing the resolution meant all those settings went wonky. I didn't really care about the weight and bulk when I later replaced the CRT with another LED monitor.
That just sounds like a crappy CRT.
Not to mention how much cheaper a much bigger/higher resolution LCD would be...
@@GTXDash
That just sounds like moving the goal post.
I wish I still had one. I used to have a 21" Mitsubishi. I miss it. I would even take a 21" ViewSonic that's how much I want one.
165hz feels just perfect
Believe me when I say this: I love CRTs, I have a 29 inch trini, rgb moded, hooked to an RGB-Pi arcade setup (OS4). CRTs are great for retro gaming.
That said, CRTs, generally speaking, are NOT better. It's just more cons than pros at this point. I just pray for my trini to last many more years. Even after doing a recap, I know it's not forever (it's already 15 years old).
60hz flicker looks awful on a CRT, it was the bane of many a worker's existence if they were staring at it all day for work. 60hz on an LCD doesn't flicker the same way at all and modern zero flicker backlights are much nicer as well.
My friend had a giant ViewSonic CRT and its still one of the best pictures Ive ever seen.
Excellent points all around MXDash2! Thoroughly enjoyed how you put all this together and you're spot on for every point! I spent a pretty high fortune for my whole PC setup, and when it comes especially in running my older games I miss my CRT that was 1600x1200p. OLED at the moment gives me most everything I want and it's impossible for me to want to go back, but I haven't forgotten the perks of the old CRT. I actually back in the 90s ran my TV and VCR through my GPU because everything looked so much better on the monitor. Somewhere around 2005 I think was when I stopped using my Radeon AIW 9800XT (If I remember correctly) where I had all that set up and everything just felt great. These days it's amazing when it all works, but I'm plagued with games that don't work with windows, or multi-core CPUs are the issue, or graphic drivers obsolete.... A decent number just flat out don't work, and even more recent titles don't.
Honestly the video capture of the CRT screen makes it look like complete garbage. It doesn't really deliver the point being made.
Even the OLED is not experienced only until seen in real life.
The solution should have been SED/FED tech at 60-75Hz with it's individual electron emitters doing flicker naturally.........I'm really bummed they never made it to production 😢☹️
Sad, ikr..
LCD/LED has always been inferior but could you imagine how deep a 100” crt would need to be.
I still have a CRT monitor for retro gaming, but you need a deep desk, modern monitors take up so little room.
If manufacturers continued building CRTs, they would've improved on the technology the same that LCDs continued to improve. Samsung (before canceling it) got a 30" CRT to be only a few inches thick. That being said, I think the industry needs to keep working on OLED technology to get to the reduction of motion blur that CRTs were able to achieve.
@@GTXDash i agree that OLED is the kind of tech we should be supporting.
@@GTXDash Can't wait for multi-stack RGB-OLED with native HDR-RBFI, which will tide us over until eQD which will hopefully have a native rolling-scan modulation method.
So it wasn't just nostalgia, games DID look better and were more "dynamic" and "fluid" back in the good old days :D
Nothing beats a crt! And I have an oled c1, a crt and a plasma so I can say it because I am always comparing... specially the movement of cameras that are very noticed on oled/lcd but not at all on a crt...
In the Philippines there still crt shops in the rural areas they Repair and Sell Used Crt Tvs
I have a CRT monitor I got free from someone and I love it. It has downsides like, yeah, size and danger if dropped or if I ever need to repair to, but it works better for me. I game at 1024x768 at 118Hz (the best I can get out of it), and it actually works *very* well, I oftentimes forget that it's not 1080p. As long as I have antialiasing enabled, at least. I did have to scrub the damaged anti-glare coating off it because otherwise it was awful, so I can't really use it without having the curtains closed in my room.
I will not be getting any new LCD or any flat panel monitor until I can get something that is as good as a CRT in picture quality and feel. And I do wish I could exchange resolution for refresh rate on modern monitors.
Pioneer and Panasonic plasmas are very close to CRT when it comes to motion clarity. I owned a Panny plasma for 12 years and gaming on my consoles was always a smooth experience. Keep in mind that the vast majority of these games were running at 30 FPS. Eventually my plasma burned out and I replaced it with an LG C2 OLED and I couldn't believe how bad the motion was. 30 FPS was simply unplayable because of the stutter, and even 60 FPS looked very blurry. And it wasn't just the motion clarity. Somehow the colors looked more natural on the plasma, and that was "only" in SDR. And then there's banding and uniformity. Both are still superior on a plasma. Anyway, I sold my C2 and went back to plasma. I bought an excellent refurbished Kuro and it's literally the best picture I've ever seen. I don't care if it's "only" SDR and 1080p. The motion clarity and natural colors are simply stunning.
Someone where i live is trying to sell a 720p pioneer kuro.... its tempting but not sure if it can fit in my room.
Id mainly use it just to game on 360, ps3 and watch movies via blu ray.
My panasonic crt is decent but 480i not 480p like later hd models.
It has component and overall, I love it. It was free from a church and given i have low contrast and just a tiny bit higher brightness than factory id say its very low hours.
@@roveradventures How big is it?
@@Stoddardian pdp-1130HD. Should be a 50". Has the box, and display. Speakers I can get on ebay for 20 bucks. I'm mostly concerned about hours on it. Ironically I have a cheaper pioneer set but the power board is nla and it'd cost more to get the board than a whole tv...lol.
@@roveradventures If it's cheap I would go for it. I got really lucky with mine. It was refurbished by a guy who worked as a Pioneer technician for 15 years. It still had over 90% of its original brightness left. It's an LX6090.
@@Stoddardian indeed, it was 120. But if I go for it I'll try to drop the price since speakers are about $20 on ebay.
Ideally they didn't use max contrast or brightness. The panasonic factory settings or the settings I saw. "Picture" is maxed out usually which can really decrease crt life.
you point around 7 minute mark is exactly why I push for black frame insertion in modern displays. because these MF dont realize increasing refresh rate doesn't matter if you can't drive it. currently on the oled_gaming subreddit, you can find people literally buying $800-1400 OLED displays but their computer can't even drive them properly. And its just insane. I could drive a 360hz OLED playing counter strike 2 or hunt showdown or world of warcraft, but most gamers cannot. What would be superior is having an LCD/OLED that mimics a CRT in terms of pixels on vs pixels off, giving that same illusion to our eyes that give us such perfect motion clarity. If they developed an OLED that was say 240hz and included a true black frame insertion mode that kept the full refresh rate, it would be pretty sweet. As it stands, modern BFI will generally double your visual performance. 120hz + BFI looks exactly the same as native 240hz. 240hz + BFI would look like native 480hz.... and obviously having less fps, BFI helps improve. Sure, new 4k monitors are coming that are 240hz, and have a 120hz BFI mode (runs in 240hz mode, but only every other frame is displayed, so frame black frame black frame black and so on). sure, for console, that 120hz + BFI mode might be awesome, but most people already have 120hz OLED gaming TV's with BFI.... not to mention television BFI is superior because both LG and SONY (the top brands for oled tv's) use a "rolling scan" black frame insertion. and it works. hell my sony xperia 1 mark 2 uses a rolling scan BFI and its only 60hz but feels like 120hz. its amazing. watching movies/anime on my phone in bed is way nicer than watching anime/movies on my PC gaming monitor, simply because my aw3423dw 3440x1440 175hz oled doesn't have BFI. so you get extreme judder in panning shots and its clear as day in anime.... yet on my phone which is oled + BFI you can't even tell it exists.... I would kill for a proper 240hz+BFI oled gaming monitor. sadly, we aren't getting it yet.
i had that shitty ips viewsonic blur busters certified 1080p display which had BFI at 60, 120, and 240hz.... 60hz bfi made my eyes hurt. it was atrocious. 120hz + bfi wasn't bad, but looked like native 240hz.... and then 240hz vs 240hz+bfi was no difference because pixel response wasn't good enough to show a difference. so technically its ;capped to 240hz with or without BFI. however OLED actually has TRUE sub 1ms pixel response, so any BFI at any level would be amazing. I really want 240hz+BFI.... true 240hz BFI.
a solution to the native aliasing on crt displays for LCD/OLED? honeycomb subpixel structure. instead of a subpixel of red/green/blue we need a technology where 1 subpixel can produce the entire range of color in one subpixel. have it in a honeycomb/hexagonal setup. and since the subpixels would be so small, you end up with a sharper display as the PPI would be much much higher.
You're making me want to give my CRTs another try. I already knew about what you told in this video but sometimes, an ultrawide monitor is good.
Yeah I picked up 3 CRT's on the side of the road and the Sony Trinitron 32inch that I got is just amazing. Plugging in a Super Nintendo with Composite cable looks almost crystal clear, there's no latency, and no blurring or ghosting. It just looks incredible, and it has a warmer glow that feels more alive than the artificial light that my LCD gives off.
I"m still using my CRT for playing games, sometimes i use the LED ones for some games and aplications such as Adobe Animate, Blender, Premiere but my main playing monitor is the CRT one, Motion Clarity even at 60 hertz is another level, there is no modern scream that come even close to what CRTs can make, there's no input lag, colors are great, natural bloom, black levels.. CRT is the king.
I really hope we'll get SED/FED monitors one day although I doubt it... Great video btw!
My dream is some company that will one day create a modern 4:3 ratio monitor that perfectly replicates (in hardware) CRTs. I wonder what modern technology could do to replicate the cathode-ray tube but without the weight, size and maybe without the glass
i like crts for that same exact reason, but i also like crts course of the shape and course how big it is course i like how it looks, the design, and has allot of detail, like my fravoite type of detail, but that would still be cool if they made a flat screen computer that replicates crt computers, like a flatscreen version of a crt computer monitor!!
Pumping up resolution will basically make it easier to display non native ones. Compare footage of an old game on a new phore screen to a large monitor. The phone will look better because the density of pixels hides the distortions.
There was supposed to be a flatscreen version of CRT....it was called SED or FED. Instead of an electron gun, there was individual electron emitters for each subpixel. It's downfall was caused by company greed.
Didn't really get why Oled can't come close to matching CRT in motion clarity, the near instantaneous pixel response time and black frame insertion would create a very similar effect, at the cost of brightness, whilst having all those other benefits, and a 4k display can drop to 1080p losslessly, just dispay 4 of the same pixel. Yes Oled will never look exactly like CRT but is CRT the end all be all best technology?
Nobody said that.
The problem is that they aren't doing that. That's exactly the issue.
I've been chomping at the bit for OLED with heavier BFI for ages because I'm in a light-controlled environment where I won't mind the brightness lost so much and it just isn't available still.
10:50 Brazil doesn't use PAL but rather PAL-M, which is basically 30i but with different colors. All the TVs I owned supported both NTSC and PAL-M.
Yeah, our TVs can do both 50i and 60i. I did not know there was a version of PAL that could only do 60i (30 full fields). Interesting.
I never stopped using them
I forgot the name of the tech... it was supposed to replace CRT with a pixel grid structure like LCD, but, each pixel was a very small cathode ray tube like pixel... that would've been a good replacement... I think. Maybe, though, the motion blur would still be a problem with that as well.
FED / SED
@@ShankMods thank you
@@ShankMods Makes me sad every time I think about SED & FED, even Pioneer's Kuro tech got shut down, seems like quailty displays are only destined for the professional grade markets.
@@Wobble2007 I have high hopes for Electroluminescent Quantum Dot and MicroLED
I still have a PC CRT monitor in my gear and for 2D platformers, old fps games, 3d platformers and other games where low input lag and perfect motion clarity are a massive bonus, I use the CRT always.
reasons why i like crts
- uses 4:3
- looks (and is) retro
It is funny to talk about motion quality of displays, when game developers still fok up their camera movement to an unnatural jerky headache making shock experience. I've yet to see the first ever FPS that has a camera movement that does justice to anything above 24fps.
Black frame insertion done well on modern tvs would definitely make crt obsolete
I hope so. If it's a high refresh rate display where 5 out of 6 frames are black while the 1 is bright enough to compensate for the 5x darkening caused by those 5 frames. Sounds outrageous but that's really what it's gonna take to reach that level of no motion blur that CRTs are known for.
@@GTXDash yes
Standard BFI, that is straight impulse modulation can not produce the same type of motion IQ and performance that raster-scan CRTs can, so for BFI to make CRTs obsolete, they will need to come up with an effective "race the beam" algorithm to simulate raster-scan, this will also need to simulate the incredibly fast phosphor decay times and phosphor glow and all that good stuff that CRTs have, the hardest thing to match will be the CRTs incredible native image depth, which with a good quality input source can be almost three-dimensional, looking glass, aka light-field tech is a good candidate.
Crt started to died when there was a attempt at 29 inches TV that didn't last longer than the other small ones.
If only modern gpus had vga output
There is always the display port connection to VGA.
VGA to hdmi works just fine
@@hehashivemind6111 yeah but you introduce some lag
Display port to VGA works, the issue is GPU drivers for custom resolutions.
@@ksysinf its generally such a low amount on most of them that it realistically shouldnt matter
I liked the great explanation of motion blur.
Great visuals and response times, however the way they worked would affect the user eyes.
11:05
PAL 50hz televisions are notorious for their noticeable flicker even on consumer TVs with relatively slow decaying phosphors (which, however, were good enough to display 60hz content without flicker). This was also the reason why 100hz european televisions with digital preprocessing were made (which is an absolute bummer and also the reason for HDTV CRT televisions being primarily in NTSC countries).
This is an additional issue: the slower the phosphor decay is, the less noticeable the flicker on lower refresh rates is, and the more noticeable are blur trails on high contrast moving content (mouse pointer on a black background, for example)
still much more bearable than any sample-and-hold LCD tho
When I switched from CRT to LCD back in the day the lack of flicker on the LCD was the main benefit. Even at 120hz a CRT flickers more than LCD at 60hz. You claim 48hz is enough not to see flicker but 50hz TVs that we have here flicker a lot to me. Playing at 60 in games that support PAL60 helps but it doesn’t get rid of flicker. So flicker or motion blur? Best would be to have neither. When I game on a CRT monitor I prefer 85hz minimum for my eyes not to hurt.
I might have missed it but other advantage crt has over new monitors is that, because of the lower resolution you can choose, games at that resolution benefits in performance but at the same time they look better at that resolution in crt than a newer monitor because of how crt works. The only downside of them are maybe HUD elements that are programmed with higher res in mind being out of proportion
You are a hero
Have you dabbled with black frame insertion on OLED monitors? I have a lgc1 tv that does it. It Darkens the picture a lot and the flicker feels noticeable but it seems to help a lot with blur. I've been thinking about this quest we've been on to get to where we were 30 years ago and it makes me mad lol. These 1000 dollar alternatives are only just now approaching CRT levels of clarity and it's very unsustainable when you account for new games with better visuals and higher pixel counts. The industry is full of snake oil too with things like frame generation which increases input lag and decrease responsiveness to appear better.😢 I'd but a new CRT if a company started making them again.
@testtube173 Even I play newer games on a modern display. It really does depend where the tradeoff of motion clarity for image sharpness/resolution no longer becomes worth it.
BFI is better than nothing, but even without it, I still generally only play games on my CRT if they predate the 2010s.
I can see easily 60Hz flicker on an LED light but not an incandescent.