When I was a kid around 2000 or so, my friend had his Dreamcast hooked up to his otherwise unused Commodore 64 monitor. It was one of my favorite gaming experiences. Everything was clear, colorful, and really showed off Dead or Alive’s independent boob physics. Extremely important at that age.
yeah those old crts had so much smoother frame rates then over now days crappy lcd led and oled displays just cannot compete with those old beasts for smoothness in gaming I wish they still made CRT monitor's😭
When there is demand, there will be supply. Don't worry, CRT stuff will get made again by some niche companies but don't expect them to be cheap lol :P
CRT televisions are being re-manufactured in 2023, including replacement parts. However, it's doubtful that CRT monitors will experience the same resurgence. The costs associated with manufacturing, tooling, and various regulations are considerably high, making it a challenging endeavor.
Their prices can vary from £80 to £300 (depending on size) when re-manufactured. Honestly, customers are better off purchasing a pre-owned one from eBay for a lower cost, rather than burning hole thru their wallet.
See the interesting thing was. CRTs died off because they were so heavy and bulky. Not to mention they were very expensive to manufacture. The cost of everything was just increasing so the sought out a different technology and just decided on LCD. It just took us this long to catch back up
You also had to live with the fact that you were sitting in front of a giant electron gun. After a while, particularly with large monitors, you could actually get what was essentially a sun burn.
well and people were just infatuated with a flat TV screen. You were the poor neighbor if you still had a CRT in your living room. It started with plasma screens which were cool but had terrible burn in, then went to LCDs and etc.
@@escapetherace1943 on the TV side yeah. But when it came to monitors. They were just too expensive to keep making. CRT TVs were noticeably less quality than CRT monitor
@@escapetherace1943 my 1080p 2010 plasma still going strong in my living room. it's better than any LCD i've ever owned. and unlike the LCDs of the era, it has almost no input latency. i remember the 40" 3D samsung i bought in 2013, that was the laggiest screen i've ever owned with it's lowest latency being 60ms in pc mode. i dunno how i managed to play team fortress 2 in that thing for two whole years before getting a 144hz screen.
I think the reason the screen keeps turning off and on it's because of the VGA to HDMI conversion. It usually happens whenever there is a sudden change in hz or something about the color limit of the monitor.
Maybe if he's using a converter with no power supply cable this what's turrning off the monitor, i had this issue. When i bought another converter with power cable to it the problem was gone. My LG CRT monitor does 1440x900 75 hertz, 1920x1080 60 hertz, 1360x768 86 hertz, 800x600 120 hertz.. sure you can pick other resolutions and refresh rates by doing custom resolutions, these are the ones i mostly use for different games and the image quality is so much better than any modern display, games run much nicer than shitty modern displays.
Yeah, i once had the exact same issue. Thought for sure it was the capacitors on the power supply on one of my monitors (had 3 back then) but it was apparently the Displayport to DVI adapter that was causing the issue, not the display. So all in all, im pretty sure i would firstly go and check for another adapter to give it a go and see if it works better before you put the monitor to shame. ;) maybe it will have years and years left to kick ass.
Part of the issue with gamma is that most new games aren't developed with CRT's in mind at all, so they work within the limitations (and advantages) of LCD and OLED panels, hence why older games were easier to dial in compared to Escape from Tarkov for example. As others have mentioned, you need to adjust brightness, not contrast to fix the crushed black levels, and only then adjust in game gamma options to suit the individual game.
Also he should be changing the settings on the monitor first then on the pc after in case he wants it brighter rather than him doing it all in the game and display settings first
This is ALSO, why old gaming consoles work SOOOO much better with CRT screens of any kind vs LCD/OLED etc! They are designed for interlaced not progressive as in the 1080i in a screen resolution vs 1080p. Both are HD but the WAY they read and display it is VERY different depending on the source and the screen type on the output.
We need a benevolent billionaire to throw away 200 million dollars by refurbing the old Factory Trinitrons were made in and cranking out the highest end models they ever made at a large loss.
Just so you know, Pin number 12 on the VGA cable needs to be removed, That's what is causing the screen to lose sync. Its an issue with the vga - hdmi adaptor. I would suggest buying an extension vga cable and removing pin 12. That way you aren't damaging the original vga cable. The image will be stable and look crisper.
Probably best to just buy a better adapter. the HDMI to VGA adapters generally can't do the maximum refresh rates. the Startech DP2VGA is cheap and easy to get, and it will do high refresh rate. Another thing he neglected to do in this video was setup custom resolutions, to really get the most out of higher end monitors you have to set up custom resolutions.
I want his monitor so bad, trying to find one somewhere though is a pain in the arse since shipping is ridiculous for any CRT monitor these days.@@Ty-sm9cv
If I recall some of the most modern CRTs could auto resize the screen too so you didn't have to manually do it every time. They also started quicker. Though sometimes slow startup was a sign of wear...
I didn't have one of those, what I had are the regular CRT that you adjusted once and it was saved in memory, for which it would save at least 10-15 different resolutions and refresh rates.
Yes, and you need to keep your eyes about 2 inches away from the screen and keep the monitor brightness very high. And to see how much better that is you should play a video game with extremely rapid flashing bright colors
LCD is exact opposite by having better image in the bright room (and it’s still look horrible) but slender man is not longer scary. I only use lamp for my CRT TV when I do retro gaming
it's sobering how, among the variety of tech channels churning out the most cutting edge tech vids, dawid brings us (also bespoke cutting edge) relaxing vids one can enjoy with chai Love ya dude, keep pumping out great content !
@@kyoudaikenCRTs had even worse burn in. It's a gamble buying an old monitor because it'll either have the Windows Taskbar or Apple Menubar etched into the phosphor or be totally fine.
@@benanderson89not really shitty consumer grade tvs had burn in not the PC area and especially not the Pro area. Oled though has burn in no matter the price range same with LCD's and Plasma's
@@mrsleep0000 Not requiring anti-aliasing is a huge positive for performance, especially for older graphics cards. :P As for the resolution, CRT's will always have this fuzzy, organic appearance... Which is why anything above a certain resolution, won't really affect image quality all that much. Sure, it's not as clear or crisp as an LCD or OLED, but it's more than clear enough for you to make out what you're looking at. :P Perfection is the enemy of good enough! At worst, it's a small negative, not a big one. :P Besides, you get better motion-rendering on a crt, movements look way smoother... So 90 hz on a CRT feels like 144 hz on an LCD, and so on, and so forth... And THAT is definitely a BIG positive. ;)
@@MyouKyuubi all resolutions on CRT are the equivalent to 1000hz on a traditional sample and hold display (assuming the FPS of your content, matched the FPS of the CRT) this is true for all resolutions. The reason for this is that CRT always has 1mS of image persistence. Motion resolution is going to be dictated by persistence, a 144hz display has persistence of around 7mS. It's pretty apparent when you have them side by side. As far as sharpness/clarity, it really depends on the pitch of the mask/grill in the tube it's self, as well the electronics (the electronics will dictate the "spot size", otherwise described as the size of the area where the electron beam touches down). We have X-ray monitors at my work that can do resolutions close to 4k, and they're sharper for X-ray viewing than their LCD counterparts!
@@Ty-sm9cv "all resolutions on CRT are the equivalent to 1000hz on a traditional sample and hold display" Hah, not quite, that good, no. :P Due to phosphor decay, it makes CRT's hold the previous image a bit, whilst the new image ALSO shows up at the same time, making the transitions between each frame soft...That coupled with how the cathode does one line at a time, adds another softness to the transitioning between frames. This is what causes the "motion rendering" to be so good. But make no mistake, you can still see flickering at 60 hz... At least my eyes can... i can see flicker at even 240 hz, lmao... But usually only when i'm turning fast in a game, or moving the mouse real fast. I don't know what it is, but it's likely that i have some kind of medical condition... I am rather photosensitive... So even if i'm lmaying at like 240 hz, if i'm playing something competitive like Team Fortress 2, or Overwatch... I feel like i'm going blind with all the afterimages i see when it turn around. It gets even worse when i turn on blackframe instertion/backlight strobing. Good lord, i dunno how people live with that sht... xD I know they've made some 500 hz monitors, though i haven't tried those to see if i can see any flicker in those... would like to tesat one out some time, to see. :3
Dawid, your exploration of CRT gaming in 2023 is a delightful thing. It's great to see you testing these old monitors and sharing your candid experiences. Keep it up
I think that the reason for the random flickering on and off is the VGA cable. I had a really cheap HDMI to VGA adapter a while ago, and it would do this while gaming. I'm no monitor expert though, so I could be wrong.
@@InvictusCore There are great ones it's just that with many products you have to sort through the good and bad ones. The Startech DP2HD20 VGA adapter is one of the best ones out there. I also have an adapter that doens't flicker as well
The best option is a GTX 900 series card, they support analogue out but still get driver updates. I think the GTX 980 Ti is the best card that properly supports VGA (DVI-I).
I regret getting rid of my crts. Lcds were a false promise. "at some point they're all gonna die and we're not gonna get to see crts anymore" That physically hurt me.
I still think LCD's look miles better than CRT for quite some good time already, specially semi-gloss and full gloss ones, much better colors, blacks, contrast and brightness... The problem of the LCD is it really needs high resolution content which is totally normal since the CRT is rear projected which softens everything and make really old low resolution games to look so much better. The main issue on LCD is really motion, but for now there's surely some models that have it figured out and are so fast that the difference is small but they also cost a ton of money and the priority of them are motion and low input lag over probably everything else. I think I still have my father's last CRT somewhere something with 19 inch, but my eyes still thanked a lot when I moved to a much crappier LCD back in 2004 with less resolution, less frequency, bad blacks... At least my eyes stop burning playing trough the night and when I could play games at it's full resolution it had much better sharpness than the CRT since the PPI were probably about the same because the LCD even being lower resolution was also only 15 inch compared to the 19 inch CRT...
@@guily6669 that is just wrong, lcds suck. it's literally the worst tech out there, behind plasma oled and crt. wait, no. projection tvs are the worst BY far. but lcds are the worst among the non puke-covered ones.
@@GraveUypo I don't think they look that bad in terms of image quality specially latest VA panels giving almost inky oled blacks on some models, very wide colour gamuts, 1500+ nits and so on... Also soon ADS Pro panels will arrive making IPS like with much more contrast, brightness, blacks and so on... LCD is definitely showing some good value at least on TVs on some models and in terms of monitors it also shown it can be quite fast with the fastest latest models. But it sure has its downfalls which takes a lot of money to overcome by needing crazy fast hardware for fast processing, quantum dot and miniled backlight with many dimming zones and a good fast and reliable dimming algorithm and clever thinking software, but when done right it works amazingly.
Thirded. I had a similar one I tried on my Gamecube (GCHD Mk II) and PS3. No matter what I tried it with it blacked out like that. I went through two of them like that and just wound up getting my money back. Need to find a model that doesn't totally suck, and the only good one I've heard of isn't even available anymore.
That is a good point indeed. I had a generic display adapter for my Windows Phone that I used with a CRT and the bandwith was also wonky, resulting in the same problem.
Display tech is just now catching up to CRT's, yes, this is well known. I think once microled's finally get mainstream we'll finally have something better in an absolute sense.
The only thing CRTs have going for them is their high 200Hz refresh rate. They are drastically worse at everything else. That's why CRTs died instantly the second LCDs appeared cuz only gamers cared. Actually not even they cared.
@@roller4312It was more than LCD's were novel and cool and didn't take up your entire desk. They weren't better as displays though, worse in fact, and usually not much bigger if at all bigger at first. Where they came into their own was 19+in displays, which weren't commonly found in CRT's as they'd be too large for desktop use. CRT's stayed in-vogue in commercial environments far longer mind. Video production and anywhere that needed very accurate color reproduction was run on CRT's for ages after they weren't a desktop norm
@@jttech44 Even the trashiest LCD is better than any good CRT, even with their crazy 5:4 aspect ratio, they effortlessly took over. Just the image clarity and proper geometry is enough to destroy any CRT. And S-IPS took the color accuracy crown almost immediately. EIZO started making them in like 2002. I suppose one could make an argument that a very expensive CRT with proper maintenance could rival an a mediocre LCD is some areas back in the day, but it was unsustainable in the long run, and certainly not in 2023, or any other date really.
This video just got me missing 2 of my favorite CRT monitors that I owned wayyyyy back. The 22 inch Viewsonic and 21 inch LG monitor. Their picture quality was damn decent coming to think of it. But was a Pain in the A$$ for transporting them to LAN parties. Awesome video man.
If you get the chance, I recommend trying out a high-end CRT monitor like the 21" Eizo Flexscan F980. They feature great resolution and refresh rate. 1024x768@160Hz and all the way up to 2048x1536@85Hz - according to the manual. It's been years since I actually used one. It would probably look amazing with modern 3D graphics.
Yep those are awesome. Full HD fits in a window on mine. The mouse cursor becomes almost invisible at that resolution though. Upside: you won't need any anti-aliasing at that pixel density. And it doubles as a heater.
I’d like to add crt monitors made with relatively simple components. Replacing a few dying parts arent an issue at all. I think crt is gonna be around for a looong time in caring hands.
The only thing that is depressing is that the tubes themselves are wear items :( They get dimmer over time, eventually becoming unusable. If no new tubes get made, we’ll eventually run out of good tubes. If you only use your TV/monitor sparingly, I think they’ll last a long time but your tube wearing out is inevitable.
I wish I have known that back in the day, I got rid of some amazing high-end CRTs all because they died prematurely, when it was probably a bad capacitor or some other small problem, while the tube itself was immaculate and barely used, especially a 21" Sony Triniton monitor that I gave to a recycling facility. I think the biggest problem with CRT repair is the fear surrounding it since we all know they're high voltage devices that can be lethal and that stops many from even trying and ofc there's the fact that repair shops for CRTs are gone now and the people that knew how to repair them are retired or no longer with us so you have to learn from scratch at home.
@@AJ-po6upman thats what gets me too. I already cant stand trying to work on a car with electrical components or batteries cant imagine how dangerous a crt is.
Shadow details in darker scenes is one of CRT's strongest points especially compared to LCD, not sure what was going on with the Dell here but makes me wonder if that little adapter he used had something to do with it.
the thing he is talking about is because he has a very bright room - think of it like, as an exaggeration, pointing a flashlight as a crt at an angle - you are lighting up the screen itself from the front. that fact he didnt realize this and compare then in a dark room is rather odd if you ask me
The crushed black levels should be able to be fixed in the monitor's settings. The brightness setting on mine controls the black level, and contrast controls the white level. Try to set the brightness so that black is right on the edge between black and an extremely dark gray. Also CRTs generally look their best in a dark environment so the display itself being less dark isn't an issue (you can shine a flashlight at it to see what I mean, it makes black look like a light gray.)
I'd actually recommend something else, set contrast between 80-90% and brightness to 30%. Put a fully white picture fullscreen, and then zoom out the image on screen. Increase the brightness until you see an edge to the unlit area of the screen and the black "overdraw" area. Once you see that reduce the brightness until it goes away again. Now zoom the image back to your normal settings and, if using windows, run the display calibration tool to sort out the gamma and colour. I have the best results by making first sure the monitor is set to the middle-most display temperature setting (Something like 5600k I think.). sRGB will usually crank the contrast to maximum which will wear the phosphor out quicker.
@@blunderingfool Since he's using a adapter for VGA Windows might have HDR enabled for that monitor connection. For native VGA Windows doesn't normally let you enable it but that adapter might be confusing Windows. So good idea to check color settings and make sure HDR is off. HDR is not a thing with CRT monitors and that can result in blacks being too dark as well. ;)
@@MagnumForce51There's also the chance he's using an HDMI to VGA converter. HDMI is known to have 2 different black level settings for TV and monitors. When the wrong one is used, the black levels and highlights are crushed.
@@rdoursenaud Oh yeah forgot about that. Some HDMI converters can be kinda crappy which is why I have been reluctant to get a modern graphics card. Currently got a 780Ti. I recall the Titan X was the last Nvidia card that had native VGA output (via DVI-A port with passive adapter).
@@raven4k998 the frame rate isn't at all the benefit. The response time and motion performance is the difference. A modern display can beat CRTs in frame rate alone. The point is that latency is extremely low even with a low frame rate
@@blunderingfool ages ??? nah it was just one console generation. xbox 360 and PS3 era (2005 -2013) and sure it was the longest proliferated generation of consoles , i'll give you that , but still just one generation of consoles. the xbone and PS4 both aimed for 60fps. when the 360 launched LCD's and plasma tv's were still costing an arm and a leg and made up only 12-14% of TV's in homes. hell HDMI didn't even exist yet with the 360 launched in 2005. Netflix still rented out DVD's for the most part and had not even funded one self made show yet , Blockbuster as still relevant, You tube ahdn't been bought by google yet and was still barely catching on. and yeah 720p LCD tv's sold for upwards of 4000 dollars. so yeah why would they be aiming for more than 30 fps that gen when most households were still CRT? the next gen consoles after them got hammered by critics because they would drop resolution down to 1600x900 to maintain 60fps. So no it didn;'t take them (game devs) ages to catch upo , it took hardware manufactures (sony and MS) ages to replace their aging consoles. those consoles couldn't do 720p at 60fps so no mway in hell were they gonna do 60fps.
Most old console games were running at 60 FPS, it was just an interlaced signal and CRTs were absolutely amazing about handling that. My Sony Trinitron looks better than my 4K TV when it comes to old games and that's 100% because it was designed for a CRT's natural pixel blur due to how it works.
@@joeykeilholz925 Frame rate is absolutely the benefit if you can save resources spent on it to draw a better picture. Latency affects responce time equally regardless of the frame rate, which is something people have a hard time to grasp the concept of.
The degaussing and how long it took to warm up really hit that nostalgia I was after. I didn't buy an LCD until they were pretty well established tech, both for budgetary and performance reasons, and it took a long time for them to equal the quality of an average CRT. Great video as always!
same dude I was a poor kid forced to use a shitty sony CRT and I DESTROYED online because everybody switched to LCDs early on and had tons of input lag. This video made me feel heavy nostalgia for CRT because I honestly enjoyed those times and CRT does feel really nice and snappy. Better than even my nice modern screens.
I had Compaq CRT that I bought second hand around 2001 or so. It was a HUGE screen as far as CRTs and probably weighed around 60lbs.... but it was originally built as a high end professional graphic design / CAD monitor... and damn did it bring the smooth gaming thunder when you could crank the refresh rate. I stopped being Sweaty McTryhard on the gaming side around 2006 or 2007 and eventually went with LCD for space savings... but as far as lack of input lag and insanely smooth refresh... it would still be competitive today.
@@control_the_pet_population oh man mine was like a 24 inch television set with the color off color hard in the upper corner 😅somehow I was in the top #10 spot on the leaderboards of SnD on CoD. I quit all that nonsense a loooong time ago, a few years later than you but man those were the days weren't they
yeah it is so nostalgic to see a crt working again mind you that was how I learned IT from windows 95's training software running on my old crt for my first computer I grew up on
I still have an old CRT monitor from before I was even born 21 years ago, and it still works like a charm. We have this old 32" JVC CRT TV that we found on the curb too, working perfectly fine with a little tilt inward on the top right but barely noticeable, and it's still a pleasant experience. I love old tech. The whine is normal.
Yeah not really. I managed to get about 2h of screen time on a high-end Sony CRT without getting a serious head ache. Today I can spend the whole day in front of the monitor without my head exploding and without my skin feeling like it's going to peel off any second.
@@smeezekitty True. After 25 years we're finally at a point where lcd image quality rivals or is better than crt. Greatest advantage of lcd is size - imagine the bulk of a 65" wide-screen crt
I downgraded from a 21"(1600x1200 native) dell flatscreen CRT (trinitron tube) to an Acer 1680x1050 lcd and I hated the lcd for so long. I actually went back to the CRT for a while. I think eventually I forgot how much I loved that CRT, but this video brought back some memories for sure.
honestly, i think i lucked out with my first lcd. it was a 19" 75hz screen at 19". obviously it wasn't as good as CRTs, but it had a fast response time and it wasn't freaking 60hz. that was in 2005. it wasn't until like 2013 until non-60hz lcds started to become more common. it was almost a decade of painfully inadequate screens.
I had one of the latest CRT monitors for a long time. It was a phillips and it had a refreshrate of either 120hz or 145hz (not sure) with a lot better blacks. It was absolutely HUGE. Bigger than equally sized CRT monitors, but it had a high res with high refreshrate and better blacks, it was pretty amazing actually. I used that screen until like, 2013 or so. Sometimes wish I still had it.
i had a philips that sounds similar to that one, you could crank down the resolution to 640x480 and get like a 200+hz refresh rate out of it or crank the resolution up to 1600x1200 at 30hz, it was a beast that LCDs haven't really caught up with
Yakno as I get older I kinda want to get a retro pc and play some old rpgs like fallout, buldurs gate, and jagged alliance 2 on it. Experience what those games were like on the hardware that was around when they were released.@@Grau85
High end CRTs were absolutely incredible. I still remember when the first LCDs came out and they had atrocious colors and motion blur; I found it hard to understand, why people went to LCDs so quickly. Even now LCDs arent that good and even OLED cannot beat the motion clarity of CRTs. Maybe with the new QLED technology, we will finally see the motion clarity of CRTs again.
Crazy to think that in college I used a Sony PVM at work to help convert VHS tapes to DVD with a video capture PC. And now retro gamers are picking up these exact monitors like they're going out of style.
I've heard Black Frame Insertion can help with motion clarity on LCDs but honestly I've never had luck with it. Even at 120hz BFI dims the screen too much and is noticeably flickery.
@@VexAcer Flickering is a result of BFI, only way to mitigate that would either be less aggressive BFI settings, or higher refresh rates. That said: my first high refresh rate was a BENQ 1080p 144hz TN screen, that was one of the first with BFI support. The motion clarity was insane on that thing, not quite as good as a CRT, but still really good. I often turned it on when playing Warframe and you could see each individual frame, without blurring. To give you an idea how good it was: imagine flying past an enemy with a name tag above him and you can only see the name tag for 2-3 frames and still read it perfectly without any blur
Like others said, the crushed black levels are something you'll have to adjust in the monitor's settings. I think once you get it dialed in, you'll be impressed by the quality of blacks in the image. As for the screen shutting off, I wonder if it has to do with your VGA to modern input solution? The monitor may have trouble recognizing the signal periodically for some reason, so it's dropping the image.
@@Pazuzu- Okay, but an issue like this can be highly dependent on the user's setup specifications and settings, as well as the particular hardware itself. I imagine their are many various VGA adaptors on the market, so the exact cause of his issue could be very specific to how his individual adaptor works, or how it interacts with his specific setup. No offence meant here, but your comment is like if someone was having problems in the transmission in their 2002 Honda Civic and someone else responded with "The transmission in my 2008 Toyota Camry works fine!". Like... sure, that's great, but how much does that assist us here? If we were talking about verifiably similar or even identical setups with confidence, sure, but we're not, so... you see what I mean?
5:20 - "85 Hz on a CRT looks like 240 Hz on an LCD" As someone with the Samsung syncmaster 997MB, I 100% agree with that statement. High-end CRT monitors are basically a budget gaming monitors if you can find one for cheap.
I have a 955DF, the image quality is absurd, i can't believe it can do 4:3 1440p (almost, 1856x1344). If i use the typical 1600x1200 at 67Hz, it still looks better than +100Hz LCD i ever watched, all of them look absolutely garbage in motion. There's no aliasing as well and the colors beat my AMOLED phone (for extremely bright colors like the highlights in Tron Legacy, my AMOLED struggles with color saturation there, meanwhile my CRT is basically analog HDR).
I watched many WOLEDs and QD-OLEDs in person, they look worse than AMOLED and CRTs. WOLEDs in particular struggle with warm colors, QD OLEDs are better but they look like they have oversaturation. AMOLED screens are by favorite, pure RGB, no quantum dots, crazy pixel density (no aliasing). My 955DF was abandoned for +10 years, i turned it on and the colors blew me away. Looks like it was perfectly calibrated for graphics design. I watched Mac screens and they look the same and they are calibrated as well for graphics design. My 955DF aged like fine wine, what a timeless beast. I use it for Blu Ray movies, TH-cam, work, gaming, it's a perfect image.
This is Dawids Redemption Arc. Nice to see you giving the PC Monitors a shot. Its always cool to see CRTs getting some love There is only one issue I have with the video: You didn't test the Monitor in the dark. Thats when Brightness becomes less relevant and the black levels look actually black. If youwant to have lights in the room, then never expose the tube to direct light. If you avoid that, the grey color of the tube wont be as noticable ^^
By the way I think that converter is causing the blanking. One of my convertes starts doing that when too much is on screen while the refresh rate is high
The "eeeeeeeeeee" noise is the 15.625KHz (or 15625Hz) from the high voltage transformer putting out 25KV (yes 25000V). This was true for TV's back when i was a kid fixing them!
man, back when I was a teenager I worked as a "techinican" for the school computer lab in exchange for a discount and my favorite thing to do was degauss monitors. I remember receiving a pallet of dell computers that were all set to 240v and thinking "omg they are all broken". simpler times. I remember the old guy teaching me how to set an IP adress because he didn't know dhcp existed.
dhcp didn't work pretty well on a large (8+) network setups. especially on windows xp. it would just freak out and assign a 179.122.something.something adress so manual IP adressing was the thing back in a day for stability issues. I mean not everyone (schools especially) could afford cisco equipment, and cheaper stuff had it's glitches.
Man CRT gaming takes me back to LAN parties, and I genuinely miss it. I hope CRTs make a come back, just more compact with modern connections. They feel incredibly good to play on.
I remember at work in the mid 90's we were given Compaq 21" CRT monitors. We found we could raz the refresh from the default 50Hz (you could spot any sub 60Hz monitor if you vibrated your tongue against the roof of your mouth as they flickered like crazy...top tip) up to 85Hz. So we did. Everyone was very happy. Cue masses of dead monitors a few months later... No we didn't let on.
@@GraveUypo Yeah this was the mid 90's and we werent tech people. The IT team had left most of them at migraine inducing 50Hz so go figure. But if they can't hack it then they shouldnt let you set them that high. But at the end of the day...they werent our monitors.
very probable. Sadly, I think the last GPU with an DVI-I output (includes DVI-A, Analog over DVI, which can be fitted with a passive VGA adaptor) was the nvidia 9xx series.
The thing that fascinate me about CRTs is that, if you look at them in a slow motion footage, every frame it's made by a single pixel zigzaggin the screen from top to bottom 😮 Edit: saw that in a SlowMoGuys video i had no idea and i grow up with CRTs everywhere in my house 😂
That high pitched whine that you heard on the crt tv has a pitch of about 15.6 kHz, which is consistent with the horizontal refresher of a 480i crt. All CRTs of that resolution have that sound and it doesn't mean the little sony is close to dying, thankfully. As we age, most people lose the ability to hear that frequency, it lies just outside the average adult's range. The fact that you can hear it is good news that you still have good ears! And your TV is fine!
The display sometimes going off might be due to your HDMI to VGA adapter, or maybe some power supply circuitry is aging inside the CRT, which would be very fixable!
The flickering may very well be due to the adapter, not the monitor itself. I've had this happen with cheap HDMI to VGA adapters. Good quality DP to VGA adapters tend not to do this.
In my personal experience, when CRTs start doing that "Off and On again" thing, it's usually a sign of the flyback arcing, which could be because there's too much dust cakes on to the boards inside the monitor, or the flyback dying. Unfortunately if the flyback is dying the only way to get a replacement that I know of, is with a donor board. If it's because of dust, a good cleaning and reflowing of the solder will fix it.
Personally, I wish they kept up with plasma development. A 4k plasma with over 100hz would've been very affordable and amazing looking compared to any LCD
Oh thank goodness, I'm so happy you managed to do it right this time. XD Big pro tip for CRTs by the way, if you want to run one at 1280 pixels wide, do it at 1280x960, not 1280x1024. The former is a proper 4:3 resolution, matching the physical aspect ratio, the latter is 5:4 and gives a slightly squished image... plus you can run it a bit faster. You absolutely need to 1) Make sure the monitor is handling display scaling, and 2) fine tune things in the graphics driver. I suspect the flicker and intermitted drop out you were seeing may be due to those reasons, modem GPUs just don't behave nicely with CRTs. Also also! Another bonus tip for anyone watching, you can get very good black levels (Which Dawid did not) by zooming out the picture until you see the end of the 'overdraw' area, where there's a highlighted black versus a 'true' black. Reduce contrast to 80-90% at this point, then reduce brightness until you see no division, an all-white image on screen helps with this sine you have a solid border. Once you have the black levels right just zoom back in and adjust your geometry as needed! =D
what even is this recommendation? just look at a shadow clipping test and raise/lower brightness until its set correctly... if not helping need a full calibration/g2 adjust
@@Polowogs Sure, if you want your gamma and colour balance to be fucked. We're using worn out phosphor most of the time, it needs a little more care to really shine.
You can easily fix it by adding custom Grounding cable that directed directly to Earth. The amounts of electrical noises in old CRT is just insane but doesn't mean un-fixable
The random dropouts are caused by a bad HDMI-to-VGA converter, or a bad connection to it. It's nothing to worry about. If you plug it in by genuine VGA cable it'll be okay. Good luck finding a modern card for that, though.
Indeed. When TH-camr individuals do these kind of videos I wish they'd just buy a 980 Ti for native VGA support, this should be the most powerful card that supports analog video natively.
@@butre. I did say either a bad converter or bad connection to it. You can tell it wasn't the monitor because CRTS lose brightness and sharpness when they start to fail.
some 750ti cards still have the vga port on them, like the asus one i used to had, its not exactly "modern" but it still does have driver support for a few more years, and it can definitely run a majority of retro games and older 3d titles
As you talked about your Japan PC not having video connectivity anymore, my monitor had a random flash of a grey area, and later again, only twice so far but maybe the radiant energy killed my monitor or GPU too.
Yep, can confirm. Old games look much better on CRT for many reasons. I grabbed old Smasnug few months ago and games like Jedi Knight, Baldur's Gate, Half Life, Quake, Doom, Blood - all look really really good on it. I even tried to play Cyberpunk 2077 at 1024x768 and it ran really well even in RT Overdrive on my 3080, lol. Only game that didn't liked it was Quake 2 RTX because it didn't like my multi-monitor setup :D
Oh nooo. Not the little Japan PC! Also, I was significantly more impressed with this CRT than the previous one. But the eye fatigue is serious. Also Dawid's sketch about haters was so funny. Such commitment with that spit-take haha :D
I remember there were some special screens to filter light you could hang on your monitor screen to get less brightness. They were simpler times, haha...
@@joefish6091 eye fatigue for me has always only been with low refresh rate monitors lol. 60-70Hz feels like someone is throwing thumbtacks into my eyes, but as long as its 86Hz and above, I can use it for HOURS and not notice a thing haha. Feels buttery smooth.
The CRT is THE way to go for your retro gaming, especially 98/XP gaming, albeit not for the newer stuff (and DOS gaming is a given). And yes, Dawid, that CRT is sadly not long for this world unless the capacitors get swapped. At least, in my experience, that's a sign of the capacitors going bad, which in turn burns up the projector. If it just goes kaput all at once, in my experience, about 90% of the time, it's the fuse, which usually costs less than a buck. I saved so many of them from the trash and eBay'd them lol.
When I replaced my CRT back in 2006 or so, it was 100% due to the fact that LAN parties were still a thing in my group of friends, and an LCD was infinitely more desirable in this situation. That was it. I had no care for color, refresh rate, or anything like that. It was just due to size and convenience, and I would imagine that accounts for the vast majority of people who upgraded at that time. Little did we know we doomed ourselves to 20 years of worse image quality that's only just now catching back up to where we left off. Imagine what gaming might be like if CRTs had instead been continued to be developed on for the last 20 years. Sure, they might not ever be pocketable, but I imagine they'd be far thinner and lighter in size by now, and drastically better in terms of image quality.
Man, you know what's sad. Just few years ago, my mother was cleaning attic and asked me about the old CRT monitor we had do we still need it. I decided to turn it on again to see if it works, and find out that my monitor had 75hz refresh rate whole time and I didn't know it. 75hz for some reason felt so smooth I was really really surprised.
Oooh! I have the iiyama Vision Master Pro 455 (Or, MM904UT), 18inch viewable, great colour and black levels. The phosphor is still in great condition and it looks great at 1600x1200@75hz.
LCDs do but CRTs don't know exactly where the image they're showing ends up.They can estimate it based on the signal timings but being analog, it isn't always accurate
One thing CRT is much better than any type of modern 2024 TV or Monitor, CRT has no Latency, no Ghosting, and no motion blur, which is crazy modern technology should be much better.
Back then gaming universe was really awesome and immersive. The quality of the games was pretty high while people were using CRT monitors. (early 2000's) Considering the gaming industry went into permanent decline in the last decade, playing games on a CRT monitor was a unique, unforgettable experience. It still is :)
CRT monitor lol unique, unforgettable experience, ok dump you modern monitor now. I will tell you this the were clunky, too big, hurt your eyes and took ages to turn on, use too much power as well. I love my flat screen 32 inch curved monitor, Or and CRT monitor curved a way from you not in. As for the gaming industry yep it was down hill but remember this people knew to games so not many knew what a good game was. Now people know.
I really missed the high refresh rate of CRTs when I switched to LCD in 2007. I used a Sony that allowed 120hz at 1024x768, it was incredible. The colors were much more pleasant, 100% black tones, but I was forced to switch to LCD because my pair of monitors literally cooked my eyes, and at the end of the day, I was looking like a stoner. LCDs allowed me to stay in front of the screen up to 24 hours a day without headaches, or "sand in my eyes". Only now, 16 years later, are there LCD monitors with decent refresh rates, and some color depth at affordable prices, but even so, I feel like it's not as good as CRT.
I've been watching your videos for pure entertainment for a while while now. And I belong to one of the (perhaps few?) ones that played games in the 90's (and early 00's) and it's impressing to me what kind of performance you're referring to these days. I remember picking up a 21 Inch CRT in a storage locker and I was the envy of my crew! it was MASSIVE! And nowadays you have more ram in your graphics card than I had in my hard drive, one GB was HUGE in 1995! Keep up your charismatic attitude , that is what will gain you views, content is second! Cheers from Sweden!
My theory is that your CRT might have a cracked solder connection on the high voltage circuit and that is why it's cutting out every once in a while. If I am right about that, then that could be fixed.
Motion rendering? These analog CRTs don't do anything expect display the signal that comes through the cable, pixel by pixel, line by line. There is zero processing. And also zero delay. It's instant and direct. I had flight simulator running on 3 x 20" CRTs that together weighed about a metric ton.
The display background grey on the crt is from the light you are shining on to the crt monitor. If you are in a dark room there is no grey background on crt or plasma monitors.
the black crush could be caused by: a) shining light over the screen - better to keep subdued lighting in the back of the monitor b) bad adapter - idk about displayport, but like 99% of hdmi→vga adapters are not doing full RGB properly, so whatever you do it'll lose colors compared to VGA→VGA. but in a pinch, setting the output in Nvidia contol panel to 'ycbcr444 limited' instead of RGB helps it somewhat. c) brightness on monitor - 'contrast' is like "luminosity" or power, and 'brightness' is like gamma setting. i keep contrast @ 100 and brightness @ 40-50
@@startedtech CRT are curved so the light reflects off the edges. The shine can be really annoying especially if you have the sun coming in through a window and hitting it on 1 side. Due to the image appearing more “in the screen” due to the thick glass it can really make it look washed out on 1 side. I don’t usually like a Matt screen but with a CRT they are useful unless sitting in total darkness or without side reflection. Even like a desk lamp trashes the image.
@@u3pyg Thankfully it doesn't matter, since they only see a digital display via the adapter, the adapter then feeds the analogue signal to the display.
I used a Dell (Sony) Trinitron 8 years into the introduction of LCD’s in the market. I lugged that same exact monitor to countless LAN parties to play Battlefield 1942. It was quite the feat for a short high schooler. Thanks for taking me back to a very special time in my life. When I had friends who would always be down for slumber parties, pizza, soda, LAN party and riding dirt bikes and quads.
Oh man Battlefield 1942! Loved that game. I made mods for it for well over 10 years. Battlefield Heroes'42 and BF2'42 were the last 2 mods I made for that game before I finally moved on. :P
I bet that USB-C to VGA adaptor added some latency. I wonder what the last video card to have a DVI port (with the analog pins for the native adaptor) was...
For Nvidia it would be Maxwell I believe so a 980ti or Maxwell Titan would be the fastest cards available, for AMD it's the R9 380X I believe but I know less about the older AMD cards since I didn't own any during that period.
The fastest card I own with the analog DVI port is my GTX 980 Ti Lightning from Msi. I have a much better card in my main rig but I love having this card for my CRT. GTX 970s and 980s are getting dirt cheap now so those are solid options for gaming on a CRT
@@AJ-po6up No, it's stil very slow. A CRT draws each scanline in microseconds. It's physically impossible for BFI and strobbing tech to match how fast the magnets on a CRT change the position of the electron beams.
@@AJ-po6up The issue with close is that close can never hit 0 it can only approach it. At that point though you'd need 1000 frames a second as well. No game runs at that especially no modern game. Which is the big issue with high refresh rate panels. 500hz is fast but unless you have 500 frames it doesnt really matter. If you are locked to 60fps which many games are you might as well be on a true to spec 60hz panel (Frame pacing is better but this is a micro factor). For this reason CRTs will always be king. Until an entirely new tech is created.
6:23 the reason for this is because these older games were made with crt displays in mind, the display would smooth out the rougher looking parts that you weren't really supposed to see! the best examples of this i can think of are dracula's talk sprite in sotn where his eyes are just one red pixel each but because of the crt it would blend them to look more natural, and in waterfalls in sonic 1-3 where it did a similar thing to give them a natural looking transparency
The problems I had with cry gaming was the screen flicker , the huge back for the tube , the high pitch whistle that comes from most cry screens and the headache it caused. But CRT does look sharper even at lower resolution. I think LCD looks more jagged is because of the led layout.
hearing that whine only means that the crt is displaying in a standard definition resolution, i.e. 240p or 480i. the horizontal scan rate of sd resolutions have a scan rate of 15kHz which is inside of the human range of hearing. the monitor does not whine because the higher resolutions it displays have a higher horizontal scan rate above 31kHz, outside the human range of hearing. therefore, the whine does not indicate it is dying.
Ngl there is just something about CRT's that will always make me love them. They just have that nostalgic vibe to them. These days everything is so perfect and you can spend 5K-10K on the perfect setup with a 240Hz 4K Oled display but there's just something about CRT's that will always be awesome to me. They should bring back a modern run of CRT displayers, that would be pretty cool imo.
When I was a kid around 2000 or so, my friend had his Dreamcast hooked up to his otherwise unused Commodore 64 monitor. It was one of my favorite gaming experiences. Everything was clear, colorful, and really showed off Dead or Alive’s independent boob physics. Extremely important at that age.
yeah those old crts had so much smoother frame rates then over now days crappy lcd led and oled displays just cannot compete with those old beasts for smoothness in gaming I wish they still made CRT monitor's😭
I truly wonder how many of us grew up to be boob men just because of DOA
@@iDreamOfWeenie Pre DOA= Pam Anderson.
...independent _what_ physics?
@@greatwavefan397 and a new DOA fan is born
_beautiful_
I genuinely wish we could get a company to produce new CRTs - just once!
When there is demand, there will be supply. Don't worry, CRT stuff will get made again by some niche companies but don't expect them to be cheap lol :P
@@Saturnit3 Its pointless
@@joefish6091 Do you mean CRT? Some people like them, it was a part of our childhood.
CRT televisions are being re-manufactured in 2023, including replacement parts. However, it's doubtful that CRT monitors will experience the same resurgence. The costs associated with manufacturing, tooling, and various regulations are considerably high, making it a challenging endeavor.
Their prices can vary from £80 to £300 (depending on size) when re-manufactured. Honestly, customers are better off purchasing a pre-owned one from eBay for a lower cost, rather than burning hole thru their wallet.
See the interesting thing was. CRTs died off because they were so heavy and bulky. Not to mention they were very expensive to manufacture. The cost of everything was just increasing so the sought out a different technology and just decided on LCD. It just took us this long to catch back up
Many CRTs would have a capacitor that would fail.
You also had to live with the fact that you were sitting in front of a giant electron gun. After a while, particularly with large monitors, you could actually get what was essentially a sun burn.
well and people were just infatuated with a flat TV screen. You were the poor neighbor if you still had a CRT in your living room. It started with plasma screens which were cool but had terrible burn in, then went to LCDs and etc.
@@escapetherace1943 on the TV side yeah. But when it came to monitors. They were just too expensive to keep making. CRT TVs were noticeably less quality than CRT monitor
@@escapetherace1943 my 1080p 2010 plasma still going strong in my living room. it's better than any LCD i've ever owned. and unlike the LCDs of the era, it has almost no input latency. i remember the 40" 3D samsung i bought in 2013, that was the laggiest screen i've ever owned with it's lowest latency being 60ms in pc mode. i dunno how i managed to play team fortress 2 in that thing for two whole years before getting a 144hz screen.
I think the reason the screen keeps turning off and on it's because of the VGA to HDMI conversion. It usually happens whenever there is a sudden change in hz or something about the color limit of the monitor.
you can remove or bend the EDID pin (pin 12) on the vga cable and this problem goes away lol
Maybe if he's using a converter with no power supply cable this what's turrning off the monitor, i had this issue. When i bought another converter with power cable to it the problem was gone. My LG CRT monitor does 1440x900 75 hertz, 1920x1080 60 hertz, 1360x768 86 hertz, 800x600 120 hertz.. sure you can pick other resolutions and refresh rates by doing custom resolutions, these are the ones i mostly use for different games and the image quality is so much better than any modern display, games run much nicer than shitty modern displays.
Yeah, i once had the exact same issue. Thought for sure it was the capacitors on the power supply on one of my monitors (had 3 back then) but it was apparently the
Displayport to DVI adapter that was causing the issue, not the display.
So all in all, im pretty sure i would firstly go and check for another adapter to give it a go and see if it works better before
you put the monitor to shame. ;) maybe it will have years and years left to kick ass.
@@Postal2Dude intriguing.
Part of the issue with gamma is that most new games aren't developed with CRT's in mind at all, so they work within the limitations (and advantages) of LCD and OLED panels, hence why older games were easier to dial in compared to Escape from Tarkov for example.
As others have mentioned, you need to adjust brightness, not contrast to fix the crushed black levels, and only then adjust in game gamma options to suit the individual game.
Also he should be changing the settings on the monitor first then on the pc after in case he wants it brighter rather than him doing it all in the game and display settings first
The first thing he should have done is turn off the lights and play at night. Then he would have seen that the black is truly black.
This is ALSO, why old gaming consoles work SOOOO much better with CRT screens of any kind vs LCD/OLED etc! They are designed for interlaced not progressive as in the 1080i in a screen resolution vs 1080p. Both are HD but the WAY they read and display it is VERY different depending on the source and the screen type on the output.
After all, he did that for the oled so....lol@@BOYSSSSS
also CRT's are not meant to be played in studio lighting, it's only as dark as the room, so the lights are making it look greyer than it usually is.
Imagine there was a point in time where I couldn't even give CRT monitors away for free.
oh there was. now it may soon get to the point where they might have to make new ones.
We need a benevolent billionaire to throw away 200 million dollars by refurbing the old Factory Trinitrons were made in and cranking out the highest end models they ever made at a large loss.
Yes, I've thought about this also.
I kick myself in the ass thinking about the CRT monitor I put a free sign on and left outside when I moved cross country years ago.
That time is right now. 99.9% of people don’t want CRTs. I don’t want an old ass monitor that takes up half my desk.
Quake II RTX on an old CRT monitor attached to a modern PC is some kind of timeline colliding stuff.
Clearly, Dawid is actually a Time Lord.
😂😂 so true
Just so you know, Pin number 12 on the VGA cable needs to be removed, That's what is causing the screen to lose sync. Its an issue with the vga - hdmi adaptor. I would suggest buying an extension vga cable and removing pin 12. That way you aren't damaging the original vga cable. The image will be stable and look crisper.
he thought the crt was dying and it turned out it was just the pc dying🤣🤣🤣
too bad he doesn't do a follow up to this video, didn't realize this video was quite old.@@SaraMorgan-ym6ue
dude you just helped me ALOT. Been having a flickering issue using an HDMI->VGA adapter and had no idea why.
Probably best to just buy a better adapter. the HDMI to VGA adapters generally can't do the maximum refresh rates. the Startech DP2VGA is cheap and easy to get, and it will do high refresh rate. Another thing he neglected to do in this video was setup custom resolutions, to really get the most out of higher end monitors you have to set up custom resolutions.
I want his monitor so bad, trying to find one somewhere though is a pain in the arse since shipping is ridiculous for any CRT monitor these days.@@Ty-sm9cv
If I recall some of the most modern CRTs could auto resize the screen too so you didn't have to manually do it every time. They also started quicker. Though sometimes slow startup was a sign of wear...
I didn't have one of those, what I had are the regular CRT that you adjusted once and it was saved in memory, for which it would save at least 10-15 different resolutions and refresh rates.
One thing about CRT monitors is that you need to play in a completely dark room. That will give you much better image (especially in dark scenes).
Yes, and you need to keep your eyes about 2 inches away from the screen and keep the monitor brightness very high. And to see how much better that is you should play a video game with extremely rapid flashing bright colors
LCD is exact opposite by having better image in the bright room (and it’s still look horrible) but slender man is not longer scary. I only use lamp for my CRT TV when I do retro gaming
I was hoping he was gonna call the monitor a tv for jokes 😭😂
And he certainly delivered
1:40 he did.
@@namele55777 No way! The commenter certainly was not referring to that!
it's sobering how, among the variety of tech channels churning out the most cutting edge tech vids, dawid brings us (also bespoke cutting edge) relaxing vids one can enjoy with chai
Love ya dude, keep pumping out great content !
U mean tea?
i agree with everything with this comment
Don’t watch techmoan and cathode ray dude?
@@zakkazz1201yeah but why would you randomly mix another language like that. That implies a special kinda tea.
He seems like a really lovable dude!
Really belies his alter ego as a super-villain...
But you win some you lose some :-)
So the conclusion is: we need OLED monitors to start becoming affordable.
We need OLED monitors to stop having burn in. Then I'd pay any price for one.
But the future of PC monitors is Micro LED . Samsung presented it in CES 2023 .
@@kyoudaikenCRTs had even worse burn in. It's a gamble buying an old monitor because it'll either have the Windows Taskbar or Apple Menubar etched into the phosphor or be totally fine.
@@benanderson89not really shitty consumer grade tvs had burn in not the PC area and especially not the Pro area. Oled though has burn in no matter the price range same with LCD's and Plasma's
@@snintendog PC monitors absolutely do get burn in. What do you think screen savers were for, shits and giggles?
The thing about CRT is also you can basically set it to any resolution and it would still look very good
And anti-aliasing is less important too.
LOL, that's a big negative on both counts.
@@mrsleep0000 Not requiring anti-aliasing is a huge positive for performance, especially for older graphics cards. :P
As for the resolution, CRT's will always have this fuzzy, organic appearance... Which is why anything above a certain resolution, won't really affect image quality all that much.
Sure, it's not as clear or crisp as an LCD or OLED, but it's more than clear enough for you to make out what you're looking at. :P
Perfection is the enemy of good enough!
At worst, it's a small negative, not a big one. :P
Besides, you get better motion-rendering on a crt, movements look way smoother... So 90 hz on a CRT feels like 144 hz on an LCD, and so on, and so forth... And THAT is definitely a BIG positive. ;)
@@MyouKyuubi all resolutions on CRT are the equivalent to 1000hz on a traditional sample and hold display (assuming the FPS of your content, matched the FPS of the CRT) this is true for all resolutions. The reason for this is that CRT always has 1mS of image persistence. Motion resolution is going to be dictated by persistence, a 144hz display has persistence of around 7mS. It's pretty apparent when you have them side by side.
As far as sharpness/clarity, it really depends on the pitch of the mask/grill in the tube it's self, as well the electronics (the electronics will dictate the "spot size", otherwise described as the size of the area where the electron beam touches down). We have X-ray monitors at my work that can do resolutions close to 4k, and they're sharper for X-ray viewing than their LCD counterparts!
@@Ty-sm9cv "all resolutions on CRT are the equivalent to 1000hz on a traditional sample and hold display"
Hah, not quite, that good, no. :P
Due to phosphor decay, it makes CRT's hold the previous image a bit, whilst the new image ALSO shows up at the same time, making the transitions between each frame soft...That coupled with how the cathode does one line at a time, adds another softness to the transitioning between frames.
This is what causes the "motion rendering" to be so good.
But make no mistake, you can still see flickering at 60 hz... At least my eyes can... i can see flicker at even 240 hz, lmao... But usually only when i'm turning fast in a game, or moving the mouse real fast.
I don't know what it is, but it's likely that i have some kind of medical condition... I am rather photosensitive... So even if i'm lmaying at like 240 hz, if i'm playing something competitive like Team Fortress 2, or Overwatch... I feel like i'm going blind with all the afterimages i see when it turn around.
It gets even worse when i turn on blackframe instertion/backlight strobing. Good lord, i dunno how people live with that sht... xD
I know they've made some 500 hz monitors, though i haven't tried those to see if i can see any flicker in those... would like to tesat one out some time, to see. :3
Dawid, your exploration of CRT gaming in 2023 is a delightful thing. It's great to see you testing these old monitors and sharing your candid experiences. Keep it up
I think that the reason for the random flickering on and off is the VGA cable. I had a really cheap HDMI to VGA adapter a while ago, and it would do this while gaming. I'm no monitor expert though, so I could be wrong.
Why all adapters are so crappy? If u need adapter for something u cant really find decent one.
moreread is just a cheap brand startech though have really good adapters for vga@@InvictusCore
@@InvictusCore There are great ones it's just that with many products you have to sort through the good and bad ones. The Startech DP2HD20 VGA adapter is one of the best ones out there. I also have an adapter that doens't flicker as well
The best option is a GTX 900 series card, they support analogue out but still get driver updates. I think the GTX 980 Ti is the best card that properly supports VGA (DVI-I).
Either that or some of the caps need changing.
I regret getting rid of my crts. Lcds were a false promise.
"at some point they're all gonna die and we're not gonna get to see crts anymore"
That physically hurt me.
with the rise in crts i wont doubt it if they bring back the crt atleast for niche market like a 4k 120hz slim crt
I still think LCD's look miles better than CRT for quite some good time already, specially semi-gloss and full gloss ones, much better colors, blacks, contrast and brightness...
The problem of the LCD is it really needs high resolution content which is totally normal since the CRT is rear projected which softens everything and make really old low resolution games to look so much better.
The main issue on LCD is really motion, but for now there's surely some models that have it figured out and are so fast that the difference is small but they also cost a ton of money and the priority of them are motion and low input lag over probably everything else.
I think I still have my father's last CRT somewhere something with 19 inch, but my eyes still thanked a lot when I moved to a much crappier LCD back in 2004 with less resolution, less frequency, bad blacks... At least my eyes stop burning playing trough the night and when I could play games at it's full resolution it had much better sharpness than the CRT since the PPI were probably about the same because the LCD even being lower resolution was also only 15 inch compared to the 19 inch CRT...
@@guily6669 that is just wrong, lcds suck. it's literally the worst tech out there, behind plasma oled and crt.
wait, no. projection tvs are the worst BY far. but lcds are the worst among the non puke-covered ones.
@@GraveUypo I don't think they look that bad in terms of image quality specially latest VA panels giving almost inky oled blacks on some models, very wide colour gamuts, 1500+ nits and so on...
Also soon ADS Pro panels will arrive making IPS like with much more contrast, brightness, blacks and so on...
LCD is definitely showing some good value at least on TVs on some models and in terms of monitors it also shown it can be quite fast with the fastest latest models.
But it sure has its downfalls which takes a lot of money to overcome by needing crazy fast hardware for fast processing, quantum dot and miniled backlight with many dimming zones and a good fast and reliable dimming algorithm and clever thinking software, but when done right it works amazingly.
It’s degauss time
That flickering could also be the HDMI/DVI to VGA adaptor. It has a certain bandwidth/refresh rate it can support.
It def is that adapter.
Seconded.
Thirded. I had a similar one I tried on my Gamecube (GCHD Mk II) and PS3. No matter what I tried it with it blacked out like that. I went through two of them like that and just wound up getting my money back. Need to find a model that doesn't totally suck, and the only good one I've heard of isn't even available anymore.
That is a good point indeed. I had a generic display adapter for my Windows Phone that I used with a CRT and the bandwith was also wonky, resulting in the same problem.
Yep I've used those adapters before, they will indeed cause a delay. The issue is locking into the vertical sync of the monitor.
Display tech is just now catching up to CRT's, yes, this is well known. I think once microled's finally get mainstream we'll finally have something better in an absolute sense.
True, and True. Can't wait, for microLED.
don't hold your breath on that one, it took OLED over 10 years to become economically viable and there still aren't many monitors to choose from yet
The only thing CRTs have going for them is their high 200Hz refresh rate. They are drastically worse at everything else. That's why CRTs died instantly the second LCDs appeared cuz only gamers cared. Actually not even they cared.
@@roller4312It was more than LCD's were novel and cool and didn't take up your entire desk. They weren't better as displays though, worse in fact, and usually not much bigger if at all bigger at first.
Where they came into their own was 19+in displays, which weren't commonly found in CRT's as they'd be too large for desktop use.
CRT's stayed in-vogue in commercial environments far longer mind. Video production and anywhere that needed very accurate color reproduction was run on CRT's for ages after they weren't a desktop norm
@@jttech44 Even the trashiest LCD is better than any good CRT, even with their crazy 5:4 aspect ratio, they effortlessly took over. Just the image clarity and proper geometry is enough to destroy any CRT. And S-IPS took the color accuracy crown almost immediately. EIZO started making them in like 2002. I suppose one could make an argument that a very expensive CRT with proper maintenance could rival an a mediocre LCD is some areas back in the day, but it was unsustainable in the long run, and certainly not in 2023, or any other date really.
This video just got me missing 2 of my favorite CRT monitors that I owned wayyyyy back. The 22 inch Viewsonic and 21 inch LG monitor. Their picture quality was damn decent coming to think of it. But was a Pain in the A$$ for transporting them to LAN parties. Awesome video man.
If you get the chance, I recommend trying out a high-end CRT monitor like the 21" Eizo Flexscan F980. They feature great resolution and refresh rate. 1024x768@160Hz and all the way up to 2048x1536@85Hz - according to the manual. It's been years since I actually used one. It would probably look amazing with modern 3D graphics.
Yep those are awesome. Full HD fits in a window on mine.
The mouse cursor becomes almost invisible at that resolution though. Upside: you won't need any anti-aliasing at that pixel density. And it doubles as a heater.
I have a different one but mine can do FHD at 1080p 70hz
@@crestofhonor2349 4:3 HD 1440x1080? I have never seen one of those. Neat.
@@crestofhonor2349 70 is too low. Eyes get very tired from low hz CRT
The Eizo F980 is absurdly rare, to the point of there only being a few photos of them online.
Dawid’s gleeful reaction to the monitor degaussing is literally the same I had as a 12 year old when I first switched a monitor with that feature
I’d like to add crt monitors made with relatively simple components. Replacing a few dying parts arent an issue at all. I think crt is gonna be around for a looong time in caring hands.
The only thing that is depressing is that the tubes themselves are wear items :(
They get dimmer over time, eventually becoming unusable. If no new tubes get made, we’ll eventually run out of good tubes. If you only use your TV/monitor sparingly, I think they’ll last a long time but your tube wearing out is inevitable.
I wish I have known that back in the day, I got rid of some amazing high-end CRTs all because they died prematurely, when it was probably a bad capacitor or some other small problem, while the tube itself was immaculate and barely used, especially a 21" Sony Triniton monitor that I gave to a recycling facility.
I think the biggest problem with CRT repair is the fear surrounding it since we all know they're high voltage devices that can be lethal and that stops many from even trying and ofc there's the fact that repair shops for CRTs are gone now and the people that knew how to repair them are retired or no longer with us so you have to learn from scratch at home.
@@AJ-po6upman thats what gets me too. I already cant stand trying to work on a car with electrical components or batteries cant imagine how dangerous a crt is.
“Since I got a TV last time, I got a monitor this time”
1:40 “now the TV in question…”
I honestly love Dawid’s humour 😂
Had to pause the video for a second because that made me laugh pretty hard :D
I laughed so hard at that :P
Shadow details in darker scenes is one of CRT's strongest points especially compared to LCD, not sure what was going on with the Dell here but makes me wonder if that little adapter he used had something to do with it.
the thing he is talking about is because he has a very bright room - think of it like, as an exaggeration, pointing a flashlight as a crt at an angle - you are lighting up the screen itself from the front. that fact he didnt realize this and compare then in a dark room is rather odd if you ask me
The crushed black levels should be able to be fixed in the monitor's settings. The brightness setting on mine controls the black level, and contrast controls the white level. Try to set the brightness so that black is right on the edge between black and an extremely dark gray. Also CRTs generally look their best in a dark environment so the display itself being less dark isn't an issue (you can shine a flashlight at it to see what I mean, it makes black look like a light gray.)
I'd actually recommend something else, set contrast between 80-90% and brightness to 30%. Put a fully white picture fullscreen, and then zoom out the image on screen.
Increase the brightness until you see an edge to the unlit area of the screen and the black "overdraw" area. Once you see that reduce the brightness until it goes away again.
Now zoom the image back to your normal settings and, if using windows, run the display calibration tool to sort out the gamma and colour. I have the best results by making first sure the monitor is set to the middle-most display temperature setting (Something like 5600k I think.). sRGB will usually crank the contrast to maximum which will wear the phosphor out quicker.
@@blunderingfool Since he's using a adapter for VGA Windows might have HDR enabled for that monitor connection. For native VGA Windows doesn't normally let you enable it but that adapter might be confusing Windows. So good idea to check color settings and make sure HDR is off. HDR is not a thing with CRT monitors and that can result in blacks being too dark as well. ;)
@@MagnumForce51 Ooh, there's a good point, too!
@@MagnumForce51There's also the chance he's using an HDMI to VGA converter. HDMI is known to have 2 different black level settings for TV and monitors. When the wrong one is used, the black levels and highlights are crushed.
@@rdoursenaud Oh yeah forgot about that. Some HDMI converters can be kinda crappy which is why I have been reluctant to get a modern graphics card. Currently got a 780Ti. I recall the Titan X was the last Nvidia card that had native VGA output (via DVI-A port with passive adapter).
This is why, with CRTs, 30fps console games were a FAR better experience then with modern TVs.
@@raven4k998 the frame rate isn't at all the benefit. The response time and motion performance is the difference. A modern display can beat CRTs in frame rate alone. The point is that latency is extremely low even with a low frame rate
Unfortunately, game devs didn't get with the program for AGES and target 60fps once we were on LCDs.
@@blunderingfool ages ??? nah it was just one console generation. xbox 360 and PS3 era (2005 -2013) and sure it was the longest proliferated generation of consoles , i'll give you that , but still just one generation of consoles. the xbone and PS4 both aimed for 60fps. when the 360 launched LCD's and plasma tv's were still costing an arm and a leg and made up only 12-14% of TV's in homes. hell HDMI didn't even exist yet with the 360 launched in 2005. Netflix still rented out DVD's for the most part and had not even funded one self made show yet , Blockbuster as still relevant, You tube ahdn't been bought by google yet and was still barely catching on. and yeah 720p LCD tv's sold for upwards of 4000 dollars. so yeah why would they be aiming for more than 30 fps that gen when most households were still CRT?
the next gen consoles after them got hammered by critics because they would drop resolution down to 1600x900 to maintain 60fps.
So no it didn;'t take them (game devs) ages to catch upo , it took hardware manufactures (sony and MS) ages to replace their aging consoles. those consoles couldn't do 720p at 60fps so no mway in hell were they gonna do 60fps.
Most old console games were running at 60 FPS, it was just an interlaced signal and CRTs were absolutely amazing about handling that. My Sony Trinitron looks better than my 4K TV when it comes to old games and that's 100% because it was designed for a CRT's natural pixel blur due to how it works.
@@joeykeilholz925 Frame rate is absolutely the benefit if you can save resources spent on it to draw a better picture. Latency affects responce time equally regardless of the frame rate, which is something people have a hard time to grasp the concept of.
The degaussing and how long it took to warm up really hit that nostalgia I was after. I didn't buy an LCD until they were pretty well established tech, both for budgetary and performance reasons, and it took a long time for them to equal the quality of an average CRT. Great video as always!
same dude I was a poor kid forced to use a shitty sony CRT and I DESTROYED online because everybody switched to LCDs early on and had tons of input lag. This video made me feel heavy nostalgia for CRT because I honestly enjoyed those times and CRT does feel really nice and snappy. Better than even my nice modern screens.
I had Compaq CRT that I bought second hand around 2001 or so. It was a HUGE screen as far as CRTs and probably weighed around 60lbs.... but it was originally built as a high end professional graphic design / CAD monitor... and damn did it bring the smooth gaming thunder when you could crank the refresh rate. I stopped being Sweaty McTryhard on the gaming side around 2006 or 2007 and eventually went with LCD for space savings... but as far as lack of input lag and insanely smooth refresh... it would still be competitive today.
@@control_the_pet_population oh man mine was like a 24 inch television set with the color off color hard in the upper corner 😅somehow I was in the top #10 spot on the leaderboards of SnD on CoD. I quit all that nonsense a loooong time ago, a few years later than you but man those were the days weren't they
yeah it is so nostalgic to see a crt working again mind you that was how I learned IT from windows 95's training software running on my old crt for my first computer I grew up on
I had the 21 inch version of the Trinitron monitor and I loved it with all my heart.
I still have an old CRT monitor from before I was even born 21 years ago, and it still works like a charm. We have this old 32" JVC CRT TV that we found on the curb too, working perfectly fine with a little tilt inward on the top right but barely noticeable, and it's still a pleasant experience.
I love old tech.
The whine is normal.
When LCD monitos came out, image quality jumped back decades. We've been trying to get back to where we were in the mid 90's ever since.
We wanted ideal grid and flatness back then, and only then we realised how terrible the tradeoff was.
Yeah not really. I managed to get about 2h of screen time on a high-end Sony CRT without getting a serious head ache. Today I can spend the whole day in front of the monitor without my head exploding and without my skin feeling like it's going to peel off any second.
Older LCDs certainly do suck but modern ones definitely rival CRTs in quality with several advantages
@@smeezekitty True. After 25 years we're finally at a point where lcd image quality rivals or is better than crt. Greatest advantage of lcd is size - imagine the bulk of a 65" wide-screen crt
That's why I kept my CRT Flexscan as long as possible, went to Eizo S-PVA to LG OLED - skipping all the crap.
I downgraded from a 21"(1600x1200 native) dell flatscreen CRT (trinitron tube) to an Acer 1680x1050 lcd and I hated the lcd for so long. I actually went back to the CRT for a while. I think eventually I forgot how much I loved that CRT, but this video brought back some memories for sure.
I had that Dell monitor too, held onto it for years because lcd quality was worse.
honestly, i think i lucked out with my first lcd. it was a 19" 75hz screen at 19". obviously it wasn't as good as CRTs, but it had a fast response time and it wasn't freaking 60hz. that was in 2005. it wasn't until like 2013 until non-60hz lcds started to become more common. it was almost a decade of painfully inadequate screens.
So many memories brought back just by the best setting in the monitor menu.
OOOOOH
1:40 Still proceeds to call the monitor a TV. Thank you for that chuckle.
Damn what a nostalgic, yet modern, trip! Surreal experiencing your childhood as an adult.
I had one of the latest CRT monitors for a long time. It was a phillips and it had a refreshrate of either 120hz or 145hz (not sure) with a lot better blacks.
It was absolutely HUGE. Bigger than equally sized CRT monitors, but it had a high res with high refreshrate and better blacks, it was pretty amazing actually. I used that screen until like, 2013 or so.
Sometimes wish I still had it.
i had a philips that sounds similar to that one, you could crank down the resolution to 640x480 and get like a 200+hz refresh rate out of it or crank the resolution up to 1600x1200 at 30hz, it was a beast that LCDs haven't really caught up with
I just bought a 109b40 for my retro build. Its huge!
Yakno as I get older I kinda want to get a retro pc and play some old rpgs like fallout, buldurs gate, and jagged alliance 2 on it. Experience what those games were like on the hardware that was around when they were released.@@Grau85
I have crt 22inch Philips Brillance 202p4 Diamndtron and Im very happy.
High end CRTs were absolutely incredible. I still remember when the first LCDs came out and they had atrocious colors and motion blur; I found it hard to understand, why people went to LCDs so quickly. Even now LCDs arent that good and even OLED cannot beat the motion clarity of CRTs. Maybe with the new QLED technology, we will finally see the motion clarity of CRTs again.
I remember everyone going to LCDs so quickly because they gave more desk space but mostly heard "it uses less power, so better for the planet."
@@mrgw98 - Nobody in human history chose a flatscreen TV over a CRT “for the planet”.
Crazy to think that in college I used a Sony PVM at work to help convert VHS tapes to DVD with a video capture PC. And now retro gamers are picking up these exact monitors like they're going out of style.
I've heard Black Frame Insertion can help with motion clarity on LCDs but honestly I've never had luck with it.
Even at 120hz BFI dims the screen too much and is noticeably flickery.
@@VexAcer Flickering is a result of BFI, only way to mitigate that would either be less aggressive BFI settings, or higher refresh rates.
That said: my first high refresh rate was a BENQ 1080p 144hz TN screen, that was one of the first with BFI support.
The motion clarity was insane on that thing, not quite as good as a CRT, but still really good. I often turned it on when playing Warframe and you could see each individual frame, without blurring.
To give you an idea how good it was: imagine flying past an enemy with a name tag above him and you can only see the name tag for 2-3 frames and still read it perfectly without any blur
Like others said, the crushed black levels are something you'll have to adjust in the monitor's settings. I think once you get it dialed in, you'll be impressed by the quality of blacks in the image.
As for the screen shutting off, I wonder if it has to do with your VGA to modern input solution? The monitor may have trouble recognizing the signal periodically for some reason, so it's dropping the image.
Yea it's likely the adapter but the amount of time it took to output image is suspiciously long. My CRT's outputs signal in 5 seconds or less.
@@joelrodriguez9611 They did feel a bit long, but perhaps that delay is mostly on the adapter's end?
@@the_real_Kurt_Yarish The image flickering is likely the adapter, but not the long time it took to show image when turning the monitor on.
I use both gtx 1660 ti and a rtx 2060 on VGA monitors through a hdmi/vga adapter, absolutely zero issues.
@@Pazuzu- Okay, but an issue like this can be highly dependent on the user's setup specifications and settings, as well as the particular hardware itself. I imagine their are many various VGA adaptors on the market, so the exact cause of his issue could be very specific to how his individual adaptor works, or how it interacts with his specific setup.
No offence meant here, but your comment is like if someone was having problems in the transmission in their 2002 Honda Civic and someone else responded with "The transmission in my 2008 Toyota Camry works fine!". Like... sure, that's great, but how much does that assist us here? If we were talking about verifiably similar or even identical setups with confidence, sure, but we're not, so... you see what I mean?
5:20 - "85 Hz on a CRT looks like 240 Hz on an LCD"
As someone with the Samsung syncmaster 997MB, I 100% agree with that statement.
High-end CRT monitors are basically a budget gaming monitors if you can find one for cheap.
I have a 955DF, the image quality is absurd, i can't believe it can do 4:3 1440p (almost, 1856x1344).
If i use the typical 1600x1200 at 67Hz, it still looks better than +100Hz LCD i ever watched, all of them look absolutely garbage in motion.
There's no aliasing as well and the colors beat my AMOLED phone (for extremely bright colors like the highlights in Tron Legacy, my AMOLED struggles with color saturation there, meanwhile my CRT is basically analog HDR).
I watched many WOLEDs and QD-OLEDs in person, they look worse than AMOLED and CRTs. WOLEDs in particular struggle with warm colors, QD OLEDs are better but they look like they have oversaturation.
AMOLED screens are by favorite, pure RGB, no quantum dots, crazy pixel density (no aliasing).
My 955DF was abandoned for +10 years, i turned it on and the colors blew me away. Looks like it was perfectly calibrated for graphics design.
I watched Mac screens and they look the same and they are calibrated as well for graphics design.
My 955DF aged like fine wine, what a timeless beast. I use it for Blu Ray movies, TH-cam, work, gaming, it's a perfect image.
@@saricubra2867
What is the contrast ratio of a crt ?
I don't remember seeing black even with the screen turned off
@@Charly_dvorakIt depends on the screen tint, i have a black tint CRT monitor, basically the same as the OLED but better resolution for shadows.
I really miss my 21" Trinitron CRT. It was such a good monitor. I wish I had kept it, but it was really bulky and heavy.
This is Dawids Redemption Arc. Nice to see you giving the PC Monitors a shot.
Its always cool to see CRTs getting some love
There is only one issue I have with the video:
You didn't test the Monitor in the dark. Thats when Brightness becomes less relevant and the black levels look actually black. If youwant to have lights in the room, then never expose the tube to direct light. If you avoid that, the grey color of the tube wont be as noticable ^^
By the way I think that converter is causing the blanking. One of my convertes starts doing that when too much is on screen while the refresh rate is high
@@voodfernichterofficial4745
Came down to the comment to suggest the same thing; the adapter is likely the blinking culprit.
The "eeeeeeeeeee" noise is the 15.625KHz (or 15625Hz) from the high voltage transformer putting out 25KV (yes 25000V).
This was true for TV's back when i was a kid fixing them!
Yeah those capacitors are more dangerous than even power supply capacitors. Those could throw you across the room if you touch them
@@frf5000the CRT itself is the biggest issue as it acts like a capacitor when turned off. They can give quite the shock if not discharged.
@@EvilTurkeySlices or tickle you by making your hair stand on end.
man, back when I was a teenager I worked as a "techinican" for the school computer lab in exchange for a discount and my favorite thing to do was degauss monitors. I remember receiving a pallet of dell computers that were all set to 240v and thinking "omg they are all broken". simpler times. I remember the old guy teaching me how to set an IP adress because he didn't know dhcp existed.
i still set fixed IPs to my xbox and my desktop pc on my lan. makes it way easier to forward ports and do lan stuff. everything else goes on dhcp tho
dhcp didn't work pretty well on a large (8+) network setups. especially on windows xp. it would just freak out and assign a 179.122.something.something adress so manual IP adressing was the thing back in a day for stability issues. I mean not everyone (schools especially) could afford cisco equipment, and cheaper stuff had it's glitches.
new CRT's built to be compatible with modern PCs could actually be really nice
Man CRT gaming takes me back to LAN parties, and I genuinely miss it. I hope CRTs make a come back, just more compact with modern connections. They feel incredibly good to play on.
I remember at work in the mid 90's we were given Compaq 21" CRT monitors. We found we could raz the refresh from the default 50Hz (you could spot any sub 60Hz monitor if you vibrated your tongue against the roof of your mouth as they flickered like crazy...top tip) up to 85Hz. So we did. Everyone was very happy. Cue masses of dead monitors a few months later... No we didn't let on.
never use the maximum refresh you can get. if you had dropped to 75hz they wouldn't have broken
@@GraveUypo Yeah this was the mid 90's and we werent tech people. The IT team had left most of them at migraine inducing 50Hz so go figure. But if they can't hack it then they shouldnt let you set them that high. But at the end of the day...they werent our monitors.
oof... shoulda just gone for 60, or a lower resolution.
I wonder if the converter to VGA is causing it to blank
It is
Shitty converter always have same prob even in lcd monitor
Yes thats likely the cause. One of my worse converters usually cut out the picture and had audio issues if too much was happening on the screen
Shitty adapters can definitely cause this, but it wouldn't explain why the pc didn't work on the OLED monitor as well.
Can confirm that shit hdmi to vga adapters cause blanking issues.
very probable.
Sadly, I think the last GPU with an DVI-I output (includes DVI-A, Analog over DVI, which can be fitted with a passive VGA adaptor) was the nvidia 9xx series.
The thing that fascinate me about CRTs is that, if you look at them in a slow motion footage, every frame it's made by a single pixel zigzaggin the screen from top to bottom 😮
Edit: saw that in a SlowMoGuys video i had no idea and i grow up with CRTs everywhere in my house 😂
I mean, it IS a cathode ray tube (CRT). xD
So basically CRT's are playing Pac-Man for you to play Pac-Man on em.
You didn't learn about electron guns in science class?
Idk about him but i never did @donov25
That high pitched whine that you heard on the crt tv has a pitch of about 15.6 kHz, which is consistent with the horizontal refresher of a 480i crt. All CRTs of that resolution have that sound and it doesn't mean the little sony is close to dying, thankfully. As we age, most people lose the ability to hear that frequency, it lies just outside the average adult's range. The fact that you can hear it is good news that you still have good ears! And your TV is fine!
I bought my last (Used) CRT in 2005 to play WOW. It was a 22" Viewsonic that took up most of my desk. I loved that monitor.
The CRT that you really want is a 19" 1600x1200 one with a Mitsubishi Diamondtron tube. It was the Alpha CRT before LCD's took over.
I have one of those, an iiyama vision master pro... Honestly, I should make a video on it.
The display sometimes going off might be due to your HDMI to VGA adapter, or maybe some power supply circuitry is aging inside the CRT, which would be very fixable!
The flickering may very well be due to the adapter, not the monitor itself. I've had this happen with cheap HDMI to VGA adapters. Good quality DP to VGA adapters tend not to do this.
Just once I'd be curious to see what a modern CRT with a few more years of R&D behind it would be like.
In my personal experience, when CRTs start doing that "Off and On again" thing, it's usually a sign of the flyback arcing, which could be because there's too much dust cakes on to the boards inside the monitor, or the flyback dying.
Unfortunately if the flyback is dying the only way to get a replacement that I know of, is with a donor board.
If it's because of dust, a good cleaning and reflowing of the solder will fix it.
It would be interesting to see what a modern CRT would be like 🤔
Personally, I wish they kept up with plasma development. A 4k plasma with over 100hz would've been very affordable and amazing looking compared to any LCD
And probably some sort of a filter layer could have kept you from getting a tan from using one as a monitor.@@sinephase
@@EsaKarjalainen explain, whatever you're on about sounds ridiculous
Plasma TVs emit a small amount of UV light. There was some, mostly fearmongering, about that way back.@@sinephase
Oh thank goodness, I'm so happy you managed to do it right this time. XD Big pro tip for CRTs by the way, if you want to run one at 1280 pixels wide, do it at 1280x960, not 1280x1024. The former is a proper 4:3 resolution, matching the physical aspect ratio, the latter is 5:4 and gives a slightly squished image... plus you can run it a bit faster.
You absolutely need to 1) Make sure the monitor is handling display scaling, and 2) fine tune things in the graphics driver. I suspect the flicker and intermitted drop out you were seeing may be due to those reasons, modem GPUs just don't behave nicely with CRTs.
Also also! Another bonus tip for anyone watching, you can get very good black levels (Which Dawid did not) by zooming out the picture until you see the end of the 'overdraw' area, where there's a highlighted black versus a 'true' black. Reduce contrast to 80-90% at this point, then reduce brightness until you see no division, an all-white image on screen helps with this sine you have a solid border. Once you have the black levels right just zoom back in and adjust your geometry as needed! =D
what even is this recommendation? just look at a shadow clipping test and raise/lower brightness until its set correctly... if not helping need a full calibration/g2 adjust
@@Polowogs Sure, if you want your gamma and colour balance to be fucked. We're using worn out phosphor most of the time, it needs a little more care to really shine.
It is still not right because he used a digital to analog converter rather than a native analog vga display adapter.
@@blunderingfool Uhhh aall the gamma and white point are perfect either 2.2 or 2.4 on my 13 pc crt. You can calibrate it as long as there are controls
@@auteurfiddler8706 That's the point of calibration, and a good converter like what I have. :P
the worst thing about crts for me is that weird "eeeeeeeeeeee" sound its so annoying and bothersome 😭
I honestly miss the high-pitched whine of technology. It's nostalgic to me.
Actually it kinda grew on me when I was a kid
You can easily fix it by adding custom Grounding cable that directed directly to Earth.
The amounts of electrical noises in old CRT is just insane but doesn't mean un-fixable
🤔I guess my Industrial Deafness does have a benefit as I can't hear it anymore when Retro Gaming on my XP PC with CRT or my PS2 on CRT Tv
I watched so much tv i cant hear it at all, im so used to it.
I had a 21" dell trinitron crt monitor back in the day. I loved it. Ran it at 1600x1200 at 75Hz looked superb.
My hand still hurts from hitting the top of my CRT trying to fix the image.
The random dropouts are caused by a bad HDMI-to-VGA converter, or a bad connection to it.
It's nothing to worry about. If you plug it in by genuine VGA cable it'll be okay. Good luck finding a modern card for that, though.
I've got the same connector and it's fine. I think it's probably just a corroded VGA connector
Indeed. When TH-camr individuals do these kind of videos I wish they'd just buy a 980 Ti for native VGA support, this should be the most powerful card that supports analog video natively.
@@butre. I did say either a bad converter or bad connection to it.
You can tell it wasn't the monitor because CRTS lose brightness and sharpness when they start to fail.
some 750ti cards still have the vga port on them, like the asus one i used to had, its not exactly "modern" but it still does have driver support for a few more years, and it can definitely run a majority of retro games and older 3d titles
A (IIRC) Startech DP>VGA2HD20 will do the job. Make sure the monitor is set to handle display scaling, not the GPU.
As you talked about your Japan PC not having video connectivity anymore, my monitor had a random flash of a grey area, and later again, only twice so far but maybe the radiant energy killed my monitor or GPU too.
Yep, can confirm. Old games look much better on CRT for many reasons. I grabbed old Smasnug few months ago and games like Jedi Knight, Baldur's Gate, Half Life, Quake, Doom, Blood - all look really really good on it. I even tried to play Cyberpunk 2077 at 1024x768 and it ran really well even in RT Overdrive on my 3080, lol.
Only game that didn't liked it was Quake 2 RTX because it didn't like my multi-monitor setup :D
You probably meant Samsung? Old SyncMaster CRTs were/are quite nice.
@@karl-erikkald8876 Smasnug is kind of a meme name for Samsung, I believe it came from a DankPods video.
Smasnug is pretty hilarious though tbh :P@@karl-erikkald8876
@@karl-erikkald8876 it's a meme
ITS NOT GOING TO FUCKING EXPLODE THATS JUST THE FLYBACK
Oh nooo. Not the little Japan PC!
Also, I was significantly more impressed with this CRT than the previous one. But the eye fatigue is serious.
Also Dawid's sketch about haters was so funny. Such commitment with that spit-take haha :D
Eye fatigue with CRT monitors used to affect me badly. half a day at work on a PC then all evening glued to a monitor. midnight AGHHHH.
I remember there were some special screens to filter light you could hang on your monitor screen to get less brightness. They were simpler times, haha...
@@joefish6091 eye fatigue for me has always only been with low refresh rate monitors lol. 60-70Hz feels like someone is throwing thumbtacks into my eyes, but as long as its 86Hz and above, I can use it for HOURS and not notice a thing haha. Feels buttery smooth.
@@kllause668175Hz is good enough as well, i agree about 60Hz but it depends if you have a lot of white.
Dawid found an actually decent CRT to test this time.
There are some even better examples out there but this one is enough to prove a point.
Yeah, i like that monitor, the resolution and aspect ratio is good. I think tweaking the colors for calibration can improve the image.
The CRT is THE way to go for your retro gaming, especially 98/XP gaming, albeit not for the newer stuff (and DOS gaming is a given). And yes, Dawid, that CRT is sadly not long for this world unless the capacitors get swapped. At least, in my experience, that's a sign of the capacitors going bad, which in turn burns up the projector. If it just goes kaput all at once, in my experience, about 90% of the time, it's the fuse, which usually costs less than a buck. I saved so many of them from the trash and eBay'd them lol.
When I replaced my CRT back in 2006 or so, it was 100% due to the fact that LAN parties were still a thing in my group of friends, and an LCD was infinitely more desirable in this situation.
That was it. I had no care for color, refresh rate, or anything like that. It was just due to size and convenience, and I would imagine that accounts for the vast majority of people who upgraded at that time.
Little did we know we doomed ourselves to 20 years of worse image quality that's only just now catching back up to where we left off. Imagine what gaming might be like if CRTs had instead been continued to be developed on for the last 20 years. Sure, they might not ever be pocketable, but I imagine they'd be far thinner and lighter in size by now, and drastically better in terms of image quality.
Man, you know what's sad. Just few years ago, my mother was cleaning attic and asked me about the old CRT monitor we had do we still need it. I decided to turn it on again to see if it works, and find out that my monitor had 75hz refresh rate whole time and I didn't know it. 75hz for some reason felt so smooth I was really really surprised.
I had a iiyama vision master pro 22" back in the day. That thing was heavy!! That degausing sound when powering on brings back memorys!!!
Oooh! I have the iiyama Vision Master Pro 455 (Or, MM904UT), 18inch viewable, great colour and black levels. The phosphor is still in great condition and it looks great at 1600x1200@75hz.
I miss the CRT screens lol
4:23 Any vga monitor normally has an automatic calibration button/option in the menu.
not crts, only lcd
LCDs do but CRTs don't know exactly where the image they're showing ends up.They can estimate it based on the signal timings but being analog, it isn't always accurate
That's degaussing.... Not "automatic calibration" whatever that means
The flickering dawid it's the vga adapter that you use, try to use a power active one, those ones usually do the trick. nice vid btw =D
Awesome video. I miss CRTs. Playing Half-Life and Unreal Tournament on them was amazing.
One thing CRT is much better than any type of modern 2024 TV or Monitor, CRT has no Latency, no Ghosting, and no motion blur, which is crazy modern technology should be much better.
There's no aliasing as well, since you have dots on the screen instead of pixels.
Back then gaming universe was really awesome and immersive. The quality of the games was pretty high while people were using CRT monitors. (early 2000's) Considering the gaming industry went into permanent decline in the last decade, playing games on a CRT monitor was a unique, unforgettable experience. It still is :)
CRT monitor lol unique, unforgettable experience, ok dump you modern monitor now. I will tell you this the were clunky, too big, hurt your eyes and took ages to turn on, use too much power as well. I love my flat screen 32 inch curved monitor, Or and CRT monitor curved a way from you not in. As for the gaming industry yep it was down hill but remember this people knew to games so not many knew what a good game was. Now people know.
The gaming industry in permanent decline? Have you even played any video games recently or are you just in self inflicted denial?
i still use a 19inch crt as my main monitor to this day! ;3
I really missed the high refresh rate of CRTs when I switched to LCD in 2007. I used a Sony that allowed 120hz at 1024x768, it was incredible. The colors were much more pleasant, 100% black tones, but I was forced to switch to LCD because my pair of monitors literally cooked my eyes, and at the end of the day, I was looking like a stoner.
LCDs allowed me to stay in front of the screen up to 24 hours a day without headaches, or "sand in my eyes". Only now, 16 years later, are there LCD monitors with decent refresh rates, and some color depth at affordable prices, but even so, I feel like it's not as good as CRT.
I've been watching your videos for pure entertainment for a while while now. And I belong to one of the (perhaps few?) ones that played games in the 90's (and early 00's) and it's impressing to me what kind of performance you're referring to these days. I remember picking up a 21 Inch CRT in a storage locker and I was the envy of my crew! it was MASSIVE! And nowadays you have more ram in your graphics card than I had in my hard drive, one GB was HUGE in 1995!
Keep up your charismatic attitude , that is what will gain you views, content is second! Cheers from Sweden!
My theory is that your CRT might have a cracked solder connection on the high voltage circuit and that is why it's cutting out every once in a while. If I am right about that, then that could be fixed.
I have another Dell cry monitor and it also does that...
@@Karman7 Hehe, that typo. "CRT monitor? More like CRY monitor when you drop it on your foot"
@@NaoPb Yeah true
Motion rendering? These analog CRTs don't do anything expect display the signal that comes through the cable, pixel by pixel, line by line. There is zero processing. And also zero delay. It's instant and direct.
I had flight simulator running on 3 x 20" CRTs that together weighed about a metric ton.
That "beige haziness", in my opinion, really affects how immersive the CRTs can be.
Higher end ones don't really have that when they had better coated flat glass fronts.
The display background grey on the crt is from the light you are shining on to the crt monitor. If you are in a dark room there is no grey background on crt or plasma monitors.
the black crush could be caused by:
a) shining light over the screen - better to keep subdued lighting in the back of the monitor
b) bad adapter - idk about displayport, but like 99% of hdmi→vga adapters are not doing full RGB properly, so whatever you do it'll lose colors compared to VGA→VGA. but in a pinch, setting the output in Nvidia contol panel to 'ycbcr444 limited' instead of RGB helps it somewhat.
c) brightness on monitor - 'contrast' is like "luminosity" or power, and 'brightness' is like gamma setting. i keep contrast @ 100 and brightness @ 40-50
If you really want your mind to be blown get a filter for the CRT that dulls the glare on the glass and makes it appear matt. ;)
That sounds awful, I hate the haze of matte displays.
@@CasepbX agree
Tf are you talking about? Strong anti glare makes ANY display look worse lmao
@@startedtech agree
@@startedtech CRT are curved so the light reflects off the edges. The shine can be really annoying especially if you have the sun coming in through a window and hitting it on 1 side. Due to the image appearing more “in the screen” due to the thick glass it can really make it look washed out on 1 side. I don’t usually like a Matt screen but with a CRT they are useful unless sitting in total darkness or without side reflection. Even like a desk lamp trashes the image.
I hope someday we bring CRT production back.
the monitor occasionally going black might have something to do with the displayport/HDMI to VGA adapter you're using.
Could be also driver issue... I don`t think modern drivers like VGA.
@@u3pyg Thankfully it doesn't matter, since they only see a digital display via the adapter, the adapter then feeds the analogue signal to the display.
''I actually got a monitor this time''
10 seconds later: ''Now the TV in question.''
I used a Dell (Sony) Trinitron 8 years into the introduction of LCD’s in the market. I lugged that same exact monitor to countless LAN parties to play Battlefield 1942. It was quite the feat for a short high schooler. Thanks for taking me back to a very special time in my life. When I had friends who would always be down for slumber parties, pizza, soda, LAN party and riding dirt bikes and quads.
Oh man Battlefield 1942! Loved that game. I made mods for it for well over 10 years. Battlefield Heroes'42 and BF2'42 were the last 2 mods I made for that game before I finally moved on. :P
I bet that USB-C to VGA adaptor added some latency. I wonder what the last video card to have a DVI port (with the analog pins for the native adaptor) was...
I have an old PC with a Nvidia 560Ti, I can use a HDMI/DVID adaptor
GTX980ti and it's Titan equivelent.
For Nvidia it would be Maxwell I believe so a 980ti or Maxwell Titan would be the fastest cards available, for AMD it's the R9 380X I believe but I know less about the older AMD cards since I didn't own any during that period.
The fastest card I own with the analog DVI port is my GTX 980 Ti Lightning from Msi. I have a much better card in my main rig but I love having this card for my CRT. GTX 970s and 980s are getting dirt cheap now so those are solid options for gaming on a CRT
My last card with DVI was Radeon RX480
CRTs have 0 motion blur btw. even modern 500hz monitors still have some a CRT display is better in that regard
CRTs also don't have aliasing. So why use anti-aliasing algorithms like FXAA, MSAA, TAA?
The only thing that could come close theoretically is a 1000hz OLED/microLED with BFI/Strobbing, but they don't exist yet at least commercially.
@@AJ-po6up No, it's stil very slow. A CRT draws each scanline in microseconds.
It's physically impossible for BFI and strobbing tech to match how fast the magnets on a CRT change the position of the electron beams.
@@AJ-po6up They have to bring back CRTs and use extremely light glass, that's how the weight is reduced.
@@AJ-po6up The issue with close is that close can never hit 0 it can only approach it. At that point though you'd need 1000 frames a second as well. No game runs at that especially no modern game. Which is the big issue with high refresh rate panels. 500hz is fast but unless you have 500 frames it doesnt really matter. If you are locked to 60fps which many games are you might as well be on a true to spec 60hz panel (Frame pacing is better but this is a micro factor). For this reason CRTs will always be king. Until an entirely new tech is created.
games look better on CRT. it's just fact.
The ULTIMATE sleeper monitor build
6:23 the reason for this is because these older games were made with crt displays in mind, the display would smooth out the rougher looking parts that you weren't really supposed to see! the best examples of this i can think of are dracula's talk sprite in sotn where his eyes are just one red pixel each but because of the crt it would blend them to look more natural, and in waterfalls in sonic 1-3 where it did a similar thing to give them a natural looking transparency
Guy counts ppi for crt, that has no pixels 💀💀💀
The problems I had with cry gaming was the screen flicker , the huge back for the tube , the high pitch whistle that comes from most cry screens and the headache it caused.
But CRT does look sharper even at lower resolution. I think LCD looks more jagged is because of the led layout.
Love the constant calling of the monitor a TV and the previous TV a monitor. Great stuff. Thank you for the laugh. I needed that this morning.
1:41 "The TV in question"
lol
hearing that whine only means that the crt is displaying in a standard definition resolution, i.e. 240p or 480i. the horizontal scan rate of sd resolutions have a scan rate of 15kHz which is inside of the human range of hearing. the monitor does not whine because the higher resolutions it displays have a higher horizontal scan rate above 31kHz, outside the human range of hearing. therefore, the whine does not indicate it is dying.
Ngl there is just something about CRT's that will always make me love them. They just have that nostalgic vibe to them.
These days everything is so perfect and you can spend 5K-10K on the perfect setup with a 240Hz 4K Oled display but there's just something about CRT's that will always be awesome to me.
They should bring back a modern run of CRT displayers, that would be pretty cool imo.
Yea, I know what you mean...