This video neglects to mention the primary cause: eye tracking-induced blur. This is because your eyes are moving while the frame is being shown. This causes blur. Backlight strobing is to combat this by flashing the frame for a short time then showing just black. Your eyes' image persistence will make the image still visible, and it will move with your eyes, thus no blur. While backlight strobing also hides pixel transition artifacts, that's not its main purpose. A CRT is effectively strobed, and why CRTs give such blur-free motion. A higher frame rate helps, but strobing can act like a much higher frame rate without the need for massive GPU processing.
Indeed. And if they are going to title their video like that, they should really go into what makes a good monitor. Color gamut, brightness, contrast, hdr, color accuracy/oob calibration/delta error, and more into what makes a good response time, what is refresh compliance, how bs advertised response times are, resolution to monitor size (ppi) to viewing distance, what is acceptable overshoot, and on and on!
You need to match the games fps to the refresh rate of the monitor to take advantage of strobing and avoid artifacts. Usually you want your games to run at 120fps minimum with 240fps being the sweet spot for strobing. Really only useful in some shooters that are well optimized unless you're running a NASA pc.
Probably you need also one of these supported monitor that actually performs well and doesn't just say it is "compatible" or something. I'm thinking about the countless "HDR" monitors we have that barely perform any different in HDR mode, they physically can't do HDR remotely well, but are advertised as HDR monitors.
They are advertised as HDR monitors because they support a 10bit signal and that's it. TV manufacturers have been pushing HDR for years, but on PC nobody cares and companies spend more time designing RGB lights on the back and other nonsense instead of focusing on improving panel's picture quality. We need mini-leds to become cheap and then we might finally get good full array local dimming and real HDR on monitors.
@@Volker_A4That's true, but unfortunately they are still too expensive and most people would rather have a cheap higher refresh rate IPS monitor than a cheap 120hz OLED screen, if they existed, even though OLEDs have much faster response time than LCDs.
This has been bugging me for a long time and I literally just discovered this feature yesterday. ULMB1 does wash out the colors and add some ghosting, but I can finally read moving text
I do notice. In the past I would double, triple, quadruple check the video settings to see if motion blur was on, and make screenshots to discern whether the game was actually blurring the scene, because I couldn't understand why it was happening. While it's not a huge amount of blur, I still find it annoying even on high refresh rate screens.
i have an acer 27in 2k 144hz monitor and the ghosting i get in nearly all games its extremely annoying. i dont need motion blur turned on in any game because thats just my displays default for anything semi fast moving lol
@@Archer957usually most acer monitors are good Are you sure you have your display settings set to 144hz? Do you have nvidia low latency mode on, with things like g-sync, ULMB, etc? Or amd’s equivalent
This is something I've realized long ago and was probably also because of ltt and other tech reviews. The monitor is the main and very thing that we look at when we use a computer. But for budget builders and even mid range builders, the monitor are one of the components we cheapen out. I had several friends and including me, built our mid to high end ranged pc, but bought cheap monitors and had bad experiences as a whole. So while there are over the top and really expensive monitors, you really can't just buy the monitors by getting the cheapest one with ________(insert whatever feature you are looking for, kike refresh rate, size, etc).
Cheap monitors (130€) can be great for budget builds. I got a Samsung 60hz 1080p one for that price and it was phenomenal until I upgraded the hardware and needed more refresh rate and gsync. Now for 180€ I have a 170hz gsync compatible 2k VA monitor and it's great
I have recommended a few people the Acer SA220Q monitor for budget builds. They're surprisingly good for the price offering 1080p 75 Hz with an IPS panel. As long as the response time and color reproduction is good, it probably won't affect your experience much. That said, I regret cheaping out on my 4K monitor. Sure it's 144 Hz but somehow the backlight bleed and color reproduction is worse than my old 1440p panel even though it's from the same brand and even uses the same IPS technology. Might be getting an OLED as I don't see local dimming as a good solution.
I have a Zowie 2566k. Absolutely a awesome monitor for fps games. I have never seen something so clean in motion. Not even my S95B QD Oled comes close to it
CRT monitors were so good. Pretty much no latency, no motion blur, dark blacks, and even entry level monitors ran above 60 Hz. Downsides were burn-in, size, weight, and general blurriness.
Just watched this on a dumpster dive Dell 16" tube. Another hard part of using one of these in the modern era is that they're analog, and require the VGA port. The best response times could only be achieved when the GPU could natively output the analog signal. These days though, no GPU manufactured since nVidia 900 series has native VGA support. Meaning you need to add an adapter. Meaning latency. Usually about one frame's worth. And also a hit to reliability. Last time I ran a CRT on a modern PC, my adapter burned out in less than a year.
@@gokublack8342 people don't understand screen size is a thing. Some people use 24 inch so 1080p can look good but can look bad at 32 inch where you can step up to 1440p or 4k at bigger screen size
@@dominicshortbow1828 i think over 27" for a 1080p is a dumb move since you have less pixel density, might as well switch to higher resolution, i just bought a 24" 1080p after using a 21" 768p monitor for over 3 years and i thought that 24" is big enough for 1080p especially if you have a very limited spaced on your table
I don't have money. I game on a 1080p monitor with a refresh rate of 60hz powered by an rx6700xt. I frame limit games to 60fps too where possible because my monitor can't display more than 60fps which I guess saves me some electricity ( worlds most expensive power in the UK!) I am quite happy with this.
Using a higher framerate than your display's refresh-rate still reduces latency, apparently. So you could cap it a little above (to where it doesn't tear with some kind of sync-feature to help). But yea...
Zowie XL2456k 240hz/XL2566k 360hz. Near zero motion blur and the fastest non OLED response times of a panel period. Few gamers make this association between Zowie’s motion blur reduction experience and old higher end CRT’s but it’s comparable. If you need higher resolution look at the Alienware OLED’s.
I've been saying this forever, monitor companys lie and even when the spec is good it's either not true, simulated, or some other factor disables it from achieving the performance it should. These companies have had 0 accountability it's time you stand up for the consumers. I have a great monitor, but it took over a week of research, reviews, and monitor testing websites to cut the BS and find a good product.
Motion blur does suck something fierce and it should be combated but you'd be amazed by just how big a difference properly colour calibrated screens make, it's like unlocking a whole new world.
I blew multiple relative's minds when they thought the washed out appearance was just 'the normal thing'. They had it connected to their computer via HDMI. So windows/nvidia control panel, defaulted to a sharply limited color space, resulting in washed out blacks and whites.
Hardly anyone properly identifies this problem. It's hard to admit that the cause isn't a lack of advancement in technology, but an unavoidable trade off. You either need strobing (and its associated flicker) or a very high frame rate. At today's frame rates you can't have each frame displayed for the entire frame and not have motion blur.
Well, there are the CRTs, if you can find the right type and you're willing to put up with the bulk and weight. Not to mention the wrangling with the image settings and adaptors for the older ports. Not many people fit into that very narrow category these days.
You can see the blur by just taking your browser window and staring at some text as you drag the window around. Sigh, the old days of the CRT. But the first LCD monitors of the 90s were completely awful.
Now I'd like to see something related to this but how it relates to different display types. Riley briefly mentioned OLED, and I know OLED can "actually" hit 1ms (or less) unlike many edge or backlit LED displays that claim 1ms. I've also seem some claims about mini LED having bad response times or smearing. I was hoping mini LED would be a good alternative to OLED for gamers (cheaper, no burn-in), but I may as well skip mini LED if the response times suck and just get something like an LG C2/C3 or B2/B3. I just wish the price for 55"+ would reliably drop below $1000 freedom bucks. One of you needs to do a follow up either on here or on LTT about the different display types specifically as it pertains to gaming and use as a monitor. Oh, almost forgot, I also read something about the OLED subpixel layout not being great for text. Maybe it's BGR or something. Sure wish a tech/gaming channel would cover it rather than the TV review channels... 😉
I don't think calling it "real" motion blur is correct since it's still simulated motion blur whether it's done in software or hardware. Real motion blur would be the blurring caused by a moving object being seen or photographed which is the concept both of those effects emulate, whether it's on purpose or a consequence of hardware limitations.
Guys, I do need some advice about watching 1080p videos on 1440p monitors that no one is talking about ! Tried some high-spec 1440p models but all of them were blurry, shockingly worse than 1080p native monitors while playing 1080p videos. Then I unfortunately learned that the scaler won't be able to precisely transpose the 1080p image onto the 1440p screen. Meaning, blur and other visual artifacts make the video look substantially worse than 1080p native monitors and it's not related with bitrate or compression etc. Mainly watching 1080p shows, movies (not streaming on youtube/Netflix etc) and have a huge archive... Tried 4K monitors/TVs with 1080p videos and they look just fine, equal to 1080p native ! 4K is exactly four times 1080p resolution; this means that for every pixel in 1080p, there are four pixels in 2160p, so you just make each pixel four times bigger. Looks exactly like a 1080p display. I want to upgrade to 2K for games, web & work but "to watch 1080p videos blurry with artifacts" is holding me back. 4K is hard to drive and it's expensive but definitely clearer than 1440p ! - Should I skip 1440p & invest in 4K monitors ? Can you do a video about the elephant in the room ? 🤔
Watching at 720p will scale better but be less detailed. New Nvidia cards have upscaling for browser videos, but honestly there's no perfect solution. If you really can't stand it, just watch 1080p videos in a smaller window that fits a 1080p sized portion of your 1440p screen
There is no elephant in the room. 1080p video shouldn't be blurry on a 1440p screen. Video games do look shitty at 1080p on a 1440p monitor. That's why you pair a graphics card and CPU that can easily handle 1440p.
Hey man, i own both 1080 and 1440p lg ultagear monitors and sony tvs. I also have huge library like you. I will be honest. There is no easy way. If you want to watch 1080p shows with perfect resolution, play it on a tv. Your tv resolution can be anything but they have inbuilt upscaling software for this. Preferably sony and samsung are very very good in this , i am using it. Else you can have a dark home wallpaper and play the video in monitor in 1080p small window. Its not a bad experience trust me. Getting a 1440p 27 or 32 inch monitors sole purpose is huge improvement in browsing , writing, using office apps. And with hdd prices going down you can slowly upgrade your library to include few 4k shows.
Btw some monitors lets you change it. My samsung monitor has 3 options, standard, faster, fastest. I use faster just because it does feel more responsive.
I still have my VG278H. I use it along side my main screen on occasion (it's off most of the time) It has two uses, Its used portrait as an extra screen when I'm programming (more screen, more better) and Pinball games (portrait and pinball just works)
i've arrived on my "ideal" pc sound system in 2006 after about 12 years of struggling. i've arrived on my "ideal" pc screen in 2022, after 28 years of struggling. i could go on forever on this LG oled, ngl.
Actually in my case the GAME is the problem. Person 4 Golden has an after image on everything that moves on the screen. It's even in the screenshots if you take them while moving. The game itself is doing this and it drove me crazy trying to diagnose it. It seemingly does it on every system. And you can't turn it off in the game settings.
It took me ages to find the monitor I need. The smearing I noticed was insane. I ended up with my current LG ultragear 2k with ips-panel and it wasn't that expensive
NOPE. In most games this console generation, all of them force TAA to use some cheap, inaccurate workarounds for lighting and post-processing effects. Due to this, you CANNOT typically turn off the 'feature'... but there's a reason you should find mod workarounds and the like! TAA shadow and lighting enhancements universally run a frame behind, using last-frame information and comparing to current frame data to generate the desired effects-with the downside of adding a ghosting blur of said effects that behaves identically to artificial motion blur. Yeah, I really wanted that extremely minor amount of chromatic aberration at the expense of everything smudging together, developers. TAA is evil, and the blur it introduces is only the tip of the iceberg.
I’m waiting for an OLED type 4K240. ULMB2 would be quite the bonus. Once that tech is out from several competitors, I can start the dreaming of being able to afford one.
You should do a video on Nvidia super resolution cause it's sweet and no one is talking about it. Real time upscaling for watching old movies, streaming TH-cam .... other tubes ... is really awesome, I use less internet as a result as well.
They did. I tested it a lot as well and it's only good for cartoons and very simple anime at 360-720p. For real life content and games it's only good if it's heavily compressed. At higher bitrates it's pointless and at higher resolutions it eats up more gpu power (sometimes by a factor of 4) for less sharpness than free alternatives. MadVR gives more sharpness at resolutions above 720p and for less gpu power. You can use both on the same system. Have a portable player set up for one and a regular player with the other, so you get the best of both worlds. For anyone interested there's guides for it, but it doesn't take long to set up.
I've never understood the fascination with pixel response times when everyone just ignores how blurry modern TAA makes games in motion look anyway. You remove ghosting entirely from your monitor? Great, get ready to see it anyway with TAA
@@xDUnPr3diCtabl3 admittedly, DLSS and DLAA have next to no ghosting relative to any other temporal option. They still get softer in motion, no doubt, but they have a lot less artifacts than TAA, TSR, FSR, etc. If you're having ghosting problems with dlss, update the dll file
I always have this experience while playing fallout 76, whenever i look down from a cliff to a thick forest then moving into it. Or when i'm going to plow in a thick tall grasses and scrubs.
It's not as simple as "just don't buy a shitty monitor" when in some countries the shitty monitors are all "reputable" brands, when upgrading from 24" 1080p I went with a 27" 1440p 165hz curved monitor from "Gamemax" because it was the cheapest option available in that resolution and very popular over here, turns out that a R$ 3000 ($631) monitor is outperformed by a 1080p MSI laptop screen (looks better is what I mean). Actually my monitor right now is double the price for some reason while a MSI equivalent (not laptop) is about R$ 4900 ($1000).
I mean that's what this is, except it doesn't make it look more like a CRT. CRTs had a natural blending of the phosphorus cells as the electron beam ran across them. Making something like dracula's single red pixel in his eye look more spread out and menacing. Kinda like a natural AA that was just a byproduct of the technology.
@@Xfade81 I mean that they should develop LCD's that backlight strobe automatically, not needing a GPU and special software to do it. Similarly, you could design the monitor to display all of its data at once, instead of scrolling across the frame pixel by pixel. Or at least divide up the monitor into 8 subsections (almost like 8 separate monitors stuck together) then update each section simultaneously for much less crosstalk. This could allow you to have the backlight on for more time each frame.
I have one of those fancy LG UltraGear 27" 240Hz OLED monitors and I am beyond furious, that they don't have black-frame-insertion or some other tech for motion blur reduction... OLEDs are *by far* the most capable screens of applying this tech due to their low response times, meaning you can theoretically tweak the on-off ratio with exceptional detail, but monitor manufacturers just refuse to implement it, and I cannot for the life of me understand why
I’d suggest getting an LG C1 or a CX. 120hz at 4K, oled, and it has black frame insertion. Monitors kinda suck at this point compared to TVs in the mid-high end price range.
@@killertruth186 The vast majority of gamers have never seen how incredibly strongly it affects motion clarity, so of course they don't have much of an opinion on it. It's like with 120Hz gaming a few years back when everybody was yelling "the human eye can only see 24 fps anyway!!! we don't need anything above 60Hz!", because they had just never seen how much of a difference it makes
I like my Iiyama Vision Master Pro 455, no blur because it's a CRT. Downsample 1600x1200 to 800x600@144 and it looks great. I may try downsampling TF2 from 3200x2400 (4x SSAA) to see if it'll run well.
@@jackn4908 Oh? You'll want a good adapter. The startech DP2VGA2HD (May not have the HD, not sure) is pretty much the last word in compatibility and pixel clock unless you're flash enough to get a high-grade delock.
If you don't want to spend hundreds, you can just get more Hz monitor and tune the black levels.. You still have a little bit of ghosting but beyond 165hz it becomes less noticeable. I tried on 120hz, then 144hz and then 240hz and it reduced a lot, even for competitive titles like CSGO/Valorant/etc..
Motion blur (in-game, not from display device) is a preference, I like it, others may not, so there needs to be a demand from users to devs to implement actually useful settings (like strength, camera and world motion blur etc). Everyone should have a sweet spot. I play some competitive games with motion blur, roast me if you so desire.
my 24" 144hz LG monitor has a motion blur reduction function that is crazy good, i was amazed looking at the ufo test and text scrolling, you can read text as it's moving up the screen holding the down arrow and it's cheap too (lg 24gn600) that readability and smoothness reminds me of CRT monitors finally modern screens are starting to achieve what CRTs were great on i imagine more expensive hdr oled screens hit all the marks but those are way above my budget
Funny enough, in terms of 24" 144 hz monitors, the 24gn600 is one of the not so good option, the contrast ratio is only like around 800:1 and color quality is average. I guess panel tech are starting to get so good that even average ones are getting very good like what you said.
infact i like the motion blur generated by my monitor. Because it make the graphics look better when you have to play on a lower resolution due to a moderate graphics card.
It's more for really fast motion and you need to keep track of things. - You can notice it when you want to keep an eye on a lot of details, but you need to move quickly and turn your aim a lot, whether it's in a multiplayer game or even just a casual adventure-game. - Action is just more difficult to track at 60/75Hz. - And yes, it's PLAYABLE, but there IS a blur that messes up your vision in motion. Even trying to read text on a slowly scrolling page, which you can test right now, is hard to do, cause of the actual blurring and relatively low refresh-rate. All of that is what annoys me personally and I can't wait to at least jump to 120Hz for a doubling. Going higher isn't realistic for a lot of games anyway, because you need the framerate to back it up.
@@michaelmonstar4276 I've been thinking about it for a long time but well, for one, I have a really hard time finding small 1440p high refresh rate monitors in my area. It seems they are either small and 60/75hz or 27" or bigger, which would mean a downgrade in pixel density. That would bother me as aliasing is the bane of my existence. Which leads me to the other thing, dldsr. It doesn't play nice with high refresh rates. So then I think, okay lets look at 4k monitors so I can run natively, I mean I do have a 4090 after-all. But then its like, okay, what do I do when a game has bad or no AA? Right now its dldsr+reshade. If I'm already at 4k, idk if rending at an even higher resolution is realistic. And a 30" or 32" 4k monitor would have about the same pixel density as my current monitor. I remember when I upgraded to 1440p people were saying "you wont even need AA anymore" yeah that was totally not true. And if you go back far enough, people even said that about 1080p. And then there's also, there will come a time when playing at 4k with a 4090 is no longer realistic. Running a 4k monitor at 1440p is going to look like shite. Whereas returning a 1440p monitor, previously downsampled, back to its native resolution, is do-able to extend the life of the card. So with all this to think about, I just end up doing nothing. Year after year. Lol.
Must say I was disappointed by early backlight strobing - namely that Asus TUF 27" 1440p 165Hz monitor featured on LTT several years ago. The image was much dimmer, yes. But I didn't expect to get whacking headaches after 30-45min! Almost unrelated point: I love using the 3D on my 3DS and never had/have a problem with it. Except that I have the original, less powerful 3DS and the frame rate chugs more than normal...
Nice ad for ULMB2. Blurring is not monitor dependant. It is indeed also possible to come from the GPU, as proved by some titles literally looking crisper vs another GPU (as exampled by known tests done by monitor review tech channels when comparing the 3090 vs the 5700xt). Same monitor, same game, all equal except the GPU. Now does that mean it is "only" the GPU? No. Point is, it's a combination of factors. Could be GPU and or monitor and or even the game itself and how it's "coded" to run better on a specific brand GPU. More importantly is to know where the blurring is coming from and if it's severe enough to warrant expensive upgrades. Cheers 🍻
This video fails to mention a few important details about ULMB that I believe are important. Such as ULMB can not be used at the same time as using VRR features such as Freesync and G-Sync. I don't know if this is down to a technical limitation or if manufacturers are worried that if you were playing a demanding game at low framerates if there was a risk of using VRR and ULMB at the same time possibly inducing seizures among those sensitive to the strobing and headaches in others. Sub-60hz refresh rates on CRT's used to induce headaches with me. As a result I used to require my CRT monitors be capable of at least 75hz or I would refuse to use them. Which begs the parallel, do you maintain high strobing rate regardless of the refresh rate, or do strobe in sync with the refresh rate. I don't know the definitive reason, all I know is currently, I've not seen a monitor that allows you to use ULMB with VRR enabled. Many including myself may not feel like that's a worthwhile sacrifice. For me, VRR is the far more useful feature. And the other thing it doesn't really address (it kinda does, but only vaguely in passing) is that ULMB dims the screen a lot. They did mention this, but what they didn't mention is just how much that will affect features like HDR if you have a real interest gaming or consuming HDR content. As such ULMB will drastically and negatively affect HDR performance due to the significant hit to brightness it incurs on the monitor. To the point where it would quite likely make HDR practically unusable or pointless. I haven't seen anything about ULMB2 that addresses either of these two major shortcomings in any meaningful way. Top that with ULMB2 only appearing on really expensive monitors and it forces the question. If I'm paying that much for a monitor, what's more important to me? ULMB2, or VRR+HDR. Am I really going to spend that much money on that high end of a monitor and then choose not to use two signature features of that bracket for the sake of one that isn't really all that important "by comparison"? Honestly, at that price point, if motion clarity really was that important to me, then I'd rather bite the bullet and buy an OLED. OLED already has vastly faster response times than LCD, resulting in much better clarity than LCD. Not as good as an LCD with ULMB, but nowhere near as bad as LCD on it's own. So I can get most of the way there for clarity, but also get to use VRR and have a great HDR experience thanks to OLED's per pixel level brightness control due to their pixels being self lit. The only thing I have to be mindful of is burn-in, but if I take reasonable steps, I should be able to hopefully manage that risk. Either way... ULMB2 just doesn't really seem all that appealing. There's just way too many downsides and not enough upsides to justify it. Especially when there are already far better options available on the market.
No thank you, my 27 inch 1080p 60hz monitor i bought second hand for 40 bucks with a one year warrant works fine, and since i would have to pay about 600 bucks to get a decent 32/34 1440p monitor for my 3080, ill just stick with the one i have, i dont need more than 60hz anyway for Wot, and i mostly play vr games..
People are complaining about blur and low refresh rate but for me 60 is perfect, it feels almost like 120 or higher, just eats less power. Also why care if reaction time is lower? You just shoot where the enemy will be, not where the enemy is.
You don't need 4K... 1620p needs to be normalized, but 99.9% of the population acts like it doesn't exist and acts as if 1440p is the in-between. That said, most of the few 1620p monitors, all LCD, are already expensive because they're kind of a niche. - Thanks consumers!
Really curious about the "military grage" thing (mentioned in sponsor spot). Where did ut come from and what it actually means? I kinda understand that it means "rugged", but is there actually a standard for this kind of thing?
I feel they use it for marketing, as just a term to "wow" people, which is often the case. - But there COULD be a "military standard". - I would ask Volta directly if they could back it up... But generally, don't take it too literal, and just indeed consider it as "strong", as well as check their warranty. Some of these brands for things like cables and cases have high claims for durability and will even exchange when something goes bad for about any reason within a certain period of time. - Not sure about Volta, though.
i mean if you're gonna spend a lot on a monitor you may as well just get yourself an OLED. they have improved a lot for gaming, and they are being released with 240hz, HDR, Gsync, and all those sweet extras which will never make your aim better.
Don't forget depth-of-field. I just got an RTX 4060, and was able to crank up CoD DMZ. It looked like absolute garbage. It was blurry and looked trash. I experimented and figured out that the ever-present culprit of depth-of-field was the problem. Turn that trash OFF. In some games, it looks ok, but in other games, it destroys the visuals outright.
The biggest problem is the lack of money for a better monitor
fr
You can buy a CRT monitor for less than 100$ or even get it for free.
It's time to get your money up then😢
Skill Issue
"get yo money up, not your funny up" - some yt comment I saw somewhere sometime
This video neglects to mention the primary cause: eye tracking-induced blur. This is because your eyes are moving while the frame is being shown. This causes blur. Backlight strobing is to combat this by flashing the frame for a short time then showing just black. Your eyes' image persistence will make the image still visible, and it will move with your eyes, thus no blur. While backlight strobing also hides pixel transition artifacts, that's not its main purpose. A CRT is effectively strobed, and why CRTs give such blur-free motion. A higher frame rate helps, but strobing can act like a much higher frame rate without the need for massive GPU processing.
Indeed. And if they are going to title their video like that, they should really go into what makes a good monitor. Color gamut, brightness, contrast, hdr, color accuracy/oob calibration/delta error, and more into what makes a good response time, what is refresh compliance, how bs advertised response times are, resolution to monitor size (ppi) to viewing distance, what is acceptable overshoot, and on and on!
@@Keivz this is tech"quickie". Never intended to make in-depth analysis. Just a quick and dirty info video
@@queueeeee9000So it is. Missed that and got fooled (baited) by the title (not a subscriber, just into tech)
@@Keivz yeah, 99% of YT videos are titled that way. Understandable you got baited
You need to match the games fps to the refresh rate of the monitor to take advantage of strobing and avoid artifacts. Usually you want your games to run at 120fps minimum with 240fps being the sweet spot for strobing. Really only useful in some shooters that are well optimized unless you're running a NASA pc.
Probably you need also one of these supported monitor that actually performs well and doesn't just say it is "compatible" or something. I'm thinking about the countless "HDR" monitors we have that barely perform any different in HDR mode, they physically can't do HDR remotely well, but are advertised as HDR monitors.
They are advertised as HDR monitors because they support a 10bit signal and that's it. TV manufacturers have been pushing HDR for years, but on PC nobody cares and companies spend more time designing RGB lights on the back and other nonsense instead of focusing on improving panel's picture quality. We need mini-leds to become cheap and then we might finally get good full array local dimming and real HDR on monitors.
@@rodryguezzzsome monitors say HDR ready lol
@@rodryguezzzthe OLEDs can do it right now. And competition has been really ramping up over the last year with those.
@@Volker_A4That's true, but unfortunately they are still too expensive and most people would rather have a cheap higher refresh rate IPS monitor than a cheap 120hz OLED screen, if they existed, even though OLEDs have much faster response time than LCDs.
The key you are looking for is VESA displayHDR certification. VESA has a list on their website of all VESA displayHDR certified monitors.
Built my computer in 2015 4790k upgraded GPU to 1080ti still great for 1080p.
This has been bugging me for a long time and I literally just discovered this feature yesterday. ULMB1 does wash out the colors and add some ghosting, but I can finally read moving text
You should do a test to see whether people can tell the difference.
If you know what to look for you can 100% notice ghosting when turning quickly. Don’t really need a video to confirm it, it’s really that obvious
Your phone is probably an OLED at this point. Ever feel like your monitor looks like trash in comparison?
Because I've felt that way for years.
I do notice. In the past I would double, triple, quadruple check the video settings to see if motion blur was on, and make screenshots to discern whether the game was actually blurring the scene, because I couldn't understand why it was happening. While it's not a huge amount of blur, I still find it annoying even on high refresh rate screens.
i have an acer 27in 2k 144hz monitor and the ghosting i get in nearly all games its extremely annoying. i dont need motion blur turned on in any game because thats just my displays default for anything semi fast moving lol
@@Archer957usually most acer monitors are good
Are you sure you have your display settings set to 144hz? Do you have nvidia low latency mode on, with things like g-sync, ULMB, etc? Or amd’s equivalent
This is something I've realized long ago and was probably also because of ltt and other tech reviews. The monitor is the main and very thing that we look at when we use a computer. But for budget builders and even mid range builders, the monitor are one of the components we cheapen out. I had several friends and including me, built our mid to high end ranged pc, but bought cheap monitors and had bad experiences as a whole. So while there are over the top and really expensive monitors, you really can't just buy the monitors by getting the cheapest one with ________(insert whatever feature you are looking for, kike refresh rate, size, etc).
Cheap monitors (130€) can be great for budget builds. I got a Samsung 60hz 1080p one for that price and it was phenomenal until I upgraded the hardware and needed more refresh rate and gsync. Now for 180€ I have a 170hz gsync compatible 2k VA monitor and it's great
I have recommended a few people the Acer SA220Q monitor for budget builds. They're surprisingly good for the price offering 1080p 75 Hz with an IPS panel. As long as the response time and color reproduction is good, it probably won't affect your experience much. That said, I regret cheaping out on my 4K monitor. Sure it's 144 Hz but somehow the backlight bleed and color reproduction is worse than my old 1440p panel even though it's from the same brand and even uses the same IPS technology. Might be getting an OLED as I don't see local dimming as a good solution.
I bought a £600 4K @ 144Hz Acer monitor and have never regretted it. I agree, it is one of the most important components in your set up.
@@Killamarshian Is it an XV282K? I kind of regret since Acer released a new version with local dimming, the XV275K for not much more.
I have a Zowie 2566k. Absolutely a awesome monitor for fps games. I have never seen something so clean in motion. Not even my S95B QD Oled comes close to it
Only CRT legends know the struggle ;))
CRT monitors were so good. Pretty much no latency, no motion blur, dark blacks, and even entry level monitors ran above 60 Hz.
Downsides were burn-in, size, weight, and general blurriness.
@@KillFrenzy96good monitors had no burn in, flat screens, sharp as a tack, because I’ve got one :)
Just watched this on a dumpster dive Dell 16" tube.
Another hard part of using one of these in the modern era is that they're analog, and require the VGA port.
The best response times could only be achieved when the GPU could natively output the analog signal.
These days though, no GPU manufactured since nVidia 900 series has native VGA support.
Meaning you need to add an adapter. Meaning latency. Usually about one frame's worth.
And also a hit to reliability. Last time I ran a CRT on a modern PC, my adapter burned out in less than a year.
I honestly noticed little difference switching... The bigger issue being the viewing angle and consistency and all that.
@@Hawxsn Jesus Christ 🤓
Techquickie: Your GPU Isn't The Problem. Your Monitor Is.
Monitor:
I’d say the minimum for competing is 1080p120 with and advertised 1ms response time.
Not too expensive and still good enough clarity.
I think what cooljosh said is about right
@fln0 I favor 1440p myself but 1080p is a fair starting point (but at least get 120hz refresh rate or more)
@@gokublack8342 people don't understand screen size is a thing. Some people use 24 inch so 1080p can look good but can look bad at 32 inch where you can step up to 1440p or 4k at bigger screen size
@@dominicshortbow1828 i think over 27" for a 1080p is a dumb move since you have less pixel density, might as well switch to higher resolution, i just bought a 24" 1080p after using a 21" 768p monitor for over 3 years and i thought that 24" is big enough for 1080p especially if you have a very limited spaced on your table
@fln0Tell that to csgo pros playing in less than 1080
I have this old monitor, it's a HP EliteDisplay 232, for some reason it's reeeelly good, like high quality, and it came from around 2013~14
Biggest problem is greedy companies that always make worse products to save money.
I mean yeah capitalism bad but what’s new
And mislabel said bad products to sell them for higher prices.
I don't have money. I game on a 1080p monitor with a refresh rate of 60hz powered by an rx6700xt. I frame limit games to 60fps too where possible because my monitor can't display more than 60fps which I guess saves me some electricity ( worlds most expensive power in the UK!) I am quite happy with this.
Using a higher framerate than your display's refresh-rate still reduces latency, apparently. So you could cap it a little above (to where it doesn't tear with some kind of sync-feature to help). But yea...
Thanks LTT Team
ULMB is the perfect addition to FlickerFree current controlled backlight 🤟
Zowie XL2456k 240hz/XL2566k 360hz. Near zero motion blur and the fastest non OLED response times of a panel period. Few gamers make this association between Zowie’s motion blur reduction experience and old higher end CRT’s but it’s comparable. If you need higher resolution look at the Alienware OLED’s.
Motion blur has not always been a problem. Only since switching to LCD monitors. CRTs never had this problem.
Why do CRTs not have the issue?
@@Michael-fc7no Saw this in a different comment, but it's because they also do strobing afaik.
I've been saying this forever, monitor companys lie and even when the spec is good it's either not true, simulated, or some other factor disables it from achieving the performance it should. These companies have had 0 accountability it's time you stand up for the consumers. I have a great monitor, but it took over a week of research, reviews, and monitor testing websites to cut the BS and find a good product.
Which monitor is it
Whats your Monitor
"Here at LTT we're very sorry for our mistakes. But do you know who isn't sorry? Our sponsor for this video"
Motion blur does suck something fierce and it should be combated but you'd be amazed by just how big a difference properly colour calibrated screens make, it's like unlocking a whole new world.
I blew multiple relative's minds when they thought the washed out appearance was just 'the normal thing'. They had it connected to their computer via HDMI. So windows/nvidia control panel, defaulted to a sharply limited color space, resulting in washed out blacks and whites.
Everyone knows ULMB actually stands for Ultra Low Monitor Brightness
Ahahah I turned ULMB because my screen is too bright even at the lowest brightness setting.
@@ArifKamaruzamanmaybe your eyes are just too dark?
bigest reason for blur is sample and hold on current display technology's (lcd and oled alike)
Hardly anyone properly identifies this problem. It's hard to admit that the cause isn't a lack of advancement in technology, but an unavoidable trade off. You either need strobing (and its associated flicker) or a very high frame rate. At today's frame rates you can't have each frame displayed for the entire frame and not have motion blur.
Well, there are the CRTs, if you can find the right type and you're willing to put up with the bulk and weight. Not to mention the wrangling with the image settings and adaptors for the older ports.
Not many people fit into that very narrow category these days.
This is absolutely true, I bought a 7900xtx and not until I bought a new monitor did I feel it
Wait a second…..
@@iluvpandas2755 aight, it's been 3 weeks, you should finish your thought now.
I read it as RTX 7900.
That is not a real GPU
A thing not mentioned is that a lot if not most monitors with backlight strobing do not allow it to be enabled with adaptive sync/gsync
Techquickie: your gpu isn't the problem your monitor is
My monitor laughing at my intel hd 630
As a casual Faceit lvl 6 player who plays CSGO with motion blur on on a 60 hz monitor, its a skill issue boys.
how in the hell do you get motion blur in csgo?
@@Sefibid there's a motion blur setting in the game options.
4:29 Oh man, that hits hard 😂
🥲
You can see the blur by just taking your browser window and staring at some text as you drag the window around.
Sigh, the old days of the CRT.
But the first LCD monitors of the 90s were completely awful.
Now I'd like to see something related to this but how it relates to different display types. Riley briefly mentioned OLED, and I know OLED can "actually" hit 1ms (or less) unlike many edge or backlit LED displays that claim 1ms. I've also seem some claims about mini LED having bad response times or smearing. I was hoping mini LED would be a good alternative to OLED for gamers (cheaper, no burn-in), but I may as well skip mini LED if the response times suck and just get something like an LG C2/C3 or B2/B3. I just wish the price for 55"+ would reliably drop below $1000 freedom bucks.
One of you needs to do a follow up either on here or on LTT about the different display types specifically as it pertains to gaming and use as a monitor. Oh, almost forgot, I also read something about the OLED subpixel layout not being great for text. Maybe it's BGR or something. Sure wish a tech/gaming channel would cover it rather than the TV review channels... 😉
or maybe just go for some new 1440p 240Hz OLED monitors that were released this year :D
You know, theres hundreds of channels and videos that could answer all the questions u have, u dont need to wait for LMG to do a video 😅
Hey Riley, do some side by side comparisons of different monitors to see if there is that much of a difference in quality of picture!🎉
There is. A good monitor vs a bad monitor is just as big a change or bigger than going from low settings to high.
Go to a store and watch the tvs, Its the same
Im well aware, my hardware is also falling behind all the dogshit that passes for game development these days
half these kids spend all that to play a free game
3:44 Too much overdrive could be bad for monitors, but Hamon Overdrive, it sure helped Jonathan & Joseph
I love my Gigabyte Aorus FO48U OLED monitor. 😍Once you go OLED, you never go back!
till u get burn in
@@Owy. or you can just take care of it and ensure burn in doesn't happen. :) Toolbar is hidden. Wallpapers are dynamic.
@@imjody i didnt think a year was all it took for my phone to burn in
@@Owy. I have never experienced burn in on any of my phones
Insurance is 36€/year.
Always felt when I went from CRT to flat screen started getting issues that I do not remember with CRTs
I don't think calling it "real" motion blur is correct since it's still simulated motion blur whether it's done in software or hardware. Real motion blur would be the blurring caused by a moving object being seen or photographed which is the concept both of those effects emulate, whether it's on purpose or a consequence of hardware limitations.
Guys, I do need some advice about watching 1080p videos on 1440p monitors that no one is talking about !
Tried some high-spec 1440p models but all of them were blurry, shockingly worse than 1080p native monitors while playing 1080p videos.
Then I unfortunately learned that the scaler won't be able to precisely transpose the 1080p image onto the 1440p screen. Meaning, blur and other visual artifacts make the video look substantially worse than 1080p native monitors and it's not related with bitrate or compression etc.
Mainly watching 1080p shows, movies (not streaming on youtube/Netflix etc) and have a huge archive... Tried 4K monitors/TVs with 1080p videos and they look just fine, equal to 1080p native ! 4K is exactly four times 1080p resolution; this means that for every pixel in 1080p, there are four pixels in 2160p, so you just make each pixel four times bigger. Looks exactly like a 1080p display.
I want to upgrade to 2K for games, web & work but "to watch 1080p videos blurry with artifacts" is holding me back.
4K is hard to drive and it's expensive but definitely clearer than 1440p !
- Should I skip 1440p & invest in 4K monitors ? Can you do a video about the elephant in the room ? 🤔
Watching at 720p will scale better but be less detailed. New Nvidia cards have upscaling for browser videos, but honestly there's no perfect solution.
If you really can't stand it, just watch 1080p videos in a smaller window that fits a 1080p sized portion of your 1440p screen
There is no elephant in the room. 1080p video shouldn't be blurry on a 1440p screen. Video games do look shitty at 1080p on a 1440p monitor. That's why you pair a graphics card and CPU that can easily handle 1440p.
UHD all the way.
4K?
Hey man, i own both 1080 and 1440p lg ultagear monitors and sony tvs. I also have huge library like you. I will be honest. There is no easy way. If you want to watch 1080p shows with perfect resolution, play it on a tv. Your tv resolution can be anything but they have inbuilt upscaling software for this. Preferably sony and samsung are very very good in this , i am using it. Else you can have a dark home wallpaper and play the video in monitor in 1080p small window. Its not a bad experience trust me. Getting a 1440p 27 or 32 inch monitors sole purpose is huge improvement in browsing , writing, using office apps. And with hdd prices going down you can slowly upgrade your library to include few 4k shows.
Btw some monitors lets you change it. My samsung monitor has 3 options, standard, faster, fastest. I use faster just because it does feel more responsive.
“… in majestic 4K UHD”
But actually it is 2560 x 1440
I love the look of disgust when showing the effect of software motion blur.
I still have my VG278H. I use it along side my main screen on occasion (it's off most of the time) It has two uses, Its used portrait as an extra screen when I'm programming (more screen, more better) and Pinball games (portrait and pinball just works)
i've arrived on my "ideal" pc sound system in 2006 after about 12 years of struggling.
i've arrived on my "ideal" pc screen in 2022, after 28 years of struggling. i could go on forever on this LG oled, ngl.
Actually in my case the GAME is the problem. Person 4 Golden has an after image on everything that moves on the screen. It's even in the screenshots if you take them while moving. The game itself is doing this and it drove me crazy trying to diagnose it. It seemingly does it on every system. And you can't turn it off in the game settings.
@@figweb sadly I'm playing on switch through a capture card. So I'm stuck with it.
Actually it's my glasses. The new pair should be in next week.
This made things a lot clearer + thanks for the clarity!
ngl i keep motion blur turned on in games cuz it helps mask this effect on my shitty monitor
It took me ages to find the monitor I need. The smearing I noticed was insane. I ended up with my current LG ultragear 2k with ips-panel and it wasn't that expensive
NOPE. In most games this console generation, all of them force TAA to use some cheap, inaccurate workarounds for lighting and post-processing effects. Due to this, you CANNOT typically turn off the 'feature'... but there's a reason you should find mod workarounds and the like! TAA shadow and lighting enhancements universally run a frame behind, using last-frame information and comparing to current frame data to generate the desired effects-with the downside of adding a ghosting blur of said effects that behaves identically to artificial motion blur. Yeah, I really wanted that extremely minor amount of chromatic aberration at the expense of everything smudging together, developers.
TAA is evil, and the blur it introduces is only the tip of the iceberg.
I’m waiting for an OLED type 4K240. ULMB2 would be quite the bonus.
Once that tech is out from several competitors, I can start the dreaming of being able to afford one.
120/240hz oleds with bfi is all I want. 120~165 for the budget side and 240+ for high end.
Me with a 4070 and a 1080p 60hz monitor from 2008 😎
Me: has a 4090 and the Alienware QD-OLED.
*Interesting*
Fun fact: The HDD shirt you're wearing is one I got in "mystery" t-shirt order!
You should do a video on Nvidia super resolution cause it's sweet and no one is talking about it. Real time upscaling for watching old movies, streaming TH-cam .... other tubes ... is really awesome, I use less internet as a result as well.
They did. I tested it a lot as well and it's only good for cartoons and very simple anime at 360-720p. For real life content and games it's only good if it's heavily compressed. At higher bitrates it's pointless and at higher resolutions it eats up more gpu power (sometimes by a factor of 4) for less sharpness than free alternatives. MadVR gives more sharpness at resolutions above 720p and for less gpu power.
You can use both on the same system. Have a portable player set up for one and a regular player with the other, so you get the best of both worlds. For anyone interested there's guides for it, but it doesn't take long to set up.
I've never understood the fascination with pixel response times when everyone just ignores how blurry modern TAA makes games in motion look anyway.
You remove ghosting entirely from your monitor? Great, get ready to see it anyway with TAA
And ghosting with DLSS 😮💨
@@xDUnPr3diCtabl3 admittedly, DLSS and DLAA have next to no ghosting relative to any other temporal option.
They still get softer in motion, no doubt, but they have a lot less artifacts than TAA, TSR, FSR, etc.
If you're having ghosting problems with dlss, update the dll file
I always have this experience while playing fallout 76, whenever i look down from a cliff to a thick forest then moving into it. Or when i'm going to plow in a thick tall grasses and scrubs.
It's not as simple as "just don't buy a shitty monitor" when in some countries the shitty monitors are all "reputable" brands, when upgrading from 24" 1080p I went with a 27" 1440p 165hz curved monitor from "Gamemax" because it was the cheapest option available in that resolution and very popular over here, turns out that a R$ 3000 ($631) monitor is outperformed by a 1080p MSI laptop screen (looks better is what I mean). Actually my monitor right now is double the price for some reason while a MSI equivalent (not laptop) is about R$ 4900 ($1000).
our awesome brazillian taxes 💪
import stuff
@@estevaoanggotta keep it a 3rd country by force
@@apache937 Doesn't change much since there's 60% tax on top
Maybe u should have done some research, there are way better 1440p 200hz+ monitors for around 300-400.
Can't wait for OLED monitors to become better and cheaper!
me with my overclocked 60hz to 75hz smushy blur of a $10 used ebay screen
Really surprising they dont just make LCD's that render the whole image at once and flicker their own backlight, so they look more like a CRT
I mean that's what this is, except it doesn't make it look more like a CRT. CRTs had a natural blending of the phosphorus cells as the electron beam ran across them. Making something like dracula's single red pixel in his eye look more spread out and menacing. Kinda like a natural AA that was just a byproduct of the technology.
You mean backlight strobing as mentioned ?
They do make backlight strobing free monitors - to reduce eye strain
i don't think you can simulate cathode-ray tube fuction via liquid crystal display
@@Xfade81 I mean that they should develop LCD's that backlight strobe automatically, not needing a GPU and special software to do it. Similarly, you could design the monitor to display all of its data at once, instead of scrolling across the frame pixel by pixel. Or at least divide up the monitor into 8 subsections (almost like 8 separate monitors stuck together) then update each section simultaneously for much less crosstalk. This could allow you to have the backlight on for more time each frame.
I have one of those fancy LG UltraGear 27" 240Hz OLED monitors and I am beyond furious, that they don't have black-frame-insertion or some other tech for motion blur reduction... OLEDs are *by far* the most capable screens of applying this tech due to their low response times, meaning you can theoretically tweak the on-off ratio with exceptional detail, but monitor manufacturers just refuse to implement it, and I cannot for the life of me understand why
Not so sure why it is that important, when the vast majority of gamers would never use it in the first place.
@@killertruth186 but for the ones that would
I’d suggest getting an LG C1 or a CX. 120hz at 4K, oled, and it has black frame insertion. Monitors kinda suck at this point compared to TVs in the mid-high end price range.
@@killertruth186 The vast majority of gamers have never seen how incredibly strongly it affects motion clarity, so of course they don't have much of an opinion on it. It's like with 120Hz gaming a few years back when everybody was yelling "the human eye can only see 24 fps anyway!!! we don't need anything above 60Hz!", because they had just never seen how much of a difference it makes
@insu_na - who said the human eye can only see 24fps? That's hilarious.
I like my Iiyama Vision Master Pro 455, no blur because it's a CRT. Downsample 1600x1200 to 800x600@144 and it looks great. I may try downsampling TF2 from 3200x2400 (4x SSAA) to see if it'll run well.
I just got a vision master pro 512 yesterday and it's amazing. CRTs are the future lol
@@jackn4908 Oh? You'll want a good adapter. The startech DP2VGA2HD (May not have the HD, not sure) is pretty much the last word in compatibility and pixel clock unless you're flash enough to get a high-grade delock.
Cant wait to see Riley say "the world isnt the problem, your existence is"
Hi bot, it looks like you didn't buy enough upvotes, so sad :(
1:18 Damn right we are! GAMERS! GAMERS! GAMERS!
If you don't want to spend hundreds, you can just get more Hz monitor and tune the black levels.. You still have a little bit of ghosting but beyond 165hz it becomes less noticeable. I tried on 120hz, then 144hz and then 240hz and it reduced a lot, even for competitive titles like CSGO/Valorant/etc..
Motion blur (in-game, not from display device) is a preference, I like it, others may not, so there needs to be a demand from users to devs to implement actually useful settings (like strength, camera and world motion blur etc). Everyone should have a sweet spot. I play some competitive games with motion blur, roast me if you so desire.
my 24" 144hz LG monitor has a motion blur reduction function that is crazy good, i was amazed looking at the ufo test and text scrolling, you can read text as it's moving up the screen holding the down arrow
and it's cheap too (lg 24gn600)
that readability and smoothness reminds me of CRT monitors
finally modern screens are starting to achieve what CRTs were great on
i imagine more expensive hdr oled screens hit all the marks but those are way above my budget
Funny enough, in terms of 24" 144 hz monitors, the 24gn600 is one of the not so good option, the contrast ratio is only like around 800:1 and color quality is average.
I guess panel tech are starting to get so good that even average ones are getting very good like what you said.
CRTs had no motion blur and instant response time , by default.
Remember what we lost to get shitty LCDs.
1:08 correction, anything that isnt OLED, CRT and Plazma :p
Can't wait for my monitor to give me nausea from all the flickering. No thanks. Ill rather have the smearing
infact i like the motion blur generated by my monitor.
Because it make the graphics look better when you have to play on a lower resolution due to a moderate graphics card.
That acer monitor they showed for $500 is not the right one. The one that supports ulmb2 is $900+
I still use a 10 year old 60hz 1440p office monitor. Just don't really see the need to upgrade. Games look good, especially with dldsr.
It's more for really fast motion and you need to keep track of things. - You can notice it when you want to keep an eye on a lot of details, but you need to move quickly and turn your aim a lot, whether it's in a multiplayer game or even just a casual adventure-game. - Action is just more difficult to track at 60/75Hz. - And yes, it's PLAYABLE, but there IS a blur that messes up your vision in motion. Even trying to read text on a slowly scrolling page, which you can test right now, is hard to do, cause of the actual blurring and relatively low refresh-rate.
All of that is what annoys me personally and I can't wait to at least jump to 120Hz for a doubling. Going higher isn't realistic for a lot of games anyway, because you need the framerate to back it up.
@@michaelmonstar4276 I've been thinking about it for a long time but well, for one, I have a really hard time finding small 1440p high refresh rate monitors in my area.
It seems they are either small and 60/75hz or 27" or bigger, which would mean a downgrade in pixel density. That would bother me as aliasing is the bane of my existence.
Which leads me to the other thing, dldsr. It doesn't play nice with high refresh rates. So then I think, okay lets look at 4k monitors so I can run natively, I mean I do have a 4090 after-all.
But then its like, okay, what do I do when a game has bad or no AA? Right now its dldsr+reshade. If I'm already at 4k, idk if rending at an even higher resolution is realistic. And a 30" or 32" 4k monitor would have about the same pixel density as my current monitor. I remember when I upgraded to 1440p people were saying "you wont even need AA anymore" yeah that was totally not true. And if you go back far enough, people even said that about 1080p.
And then there's also, there will come a time when playing at 4k with a 4090 is no longer realistic. Running a 4k monitor at 1440p is going to look like shite. Whereas returning a 1440p monitor, previously downsampled, back to its native resolution, is do-able to extend the life of the card.
So with all this to think about, I just end up doing nothing. Year after year. Lol.
Got my oled monitors almost a year a go. Im never looking back.
They are better than most people! 1:15 Homelander vibe, I am better, I am better! 😁"
Rye-Lee is so smart!, his genius can't be tracked!!
Volta Spark. The tips are not reversible!
Must say I was disappointed by early backlight strobing - namely that Asus TUF 27" 1440p 165Hz monitor featured on LTT several years ago.
The image was much dimmer, yes. But I didn't expect to get whacking headaches after 30-45min!
Almost unrelated point: I love using the 3D on my 3DS and never had/have a problem with it.
Except that I have the original, less powerful 3DS and the frame rate chugs more than normal...
Legend has it that Riley's moustache once stopped a would-be purse thief. The details are classified but the truth is undeniable.
seriously.. ive just solved a problem a bit like this.. but it was the differance between hdmi and diplay port on the back of monitors
OLED … no motion blur to speak of. Love mine!
not true there is still motion blur on because it is a "sample and hold" technique
Get a crt for better motion clarity
Nice ad for ULMB2.
Blurring is not monitor dependant. It is indeed also possible to come from the GPU, as proved by some titles literally looking crisper vs another GPU (as exampled by known tests done by monitor review tech channels when comparing the 3090 vs the 5700xt).
Same monitor, same game, all equal except the GPU.
Now does that mean it is "only" the GPU? No.
Point is, it's a combination of factors. Could be GPU and or monitor and or even the game itself and how it's "coded" to run better on a specific brand GPU.
More importantly is to know where the blurring is coming from and if it's severe enough to warrant expensive upgrades.
Cheers 🍻
And that’s why I still use a CRT for gaming
This video fails to mention a few important details about ULMB that I believe are important. Such as ULMB can not be used at the same time as using VRR features such as Freesync and G-Sync. I don't know if this is down to a technical limitation or if manufacturers are worried that if you were playing a demanding game at low framerates if there was a risk of using VRR and ULMB at the same time possibly inducing seizures among those sensitive to the strobing and headaches in others. Sub-60hz refresh rates on CRT's used to induce headaches with me. As a result I used to require my CRT monitors be capable of at least 75hz or I would refuse to use them. Which begs the parallel, do you maintain high strobing rate regardless of the refresh rate, or do strobe in sync with the refresh rate. I don't know the definitive reason, all I know is currently, I've not seen a monitor that allows you to use ULMB with VRR enabled. Many including myself may not feel like that's a worthwhile sacrifice. For me, VRR is the far more useful feature.
And the other thing it doesn't really address (it kinda does, but only vaguely in passing) is that ULMB dims the screen a lot. They did mention this, but what they didn't mention is just how much that will affect features like HDR if you have a real interest gaming or consuming HDR content. As such ULMB will drastically and negatively affect HDR performance due to the significant hit to brightness it incurs on the monitor. To the point where it would quite likely make HDR practically unusable or pointless.
I haven't seen anything about ULMB2 that addresses either of these two major shortcomings in any meaningful way. Top that with ULMB2 only appearing on really expensive monitors and it forces the question. If I'm paying that much for a monitor, what's more important to me? ULMB2, or VRR+HDR. Am I really going to spend that much money on that high end of a monitor and then choose not to use two signature features of that bracket for the sake of one that isn't really all that important "by comparison"? Honestly, at that price point, if motion clarity really was that important to me, then I'd rather bite the bullet and buy an OLED. OLED already has vastly faster response times than LCD, resulting in much better clarity than LCD. Not as good as an LCD with ULMB, but nowhere near as bad as LCD on it's own. So I can get most of the way there for clarity, but also get to use VRR and have a great HDR experience thanks to OLED's per pixel level brightness control due to their pixels being self lit. The only thing I have to be mindful of is burn-in, but if I take reasonable steps, I should be able to hopefully manage that risk.
Either way... ULMB2 just doesn't really seem all that appealing. There's just way too many downsides and not enough upsides to justify it. Especially when there are already far better options available on the market.
No thank you, my 27 inch 1080p 60hz monitor i bought second hand for 40 bucks with a one year warrant works fine, and since i would have to pay about 600 bucks to get a decent 32/34 1440p monitor for my 3080, ill just stick with the one i have, i dont need more than 60hz anyway for Wot, and i mostly play vr games..
People are complaining about blur and low refresh rate but for me 60 is perfect, it feels almost like 120 or higher, just eats less power. Also why care if reaction time is lower? You just shoot where the enemy will be, not where the enemy is.
Lmao hitting home with the ghosting joke at the end 🤣
It's almost like you never even heard of blur busters approved monitors. Viewsonic XG2431 and XG270 have pretty much perfect blur reduction.
@3:08 This needs to be made into a message tone!
The biggest problem is the price for a high refresh rate 4k OLED monitor
You don't need 4K... 1620p needs to be normalized, but 99.9% of the population acts like it doesn't exist and acts as if 1440p is the in-between.
That said, most of the few 1620p monitors, all LCD, are already expensive because they're kind of a niche. - Thanks consumers!
nice pun on the ghosting
ULMB2: "We put the W I D T H in PWM."
Really curious about the "military grage" thing (mentioned in sponsor spot). Where did ut come from and what it actually means?
I kinda understand that it means "rugged", but is there actually a standard for this kind of thing?
I feel they use it for marketing, as just a term to "wow" people, which is often the case. - But there COULD be a "military standard". - I would ask Volta directly if they could back it up...
But generally, don't take it too literal, and just indeed consider it as "strong", as well as check their warranty. Some of these brands for things like cables and cases have high claims for durability and will even exchange when something goes bad for about any reason within a certain period of time. - Not sure about Volta, though.
Maybe add monitors to your pc budget build guides from now on then??
Also ULMB2 has issues with not rendering full frame.
i mean if you're gonna spend a lot on a monitor you may as well just get yourself an OLED. they have improved a lot for gaming, and they are being released with 240hz, HDR, Gsync, and all those sweet extras which will never make your aim better.
lol
Rule Nr. 0 don't go competetive without a TN Panel Monitor
2:51 skip ad
Don't forget depth-of-field. I just got an RTX 4060, and was able to crank up CoD DMZ. It looked like absolute garbage. It was blurry and looked trash. I experimented and figured out that the ever-present culprit of depth-of-field was the problem. Turn that trash OFF. In some games, it looks ok, but in other games, it destroys the visuals outright.