I'm still rocking a g-sync monitor and like the fact it can sync all the way down to 1hz. (especially with games like final fantasy 16 on pc, that has 30fps cutscenes, so they look smooth)
@@TechTusiasti do wish we got oled pulsar. I have c1 with bfi and its insanely good, but being locked to 120 means i have to keep a lot of headroom incase i get a demanding scene. So often i could run maybe 200 fps avg but still only get 120fps min 1% low. Bfi is too good on oled and really unexplored on monitors as it is now.
GSYNC compatible monitors are Validated to be artifact free. Not Validated and Certified as shown in this video. GSYNC with module are Validated (artifact free) and Certified (300+ tests). Current monitors with modules could not receive firmware updates. I imagine going the MTK route will change that.
I'm still rocking a g-sync monitor and like the fact it can sync all the way down to 1hz. (especially with games like final fantasy 16 on pc, that has 30fps cutscenes, so they look smooth)
Alienware Oled monitor DW measured way better than DWF for vrr flicker in Rtings test. Whether that's due to Gsync module is far from proven, as it's the only oled monitor with the module... But it's an interesting thing I wish could be investigated more. DW measured by far #1 for vrr flicker.
I've noticed the flicker maybe once so far and when I do it's usually a really dark and demanding scene. Thankfully the flicker isn't that big of an issue for me
Ive had the DW for almost 2 years now and have never seen any vvr flicker in any games, from darker games like dead space remake or resident evil 4 to games like bf2042 where it’s maxing out monitor hz.
@@DaKrawnik later on sure, but all the previous models didn't. And even if FPGA, it still meams they'll have to design it since that's how they work. The same HDL code tested on FPGA with timing constraints & routing constraints are then put into ASIC without those constraints.
@scarletspidernz My prediction. Nvidia will have a dp2.1 monitors available for Blackwell Titan RTX for them whales. So after the 1 or 2 quarters of craze prices will drop significantly, especially by Black Friday 2025.
I guess it's...good? Seeing as there were some monitors with G-Sync module that can cause compatibility issues with Radeon cards despite having a FreeSync/Adaptive Sync certification.
@@kesamek8537How so? Freesync haven't exactly been innovating recently anyway. The only really good updates have been HDR support and GSync compatibility support
@@kesamek8537 wow, are you for real? like Freesync did any improvement since its release, right? if there is room for improvement without a dedicated module sure they can improve and invalidate this new module as they did back in the day but if there is no room for improvement this module not prevent any improvement whatsoever. I cant even process how could you got to a conclusion that this prevent others to go further. Its completely flawed.
@@kesamek8537 freesync is still stuck unable to sync below 40fps in many cases. where as g-sync refreshes all the way down to 1fps (1hz). important for slower computers that can only hit 30ish fps (or games that have capped cutscens at 30fps)
They're pretty much taking notes from Smart TV design. They have had discrete video processing chip for ages now. It's practically standard design for them. It's an interesting move.
I remember that and it's probably more on Sony for over promising and under delivering...just to sell more tvs. I've used tvs with the newer, more capable Mediatek chips and they work pretty darn good.
@mikewmoran84 because they designed the chip? come on dude spread the blame on that I'm sure sony would have wanted flawless announcement and release.
They announced some monitors with Pulsar, but unfortunately no dates or prices. But I looked up one of those monitors and it's current version without Pulsar is 1 grand. I don't think the Pulsar version is going to be cheaper.
Hope this means we’ll see laptops with this new MediaTek scalar and access to all the previously module exclusive features. There haven’t been any laptops with the G-Sync module so they’ve all been just “G-Sync compatible” but now we should be able to get full G-Sync or even G-Sync Ultimate displays! Totally Psyched.
Almost certainly. In theory, the power consumption for the Mediatek chip should be negligible. Ideally you also get 2048+ backlight zones, not only for deeper blacks but should also be more power efficient.
Bummer that G-Sync Pulsar is not supported on any of the recently announced 1440p 480Hz OLED monitors. I guess we will have to wait until next year for that.
OLEDs don't have a backlight and are not good candidates for Gsync Pulsar. On OLEDs you can only do BFI and you can't dynamically change the persistence by variating the strobe length duty cycle. And you need that for Pulsar.
@@hastesoldat I though so too at first but listen to what Tim says at 5:17, "I asked Nvidia about support for OLED:s with this new G-Sync scaler and they confirmed that theoretically OLED panels will be supported through this partnership although the inital focus is on high-end LCD panels." I'm sure they'll figure out how to combine VRR and BFI.
Oh I never noticed, that I had the only OLED monitor with GSync on the market. I love the wide thing, can't wait to upgrade in 5-10 years. Just remembering how long I had to wait for a decent gaming OLED monitor.. the industry really is milking the snails pace they are working at.
VRR just blew my mind when it came out, Once I experienced those first 120hz screens I needed that buttery smooth performance and without VRR any dips below 100fps felt woeful with stutter, once G-Sync arrived I could handle dips down to as low as 80fps without being too pissed off. Glad we now have freesync/adaptive-sync, so good.. Also annoys me that people focus so much on VRR stopping tearing.. come on.. Adaptive V-Sync did that, its the removal of stutter thats the game changer.
Tandem-OLED with FPGA accelerated Nvidia Pulsar and HDR-RBFI injection is when things finally get interesting for me, and it's worth replacing my fancy Sony CRT gaming monitor, which looks incredible with RTX & perfect motion-clarity and incredible input fidelity that standard sample & hold OLED can not get anywhere near.
I loved my early g-sync monitors 120hz and using 3D gaming🥳💪🥰👍. I still have my 3D glasses set and components and would be interested to see if these work ont he new g-sync modules coming out🤩😁
Still running the old almost decade old ROG PG279Q. Was an absolute beast back in 2015 when I got it my pc wasn’t even powerful enough to hit 165fps at 1440p and I guess you could say the same about these 240hz 4k monitors these days even a 4090 can’t touch that reliably without significant compromises to image. These high end monitors are only really useful for esports games that run fine on ancient hardware. Since then CRT tech has been my go to putting out a far superior gaming experience that can scale with pc power and type of game being played. The worlds first native multi display mode monitors. Size and desk space is the only downside really.
In general this means more, lower cost vrr lcd monitors with vod and strobing as well as good fald algo for miniled, which is a good thing. Personal though, im using oled, and whether this means oled bfi without refresh rate compromise dictates whether this is extremely exciting or mildly boring.
I don't know if this is related to the G-Sync module in my LG OLED monitor or just the monitor itself, but I specifically bought this monitor because it was "G-Sync" enabled to use with my RTX 4080 Super. Playing games like Starfield and Cyberpunk 2077 caused a lot of flickering with G-Sync on, turning it off stopped the flickering.
Any plans to possibly review the Samsung 57" Super Ultrawide (which is like 2 32" monitors side by side) that's dual-4K resolution? I think it's only 'down side' is being Mini-LED instead of OLED, but I would love to see your review of it... it's 1000R curve, but that's also what I'm used to (I have a 49" G9 Odyssey currently). I'm hoping Samsung/LG will make a OLED-version of the 57" Super Ultrawide monster.
Ehh this is neat. But it was announced sounding like an update to existing monitors. I was hoping vendors like Samsung would update their Odyssey line for better compatibility
This took a while to happen. Lee, my DWF display flickers on adaptive sink, and my previous display, the Samsung C H capital G 90 also flickered with adaptive sink enabled. Both have traditional scalars and Samsung panels. With this 242 results, I cannot help but mistrust, traditional scalars on Samsung panels. I feel that, in order to ensure that my next display supports adaptive sync properly, I will go for something G sink branded.
The original GSYNC modules were like 20nm (I believe) Altera FPGAs. They also used DDR3 and only supported DP 1.2a. Did the hardware really never change and they only did software updates over time to allow for things like DP 1.4? Cause if so... no wonder hardware GSYNC has a large cost penalty and requires active cooling. That's some seriously out-of-date hardware that can't be very efficient nor affordable (I'd imagine DDR3 and those old-ass 20nm chips aren't exactly mass produced these days?)
I'm curious about the frequency range for the new G-sync module because that was a big advantage compared to other VRR implementations. Having VRR on games running less than 45fps can make a huge difference to perceived smoothness
Nvidia has been setting an unprecedented bar recently, once completely proprietary is now going maybe not completely open source but is at least making tools available to create stuff that they've created in the past, truly innovative stuff and as far as my knowledge goes, the only company to actually do this with their software, it is genuinely surprising that this is the direction that Nvidia chose to go, kinda makes me wish more companies did this
This is great! I've used non G-SYNC compatible monitors and the VRR flickering is annoying. This channel told me it didn't matter nowadays which adaptive sync to use which whatever GPU, but I've had to return one better monitor to get a lower quality monitor which is officially G-SYNC compatible. It still has some flickering, but it's minor.
@@A.Froster Good point, yes it was a VA monitor. It is my first time experiencing a VA monitor and I'm not a fan. I don't like the motion blurry issues, even of the latest VA monitors. I replaced the VA monitor with an IPS monitor.
@@legendp2011 Good to know, thanks, but the Freesync monitor that I was using had issues even when the frame rate only went down to about 60 FPS. It still flickered until I turned off VRR.
I was kinda hoping to hear something about fixing VRR flicker on OLED. Until that's dealt with, VRR is DOA on OLED. I won't be purchasing another monitor until this issue is fixed with the new RTINGS test performed to prove it.
This could mean MediaTek could be strongarmed into de-featuring the scalers they put into products that don't carry the full nvidia branding, in my opinion. edit: therefore making Freesync relatively gimped going forward.
Yep, hence the problem with Nvidia vertically integrating into everything. Nvidia already strongarms AIBs due to their market power so I wouldn't be surprised if they did the same with other spaces they are creeping into. They pull the same BS in the AI space. Nvidia is essentially a horizontal and vertical monopoly in the GPU market.
Good, as a French customer i can tell that these monitors with module from Nvidia are nearly impossible to buy in Europe. I guess that means it'll be more accesible to European people, we have to wait and see for the price though...
Not really. Monitors with G-Sync hardware have never been common at all, that isn't a Europe-exclusive thing. I'm from EU as well and wanted a G-Sync monitor a few years back. Had no trouble finding one that fit my specs and buying it.
Good point. I do like that you've updated the community with this. I wouldn't have known without someone telling me. I can pass this on to my brother too. He's gotten more, newer, monitors since but he still daily drives his first "big expense" monitor from 2014 back when G-Sync was still relatively new. I guess that means he should have to wait another couple years. His wife gets the last say lol.
TBH with how bad the VRR flicker is on my $1100 qd-oled, something that was supposed to be the end all be all of monitors, all of my interest in new monitor technology has effectively evaporated.
Thanks H.U.; i had no idea the Nvidia scaler module required a G-sync monitor to include a fan! While Nvidia GPU users wait on the new tech to reach us - we basically should just buy a G-sync COMPATIBLE monitor? Can you recommend me an IPS that at least has a Wide Color Gamut, like DCI-P3 or Adobe RGB? I will need to use Nvidia DSR to 'hopefully' upscale my 480-720P movie collection and display them well enough on a 32" 1440P monitor. That would be greatly appreciated! Thanks again for the news flash!
I just want more devices with a physical g-sync chip so i can turn my monitor hz to 24hz. So i can watch movies/shows *truly* without any sort of judder.
"G-Sync Compatible" the disclosure of "compatible" I feel has been hidden/absent in most cases of "g-sync compatible" monitors. Weather brands deliberately and maliciously "forget" to advertise the word "compatible", or innocently and ignorantly leave the word out, thinking it doesn't mean anything- probably both. But Nvidia are very clever, and I would be shocked if it wasn't by design.
I'm a happy FreeSync user (NVIDIA GPU with an LG OLED TV). What am I missing out? It seems that G-SYNC's advantages are very niche. What are the use cases that would benefit from an "upgrade" to a G-SYNC setup?
Having a non-OLED monitor always benefits from an actual G-sync module. If you are just running a standard VRR monitor, you have to manually change the overdrive settings everytime you play something with a different framerate if you want to avoid blur and overshoot, and even then it's not ideal. With that said, it's not that it's a bad experience, and most people don't have the awareness of how overdrive influences their image, but when you do know what's going on, it's hard to ignore.
We know for sure that any new monitors released from this point either have, or dont have the mediatek scalar. There no 3rd option with the module anymore. And as always you should read/watch reviews. I very much doubt there would be new monitors released with the same name anyway.
Will be interested to see if it is as smooth as the current g-sync module. I have both the DW (gsync) and DWF (non gsync) and when it comes to g-sync/freesync/adaptive sync, the DW is generally smoother (on nvidia card at least)
@@Heisen_burger-dude freesync is software driven by a standard VRR controller. It only has low frame rate compensation window if it’s a crappy display (related to gaming) that’s not related to the controller, the controller can only address what it’s connecting to at a hardware level. This is a primary reason why you never buy a TV for gaming, they’re substandard at an actual gaming performance level. But it’s irrelevant anyway as it’s not a gsync monitor so isn’t affected by this.
@@SpyderTracks Little nitpick: G-sync compatible is a certification for "AdaptiveSync". "FreeSync" is a brand from AMD for their VRR ecosystem. For example you could theoretically have a non freesync branded monitor that is certified "G-Sync compatible"
@@hastesoldat well no, in some future land yes of course you could, but there are to date no gsync compatible monitors that aren’t freesync. But yes, if Intel came to the party and branded some monitors under their own VRR moniker, then those could become gsync compatible too if they reached the required bar. But word on the street is that intels days in the GPU space are done and battlemage is DOA. It’s not really a nitpick as highlighting something that doesn’t exist
Turned out to be one of the best thing Nvidia had released. Some monitors from the early Gsync era could be very easily hacked into doing fully unlocked BFI/Backlight Strobing at any framerates, especially at 60FPS, with very low strobe lenght making them look incredibly clear, much more so than most monitor today that still do not allow this feature at all! Not only that but the module apparently could mitigate the VRR flickering, and some gsync modules could even do BFI+VRR. So, I'm actually sad to see this.
Honestly it's stupid of companies to not offer strobbing/bfi, bfi on OLED is so easy to implement compared to the fine tuning needed with IPS, yet only ASUS does it...
I don't know how you cam to that conclusion. Frame capping literally pairs perfectly with VRR. Without VRR frame capping will either tear if vsync is off, or judder like crazy if vsync is on.
If you're not using VRR/GSync/Freesync then you're doing something wrong. On my 144Hz GSync Compatible monitor I often cap to 90FPS. So my monitor matches at 90Hz. That still requires GSync to be working (along with VSYNC to prevent tearing if frametime changes too much). I suspect you mean you've left behind wild swings in the FPS, but again you should be using VRR. And probably are. Frame capping + VRR is the ideal solution. Digital Foundry has a bunch of good videos on this. (and not all frame capping is equal. I cap in-game if possible, but if not I use NVidia Control Panel. I have ALL games using an FPS cap. Some of them are old games that run at 138FPS as I have Ultra Low Latency enabled globally in the NCP.)
Can you guys make a video on how annoying DSC display stream compression is, when tabbing out and in (games), is this ever gonna get fixed by NVIDIA or a solution (other than running games in borderless, big issue is games dont minimize to taskbar in borderless) deal breaker and annoying asf
In short no, In details still no, not the existing monitors, maybe if somebody want to compete with this module by improving with software or creating new modules for future monitors or update old, but solely relying on software will only get you so far.
G-Sync Compatible is good enough for me, but would be nice to have variable overdrive and VRR below 48Hz range, a game like Destiny 2 can choke my poor old hardware hard enough for FPS to drop that low in some scenarios
OLED turns individual pixels on and off so you can't pulse a backlight. The main issue with persistance is to NOT show light for a fraction of the time to allow your brain to "reset" In old PROJECTORS they had a spinning disc that periodically blocked the light coming from the lamp... with OLED you use "BFI" (black frame insertion). BFI and backlight strobing are similar but I'm not sure of the OLED difficulties... The LG CX OLED TV's had BFI but then it got dropped (I doubt it was used with VRR though). With the LG CX BFI the light level dropped a lot as the black frame lasted a long time, but I see no reason you couldn't have a hardware chip doing better control (i.e. light pulsed OFF for only 1/24th of the time or whatever makes sense.) This technology WILL come to OLED but not exactly in the same way. You'd turn the pixels OFF for a small amount of time, every so often dictated by the refresh rate. I'm not sure if this is possible with existing OLED panels or not. It seems like it SHOULD be since OLED has a near instant response time. I'm ignorant of the issues, so to me it seems like it should be easier to do than variable backlight strobing. I suspect everything they learn getting VRR+Strobing working would make OLED BFI fairly easy to implement. (I'm guessing OLED is being looked at NOW. OLED is pretty new to desktops so maybe NVidia is focused on what sells the most. Or maybe we need new OLED panels.)
Yes Nvidia themselves said that for now their focus is on LCD, not OLED. Also i would not say two years is new. That's how long mainstream OLED has been available on desktop.
I really hope pulsar is not vendor locked to nvidia gpu's, after all asus solution did not required to talk to the gpu, and it can just hook into adaptive sync, but knowing NVIDIA I doubt it.
At least on my old vg259qm and my new 271 qrx, VRR is pointless. If you run the vg259qm at 280hz there’s no tearing ever. Gsync/VRR just makes it less smooth because of the lower response times with lowering refresh rate on lcd
It should result in cheaper monitors than those with the FPGA modules. But there will likely still be a premium for the inclusion of the feature. Ultimately those scalers don't cost that much (unlike the FGPA module) but it's still the monitor vendors who decide the final price at the end of the day.
YES and NO. I expect a few hundred dollar premium to start, but within two years I think we'll start seeing prices drop quite a bit. There will be a lot of factors such as how many backlight ZONES there are. Competition will drive this, so yes, the chip shouldn't add a lot to the price (Bill of Materials). Not sure on the LICENSING part though.
Makes sense. Thanks guys. Kind of a shame, but I guess you’re right that as it’s still a differentiating feature that people want, it’s going to be used in more expensive monitors. I was hoping that it would be a much more universally adopted chip, and not used as a premium upsell
Just got my hands on a lg 32gs95ue. Any chance on a general windows guide HDR / HDR terminology guide. Had a real pain trying to get netflix 4k working too with the need to download hevc and still not ending up working.
hoping for ulmb2 for oled, finally. what does oled have a pixel response time of 0.03ms for, if it can't even do intra-frame black microframe insertion?
To me, it sounds like a relatively easy problem at some point. I'm honestly not sure of what the issues are. I'd imagine that you'd just have a MediaTek chip doing something similar to VRR Strobing but in this case just turn the pixels off every so often... but, screen technology can be pretty complicated so I'm unclear if there are any issues that would make this far harder. Is it a simple on/off duty cycle? Of do you need to quickly analyze the data (like how much WHITE there is) and adjust the duty cycle further? I have no idea.
It just won't be called ulmb2 because it's no longer ultra low motion blur. It's not clearer than running at max refresh. You have to halve the refresh rate for bfi on oled remember?
@@Frozoken You're completely misunderstanding the point. OLED has a pixel response time of 0.03ms. There's not a single reason why it would have to halve the refresh rate, because it can absolutely insert a black frame at basically any time. If you maxed an OLED out it could flicker between black and white at 33kHz.
i want oled pulsar. I have c1 with bfi and its insanely good, but being locked to 120 means i have to keep a lot of headroom in case i get a demanding scene. So often i could run maybe 200 fps avg but still only get 120fps min 1% low, having bfi+vrr would allow me to increase graphics and still get bfi benefits. Bfi is too good on oled and really unexplored on monitors as it is now.
I have one of those AW3423DW OLED monitors with a hardware g-sync module. It supports zero features that my previous "g-sync compatible" monitor didn't. No ULMB, no reflex, nothing. The VRR doesn't work any better either.
Definitely. In fact, lower FPS flickering issues are what the VRR Strobing issue is designed to help solve. You don't need a chip for "25FPS" anyway provided the monitor refresh range is sufficient as you just send the same frame multiple times (i.e. 25FPS is sent to monitor as 50FPS so the monitor is now 50Hz but drawing each frame TWICE) using the video SOFTWARE drivers... but the lower the FPS relative to the monitor you get blur issues (BlurBusters has a page on this). *so TBC, 25FPS using an NVidia Pulsar monitor is going to be the best way to experience low FPS gaming. **It can NOT solve anything related to STUTTER that's caused by a computer NOT creating a new frame quickly enough though. So if you're talking about game stutters like that then NOTHING can solve that beyond a faster CPU and/or GPU etc.
I think the market simply produce less shit scaler, G-SYNC module was great back then, it wasn't perfect, but at least it drive today scaler to be good
The G-Sync module is dead! Long live the G-Sync module!
Module how we know it is dead*
Fixed
I'm still rocking a g-sync monitor and like the fact it can sync all the way down to 1hz. (especially with games like final fantasy 16 on pc, that has 30fps cutscenes, so they look smooth)
@@legendp2011 There's no reasonable purpose to scale down to 1Hz. it's not going to make 1 frame per second any less torn.
@@yerdude My gsync monitor was just so much better in that part that any vsync compatible monitor I had.
One minute I held the key
Next the walls were closet to me
And i discovered that my castles stand upon pillars of salt and sand
Oh boy, all the 300 new OLED monitors this year are going to get new versions next year lol
So? The old ones keep working just fine. Gsync module is next to useless as "gsync compatible" is all you need.
@@TechTusiast that’s quite exactly the point lol wonder how many are going to buy this newer “better” version because of their severe FOMO
@@TechTusiasti do wish we got oled pulsar. I have c1 with bfi and its insanely good, but being locked to 120 means i have to keep a lot of headroom incase i get a demanding scene. So often i could run maybe 200 fps avg but still only get 120fps min 1% low. Bfi is too good on oled and really unexplored on monitors as it is now.
Good. I can buy a great used one that’s basically as good as the newest
I predicted that Nvidia will have dp2.1 gsync monitors ready for Blackwell so it's possible.
Sounds like a good move for nvidia. Remove their hardware cost, work with partners, turn it into a software license fee.
it is still a hardware solution. the hardware costs are reduced but not zero
@@PrefoXNot a cost to Nvidia. They just license to Mediatek, Monitor manufacturers buy from them.
@@keyboard_g why are you talking like you own nvidia. consumers are still charged extra for it and this is what matters. so shoo, shoo
If only theyd do this with theyre graphics cards
@@user-ew8lt6yi2i You get what you pay for.
I have an older Acer Predator monitor with G-Sync.. If that thing has a fan, it's the quietest fan I've never heard.
The G-Sync fan inside of the LG 38inch ultragear ultrawide from 2020 is horrible, constant high pitch whine
same. always shocked to hear that some monitors are actively cooled and loud...
Wait wait wait.... it's been 10 years since GSNYC was released?! 😮 I feel old now 😂😂
@@buddybleeyes Don't blink 😉!
@aberkae Literally man 🤣🤣🤣
haha, I feel you on that! I still have one of the OG BenQ 144hz 1080p G Sync panels on my desk, hard to believe how far monitors have come since then
Worse, I'm still using one of the original G-Sync monitors🤣
Grindr-Sync
I am so lost, so many concepts
Man, i remember when i got the world's first G-Sync IPS Monitor "Acer 27" Predator XB270HU" on July 2015.
I still have that monitor. Still no dead pixels and brightness is the same. A pretty well-built monitor. Purchased mine in Nov 2015.
GSYNC compatible monitors are Validated to be artifact free. Not Validated and Certified as shown in this video.
GSYNC with module are Validated (artifact free) and Certified (300+ tests).
Current monitors with modules could not receive firmware updates. I imagine going the MTK route will change that.
I'm still rocking a g-sync monitor and like the fact it can sync all the way down to 1hz. (especially with games like final fantasy 16 on pc, that has 30fps cutscenes, so they look smooth)
Alienware Oled monitor DW measured way better than DWF for vrr flicker in Rtings test. Whether that's due to Gsync module is far from proven, as it's the only oled monitor with the module... But it's an interesting thing I wish could be investigated more. DW measured by far #1 for vrr flicker.
I've noticed the flicker maybe once so far and when I do it's usually a really dark and demanding scene. Thankfully the flicker isn't that big of an issue for me
Ive had the DW for almost 2 years now and have never seen any vvr flicker in any games, from darker games like dead space remake or resident evil 4 to games like bf2042 where it’s maxing out monitor hz.
@@craig9365 the flicker happens to me whenever there is a micro stutter, and depending on game/engine that can happen very often.
Of course it's because of the module..
@@craig9365 you can see VRR flicker right there in this video. At 4:43, for example.
They used just FPGA for gsync for quite a while, hence the cost & need for active cooling.
They just used something they designed...
@@DaKrawnik later on sure, but all the previous models didn't. And even if FPGA, it still meams they'll have to design it since that's how they work. The same HDL code tested on FPGA with timing constraints & routing constraints are then put into ASIC without those constraints.
@@DaKrawnikThe very first module was an Altera Arria V GX FPGA.
Can't wait to see another hundred-dollar difference for a g-sync monitor, now with a new sticker.
Mediatek scalers dont cost hundreds of dollars.
@@Raivo_K But Nivida branding and a new sticker does is what he's inferring
@@scarletspidernz That's speculation at this point. We'll see what the prices will be.
Gsync tv's soon then
@scarletspidernz My prediction. Nvidia will have a dp2.1 monitors available for Blackwell Titan RTX for them whales. So after the 1 or 2 quarters of craze prices will drop significantly, especially by Black Friday 2025.
I guess it's...good? Seeing as there were some monitors with G-Sync module that can cause compatibility issues with Radeon cards despite having a FreeSync/Adaptive Sync certification.
No it's terrible, in my opinion, it will prevent Freesync from further progress.
@@kesamek8537How so? Freesync haven't exactly been innovating recently anyway. The only really good updates have been HDR support and GSync compatibility support
@@kesamek8537 wow, are you for real? like Freesync did any improvement since its release, right? if there is room for improvement without a dedicated module sure they can improve and invalidate this new module as they did back in the day but if there is no room for improvement this module not prevent any improvement whatsoever. I cant even process how could you got to a conclusion that this prevent others to go further. Its completely flawed.
@@kesamek8537 freesync is still stuck unable to sync below 40fps in many cases. where as g-sync refreshes all the way down to 1fps (1hz). important for slower computers that can only hit 30ish fps (or games that have capped cutscens at 30fps)
@@legendp2011 What nonsense, both G-Sync and FreeSync support LFC below 30 FPS.
They're pretty much taking notes from Smart TV design. They have had discrete video processing chip for ages now. It's practically standard design for them. It's an interesting move.
Get ready for those trash Media Tek chips. I remember it took Sony forever just to get VRR working through HDMI 2.1
I remember that and it's probably more on Sony for over promising and under delivering...just to sell more tvs.
I've used tvs with the newer, more capable Mediatek chips and they work pretty darn good.
@mikewmoran84 because they designed the chip? come on dude spread the blame on that I'm sure sony would have wanted flawless announcement and release.
Interested to see Nvidia's pulsar in action. Hoping these monitors will actually be more affordable.
They announced some monitors with Pulsar, but unfortunately no dates or prices.
But I looked up one of those monitors and it's current version without Pulsar is 1 grand. I don't think the Pulsar version is going to be cheaper.
Hope this means we’ll see laptops with this new MediaTek scalar and access to all the previously module exclusive features.
There haven’t been any laptops with the G-Sync module so they’ve all been just “G-Sync compatible” but now we should be able to get full G-Sync or even G-Sync Ultimate displays! Totally Psyched.
Almost certainly.
In theory, the power consumption for the Mediatek chip should be negligible. Ideally you also get 2048+ backlight zones, not only for deeper blacks but should also be more power efficient.
Bummer that G-Sync Pulsar is not supported on any of the recently announced 1440p 480Hz OLED monitors. I guess we will have to wait until next year for that.
OLEDs don't have a backlight and are not good candidates for Gsync Pulsar. On OLEDs you can only do BFI and you can't dynamically change the persistence by variating the strobe length duty cycle. And you need that for Pulsar.
@@hastesoldat I though so too at first but listen to what Tim says at 5:17, "I asked Nvidia about support for OLED:s with this new G-Sync scaler and they confirmed that theoretically OLED panels will be supported through this partnership although the inital focus is on high-end LCD panels." I'm sure they'll figure out how to combine VRR and BFI.
@johansjostrand6026 Yeah true altho bfi isn't nearly as useful but it would still be nice tbf. Honestly should be way easier to implement actually
"a decade ago" Wow has it been already 10 years?
Oh I never noticed, that I had the only OLED monitor with GSync on the market.
I love the wide thing, can't wait to upgrade in 5-10 years.
Just remembering how long I had to wait for a decent gaming OLED monitor.. the industry really is milking the snails pace they are working at.
Hopefully the new module will be better equipped to mitigate vrr gamma flicker on the oled side of things. 🙏
VRR just blew my mind when it came out, Once I experienced those first 120hz screens I needed that buttery smooth performance and without VRR any dips below 100fps felt woeful with stutter, once G-Sync arrived I could handle dips down to as low as 80fps without being too pissed off.
Glad we now have freesync/adaptive-sync, so good..
Also annoys me that people focus so much on VRR stopping tearing.. come on.. Adaptive V-Sync did that, its the removal of stutter thats the game changer.
And input delay
Tandem-OLED with FPGA accelerated Nvidia Pulsar and HDR-RBFI injection is when things finally get interesting for me, and it's worth replacing my fancy Sony CRT gaming monitor, which looks incredible with RTX & perfect motion-clarity and incredible input fidelity that standard sample & hold OLED can not get anywhere near.
Ive been saving up for a full-Gsync monitor for 3 years now.
Seems im now saving for this.
This will require the upcoming 50 series cards, right?
Interesting update, i would have thought this would be a bigger deal for Oled monitors as they have a focus on image quality.
I loved my early g-sync monitors 120hz and using 3D gaming🥳💪🥰👍. I still have my 3D glasses set and components and would be interested to see if these work ont he new g-sync modules coming out🤩😁
Still running the old almost decade old ROG PG279Q. Was an absolute beast back in 2015 when I got it my pc wasn’t even powerful enough to hit 165fps at 1440p and I guess you could say the same about these 240hz 4k monitors these days even a 4090 can’t touch that reliably without significant compromises to image.
These high end monitors are only really useful for esports games that run fine on ancient hardware. Since then CRT tech has been my go to putting out a far superior gaming experience that can scale with pc power and type of game being played. The worlds first native multi display mode monitors. Size and desk space is the only downside really.
In general this means more, lower cost vrr lcd monitors with vod and strobing as well as good fald algo for miniled, which is a good thing.
Personal though, im using oled, and whether this means oled bfi without refresh rate compromise dictates whether this is extremely exciting or mildly boring.
Now with the word Pulsar, it sounds fancy and space techy, it must be good.
I don't know if this is related to the G-Sync module in my LG OLED monitor or just the monitor itself, but I specifically bought this monitor because it was "G-Sync" enabled to use with my RTX 4080 Super. Playing games like Starfield and Cyberpunk 2077 caused a lot of flickering with G-Sync on, turning it off stopped the flickering.
Always appreciate these technical breakdowns.
Great, I'll see this tech 5-8 years from now, when I buy a new monitor.
Any plans to possibly review the Samsung 57" Super Ultrawide (which is like 2 32" monitors side by side) that's dual-4K resolution? I think it's only 'down side' is being Mini-LED instead of OLED, but I would love to see your review of it... it's 1000R curve, but that's also what I'm used to (I have a 49" G9 Odyssey currently). I'm hoping Samsung/LG will make a OLED-version of the 57" Super Ultrawide monster.
Ehh this is neat. But it was announced sounding like an update to existing monitors.
I was hoping vendors like Samsung would update their Odyssey line for better compatibility
Holy shit 10 years of g-sync
This took a while to happen. Lee, my DWF display flickers on adaptive sink, and my previous display, the Samsung C H capital G 90 also flickered with adaptive sink enabled. Both have traditional scalars and Samsung panels. With this 242 results, I cannot help but mistrust, traditional scalars on Samsung panels. I feel that, in order to ensure that my next display supports adaptive sync properly, I will go for something G sink branded.
Adaptive sink, is that something that comes with modern bathrooms?
why do any 4k 240hz qd-oled monitors with g-sync model?
The original GSYNC modules were like 20nm (I believe) Altera FPGAs. They also used DDR3 and only supported DP 1.2a.
Did the hardware really never change and they only did software updates over time to allow for things like DP 1.4? Cause if so... no wonder hardware GSYNC has a large cost penalty and requires active cooling. That's some seriously out-of-date hardware that can't be very efficient nor affordable (I'd imagine DDR3 and those old-ass 20nm chips aren't exactly mass produced these days?)
I'm curious about the frequency range for the new G-sync module because that was a big advantage compared to other VRR implementations. Having VRR on games running less than 45fps can make a huge difference to perceived smoothness
Nvidia has been setting an unprecedented bar recently, once completely proprietary is now going maybe not completely open source but is at least making tools available to create stuff that they've created in the past, truly innovative stuff and as far as my knowledge goes, the only company to actually do this with their software, it is genuinely surprising that this is the direction that Nvidia chose to go, kinda makes me wish more companies did this
This is great! I've used non G-SYNC compatible monitors and the VRR flickering is annoying. This channel told me it didn't matter nowadays which adaptive sync to use which whatever GPU, but I've had to return one better monitor to get a lower quality monitor which is officially G-SYNC compatible. It still has some flickering, but it's minor.
Is it a VA panel ? Because every VA panel i tried had some kind of flickering with G Sync. Had no problem whatsoever with IPS or TN panels
It's a va or oled problem
@@A.Froster Good point, yes it was a VA monitor. It is my first time experiencing a VA monitor and I'm not a fan. I don't like the motion blurry issues, even of the latest VA monitors. I replaced the VA monitor with an IPS monitor.
freesync vrr can only sycn between 40hz and above. gsync can sync much lower hz (all the way down to 1hz). huge difference for me
@@legendp2011 Good to know, thanks, but the Freesync monitor that I was using had issues even when the frame rate only went down to about 60 FPS. It still flickered until I turned off VRR.
I was kinda hoping to hear something about fixing VRR flicker on OLED. Until that's dealt with, VRR is DOA on OLED. I won't be purchasing another monitor until this issue is fixed with the new RTINGS test performed to prove it.
This could mean MediaTek could be strongarmed into de-featuring the scalers they put into products that don't carry the full nvidia branding, in my opinion.
edit: therefore making Freesync relatively gimped going forward.
Probably we will soon see gsync tv's
I could see VRR working with AMD.. but for Pulsar, VRR+Strobing not working.
Yep, hence the problem with Nvidia vertically integrating into everything. Nvidia already strongarms AIBs due to their market power so I wouldn't be surprised if they did the same with other spaces they are creeping into. They pull the same BS in the AI space. Nvidia is essentially a horizontal and vertical monopoly in the GPU market.
Lgs already have gsync for years @@dagnisnierlins188
Yea I don't like it, yet people in the comments are cheering for it. Gotta love a monopoly.
Would love to get a 480hz 1440p 27" oled 'glossy' with pulsar/ulmb2, this would be the greatest esport monitor
wonder if they will release OLED 32inch 4k monitors with this new GSYNC technology
Still rocking my Asus PG278Q (first Gsync monitor?) nearly 10 years later, no active cooling AFAIK.
Good, as a French customer i can tell that these monitors with module from Nvidia are nearly impossible to buy in Europe. I guess that means it'll be more accesible to European people, we have to wait and see for the price though...
Not really. Monitors with G-Sync hardware have never been common at all, that isn't a Europe-exclusive thing. I'm from EU as well and wanted a G-Sync monitor a few years back. Had no trouble finding one that fit my specs and buying it.
Mediatek has come a long way... Now they're dominating the tv and mobile soc
AdaptiveSync and backlight strobing simoultaneously are already on some monitors like the Gigabyte M34wq
Finally. Though I'm still hard pressed to find a monitor to replace my Predator X35.
Should see some exciting monitor changes at CES next year
Good point. I do like that you've updated the community with this. I wouldn't have known without someone telling me. I can pass this on to my brother too. He's gotten more, newer, monitors since but he still daily drives his first "big expense" monitor from 2014 back when G-Sync was still relatively new.
I guess that means he should have to wait another couple years. His wife gets the last say lol.
Gwynn module meant my Alienware oled can’t be firmware updated by the user.
TBH with how bad the VRR flicker is on my $1100 qd-oled, something that was supposed to be the end all be all of monitors, all of my interest in new monitor technology has effectively evaporated.
I'm currently using a BenQ EX240 monitor which allows VRR and backloght strobing at the same timr
Thanks H.U.; i had no idea the Nvidia scaler module required a G-sync monitor to include a fan!
While Nvidia GPU users wait on the new tech to reach us - we basically should just buy a G-sync COMPATIBLE monitor? Can you recommend me an IPS that at least has a Wide Color Gamut, like DCI-P3 or Adobe RGB? I will need to use Nvidia DSR to 'hopefully' upscale my 480-720P movie collection and display them well enough on a 32" 1440P monitor. That would be greatly appreciated!
Thanks again for the news flash!
Once this makes to oled panels then i'll be interested. Can't use anything else compared to it.
I just want more devices with a physical g-sync chip so i can turn my monitor hz to 24hz. So i can watch movies/shows *truly* without any sort of judder.
"G-Sync Compatible" the disclosure of "compatible" I feel has been hidden/absent in most cases of "g-sync compatible" monitors. Weather brands deliberately and maliciously "forget" to advertise the word "compatible", or innocently and ignorantly leave the word out, thinking it doesn't mean anything- probably both. But Nvidia are very clever, and I would be shocked if it wasn't by design.
I'm a happy FreeSync user (NVIDIA GPU with an LG OLED TV). What am I missing out? It seems that G-SYNC's advantages are very niche. What are the use cases that would benefit from an "upgrade" to a G-SYNC setup?
Having a non-OLED monitor always benefits from an actual G-sync module. If you are just running a standard VRR monitor, you have to manually change the overdrive settings everytime you play something with a different framerate if you want to avoid blur and overshoot, and even then it's not ideal.
With that said, it's not that it's a bad experience, and most people don't have the awareness of how overdrive influences their image, but when you do know what's going on, it's hard to ignore.
At last! 😅
Thanks for the video! 😊
Waiting for the Monitor Tests.
Will it solve the VRR flicker in OLED models?
How can we know if a monitor has the old G-Sync module or the new one? Does the name change?
Probably you'll have specific branding like "GSync Pular" that implies there's a new MediaTek GSync chip inside.
We know for sure that any new monitors released from this point either have, or dont have the mediatek scalar. There no 3rd option with the module anymore.
And as always you should read/watch reviews. I very much doubt there would be new monitors released with the same name anyway.
Will be interested to see if it is as smooth as the current g-sync module.
I have both the DW (gsync) and DWF (non gsync) and when it comes to g-sync/freesync/adaptive sync, the DW is generally smoother (on nvidia card at least)
Ive got a theory this module is also to setup Pulsar working on oled in the future.
OLEDs don't need it since their response times are over ten times as fast as even the best LED monitors.
@@selohcin they can still benefit from black frame insertion for perceived motion blur
10 year anniversary of the best monitor tech since CRT's were invented.
So what does that mean for just gsync compatible screens? Like lgc2?
Gsync compatible just means nvidia signed freesync, so nothing would change as it doesn’t have a gsync module
@@SpyderTracks it has low framerate compensation when compatible labeled
@@Heisen_burger-dude freesync is software driven by a standard VRR controller. It only has low frame rate compensation window if it’s a crappy display (related to gaming) that’s not related to the controller, the controller can only address what it’s connecting to at a hardware level. This is a primary reason why you never buy a TV for gaming, they’re substandard at an actual gaming performance level. But it’s irrelevant anyway as it’s not a gsync monitor so isn’t affected by this.
@@SpyderTracks Little nitpick: G-sync compatible is a certification for "AdaptiveSync". "FreeSync" is a brand from AMD for their VRR ecosystem. For example you could theoretically have a non freesync branded monitor that is certified "G-Sync compatible"
@@hastesoldat well no, in some future land yes of course you could, but there are to date no gsync compatible monitors that aren’t freesync. But yes, if Intel came to the party and branded some monitors under their own VRR moniker, then those could become gsync compatible too if they reached the required bar. But word on the street is that intels days in the GPU space are done and battlemage is DOA. It’s not really a nitpick as highlighting something that doesn’t exist
Thanks for the video :)
What is the probability that we see this Tec in 34" QLED and miniLED monitor later this year?
What happens then with the OLEDs, they only come compatible with GSYNC and I don't see any with ublm2, why wait for new OLED monitors to come out?
Yes but will we still have oled flicker issues? I assume yes because its on the OLED side.
Best move ever. Everyone is a winner with this one
Turned out to be one of the best thing Nvidia had released. Some monitors from the early Gsync era could be very easily hacked into doing fully unlocked BFI/Backlight Strobing at any framerates, especially at 60FPS, with very low strobe lenght making them look incredibly clear, much more so than most monitor today that still do not allow this feature at all! Not only that but the module apparently could mitigate the VRR flickering, and some gsync modules could even do BFI+VRR.
So, I'm actually sad to see this.
Honestly it's stupid of companies to not offer strobbing/bfi, bfi on OLED is so easy to implement compared to the fine tuning needed with IPS, yet only ASUS does it...
60FPS is not clear... by definition.
@@rnf123 60 hz *can* be perfectly clear in motion. Virtual reality headsets are proof of that.
@@houssamalucad753Strobbing hurt my eyes.
12 ppl did that. Cool.
AMD forced them to do this!!! 👏 👏 👏
I might be the exception, but after moving to OLED i left the horrible experience of VRR behind and moved toward frame capping whenever possible...
I don't know how you cam to that conclusion. Frame capping literally pairs perfectly with VRR. Without VRR frame capping will either tear if vsync is off, or judder like crazy if vsync is on.
If you're not using VRR/GSync/Freesync then you're doing something wrong.
On my 144Hz GSync Compatible monitor I often cap to 90FPS. So my monitor matches at 90Hz. That still requires GSync to be working (along with VSYNC to prevent tearing if frametime changes too much).
I suspect you mean you've left behind wild swings in the FPS, but again you should be using VRR. And probably are. Frame capping + VRR is the ideal solution. Digital Foundry has a bunch of good videos on this.
(and not all frame capping is equal. I cap in-game if possible, but if not I use NVidia Control Panel. I have ALL games using an FPS cap. Some of them are old games that run at 138FPS as I have Ultra Low Latency enabled globally in the NCP.)
I am still waiting for the "Minotaurs on box" line of clothing 😂😂😂
should i wait for an oled gsync module or pull the trigger on one with out in the market rn?
But what about G-sync ultimate modules? isn't one of the bigger features better HDR performance with VRR?
Can you guys make a video on how annoying DSC display stream compression is, when tabbing out and in (games), is this ever gonna get fixed by NVIDIA or a solution (other than running games in borderless, big issue is games dont minimize to taskbar in borderless) deal breaker and annoying asf
Monitors just recently peaking with high refresh rate Oleds now they have to add new feature.
whats the alienware monitor that was being shown in video?
Will existing monitors without the G-Sync module see a benefit from this in any way?
No. This only applies to future monitors featuring the new scalers.
In short no, In details still no, not the existing monitors, maybe if somebody want to compete with this module by improving with software or creating new modules for future monitors or update old, but solely relying on software will only get you so far.
Will the Module have any impact on GSync flicker on oled monitors?
G-Sync Compatible is good enough for me, but would be nice to have variable overdrive and VRR below 48Hz range, a game like Destiny 2 can choke my poor old hardware hard enough for FPS to drop that low in some scenarios
but so far this is limited to IPS panels? nothing produced soon to OLED ones?
OLED turns individual pixels on and off so you can't pulse a backlight. The main issue with persistance is to NOT show light for a fraction of the time to allow your brain to "reset" In old PROJECTORS they had a spinning disc that periodically blocked the light coming from the lamp... with OLED you use "BFI" (black frame insertion). BFI and backlight strobing are similar but I'm not sure of the OLED difficulties... The LG CX OLED TV's had BFI but then it got dropped (I doubt it was used with VRR though).
With the LG CX BFI the light level dropped a lot as the black frame lasted a long time, but I see no reason you couldn't have a hardware chip doing better control (i.e. light pulsed OFF for only 1/24th of the time or whatever makes sense.)
This technology WILL come to OLED but not exactly in the same way. You'd turn the pixels OFF for a small amount of time, every so often dictated by the refresh rate. I'm not sure if this is possible with existing OLED panels or not. It seems like it SHOULD be since OLED has a near instant response time. I'm ignorant of the issues, so to me it seems like it should be easier to do than variable backlight strobing.
I suspect everything they learn getting VRR+Strobing working would make OLED BFI fairly easy to implement.
(I'm guessing OLED is being looked at NOW. OLED is pretty new to desktops so maybe NVidia is focused on what sells the most. Or maybe we need new OLED panels.)
Yes Nvidia themselves said that for now their focus is on LCD, not OLED.
Also i would not say two years is new. That's how long mainstream OLED has been available on desktop.
I really hope pulsar is not vendor locked to nvidia gpu's, after all asus solution did not required to talk to the gpu, and it can just hook into adaptive sync, but knowing NVIDIA I doubt it.
Now they'll just come up with another price gouging feature to milk the Pc Gamers' never-ending supply of FOMO if the G-Sync well has run dry.
Just like AMD cloning everything Nvidia and Intel is doing?
like every other corporation ever
@@Vipppala "How dare this company not spend tons of money to help other corps grow"
Nvidia seems terrified of competition , not inspired by it.
DLSS3
At least on my old vg259qm and my new 271 qrx, VRR is pointless. If you run the vg259qm at 280hz there’s no tearing ever.
Gsync/VRR just makes it less smooth because of the lower response times with lowering refresh rate on lcd
... So, the monitors on the market with G-Sync Modules built-in will have their prices cut, right? Right??
Will every single game I fire up still suffer from G-sync gamma flicker? 😒
Am I wrong in thinking that this should now be much cheaper to implement, and therefore (hypothetically) shouldn’t raise the price by so much?
It should result in cheaper monitors than those with the FPGA modules. But there will likely still be a premium for the inclusion of the feature.
Ultimately those scalers don't cost that much (unlike the FGPA module) but it's still the monitor vendors who decide the final price at the end of the day.
YES and NO.
I expect a few hundred dollar premium to start, but within two years I think we'll start seeing prices drop quite a bit. There will be a lot of factors such as how many backlight ZONES there are.
Competition will drive this, so yes, the chip shouldn't add a lot to the price (Bill of Materials). Not sure on the LICENSING part though.
Makes sense. Thanks guys. Kind of a shame, but I guess you’re right that as it’s still a differentiating feature that people want, it’s going to be used in more expensive monitors. I was hoping that it would be a much more universally adopted chip, and not used as a premium upsell
Just got my hands on a lg 32gs95ue. Any chance on a general windows guide HDR / HDR terminology guide. Had a real pain trying to get netflix 4k working too with the need to download hevc and still not ending up working.
You need hevc and plug the monitor using HDMI instead of display port.
hoping for ulmb2 for oled, finally. what does oled have a pixel response time of 0.03ms for, if it can't even do intra-frame black microframe insertion?
To me, it sounds like a relatively easy problem at some point. I'm honestly not sure of what the issues are. I'd imagine that you'd just have a MediaTek chip doing something similar to VRR Strobing but in this case just turn the pixels off every so often... but, screen technology can be pretty complicated so I'm unclear if there are any issues that would make this far harder. Is it a simple on/off duty cycle? Of do you need to quickly analyze the data (like how much WHITE there is) and adjust the duty cycle further? I have no idea.
It just won't be called ulmb2 because it's no longer ultra low motion blur. It's not clearer than running at max refresh. You have to halve the refresh rate for bfi on oled remember?
@@Frozoken Please read again, thank you.
@@insu_na I did. I'm saying it's just no longer as relevant that's why
@@Frozoken You're completely misunderstanding the point. OLED has a pixel response time of 0.03ms. There's not a single reason why it would have to halve the refresh rate, because it can absolutely insert a black frame at basically any time. If you maxed an OLED out it could flicker between black and white at 33kHz.
i want oled pulsar. I have c1 with bfi and its insanely good, but being locked to 120 means i have to keep a lot of headroom in case i get a demanding scene. So often i could run maybe 200 fps avg but still only get 120fps min 1% low, having bfi+vrr would allow me to increase graphics and still get bfi benefits. Bfi is too good on oled and really unexplored on monitors as it is now.
Curious if the AW3225QF will get support. It uses the MediaTek MT9810.
I have one of those AW3423DW OLED monitors with a hardware g-sync module. It supports zero features that my previous "g-sync compatible" monitor didn't. No ULMB, no reflex, nothing. The VRR doesn't work any better either.
Oh look a collectors item now, like the 3D tv's & Nvidia 3D.
will the new solution still support refresh rates down to 25fps? (ideally down to 1fps)
Definitely.
In fact, lower FPS flickering issues are what the VRR Strobing issue is designed to help solve. You don't need a chip for "25FPS" anyway provided the monitor refresh range is sufficient as you just send the same frame multiple times (i.e. 25FPS is sent to monitor as 50FPS so the monitor is now 50Hz but drawing each frame TWICE) using the video SOFTWARE drivers... but the lower the FPS relative to the monitor you get blur issues (BlurBusters has a page on this).
*so TBC, 25FPS using an NVidia Pulsar monitor is going to be the best way to experience low FPS gaming.
**It can NOT solve anything related to STUTTER that's caused by a computer NOT creating a new frame quickly enough though. So if you're talking about game stutters like that then NOTHING can solve that beyond a faster CPU and/or GPU etc.
I think the market simply produce less shit scaler,
G-SYNC module was great back then, it wasn't perfect, but at least it drive today scaler to be good