@@CurleeToesAppreciate the interest, but I'm afraid not. It's common for Dell to hold Alienware flagships like this as exclusives on their own site for a few months before supplying to additonal retailers.
@@orvilleswearing334 Afraid not. As usual for flagship Alienware monitors it'll be available from Dell directly for a little bit before reaching Amazon.
Yes, it is true. Maybe if this channel were more funny, it would be more popular (I don’t mind the presentation; for me, it’s excellent), but most people demand excitement, a lot of entertainment.
Yup, I make content I enjoy in the style I like and don't chase the algorithm. This does reduce the potential audience (monitors + very detailed and long videos = a niche within a niche), but it is sustainable and I appreciate the community I generate from it here. 🙂
No disrespect there is still one guy and in my opinion he is the most accurate reviewer on yt for tv’s and monitors search for it. HDTVTest 👍🏼 still this is also an great review
I wanted to subscribe after 3min of the video, but I was busy watching it for 1h10min more. If anyone is in need of some good information, then this might be the best source for it. What a great channel. Hope you keep yourself busy with other upcoming oled monitors as well.
@@Xpert45Definitely did, It just took me 1:14:00 to do it. Still trying to decide what monitor to go with. the new 34" samsung seems like a good one, but I am kinda hoping for someone to bring out a good 38" option
I recently got this monitor and love it. I wanted a flat screen, but was tired of waiting for the MSI or Asus versions to come in stock. The curve doesn’t bother me at all. Going from a good IPS to QD Oled was a huge jump. The only downside so far is my other monitor now feels washed out by comparison.
Thanks for being the only reviewer to acknowledge and demonstrate the VRR flicker issue. Another typical OLED problem that would be good to demonstrate is the vertical banding visible during horizontal camera panning in dark game scenes - particularly in games that use elevated blacks. Callisto Protocol, Cyberpunk, Alan Wake, etc.
I haven't observed anything OLED-specific of that nature so I'm not sure what you're describing there. Vertical banding could be related to high frequency low amplitude oscillation of the backlight. Or just vertical banding due to alternating darker and somewhat brighter shades and there being less perceived blur to mask eye movement on an OLED. Or possibly you're describing a DSE (Dirty Screen Effect) phenomenon which IS quite common though not unique to OLED - and as covered in the conclusion not an issue on this one.
@@PCMonitors Sorry for the late response. I was referring to poor uniformity on dark colors such as a 5% gray field. Most OLED TVs and monitors will show darker or lighter gray streaks running up and down the screen instead of a uniform gray across the whole panel. Such uniformity issues become very obvious when panning camera in many dark games like the examples I listed. This OLED issue is rarely mentioned by anyone but HDTVtest and Rtings. Hopefully more awareness will push display manufacturers to improve in this area. Thanks.
@@jackw9568So that's DSE, which I do assess and where applicable (some VA and OLED models) cover and demonstrate. I didn't have an issue with it on my unit, as mentioned in the conclusion. 1:09:07.
Appreciate the long format detailed review. Wanted to ask though, are the qd oled magenta raised blacks similar in magnitude compared to the AW3423DWF? Or have they improved it?
It has been very slightly improved, but as a previous owner of the AW3423DW I can't say the difference is at all dramatic. I did some side by side testing and the screen surface looked extremely similar in most cases and I'd say the size difference accounted for a lot of screen surface behaviour difference I might've observed.
@PCMonitors thank you for your input! The raised blacks drove me insane since I was unable to control my room's ambient light, so I ended up selling my AW3423DWF. Also have you noticed the qd oled TVs tend to perform quite a bit better compared to the monitor versions with this issue?
I don't have experience with the TV side, but the panel technology and lack of outer polarizer is the same and based on testing from RTINGS etc. it seems to suffer to the same degree. It could be more about the size and curvature of the screen.
I returned my PG32UCDM. I wanted to love it but it was just too dim. Any bright scenes just looked so dim. It looked incredible when playing in a darker environment but any snow maps or sunny days looked really unnaturally dim.
Thank you for your review! Also really cool to see that you test scaling, show default resolutions, really in-depth stuff that I don't see getting featured in other reviews! On the topic of scaling, if you send the monitor a signal with a custom resolution like 3200x1800 it doesn't support it and you get letterboxing (like with 1080p 240Hz)? You can always use GPU scaling, but that probably has some issues with DSC and 240Hz as well, right
Yeah, custom resolutions are complicated due to DSC. Which is why Nvidia Control Panel doesn't even allow creation of them on monitors like this with DSC active. But if you use CRU or an alternative means to create them there will be restrictions.
@@PCMonitors ah ok, thanks! For nvidia gpus, you can actually activate the old Image Sharpening menu instead of the new Image Scaling. It requires a small edit in the registry. This older technique also comes with Lanczos as the scaling filter, so it's pretty great. Does this perhaps work instead? I never had a DSC monitor so I unfortunately can't test :/
Just for anyone on the fence about it. There is a 10% discount if you add any cheap 1$ cable to cart. Disclaimer: not sure how long this will last as well this is on US site. Not sure if it’s same way internationally.
I think i am going to buy this. I want it primarily for gaming, and my GPU cant push this screen, but i will be upgrading my GPU when the next generation of GPU's comoe out.
I think this monitor gives a very nice experience even if you just consider it limited to 120Hz. And that's what you'd get on the PS5. There aren't any models coming out this year that would provide a superior experience, really.
I just bought this monitor but im not a gamer....what's the best setting for video editing and productivity? New subscriber 🎉 or perhaps you can recommended another Mac friendly 4k monitor to use? It may be overkill for what i need it for...also wasn't really looking for a curved monitor but this was the only 4k monitor in the store and on sale*
I'd recommend looking at the 'Best Settings' video, but really you need to adjust according to your own unit and preferences - th-cam.com/video/N8k03ZM_BvY/w-d-xo.html. 'Creator Mode = sRGB' is really going to be most appropriate if your video editing is in the sRGB colour space. The monitor could work very well for your uses, but you need to be mindful of the burn-in risk and your lighting conditions.
Looking forward to your thoughts on pg32ucdm vs this monitor and other equivalents. I still need to replace my Predator X27 but the brightness is fantastic. Maybe i will have to wait for pholed?
Yeah it's difficult to compare the X27 with the OLEDs because their strengths are in very different areas. Bright scene HDR performance is definitely better on the X27. But the colour reproduciton, responsiveness and HDR performance where darker shades dominate is much better on the QD-OLEDs. The differences between the QD-OLEDs themselves are far subtler and I'd generally recommend going with the one that's cheapest in your region. So far they all look pretty solid.
I was thinking to buy this monitor for the upcoming PS5 Pro that will be available later this year. For now, i will continue to enjoy my very aging ps4 pro playing 60Hz games
You had done a fantastic review on the PG32UQX which I own. Is it worth replacing it with one of these QD OLEDs? The haloing is the main weakness of the PG32UQX but I am concerned that if I replace it I would miss its brightness and HDR performance. Also, I use my monitor for (office) work a lot so I am still not sure that getting an OLED makes sense. It would be great to get your view.
Yeah, both models have distinct strengths in HDR. I can't guarantee no 'burn-in' for heavy productivity usage, either. Though I use QD-OLEDs myself (AW3423DW for a few years, now swapped to this one) without issue, including for a fair bit of productivity. I make sure the screens turns off (via Windows power management) after 3 minutes of inactivity and I use a dark taskbar and desktop theme, but don't take additional precuations - aside from making sure it runs the scheduled cleaning cycles. I treat it largely like I would an LCD. I personally think it's worth the risk given the warranty coverage and obvious strengths in other areas.
Thank you! I really think your reviews are the best btw. People can make a fully informed decision to buy or not any particular display based only on your review because of the amount of detail and thoughtfulness you put into them, consistently covering and clearly demonstrating all possible considerations (rather than just transmitting overall impressions). Consumers need this level of detail before forking out such amounts of money for devices that are meant to be used on a daily basis and for a significant amount of time. Keep up the brilliant work!
@@DrakonR No, I use it for work as well as gaming, I would not be comfortable yet working on an OLED for hours on end as I do on the PG32UQX, I would be too scared of burn in. Also I own an LG C1 which I use as a second monitor so, for games that would look poorly on the PG32UQX because of blooming, I can always switch to the C1. So it would not make much sense at the moment for my use case, not to mention financially. And there is always the brightness of the PG32UQX which is fantastic, like bright daylight scenes in HDR feel like you are actually outdoors.
It's interesting that you mention that the VRR flickering isn't as bad as on WOLED. I saw a comparison between LG's new 32-inch WOLED and the Samsung G80SD (QD-OLED) and the VRR flickering was more noticeable on the QD-OLED.
When you say "a comparison" - in what context? If you mean based on a synthetic test (like the VRR flickering test or the way RTINGS tests this) then don't put too much weight into that. It's a very artificial testing scenario and VRR flickering can vary in intensity and how potentially noticeable it can be depending on the exact fluctuations occuring. I found it slightly less noticeable during normal gameplay in various scenes on this model compared to various WOLED models I've tested, but I wouldn't say it's a huge difference... It can also vary between individual models - so either way I can't claim QD-OLED > WOLED for VRR flickering. In my experience talking to people who have tested or used various OLED monitors, VRR flickering is generally just something you're sensitive to and will notice (regardless of which OLED you're looking at) or you won't. Or you might notice it and not find it bothersome. The only model I've used which I'd say has any real edge in this respect for 'normal gaming' is the AW3423DW and that's because of some compensation performed by the G-SYNC module. But for some people even that level of VRR flickering is annoying.
Interesting. As I said it depends on the exact model so I don't think it's right to bundle together "QD-OLED" and "WOLED" but rather consider the model itself. I also find cameras poor at reflecting what your eyes see in terms of flickering. I've observed very similar situations on Cyberpunk where the camera will detect very small gamma changes which aren't observed by eye. And even in the VRR flickering test it's picking up fluctuations at parts of the gradient or in ways your eyes just don't perceive flicker. I'd also stress that the monitors in that video you posted are set up very differently - the poster of the video states in a comment: "Samsung was in its sRGB clamp mode, I had to put the WOLED in FPS preset to raise gamma so it doesn't look so dark on camera. The FPS preset doesn't clamp to sRGB so its oversaturated." Nonetheless, I still got the impression they generally observed stronger VRR flicker on the QD-OLED he tested than on the WOLED (though I don't know how close his calibration was when he observed them and how much of this he was just modifying for the sake of the video). Now I don't know if the Samsung G80SD is just particularly poor in this respect, but the pretty much constant low level flickering detected by the camera may not be observed at all by eye. It's usually the sudden and more significant brightness fluctuations ('obvious flicker') which is more troublesome and I tend to notice that more on WOLED models than on the AW3225QF. That certainly doesn't mean all QD-OLEDs vs. all WOLED models and depending on sensitivity to flickering and your room lighting, smaller and more frequent fluctuations could be an issue. Either way, I will be more careful in future reviews to try not to generalise the VRR flicker levels of QD-OLED vs WOLED or perpetuate those kinds of thoughts. Going back to the AW3225QF, I doubt there will be many people who would find it fine for VRR flickering but find a 240Hz WOLED obnoxious in that respect. Nor the other way around. Edit: Further thoughts shared here - forum.pcmonitors.info/topic/qd-oled-vs-woled-vrr-flickering/.
Appreciate the interest, but no plans to review the ASUS I'm afraid. Will be reviewing the MSI version (which is better value than the ASUS in the US) and don't intend to review 3 models using the same panel, especially when performance seems so similar in most respects.
@@PCMonitorsokay, a man can dream. I was thinking about pg32ucdm since there is a graphite layer and fully passive heatsink - it could be better in the hdr due to not that aggressive abl and by having some extra brightness in the sdt mode. Thank you for your hard work.
@@ALFGamingTVYou're welcome! According to Monitors Unboxed and TFT Central testing, the ABL behaviour for HDR is very similar. But yes as usual for ASUS OLEDs with 'Uniform Brightness' enabled more stable brightness for SDR. The MSI is also passively cooled (graphene film), though they don't market as aggressively as ASUS. 🙂
Not that it affects the performance in any way, but I forgot to mention this model is officially 'G-SYNC Compatible' certified by Nvidia - pcmonitors.info/wp-content/uploads/2024/03/AW3225QF-G-SYNC-Compatible-certified.png.
I'm happy with my 27" 1440p LG oled now yea it has some rough edges But subpixel isn't an issue for me, and I read documents a lot I run VRR off for virtually everything - as the majority of games just don't fully support it properly And I also watch movies/tv from quite far away so I don't think a curve would work for me 4k didn't appeal I have no issues running games at 1080p - and still do for competitive games I also don't use HDR at all The big thing for oleds is running windows in high contrast mode And running a 16-235 -> 0-255 post resize shader in your video player to force deep blacks on videos - this lets you watch something old like star trek voyager AND have the uniforms etc. rendering in full deep black, instead of having that glowing black I also make use of vivid mode from certain videos and tv shows like sci fi that really benefit from the extra pop
I don't know if its the BEST but I have absolutely no issues running it for like 50% of the time I'm just doing text based stuff and thats with zero work arounds of effort apart from using windows high contrast mode which is amazing on oled and a must to do I also have an AMOLED panel for my laptop and have zero issues with it either @@mjregan88
Hi, I have this monitor and a i7 9700k, 2070 super. I can't seem to get 240 hz. I've tried in windows and Nvidia control panel but only gives me 60 hz option, and sometimes 120 hz. I know my hardware isnt the best but whether I choose 1440 or 2160 res it wont give me the option. Any idea why please? Fantastic videos by the way dude
Hi, I'm not knowledgeable on modern tech but I'm assuming you mean a display port cable. No I'm using an hdmi 2.1 cable. I saw online that this is ok for 240 hz. Should I use a dp? I must say that the display does look good but I want to maximise the monitors capabilities. Thanks for the reply 👍
Ok, I'll give it a go. Thanks so much for your help. Ps I recently tried 'Ori and the will of the wisp' on the new monitor (I got the monitor today), and it's absolutely stunning, even without the refresh rate I hoped for. Thanks again, awesome channel 👍
I am considering this or the Asus one. Buuuuuut have not decided yet. Main concer is that i sometimes play 30fps games (Old console games) and on OLED as far as i know 30fps should not look soooo goood, versus lcd because of the fast response time of the OLED versus LCD.
Yes, OLEDs have very little perceived blur from the pixel response element to mask a natural 'juddering' you'd see from such frame rates. Whether this is actually bothersome or not is very subjective, though. Probably have to see for yourself, I wouldn't give up on the QD-OLED experience based on something you may not have an issue with.
I am disappointed you are missing my favorite part of your reviews for this one. I am wondering how the interpolation/scaling is on this monitor for non 4k resolutions. How does it handle 1080p content? Does it do nearest neighbor upscaling or is it bi-linear like monitors usually use?
I have AW3423DW and was wondering if it's worth it to upgrade to this monitor. Does the monitor appear much smaller than AW3423DW? Is the picture quality better?
Yes, I agree and that's a good point. I actually recorded a piece where I said it was exaggerated on the video. And you can see this odd 'pincushion' effect in the centre. But I edited this out (didn't mean to cut so much from the edit of that section).
Regarding the VRR flicker, have you tested whether there is any perceivable difference if connecting via DP 1.4 or HDMI 2.1? I have read that HDMI 2.1 can help but wanted to hear your experience. Thanks
I have now tested this extensively and can confirm VRR flickering was no different using HDMI 2.1 vs. DP 1.4. The monitor is only 'G-SYNC Compatible certified' via DP - which makes no difference to how it performs, but means the technology is only active by default via DP. It's possible somebody was using HDMI 2.1 without actually activating VRR and thought it was active, if they observed a significant difference in (or indeed no) VRR flickering.
@@PCMonitorsthanks for following up. Have you noticed if limiting the refresh rate to 120 helps with flicker? Might be an option if playing a single player game where you can't max out the refresh rate anyway.
@@lsik231l VRR flickering is reduced if fluctuations are reduced. So if limiting your refresh rate does that then great. But if your frame rate doesn't pass 120fps either way, it won't make a difference.
@PCMonitors great! Really appreciate you investing your time on this! I recently got the monitor and so far really do love it. I ended up using DP as I found using hdmi caused random artifacting that I read was a problem but only if refresh rate was set at 240hz. 120hz eliminated artifacting at least with my monitor.
Recently bought a aw3423dwf, loving the hdr peak 1000 mode but in some games, like battlefield 5, cyberpunk in daylight or other bright games, abl makes it almost unusable. Enabling source tone mapping helps in some games (like ghost of tsushima) but the dimming is brutal, is there a solution? Maybe a firmware update? Do you always play in peak 1000 mode?
How do you find those scenes with the 'DisplayHDR True Black 400' style setting(s)? The ABL behaviour is there due to power limitations with the technology, it's not going to change with a firmware update. The 'DisplayHDR True Black 400' mode is more accurate with its PQ tracking and will show what the panel can technically do in terms of brightness in such scenes. If you still find things too dim that's truly the ABL behaviour you're observing, if not then it's because of the over-dimming of mid-tones with the 'HDR Peak 1000' setting.
45:50 this might sound like a bunch of copium, but actually we might benefit from abl more than we lose. sure, in a fully lit and sunny room, fullscreen 1000nits would be rad. but at night, this would result in a flashbang whenever you look at the sky or play in a snow covered landscape like this battlefield example. on my current monitor, benq ex2780q, i have BI+ enabled for this exact reason. 400 nits hurts my eyes when i use my pc at night for gaming and even more so for productivity
I don't personally find bright-dominant scenes overwhelming on Mini LED solutions which will raise brightness much more than on this model. Even there the whole scene is not at a high brightness, it's just certain sections of it. Because of the contrast you could also argue that smaller bright highlights (which QD-OLED presents with very high brightness) surrounded by darkness are more impactful or potentially uncomfortable to look at. Everyone has their own sensitivity to brightness though, I think multiple options with different brightness levels is the best approach. I don't like the representation of HDR on models like the EX2780Q without local dimming (or with very ineffective local dimming) because the whole scene is flooded and everything is raised up more than it should be, not just the brightest elements. I don't think the experience is really comparable to a 'proper' HDR experience even if just considering viewing comfort.
@@PCMonitorsinteresting, i have no reference to actual hdr monitors, apart from my evo oled tv, so i can't comment on that. however, i ordered the aw3225qf as soon as i finished your review :) can't wait to finally have a look at content in 4k and hdr when on the computer *_*
If that's their estimate that's their estimate. It entirely depends when you order and how their demand and supply curves are shaping out. Mine came slightly before estimated, but I ordered it as soon as it was available.
Because it's "cool" and the kids love it! But as I say in the review, it's pretty subtle. Draws you in a little bit, can aid immersion slightly. But personally agree the curves are best kept to ultrawides. On 16:9 they're either subtle and have limited effect (like this) but could still be annoying to some on the desktop... Or they can be too steep and annoying a lot of the time.
agree, (subtle) curves have their place, but not on a 32" (imo). I will say I am on a 42 C2 and I *definitely* miss the slight curve my old 34" ultrawide had...
I would say by 99% i am sure this was done by dellineware in 32 inch format for marketing purpose only. To stand above the crowd and having +1 reason to sell it as a gamer monitor. It is good thing since you have a lot to chose from -like the chosen one pg32ucdm, but if you can get this for 800-900$ compared to 1300$ asus, with 3 years advanced warranty, i would say curve in not important then 😂
Back up a bit and you'll see I was showing and talking about 2560 x 1440 (QHD) which is not the native resolution of the monitor. So what I'm saying is when the AW3225QF is running 2560 x 1440, it's softer than a monitor that has a native resolution of 2560 x 1440 and is the same size.
@@bledarrm7976Sure, it will run up to the monitors highest resolution, so anything the game supports up to that. There are differences in performance for different resolutions though.
That's a very broad question. I'd say it's better under HDR due to superior colour volume, it's a bit more vibrant under SDR due to gamut and the tighter pixel density is very nice.
I usually sit and game at 1.2m from my 65 inch Sony A95L qd-oled tv, which can reach 1500 nits on 1% window and 1300 nits on 10% window. I wonder how drastic the brightness reduction will look to my eyes if I switch to these 32" qd-oled monitors instead. I've always liked to play games on monitors more due to the closer distance (which allows me to see more details I feel) and much higher pixel density (which I hope can compensate for the blurriness of taa/dlss), but peak brightness figures seem very disappointing compared to bigger tvs.
the only time you notice it is when your doing like browsing or office stuff and its actuall nice cause you don't burn your eyes out when some app/website that doesn't have dark mode lights up DLAA and a low or off ansio looks best imo
Tough to say for sure, but I suspect the closer viewing distance will help provide a suitable perceived brightness and 'pop' which won't disappoint compared to your TV.
@@PCMonitors thanks for the reply. Are the color gamut/volume on these monitors also worse compared to tvs? I wonder why. They have gen 3 qd oled panels while tvs only have the gen 2.
@@vmd1293The tuning seems to be slightly different, perhaps. DCI-P3 is similar, ~99% for both. Adobe RGB and Rec. 2020 is also slightly lower on this one compared to the PG49WCDM I looked at. Visually there's very little difference you'd notice, though. Different software and devices can measure a slightly different gamut, but all of the QD-OLEDs (TVs or monitors) are ~80 - 85% Rec. 2020 so within fairly similar range really.
Apples to oranges really, depends on your own preferences for screen surface and QD-OLED vs. WOLED characteristics. I'd recommend waiting for our review of the ASUS PG32UCDP, which is their version of that one using the same panel.
Is there something similar with a more normal aspect ratio/ full 4k resolution? Preferably not curved. Similar or better anti-reflection coating. Edit: Disregard that. Amazon UK is directing me to the wrong monitor model which has a weird widescreen resolution. Looks like they don't have this one. Why is it so hard to find a decent OLED monitor in the uk!?
Hey, I am running my games on a 7900xtx. Will I be able to utilize freesync while play in hdr1000/tb400? Because I can't wrap my head around dell offering a 4k hdr flagship without official freesync premium pro support
Yes. 'FreeSync Premium Pro' is just a confusing AMD-specific certification which sometimes involves its own 'HDR pipeline' - this just makes a few tweaks to the HDR experience (not usually good ones). You do NOT need FreeSync Premium Pro or even any specific FreeSync certification to use HDR + VRR with an AMD GPU, so your experience will be similar to an Nvidia user here.
I'm considering this monitor, but I would like to clarify one thing: how does the ~250 nits figure relate to the brightness you'll get in SDR games and video? If 250 nits is the value for a white area, then will a light blue area (e.g. the sky in a game) put out less than 250 nits? Or will the display notice the content has lower luminosity and turn up the brightness so that the sky is at 250 nits? Put another way: if I normally game on a monitor adjusted to 500 nits white-rectangle-brightness (which is too bright for desktop, but good for daytime gaming), then if I switch to the AW3225QF, will my games look 50% dimmer?
Yes, it's the brightness of white and any deeper shade will have a lower brightness. That's the maximum brightness that any shade will appear on the monitor (it's the same way on any monitor, regardless of panel technology). Games won't necessarily look 50% dimmer, though, because perceived brightness involves a lot more than just measured white luminance. For example the screen surface and exceptional consistency plus exceptional contrast of this model works in its favour to enhance perceived brightness, in the right lighting.
@@PCMonitors Thanks for the clarification. I thought maybe ABL would make the difference between a white rectangle and the sky a bit smaller, but I guess those new OLEDs don't use ABL anymore. When I got the AOC Q27G3XMN (mostly due to your review), I realized I like ~70% brightness for daytime gaming, which is ~400 nits white-rectangle brightness. I tried setting 45% brightness to simulate a QD-OLED (I used a light meter app to estimate that 45% brightness puts out about 250 nits on white) and while I can see everything just fine, I find the image much duller and less lifelike. I am flabbergasted by this, because I've always pooh-poohed high brightness on displays for ergonomic reasons. Turns out what's too bright for desktop use can be too dim for gaming/video use!
As noted in the video, it's very subjective and depends on the scenario and how much for frame rate is fluctuating. I don't find it bothersome - I do on some VA models for example where it's stronger and I'd sometimes even want to disable VRR simply because of the flickering.
Could still give a really nice experience. Basically as shown but limited to 120Hz maximum (and generally lower frame rates - depends on game). So a lot of excellent image quality characteristics to enjoy, still.
It covers ~80% Rec. 2020 which is high for an OLED but not as high as some LCDs (with QD LED backlights). So the representation is reasonable but it can't display the most saturated shades.
Honestly - it makes no difference to the experience, it's just a certification used for marketing purposes. Actually sometimes being 'FreeSync Premium Pro' certified can be a negative as it sometimes causes the monitor to use a separate and usually worse-calibrated HDR pipeline.
You might be used to bright monitors and may need to adjust to a more limited brightness level. Or try to. It's also worth working on room lighting if you can, perhaps that's too bright.
"Desktop" is just another preset of the monitor, briefly covered in the "Best Settings" video - th-cam.com/video/N8k03ZM_BvY/w-d-xo.html. Dell Display Manager isn't specific to this model and not covered in these videos. Though is briefly touched upon in the OSD video - th-cam.com/video/ZYfIYV4LqQI/w-d-xo.htmlsi=hxjHUHA95SX39SG_&t=1406.
@@PCMonitors hmmmmm. I guess im looking for deep dive summary table of nits and color accuracy like you did for creator and standard as it relates to your PC settings. Like comparing your findings for the creator mode and Hdr with leaving standard and toggling display settings in PC settings (brightness, HDR on/off ICC changing system profiles). On Display manager side. I would like to see if application mode and dynamic sync works with this model.
@chevonpetgrave4991 Right, you're looking for something reviewers don't realistically have time or inclination to cover. It's not practical or particularly useful to take measurements using every preset - they typically make a few (sometimes undesirable) changes to the image or are so similar to the standard or default setting that they aren't worth exploring. The presets I focus on are selected for good reason. The impact of the full native colour gamut and how it impacts the image, which is extremely important, is explored in the review. As are the characteristics of the only SDR presets which don't use that gamut (Creator Mode colour space settings). Display Manager is software which is available for a range of monitors and isn't a core aspect of the monitor itself or its performance.
Totally get that. I guess I waskibg conversafionally not for a reviee video. In testing the viability of these for a color managed editing workflow that depends on high contrast ratios high peek brightness and specific black levels which this monitor checks off at a reasonable price. Just wanted to pick your brain on what I would be getting myself into workflow wise. Fom my research on another dell monitor ultrasharp series I envision using the Dell Display Manager to control auto sync icc profiles with the system. Making sure any monitor profile is sycned with what the system is outputting is critical for working in davinci resolve. I do it manually (first from my system settings and then have to do the same on display osd) on my lame monitor now. But ben q and dell have series that will match the counterpart automatically when you either switch on the osd end or system settings end. Additionally based on the application you’re in you can even switch as you open a new app to jump into a new color space and sync all in one move (open the application). I was going to go the ultra sharp rout for price for features but saw that this monitor does oled and hdr. At first i was like. Oh thats a gaming monitor it probalby doesnt have that capability to work with display manager. But when i saw in the spec sheet that it is compatible, i stsrted to wonder what is possible. That’s where im coming from for some contex😂.
Yeah, that makes sense. Well it definitely works with Dell Display Manager and as far as I'm aware associated features should work. The ICC profile switching section of DDM was shown in the OSD video, it's definitely accessible.
Assuming you're actually running at 540fps, the ASUS should have an edge overall because of much lower perceived blur due to eye movement. Might be some transitions where the OLED would have an edge simply due to the visually instantaneous pixel responses vs. slightly below optimal for some transitions on the ASUS, though.
@@ployth9000 Probably not as it's an LG and they don't have a functional UK PR department. Can't really justify purchasing that one myself at the moment.
@@PCMonitors ok well what about the samsung odyssey neo g9 57 I have been waiting for either you tftcentral or monitors unboxed to review it I have been using it and enjoying it but would be nice to see more people reviewing it.
Hi. Will the FreeSync option on my computer with Ati Radeon 7900XT work on this monitor? Some reviews/specs say no and some say yes, so I am confused. Thank you.
@@TheNikolinho Yes, per the review VRR is supported including Adaptive-Sync. Meaning you can use FreeSync. The monitor isn't 'FreeSync certified' but that doesn't matter.
@@PCMonitors thank you for that answer! The last thing - customers are complaining that the Display Port is obsolete - 1.4 instead of 2.1. My gpu does support 2.1. I'm ignorant so idk if having DP 1.4 does something to the quality of the screen, picture, watching movies, or playing games? Thank you!
Makes no difference whatsoever to the quality of the image or really to the functionality. With the exception that DSC is required for the signal, which locks off access to some Nvidia features such as DSR and DLDSR. And alt-tabbing in and out of games can take longer.
Apparently this and the 27" model have a 10% off deal if bought with an accessory through dell. Does anyone know if this deal is available outside the US?
@@PCMonitors i'm sorry for not being clear enough. The sentence at 3:03 "if you use hdmi you get up to 240hz" sounds to me as if you cant get 240hz on dp. Sorry for that one sir. Just wanna make sure before i buy something :D
I wish all monitor had the option to not get it with the stand, such a waste of money paying for something you will never use and ofcourse it either site in a wardrobe or goes in the bin....
It was fine with our RTX 3090, not sure what the issue would be (if there is one). Maybe something Dell needs to work out with Nvidia if so. But PC users can use DP 1.4 without issue anyway, so I don't see the problem.
It depends how stable the frame rate is, not specifically how low it is. The main problem you'd notice there would be if it frequently dips below and rises above the LFC boundary (so it's dipping below perhaps 55fps and rising above that).
Very much the same as in games, which are explored in great subjective detail here. That's under SDR at least - refer to the HDR section for discussion on 'Dolby Vision' vs 'HDR 10' as it applies to movies.
The monitor doesn't "filter" the content in any special way. Unlike TVs monitors don't include 'denoise' filters etc... So it entirely depends on the content. You still benefit from the exceptional contrast and colour performance, even if you see some compression artifacts in places. The gamma tracking on the monitor is good and it masks these artifacts 'appropriately' with the '2.2' standard in mind. It also offers alternative gamma settings if you wish to mask them more.
The TV can add a lot of processing and filtering (per my previous reply), the monitor shows things more as intended. But both monitors and TVs have various settings that can be configured to make things more or less "as intended".
➡ Amazon link: geni.us/iUjbJ
As an Amazon Associate I earn from qualifying purchases made using the "Amazon link" above.
Introduction: 0:00
Refresh Rates & Scaling: 02:38
Features & Aesthetics: 05:45
Subpixels & Calibration: 12:26
Contrast & Brightness: 19:42
Colour Reproduction: 26:04
HDR (High Dynamic Range): 33:11
Responsiveness (General): 57:21
Responsiveness (VRR): 01:03:00
Conclusion: 1:06:57
It doesn’t show on Amazon UK do you know when it will be available?
@@CurleeToesAppreciate the interest, but I'm afraid not. It's common for Dell to hold Alienware flagships like this as exclusives on their own site for a few months before supplying to additonal retailers.
It doesn’t show on Amazon Canada do you know when it will be available?
@@orvilleswearing334 Afraid not. As usual for flagship Alienware monitors it'll be available from Dell directly for a little bit before reaching Amazon.
I ordered mine from Dell on March 21st coming April 16th :)
One of the best and unfortunately also most underrated channels on YT. Thanks for another great review!
Yes, it is true. Maybe if this channel were more funny, it would be more popular (I don’t mind the presentation; for me, it’s excellent), but most people demand excitement, a lot of entertainment.
@@socialreport2836 also not everyone has 1 hour to watch a video about monitor on the channel they see for the first time.
Yup, I make content I enjoy in the style I like and don't chase the algorithm. This does reduce the potential audience (monitors + very detailed and long videos = a niche within a niche), but it is sustainable and I appreciate the community I generate from it here. 🙂
No disrespect there is still one guy and in my opinion he is the most accurate reviewer on yt for tv’s and monitors search for it. HDTVTest 👍🏼 still this is also an great review
Indeed the best Monitor channel on TH-cam. I love the way everything is described in detail. That's why im i'm subscribed since years.
I wanted to subscribe after 3min of the video, but I was busy watching it for 1h10min more. If anyone is in need of some good information, then this might be the best source for it. What a great channel. Hope you keep yourself busy with other upcoming oled monitors as well.
Great to see, glad you appreciate all the information. 😄
so, have you subscribed or not?
@@Xpert45Definitely did, It just took me 1:14:00 to do it. Still trying to decide what monitor to go with. the new 34" samsung seems like a good one, but I am kinda hoping for someone to bring out a good 38" option
I wish one day this channel will have 300k +++ subs . top reviews always, calm and no bullshit❤
this video has been a godsend, especially the part on the scaling into 1440p, Thank you so much!
Glad you liked it!
I recently got this monitor and love it. I wanted a flat screen, but was tired of waiting for the MSI or Asus versions to come in stock. The curve doesn’t bother me at all. Going from a good IPS to QD Oled was a huge jump. The only downside so far is my other monitor now feels washed out by comparison.
"I'm keeping it" says it all. Thanks for another amazing review!
Thanks for being the only reviewer to acknowledge and demonstrate the VRR flicker issue. Another typical OLED problem that would be good to demonstrate is the vertical banding visible during horizontal camera panning in dark game scenes - particularly in games that use elevated blacks. Callisto Protocol, Cyberpunk, Alan Wake, etc.
I haven't observed anything OLED-specific of that nature so I'm not sure what you're describing there. Vertical banding could be related to high frequency low amplitude oscillation of the backlight. Or just vertical banding due to alternating darker and somewhat brighter shades and there being less perceived blur to mask eye movement on an OLED. Or possibly you're describing a DSE (Dirty Screen Effect) phenomenon which IS quite common though not unique to OLED - and as covered in the conclusion not an issue on this one.
vrr
isn't that present only with g-sync enabled?
@@urkent4463VRR = Variable Refresh Rate. So G-SYNC is an example of that, yes.
@@PCMonitors Sorry for the late response. I was referring to poor uniformity on dark colors such as a 5% gray field. Most OLED TVs and monitors will show darker or lighter gray streaks running up and down the screen instead of a uniform gray across the whole panel. Such uniformity issues become very obvious when panning camera in many dark games like the examples I listed. This OLED issue is rarely mentioned by anyone but HDTVtest and Rtings. Hopefully more awareness will push display manufacturers to improve in this area. Thanks.
@@jackw9568So that's DSE, which I do assess and where applicable (some VA and OLED models) cover and demonstrate. I didn't have an issue with it on my unit, as mentioned in the conclusion. 1:09:07.
This is the perfect monitor for me - I was able to get it for $952 with various CC credits and cashbacks.
Lol the fuck. I'm a little jealous but I have been gaming on it since it came out
Thanks for another comprehensive review! I love that we're getting proper OLED options on the desktop.
Great in-depth review as always! Your effort is much appreciated.
I've got a hunch I'll end up buying this monitor myself! Thanks for the great review...and greetings from Europe!
Hey a reviewer on youtbube that isnt just an advert/algo farming! Subed.
Superb review. A keeper for sure. I am tempted to buy it even though I prefer and already own a QD UltraWide.
Good tidbit about the sharpening if using 1440p scaling. I’ve never owned a 4k monitor and this will be my first one. Can't wait!
It’s been great so far!! Few little things still need to be fixed but the first update did help
What about the 27 version or 32 msi?
@@mjregan88 I only have experience with the 32 inch AW, forgive me
Excellent review. Particularly because it goes into detail about calibration/configuration display settings etc.
excellent review. I ordered one after watching this and should be here this week.
Appreciate the long format detailed review. Wanted to ask though, are the qd oled magenta raised blacks similar in magnitude compared to the AW3423DWF? Or have they improved it?
It has been very slightly improved, but as a previous owner of the AW3423DW I can't say the difference is at all dramatic. I did some side by side testing and the screen surface looked extremely similar in most cases and I'd say the size difference accounted for a lot of screen surface behaviour difference I might've observed.
@PCMonitors thank you for your input! The raised blacks drove me insane since I was unable to control my room's ambient light, so I ended up selling my AW3423DWF. Also have you noticed the qd oled TVs tend to perform quite a bit better compared to the monitor versions with this issue?
I don't have experience with the TV side, but the panel technology and lack of outer polarizer is the same and based on testing from RTINGS etc. it seems to suffer to the same degree. It could be more about the size and curvature of the screen.
I returned my PG32UCDM. I wanted to love it but it was just too dim. Any bright scenes just looked so dim. It looked incredible when playing in a darker environment but any snow maps or sunny days looked really unnaturally dim.
Thank you for your review! Also really cool to see that you test scaling, show default resolutions, really in-depth stuff that I don't see getting featured in other reviews! On the topic of scaling, if you send the monitor a signal with a custom resolution like 3200x1800 it doesn't support it and you get letterboxing (like with 1080p 240Hz)? You can always use GPU scaling, but that probably has some issues with DSC and 240Hz as well, right
Yeah, custom resolutions are complicated due to DSC. Which is why Nvidia Control Panel doesn't even allow creation of them on monitors like this with DSC active. But if you use CRU or an alternative means to create them there will be restrictions.
@@PCMonitors ah ok, thanks! For nvidia gpus, you can actually activate the old Image Sharpening menu instead of the new Image Scaling. It requires a small edit in the registry. This older technique also comes with Lanczos as the scaling filter, so it's pretty great. Does this perhaps work instead? I never had a DSC monitor so I unfortunately can't test :/
Finally a serious and passionate Reviewer.
Great work. Really appreciate your passion and effort in showing to the world your findings.
brother this review is no jokes. Thank you for your service.
Just for anyone on the fence about it. There is a 10% discount if you add any cheap 1$ cable to cart.
Disclaimer: not sure how long this will last as well this is on US site. Not sure if it’s same way internationally.
I think i am going to buy this. I want it primarily for gaming, and my GPU cant push this screen, but i will be upgrading my GPU when the next generation of GPU's comoe out.
I bought a 4080 super when it came out to push this thing.
@@veilmontTVI bought a 7900 XTX and it works well. I'll definitely be getting a 5080 to keep pushing this awesome monitor.
Great video I'm looking for a monitor for my PS5 would you recommend this one or one of the other one's coming out this year?
I think this monitor gives a very nice experience even if you just consider it limited to 120Hz. And that's what you'd get on the PS5. There aren't any models coming out this year that would provide a superior experience, really.
Thanks for the quick reply I will pick one up soon.
Very nice Video! Is there any content planned for the Asus PG32UCDM?
No, won't be looking at that model. Just this one and the MSI version planned for now.
Outstanding review. I've just ordered one.
I just bought this monitor but im not a gamer....what's the best setting for video editing and productivity? New subscriber 🎉 or perhaps you can recommended another Mac friendly 4k monitor to use? It may be overkill for what i need it for...also wasn't really looking for a curved monitor but this was the only 4k monitor in the store and on sale*
I'd recommend looking at the 'Best Settings' video, but really you need to adjust according to your own unit and preferences - th-cam.com/video/N8k03ZM_BvY/w-d-xo.html. 'Creator Mode = sRGB' is really going to be most appropriate if your video editing is in the sRGB colour space. The monitor could work very well for your uses, but you need to be mindful of the burn-in risk and your lighting conditions.
Looking forward to your thoughts on pg32ucdm vs this monitor and other equivalents. I still need to replace my Predator X27 but the brightness is fantastic. Maybe i will have to wait for pholed?
Yeah it's difficult to compare the X27 with the OLEDs because their strengths are in very different areas. Bright scene HDR performance is definitely better on the X27. But the colour reproduciton, responsiveness and HDR performance where darker shades dominate is much better on the QD-OLEDs. The differences between the QD-OLEDs themselves are far subtler and I'd generally recommend going with the one that's cheapest in your region. So far they all look pretty solid.
I was thinking to buy this monitor for the upcoming PS5 Pro that will be available later this year.
For now, i will continue to enjoy my very aging ps4 pro playing 60Hz games
It will be quite the upgrade when the time comes!
You had done a fantastic review on the PG32UQX which I own. Is it worth replacing it with one of these QD OLEDs? The haloing is the main weakness of the PG32UQX but I am concerned that if I replace it I would miss its brightness and HDR performance. Also, I use my monitor for (office) work a lot so I am still not sure that getting an OLED makes sense. It would be great to get your view.
Yeah, both models have distinct strengths in HDR. I can't guarantee no 'burn-in' for heavy productivity usage, either. Though I use QD-OLEDs myself (AW3423DW for a few years, now swapped to this one) without issue, including for a fair bit of productivity. I make sure the screens turns off (via Windows power management) after 3 minutes of inactivity and I use a dark taskbar and desktop theme, but don't take additional precuations - aside from making sure it runs the scheduled cleaning cycles. I treat it largely like I would an LCD. I personally think it's worth the risk given the warranty coverage and obvious strengths in other areas.
Thank you! I really think your reviews are the best btw. People can make a fully informed decision to buy or not any particular display based only on your review because of the amount of detail and thoughtfulness you put into them, consistently covering and clearly demonstrating all possible considerations (rather than just transmitting overall impressions). Consumers need this level of detail before forking out such amounts of money for devices that are meant to be used on a daily basis and for a significant amount of time. Keep up the brilliant work!
Thanks - I appreciate the feedback and glad to see the detailed reviews are helpful!
Did you replace the PG32UQX?
@@DrakonR No, I use it for work as well as gaming, I would not be comfortable yet working on an OLED for hours on end as I do on the PG32UQX, I would be too scared of burn in. Also I own an LG C1 which I use as a second monitor so, for games that would look poorly on the PG32UQX because of blooming, I can always switch to the C1. So it would not make much sense at the moment for my use case, not to mention financially. And there is always the brightness of the PG32UQX which is fantastic, like bright daylight scenes in HDR feel like you are actually outdoors.
It's interesting that you mention that the VRR flickering isn't as bad as on WOLED. I saw a comparison between LG's new 32-inch WOLED and the Samsung G80SD (QD-OLED) and the VRR flickering was more noticeable on the QD-OLED.
When you say "a comparison" - in what context? If you mean based on a synthetic test (like the VRR flickering test or the way RTINGS tests this) then don't put too much weight into that. It's a very artificial testing scenario and VRR flickering can vary in intensity and how potentially noticeable it can be depending on the exact fluctuations occuring. I found it slightly less noticeable during normal gameplay in various scenes on this model compared to various WOLED models I've tested, but I wouldn't say it's a huge difference... It can also vary between individual models - so either way I can't claim QD-OLED > WOLED for VRR flickering.
In my experience talking to people who have tested or used various OLED monitors, VRR flickering is generally just something you're sensitive to and will notice (regardless of which OLED you're looking at) or you won't. Or you might notice it and not find it bothersome. The only model I've used which I'd say has any real edge in this respect for 'normal gaming' is the AW3423DW and that's because of some compensation performed by the G-SYNC module. But for some people even that level of VRR flickering is annoying.
@@PCMonitors th-cam.com/video/pTLDrVMgzzQ/w-d-xo.html
Interesting. As I said it depends on the exact model so I don't think it's right to bundle together "QD-OLED" and "WOLED" but rather consider the model itself. I also find cameras poor at reflecting what your eyes see in terms of flickering. I've observed very similar situations on Cyberpunk where the camera will detect very small gamma changes which aren't observed by eye. And even in the VRR flickering test it's picking up fluctuations at parts of the gradient or in ways your eyes just don't perceive flicker. I'd also stress that the monitors in that video you posted are set up very differently - the poster of the video states in a comment: "Samsung was in its sRGB clamp mode, I had to put the WOLED in FPS preset to raise gamma so it doesn't look so dark on camera. The FPS preset doesn't clamp to sRGB so its oversaturated." Nonetheless, I still got the impression they generally observed stronger VRR flicker on the QD-OLED he tested than on the WOLED (though I don't know how close his calibration was when he observed them and how much of this he was just modifying for the sake of the video).
Now I don't know if the Samsung G80SD is just particularly poor in this respect, but the pretty much constant low level flickering detected by the camera may not be observed at all by eye. It's usually the sudden and more significant brightness fluctuations ('obvious flicker') which is more troublesome and I tend to notice that more on WOLED models than on the AW3225QF. That certainly doesn't mean all QD-OLEDs vs. all WOLED models and depending on sensitivity to flickering and your room lighting, smaller and more frequent fluctuations could be an issue.
Either way, I will be more careful in future reviews to try not to generalise the VRR flicker levels of QD-OLED vs WOLED or perpetuate those kinds of thoughts. Going back to the AW3225QF, I doubt there will be many people who would find it fine for VRR flickering but find a 240Hz WOLED obnoxious in that respect. Nor the other way around.
Edit: Further thoughts shared here - forum.pcmonitors.info/topic/qd-oled-vs-woled-vrr-flickering/.
I watched the whole video. Very nice! Now i would be wating for a pg32ucdm 1 hour review by you, many of us are aming to buy it.
Appreciate the interest, but no plans to review the ASUS I'm afraid. Will be reviewing the MSI version (which is better value than the ASUS in the US) and don't intend to review 3 models using the same panel, especially when performance seems so similar in most respects.
@@PCMonitorsokay, a man can dream. I was thinking about pg32ucdm since there is a graphite layer and fully passive heatsink - it could be better in the hdr due to not that aggressive abl and by having some extra brightness in the sdt mode.
Thank you for your hard work.
@@ALFGamingTVYou're welcome! According to Monitors Unboxed and TFT Central testing, the ABL behaviour for HDR is very similar. But yes as usual for ASUS OLEDs with 'Uniform Brightness' enabled more stable brightness for SDR. The MSI is also passively cooled (graphene film), though they don't market as aggressively as ASUS. 🙂
What type of machine was used to drive this monitor for the review?
Tks for the review, great detail and delivery!
Nothing too spectacular. Windows 11 PC with RTX 3090 GPU, i7 12700K, 32GB RAM.
finally an honest review that talks about vrr flicker. subscribed 🤙🏽🤙🏽
Not that it affects the performance in any way, but I forgot to mention this model is officially 'G-SYNC Compatible' certified by Nvidia - pcmonitors.info/wp-content/uploads/2024/03/AW3225QF-G-SYNC-Compatible-certified.png.
Is that only in DP? HDMI 2.1 isn't detecting gsync in Nvidia control panel.
Too bad not Gsync Ultimate
I own this monitor. Insane review
Nice review and hello there battlefield player 😁
Thanks for the info
I'm happy with my 27" 1440p LG oled now
yea it has some rough edges
But subpixel isn't an issue for me, and I read documents a lot
I run VRR off for virtually everything - as the majority of games just don't fully support it properly
And I also watch movies/tv from quite far away so I don't think a curve would work for me
4k didn't appeal I have no issues running games at 1080p - and still do for competitive games
I also don't use HDR at all
The big thing for oleds is running windows in high contrast mode
And running a 16-235 -> 0-255 post resize shader in your video player to force deep blacks on videos - this lets you watch something old like star trek voyager AND have the uniforms etc. rendering in full deep black, instead of having that glowing black
I also make use of vivid mode from certain videos and tv shows like sci fi that really benefit from the extra pop
Is this the best oled for text clarity? I need an oled that I can do productivity on . How is the MSI?
I don't know if its the BEST
but I have absolutely no issues running it for like 50% of the time I'm just doing text based stuff
and thats with zero work arounds of effort
apart from using windows high contrast mode which is amazing on oled and a must to do
I also have an AMOLED panel for my laptop and have zero issues with it either @@mjregan88
Very informative video,thank you!!!
Hi, I have this monitor and a i7 9700k, 2070 super. I can't seem to get 240 hz. I've tried in windows and Nvidia control panel but only gives me 60 hz option, and sometimes 120 hz. I know my hardware isnt the best but whether I choose 1440 or 2160 res it wont give me the option. Any idea why please? Fantastic videos by the way dude
Are you using DP?
Hi, I'm not knowledgeable on modern tech but I'm assuming you mean a display port cable. No I'm using an hdmi 2.1 cable. I saw online that this is ok for 240 hz. Should I use a dp? I must say that the display does look good but I want to maximise the monitors capabilities. Thanks for the reply 👍
Ps should I disable vsync? It seems it limits you to 60 hz yet in the past whenever I disable it I nearly always get tearing so I leave it on, cheers
@moonticket3443 Your GPU doesn't support HDMI 2.1. You need to use DisplayPort.
Ok, I'll give it a go. Thanks so much for your help. Ps I recently tried 'Ori and the will of the wisp' on the new monitor (I got the monitor today), and it's absolutely stunning, even without the refresh rate I hoped for. Thanks again, awesome channel 👍
I am considering this or the Asus one. Buuuuuut have not decided yet. Main concer is that i sometimes play 30fps games (Old console games) and on OLED as far as i know 30fps should not look soooo goood, versus lcd because of the fast response time of the OLED versus LCD.
Yes, OLEDs have very little perceived blur from the pixel response element to mask a natural 'juddering' you'd see from such frame rates. Whether this is actually bothersome or not is very subjective, though. Probably have to see for yourself, I wouldn't give up on the QD-OLED experience based on something you may not have an issue with.
@@PCMonitorsThanks, yea i think i would order from a place with a gooood return policy, haha.
Not a fan of the curve but otherwise it’s nice. Cant decide between 27 Alienware or 32 MSI
I am disappointed you are missing my favorite part of your reviews for this one. I am wondering how the interpolation/scaling is on this monitor for non 4k resolutions. How does it handle 1080p content? Does it do nearest neighbor upscaling or is it bi-linear like monitors usually use?
You're missing it, the review isn't! It's even timestamped.
I have AW3423DW and was wondering if it's worth it to upgrade to this monitor. Does the monitor appear much smaller than AW3423DW? Is the picture quality better?
You should watch the review, at least the conclusion.
the curve seems way more aggressive than in real
Yes, I agree and that's a good point. I actually recorded a piece where I said it was exaggerated on the video. And you can see this odd 'pincushion' effect in the centre. But I edited this out (didn't mean to cut so much from the edit of that section).
Regarding the VRR flicker, have you tested whether there is any perceivable difference if connecting via DP 1.4 or HDMI 2.1? I have read that HDMI 2.1 can help but wanted to hear your experience. Thanks
I noticed it in much the same way with HDMI 2.1, though testing with that was quite brief. I will test this again more exensively when I get a chance.
I have now tested this extensively and can confirm VRR flickering was no different using HDMI 2.1 vs. DP 1.4. The monitor is only 'G-SYNC Compatible certified' via DP - which makes no difference to how it performs, but means the technology is only active by default via DP. It's possible somebody was using HDMI 2.1 without actually activating VRR and thought it was active, if they observed a significant difference in (or indeed no) VRR flickering.
@@PCMonitorsthanks for following up. Have you noticed if limiting the refresh rate to 120 helps with flicker? Might be an option if playing a single player game where you can't max out the refresh rate anyway.
@@lsik231l VRR flickering is reduced if fluctuations are reduced. So if limiting your refresh rate does that then great. But if your frame rate doesn't pass 120fps either way, it won't make a difference.
@PCMonitors great! Really appreciate you investing your time on this! I recently got the monitor and so far really do love it. I ended up using DP as I found using hdmi caused random artifacting that I read was a problem but only if refresh rate was set at 240hz. 120hz eliminated artifacting at least with my monitor.
How does this compare to the Samsung S90C tv?
I want to know as well
Recently bought a aw3423dwf, loving the hdr peak 1000 mode but in some games, like battlefield 5, cyberpunk in daylight or other bright games, abl makes it almost unusable. Enabling source tone mapping helps in some games (like ghost of tsushima) but the dimming is brutal, is there a solution? Maybe a firmware update?
Do you always play in peak 1000 mode?
How do you find those scenes with the 'DisplayHDR True Black 400' style setting(s)? The ABL behaviour is there due to power limitations with the technology, it's not going to change with a firmware update. The 'DisplayHDR True Black 400' mode is more accurate with its PQ tracking and will show what the panel can technically do in terms of brightness in such scenes. If you still find things too dim that's truly the ABL behaviour you're observing, if not then it's because of the over-dimming of mid-tones with the 'HDR Peak 1000' setting.
45:50 this might sound like a bunch of copium, but actually we might benefit from abl more than we lose. sure, in a fully lit and sunny room, fullscreen 1000nits would be rad. but at night, this would result in a flashbang whenever you look at the sky or play in a snow covered landscape like this battlefield example. on my current monitor, benq ex2780q, i have BI+ enabled for this exact reason. 400 nits hurts my eyes when i use my pc at night for gaming and even more so for productivity
I don't personally find bright-dominant scenes overwhelming on Mini LED solutions which will raise brightness much more than on this model. Even there the whole scene is not at a high brightness, it's just certain sections of it. Because of the contrast you could also argue that smaller bright highlights (which QD-OLED presents with very high brightness) surrounded by darkness are more impactful or potentially uncomfortable to look at. Everyone has their own sensitivity to brightness though, I think multiple options with different brightness levels is the best approach.
I don't like the representation of HDR on models like the EX2780Q without local dimming (or with very ineffective local dimming) because the whole scene is flooded and everything is raised up more than it should be, not just the brightest elements. I don't think the experience is really comparable to a 'proper' HDR experience even if just considering viewing comfort.
@@PCMonitorsinteresting, i have no reference to actual hdr monitors, apart from my evo oled tv, so i can't comment on that. however, i ordered the aw3225qf as soon as i finished your review :) can't wait to finally have a look at content in 4k and hdr when on the computer *_*
I see the monitor is G-sync certified, how well would it perform with a AMD card and freesync? Considering one, but on the fence
It should give a very similar experience, both technologies use Adaptive-Sync and the certification is just marketing.
Got the 2k version, best monitor I’ve owned. My 27” LG OLED is for sale lol
What made you get that over 32 or the 27 MSI?
Hi, how long was delivery? I’m looking to get this monitor but the est delivery time is mid April, I live in the uk.
If that's their estimate that's their estimate. It entirely depends when you order and how their demand and supply curves are shaping out. Mine came slightly before estimated, but I ordered it as soon as it was available.
I could not live with the monitor curve on a 16:9 screen... What is the purpose of the curve on a 32inch 16:9 screen ?
Because it's "cool" and the kids love it! But as I say in the review, it's pretty subtle. Draws you in a little bit, can aid immersion slightly. But personally agree the curves are best kept to ultrawides. On 16:9 they're either subtle and have limited effect (like this) but could still be annoying to some on the desktop... Or they can be too steep and annoying a lot of the time.
@@PCMonitors thanks for the answer... this is my first pass on the review... I'll see it a second time and there may be other questions :D
agree, (subtle) curves have their place, but not on a 32" (imo). I will say I am on a 42 C2 and I *definitely* miss the slight curve my old 34" ultrawide had...
I would say by 99% i am sure this was done by dellineware in 32 inch format for marketing purpose only. To stand above the crowd and having +1 reason to sell it as a gamer monitor. It is good thing since you have a lot to chose from -like the chosen one pg32ucdm, but if you can get this for 800-900$ compared to 1300$ asus, with 3 years advanced warranty, i would say curve in not important then 😂
Just took back msi urx quality was not ,great, is this better quality as i looking new and i run 4070ti
4K resolution on all these new OLED 32 inch monitors are not native 4K?? 4:40
Back up a bit and you'll see I was showing and talking about 2560 x 1440 (QHD) which is not the native resolution of the monitor. So what I'm saying is when the AW3225QF is running 2560 x 1440, it's softer than a monitor that has a native resolution of 2560 x 1440 and is the same size.
@@PCMonitorsso if i was to buy this, you could set in game resolution to 1440p?
@@bledarrm7976Sure, it will run up to the monitors highest resolution, so anything the game supports up to that. There are differences in performance for different resolutions though.
@@PCMonitorsi mean a 1440p Video does look better on 4k than on 1440p. thats for shure
1440p pixel density in particular resolution is a joke to 4k
On picture quality alone would you say this is better than the LG C3 42"?
That's a very broad question. I'd say it's better under HDR due to superior colour volume, it's a bit more vibrant under SDR due to gamut and the tighter pixel density is very nice.
@@PCMonitors thanks for the fast reply. I will look at picking this one up. Looks amazing.
I usually sit and game at 1.2m from my 65 inch Sony A95L qd-oled tv, which can reach 1500 nits on 1% window and 1300 nits on 10% window. I wonder how drastic the brightness reduction will look to my eyes if I switch to these 32" qd-oled monitors instead. I've always liked to play games on monitors more due to the closer distance (which allows me to see more details I feel) and much higher pixel density (which I hope can compensate for the blurriness of taa/dlss), but peak brightness figures seem very disappointing compared to bigger tvs.
the only time you notice it is when your doing like browsing or office stuff
and its actuall nice cause you don't burn your eyes out when some app/website that doesn't have dark mode lights up
DLAA and a low or off ansio looks best imo
I was actually asking the same question. I use LG g3
Tough to say for sure, but I suspect the closer viewing distance will help provide a suitable perceived brightness and 'pop' which won't disappoint compared to your TV.
@@PCMonitors thanks for the reply. Are the color gamut/volume on these monitors also worse compared to tvs? I wonder why. They have gen 3 qd oled panels while tvs only have the gen 2.
@@vmd1293The tuning seems to be slightly different, perhaps. DCI-P3 is similar, ~99% for both. Adobe RGB and Rec. 2020 is also slightly lower on this one compared to the PG49WCDM I looked at. Visually there's very little difference you'd notice, though. Different software and devices can measure a slightly different gamut, but all of the QD-OLEDs (TVs or monitors) are ~80 - 85% Rec. 2020 so within fairly similar range really.
In your opinion what is the best monitor between 1000€-1500€ ? Both for console gaming and Pc
Very subjective question. But probably either this or an MSI alternative, depending on pricing in your region.
is alienware is better than LG ???
Apples to oranges really, depends on your own preferences for screen surface and QD-OLED vs. WOLED characteristics. I'd recommend waiting for our review of the ASUS PG32UCDP, which is their version of that one using the same panel.
Is there something similar with a more normal aspect ratio/ full 4k resolution? Preferably not curved.
Similar or better anti-reflection coating.
Edit: Disregard that. Amazon UK is directing me to the wrong monitor model which has a weird widescreen resolution. Looks like they don't have this one. Why is it so hard to find a decent OLED monitor in the uk!?
Hey, I am running my games on a 7900xtx. Will I be able to utilize freesync while play in hdr1000/tb400? Because I can't wrap my head around dell offering a 4k hdr flagship without official freesync premium pro support
Yes. 'FreeSync Premium Pro' is just a confusing AMD-specific certification which sometimes involves its own 'HDR pipeline' - this just makes a few tweaks to the HDR experience (not usually good ones). You do NOT need FreeSync Premium Pro or even any specific FreeSync certification to use HDR + VRR with an AMD GPU, so your experience will be similar to an Nvidia user here.
Any curve at this size is a no go for me
I'm considering this monitor, but I would like to clarify one thing: how does the ~250 nits figure relate to the brightness you'll get in SDR games and video? If 250 nits is the value for a white area, then will a light blue area (e.g. the sky in a game) put out less than 250 nits? Or will the display notice the content has lower luminosity and turn up the brightness so that the sky is at 250 nits? Put another way: if I normally game on a monitor adjusted to 500 nits white-rectangle-brightness (which is too bright for desktop, but good for daytime gaming), then if I switch to the AW3225QF, will my games look 50% dimmer?
Yes, it's the brightness of white and any deeper shade will have a lower brightness. That's the maximum brightness that any shade will appear on the monitor (it's the same way on any monitor, regardless of panel technology). Games won't necessarily look 50% dimmer, though, because perceived brightness involves a lot more than just measured white luminance. For example the screen surface and exceptional consistency plus exceptional contrast of this model works in its favour to enhance perceived brightness, in the right lighting.
@@PCMonitors Thanks for the clarification. I thought maybe ABL would make the difference between a white rectangle and the sky a bit smaller, but I guess those new OLEDs don't use ABL anymore. When I got the AOC Q27G3XMN (mostly due to your review), I realized I like ~70% brightness for daytime gaming, which is ~400 nits white-rectangle brightness. I tried setting 45% brightness to simulate a QD-OLED (I used a light meter app to estimate that 45% brightness puts out about 250 nits on white) and while I can see everything just fine, I find the image much duller and less lifelike. I am flabbergasted by this, because I've always pooh-poohed high brightness on displays for ergonomic reasons. Turns out what's too bright for desktop use can be too dim for gaming/video use!
do you prefer curve or flat for home office?
Per the review the reasonably subtle curve was not a particular issue for me either way, but it's subjective.
Is the VRR flickering very noticeable during normal gameplay?
As noted in the video, it's very subjective and depends on the scenario and how much for frame rate is fluctuating. I don't find it bothersome - I do on some VA models for example where it's stronger and I'd sometimes even want to disable VRR simply because of the flickering.
@@PCMonitors I see. Thank you.
Can we get link of that wallpaper?
If you ask nicely.
@@PCMonitors please :)
Here you go, enjoy! backiee.com/wallpaper/nature/54336
I wish i had the money to buy this. Damn.
do you think the color on this monitor is vibrance than asus xg27aqdmg ?? Im curious to know
It has a wider gamut and superior HDR colour volume, so yes.
Should I get for a console gamer
Could still give a really nice experience. Basically as shown but limited to 120Hz maximum (and generally lower frame rates - depends on game). So a lot of excellent image quality characteristics to enjoy, still.
Love my AW3225QF.
Ahead of you on this one. I adore this monitor.
what about the vrr flicker? i get sometimes random flickers with gsync in full screen only and a good dp cable.
This is covered in the review.
Awesome review man
How does rec. 2020 looks with this screen?
It covers ~80% Rec. 2020 which is high for an OLED but not as high as some LCDs (with QD LED backlights). So the representation is reasonable but it can't display the most saturated shades.
@@PCMonitors Works enough for me, thanks.
Should I buy a Dell for $1158 out the door or the MSI for $1012 out the door!!! PLEASE COMMENT PEEP! I want to know what people think
Alienware for the better warranty
I seen it does not support free sync premium pro, if I have AMD, should I look for a monitor with that feature installed like MSI ?
Honestly - it makes no difference to the experience, it's just a certification used for marketing purposes. Actually sometimes being 'FreeSync Premium Pro' certified can be a negative as it sometimes causes the monitor to use a separate and usually worse-calibrated HDR pipeline.
Mine just came in. I dunno how to fix it from looking so dim. I thought OLEDs was known for its bright vibrant colors
You might be used to bright monitors and may need to adjust to a more limited brightness level. Or try to. It's also worth working on room lighting if you can, perhaps that's too bright.
please review DELL U2427D
Won't be doing that I'm afraid as Dell seems to have given up on the UK PR side. And I don't intend to purchase it myself as I did with the AW3225QF.
@@PCMonitors ooo!
Can you speak to Desktop option and how that works as it relates to PC settings and Dell Display Manager app color tab.
"Desktop" is just another preset of the monitor, briefly covered in the "Best Settings" video - th-cam.com/video/N8k03ZM_BvY/w-d-xo.html. Dell Display Manager isn't specific to this model and not covered in these videos. Though is briefly touched upon in the OSD video - th-cam.com/video/ZYfIYV4LqQI/w-d-xo.htmlsi=hxjHUHA95SX39SG_&t=1406.
@@PCMonitors hmmmmm. I guess im looking for deep dive summary table of nits and color accuracy like you did for creator and standard as it relates to your PC settings. Like comparing your findings for the creator mode and Hdr with leaving standard and toggling display settings in PC settings (brightness, HDR on/off ICC changing system profiles).
On Display manager side. I would like to see if application mode and dynamic sync works with this model.
@chevonpetgrave4991 Right, you're looking for something reviewers don't realistically have time or inclination to cover. It's not practical or particularly useful to take measurements using every preset - they typically make a few (sometimes undesirable) changes to the image or are so similar to the standard or default setting that they aren't worth exploring. The presets I focus on are selected for good reason. The impact of the full native colour gamut and how it impacts the image, which is extremely important, is explored in the review. As are the characteristics of the only SDR presets which don't use that gamut (Creator Mode colour space settings). Display Manager is software which is available for a range of monitors and isn't a core aspect of the monitor itself or its performance.
Totally get that. I guess I waskibg conversafionally not for a reviee video.
In testing the viability of these for a color managed editing workflow that depends on high contrast ratios high peek brightness and specific black levels which this monitor checks off at a reasonable price.
Just wanted to pick your brain on what I would be getting myself into workflow wise. Fom my research on another dell monitor ultrasharp series I envision using the Dell Display Manager to control auto sync icc profiles with the system. Making sure any monitor profile is sycned with what the system is outputting is critical for working in davinci resolve. I do it manually (first from my system settings and then have to do the same on display osd) on my lame monitor now. But ben q and dell have series that will match the counterpart automatically when you either switch on the osd end or system settings end. Additionally based on the application you’re in you can even switch as you open a new app to jump into a new color space and sync all in one move (open the application).
I was going to go the ultra sharp rout for price for features but saw that this monitor does oled and hdr. At first i was like. Oh thats a gaming monitor it probalby doesnt have that capability to work with display manager. But when i saw in the spec sheet that it is compatible, i stsrted to wonder what is possible. That’s where im coming from for some contex😂.
Yeah, that makes sense. Well it definitely works with Dell Display Manager and as far as I'm aware associated features should work. The ICC profile switching section of DDM was shown in the OSD video, it's definitely accessible.
how is it motion clarity speed wise vs the asus 540hz tn 1080p monitor ?
Assuming you're actually running at 540fps, the ASUS should have an edge overall because of much lower perceived blur due to eye movement. Might be some transitions where the OLED would have an edge simply due to the visually instantaneous pixel responses vs. slightly below optimal for some transitions on the ASUS, though.
@@PCMonitors any plans to review the dual mode 4k @ 240hz and 1080p @ 480hz oled lg monitor that releases next month ?
@@ployth9000 Probably not as it's an LG and they don't have a functional UK PR department. Can't really justify purchasing that one myself at the moment.
@@PCMonitors ok well what about the samsung odyssey neo g9 57 I have been waiting for either you tftcentral or monitors unboxed to review it I have been using it and enjoying it but would be nice to see more people reviewing it.
@@ployth9000 Same applies to Samsung I'm afraid, not easy to work with. And that's beyond the size limit of screen I'd generally consider reviewing.
Can the vrr flicker be fixed in firmware updates?
I wouldn't hold your breath for that. It's inherent to OLED panels and it seems no manufacurer (panel or monitor) has worked out how to do this.
Hi. Will the FreeSync option on my computer with Ati Radeon 7900XT work on this monitor? Some reviews/specs say no and some say yes, so I am confused. Thank you.
@@TheNikolinho Yes, per the review VRR is supported including Adaptive-Sync. Meaning you can use FreeSync. The monitor isn't 'FreeSync certified' but that doesn't matter.
@@PCMonitors thank you for that answer! The last thing - customers are complaining that the Display Port is obsolete - 1.4 instead of 2.1. My gpu does support 2.1. I'm ignorant so idk if having DP 1.4 does something to the quality of the screen, picture, watching movies, or playing games? Thank you!
Makes no difference whatsoever to the quality of the image or really to the functionality. With the exception that DSC is required for the signal, which locks off access to some Nvidia features such as DSR and DLDSR. And alt-tabbing in and out of games can take longer.
@@PCMonitors Wow, you're so knowledgeable! Thank you.
Apparently this and the 27" model have a 10% off deal if bought with an accessory through dell. Does anyone know if this deal is available outside the US?
Don't believe it is, seems to be a current US only promotion.
i'm a bit lost, does that mean that the monitor wont be able to do 3840*2160 @240hz with a display port cable?
Does what mean that? 3840 x 2160 @240Hz via DP is specifically shown at the beginning of the 'Refresh Rates & Scaling' section: 2:40
@@PCMonitors i'm sorry for not being clear enough. The sentence at 3:03 "if you use hdmi you get up to 240hz" sounds to me as if you cant get 240hz on dp. Sorry for that one sir. Just wanna make sure before i buy something :D
It's a long video, it's easy to miss or misinterpret bits so no worries.
Why is this monitor so much more costly compared to the Msi 321upx 32" oled?
It isn't in the UK and Europe. Not sure why they priced it as they did in the US, but it is what it is.
I wish all monitor had the option to not get it with the stand, such a waste of money paying for something you will never use and ofcourse it either site in a wardrobe or goes in the bin....
Wallpaper link?
Which wallpaper?
Thanks
Quite an effective anti-reflective coating but this purple glow is making me sad.
This Monitor or Asus ROG Swift OLED 32 UCDM?
Would depend on pricing in your region and your preference for the curve and aesthetics, really. They're both excellent performers.
@@PCMonitors in Germany 1499 Euro for Asus Rog
Does this monitor have BFI?
No, as covered in the review it doesn't.
@@PCMonitorsdamn what a shame
Hello guys it seems that this monitor has some issue about compatibility with few nvidia gpu when its connect with hdmi 2.1, do you know this ?
It was fine with our RTX 3090, not sure what the issue would be (if there is one). Maybe something Dell needs to work out with Nvidia if so. But PC users can use DP 1.4 without issue anyway, so I don't see the problem.
Hi if someone has a budget graphics card like a rtx 3050, say approx 60fps gaming with VRR disabled, would there be a lot flicker?
It depends how stable the frame rate is, not specifically how low it is. The main problem you'd notice there would be if it frequently dips below and rises above the LFC boundary (so it's dipping below perhaps 55fps and rising above that).
@@PCMonitors Thanks for replying, exciting times for pc monitors ahead
How does it looks in movie ?
Very much the same as in games, which are explored in great subjective detail here. That's under SDR at least - refer to the HDR section for discussion on 'Dolby Vision' vs 'HDR 10' as it applies to movies.
@@PCMonitors What if streaming movies like very compressed, F2 Movies for instance ? I can only find Alienware 4k OLED monitor. No Samsung and LG.
The monitor doesn't "filter" the content in any special way. Unlike TVs monitors don't include 'denoise' filters etc... So it entirely depends on the content. You still benefit from the exceptional contrast and colour performance, even if you see some compression artifacts in places. The gamma tracking on the monitor is good and it masks these artifacts 'appropriately' with the '2.2' standard in mind. It also offers alternative gamma settings if you wish to mask them more.
@@PCMonitors Ok noted. What is the difference between this alienware 4k OLED as oppose to TV 4k OLED for movies in general ?
The TV can add a lot of processing and filtering (per my previous reply), the monitor shows things more as intended. But both monitors and TVs have various settings that can be configured to make things more or less "as intended".
Great video!
YES, you show Nvidia supported resolutions and color modes, the other channels DON'T, and it's unnacceptable!
mine will not connect via USB tried everything
What are you trying to do via USB, specifically?