Even non-alienware monitors are great. I used a 32-inch 4k Dell monitor (not Alienware) and it was impressive. Affordable, almost no BLB, great calibration. Kind of makes Samsung look like shit in their monitor space.
@@gozutheDJ No "was" about it, they still make pretty fantastic monitors for the money. My past couple monitor purchases have ended up being Dells - the last purchase I even tried to find something else just to try a different brand, but the Dell was the best monitor for my requirements and price range lol. The next upgrade we'll be moving up in the world with nice ultrawide though, so it'll probably end up being an Alienware, even though I'll try to talk myself into an LG or a Samsung...
Could some of your cameramen make a video about the colour space naming themes and how they are measured? Nice always to get the stats, but I don't personally understand the gamut space. Thanks!
They should. But for a brief explanation a larger gamut is like having a larger pool of colors. You can get more vivid "fluorescent" colors at the extremes. That's what the triangle is showing. For example a printer would have a triangle with a lot in the grey zone cause it's hard to print fluorescent colors with only a few ink colors, but it's easy for a bright monitor to display those. And then a higher bit depth means you have more granularity around all the colors to get smooth gradients. Basically exponentially more shades of colors.
At 5:12, it looks like the video was put on the display in post and that this is edited. That's how amazing that screen looks even through a macbook air display.
For people that prefer to use speakers that may only use HDMI (like Sonos), the inclusion of an eARC port is clutch. I hope GPU's and monitors start to include those going forward. It was such a pain to get my Sonos setup working with my PC.
That would be nice, I know so many people use headphones or at most stereo speakers but if you want to hook up a surround system to computer your only option that isn't an ancient highly compressed 5.1 channel format is HDMI.
@@TheGarner0 I did, but I had to buy an HDMI splitter that would output eARC. It was like ~$80-$100 on Amazon, I think. I tried some cheaper options, but they didn’t work. The other downside was that my computer thinks I have a 2nd monitor since I have an HDMI cable plugged into my gpu, and sometimes windows open there by default and it takes a few seconds for me to remember that whole issue.
best part of all the new oled displays coming, the old ones getting cheaper and used oleds way cheaper, so snatching them for close or even half the original price is possible this year!
you want to buy a used oled because it's cheaper. i want to buy a used oled monitor because i want to see what other people's chrome bookmarks are. we are not the same
Having eArc, 4K, and Dolby Vision is huge. This will be the ultimate small space (studio, dorm, room etc) monitor that can do it all. eArc means it’s easy to just plug in a soundbar and have “hifi surround sound” that just works, a bonus if the soundbar has extra HDMI inputs. Also, w/ how accurate and affordable QD-OLED is, I foresee most of the Pro Display market dead. Just placed my order.
more monitors need earc. I know the Odyssey ARK had it, but it didn't work right and dolby atmos never worked while this one claims it does. Yet to find anyone who tested it, so if you do can you give a yes/no if atmos works?
the screen curve is perfect for me. I use an older 32 inch monitor as my secondary monitor right now and it does feel like the sides can be a bit too far away sometimes.
hey, I wanted to ask... have you ever had a 1440p ultrawide? I want to buy this Alienware 32" and I have the 34" ultrawide one, and don't want to regret the purchase when I decide to swap it for the 32" 4K model. How is it? thanks
@@dolan_plzDon't think I can offer much. I use a VA ultra-wide currently and haven't yet decided to buy the 4k qd-oled monitor. I probably will, but just not in any rush to do so.
One issue with HDR 1000 for desktop use is that it may suffer from ABL with larger bright windows being displayed. It's an issue with the AW3423DW(F), at least. If you bring a white/bright window to the foreground you can see ABL kick in, even if you're not displaying HDR content. HDR 400 True Black fixes the issue. HDR 1000 is good for games or full-screen video, but personally I just leave it in HDR 400 all the time to avoid blank screen intervals during mode switches. Also creator mode is an SDR profile, so you can't be in creator + HDR at the same time.
I don't know if that's the way to go but I personally just turn off HDR most of the time and only turn it on for HDR videos and games, that's honestly pretty quick to do with the on/off shortcut in windows 11
I’m generally not inclined to believe you guys when you say things like “it’s not as expensive as you might think” but in this case, I’m actually pretty surprised by the price. I expected AT LEAST $2,000 USD
As of 11/19 this is currently on sale with Dell for $800. I also had 10% off in my rewards account and $50 in dell offers on my AMEX. A steal at that price.
@@PanPrezesoThis is the correct take. 1440 gaming is the performance/value option for people who know the value of a dollar earned. 3440 x 1440 is absolutely the sweet spot for gaming today. People can enjoy their 40fps @ 4k on this thing. Lovely monitor, but feels like it's two years early in terms of where GPU perf/dollar is.
3:58 Probably you're not aware, but the original dw did receive a firmware update in Dec 2023. The GSync module is not an issue anymore for firmware updates.
TV brightness will hold for a much larger portion of the screen, 1000 nit at 1% screen on this monitor is actually pretty terrible. For pure HDR content TVs are still ahead by quite a bit, due to being able to maintain higher brightness over a larger portion of the screen. These monitors (yes even the 3rd Gen ones here) will dim quite noticeably when anything approaches full screen in a bright scene like snow.
@@EndoV2I kinda already have wires ran and cable management done. I used the display port cable for some reason connecting to my 4090. Do you think it’s worth it to swap out the display port with a HDMI 2.1 cable? What are the differences if both have to use dsc?
Jeez, when that arm reach in the video to tap the birdhouse I legitimately thought it was the camera man in the studio, and my mind was blown that it could interact with the video that way
Are these panels always glossy or can there be a matte version? What about the purple glow the qdoleds get in a dark scene when there is some ambient light in the room? Is this still there?
YOOOO I had a panic attack when the screen went black at 7:14 haha. I'm looking at getting the ASUS ROG Swift OLED PG32UCDM. As far as I can tell the only difference between the two is that Alienware is curved and ASUS is flat?
@@raspberrycrazyant no, that is not quite true. The panel yes, but the software, outputs, inputs and other neat extras like G-Sync and Dolby Vision, also including calibration and quality control before sending out, so not all Samsung panels are going to be equally as good. Let's wait for the real reviews.
Dell/Ailenware are just resellers of other's great work like everybody else. They don't make these panels. It's just the matter of how much money they're willing to put into their rebranded objects.
Alienware OLED with G-Sync Ultimate (AW3423DW) can now get user-installable firmware updates, just FYI since you implied that was still a limiting factor to FW update eligibility.
Thank you!!! I own that monitor and had to stop using it for work because the pulsing fan noise was driving me insane. I gave up on checking for firmware updates when I heard the GSync module meant they couldn't update it. This'll make a huge difference for me if it works. Hope this update works on older versions. I was a pretty early adopter after the LTT review.
I just updated mine due to this comment. I had the impression that the gsync module meant it was impossible to update it. It's great because hardware unboxed revisited the monitor a few time with the updated firmwares and found that it corrected some of the accuracy issues with the gamma curve (look, It has been a minute since I've seen the videos so pardon if it's wrong terminology) and also corrected the some of the standby issues I've been dealing with for a while
No, he was listing off the specs but I get what you're saying. You CAN, technically, run at 240hz with screen compression but it can look like poop. Although not REALLY 240hz over hdmi 2.1, it's still possible. I do wish they would have specified that though. It's a devils-details kind of thing.
Yeah, all right, this ticks all the boxes, that's the fastest I've pressed the 'order' button in a while. I've currently got a color-calibrated 27" 1440p monitor from mainline Dell, but it only does 60Hz and SDR, and I've kind of been waiting til there was a monitor available that was strictly better along every axis while remaining highly color accurate. So, nice of Samsung Display and Dell Alienware to launch this right when I needed it!
Sooooo… should I worry about OLED burn in or not? Honestly I will be using this more for productivity than gaming. Or should I just wait for the Dell U4025QW and avoid the OLED burn altogether?
Oled burn in has been anxiety inducing a decade ago but modern oleds are much more durable, especially this 3rd gen qdoled. If you’re still worried there’s a 3 year warranty if anything
Honestly, with my eyeballs, even if I had a 7090 Super mega Ti in 3 years, I'm still sticking with 1440. 4k is just completely irrelevant to me, personally. At least in relation to monitors at normal viewing distance for me.
4k still looks better even if you have to use upscaling, speaking as someone who uses a Pixio 1440p 27inch monitor and a Samsung Neo G7 4k 32inch monitor. I literally straight up have the experience and hardware necessary to prove you wrong. I don't even use dlss and it still looks better than my native 1440p monitor in terms of detail and clarity even with FSR on balanced mode. Complete bullshit that you "need" a 4090 to have a good time with 4k. Hate this myth. I use an RX 6800 XT.
@@BlackSmokeDMax Same. I legit cannot see any practical difference between 1440p and 2560p when I'm gaming on a desktop monitor. I'd rather go 1440 and grab all those extra 1% frames per second instead. Having my lowest 1% FPS as high as possible is priority after hitting 1440 IMO.
The one thing these screens miss is a built in eye saver/night light mode. The Windows one doesn't work with some games in fullscreen vs windowed mode.
@@El_Deen it’s been a game changer, no lie. I literally can’t go back to not having an oled 4k monitor. We’re going to upgrade our tv in the living room to be oled next but this monitor has been incredible.
@@TheDeviousLightI hate you. Mine arrived yesterday, only for me to find out today that it's a dead paperweight. It won't power on or nothing. So much for a "great monitor." It's a POS!
One thing I haven't seen yet on this monitor, is reviews describing working with text for 10 hours straight. Is this monitor limited to just gaming? Or can it be also used to work 10h / week day and 10h gamin on weekends? Or am I supposed to get two monitors?
Did you get it? This is the video I've been searching for! I cannot believe nobody has put up a productivity video with this monitor...I'm not a gamer but I got it for Editing and productivity so it needs to work with my M2 Mac studio and I can't find a video to help us out! I may return it for Dell 40" ULTRASHARP 5k but not sure...Alienware aw3225qf is still in the box...there also better not be any scratches on it if i open it up...if so Dell it is!
The most tempting one for me is the 27” 165 HTz version. QD-older for $360??? That’s actually affordable to someone who isn’t Plouff!!! Plus he already has a monitor
I'm a great fan of Alienware Monitors, but a little disappointed that they didn't upgraded the DP1.4 to DP2.1. The Monitor is very appealing however, especially because of eArc and DP3 99. Would like to see an in depth color accuracy, text clarity and overall settings test.
@@DrakonR no its not, theres no TV tuner, no remote, no smart apps (android tv etc...) and its a low latency 160Hz panel. It's just a big monitor. If you do programming, video editing, digital audio production, you'd appreciate a large 4K monitor. Its a bonus its great for gaming too ( I sit 1M away from it)...very immersive.
@@DrakonR no a tv generally doesn't run at 144hz most of the time unless you put it in game mode. It also includes tuner, smart controls, etc...and typically won't have display port either. I know what you're tying to do by conflating the 2, but they're different. Just stop mate
Can clarify how does the earc work on a monitor? Does that mean a soundbar can be connected to that other non-earc hdmi port to receive audio? For TV usually there’s a dedicated port that outputs earc audio signal.
They should make this monitor in 5K 16:10 resolution, and then add dual-mode so you can switch 1600p (1440p but 16:10) with higher framerate. It would basically combine the best of both monitors.
My days that sounds good, I'm just waiting for a good 16:10 gaming OLED monitor to come out, not spotted one yet, 16:9 is not my cup of tea for gaming on, not enough vertical space for FPS/3rd-Person games, I also like to use emulators for older 4:3 games and those look much better on a 16:10 display.
@@Wobble2007 Yeah I agree. 16:9 makes alot of sense for a TV, but for a monitor 16:10 is just better. And the 5K allows you to have dual mode to 1/4th resolution (nn interpolation) with higher frame rate. Would be perfect imho.
@@jp_8988 5K is a really good res for 16:10, of course I think 10,240x6400 would be the perfect 16:10 resolution, 5K is just over 10 megapixels, but 10K is over 64 megapixels and would give a really nice almost print quality 450 PPI, that is my ultimate 16:10 monitor resolution anyway lol, 16K is 134 megapixels by the way, 128 times 720p, man that would give unbelievable fidelity in a video game, you don't even need to drive it with a native resolution at that PPI level, you only need to give it a 2.5K to get a nice image, obviously the native resolution would be best, but that makes it future-proof.
11 หลายเดือนก่อน
Yep will be perfect if they release it when the 5090 comes out.
DisplayPort 2.0 technically *_does_* have a higher maximum bandwidth than HDMI 2.1, but unless your GPU can drive 4k HDR at more than 144 fps, you'll never need that extra bandwidth. They both support 4K 144+Hz variable refresh rates with full 10-bit 4:4:4 HDR color gamut, @@nickp3173
@@Mark-oy8pf I’m thinking that would be the play. GTA 6 will be the goal, and considering it’s supposed to run on modern consoles running equivalents of a 6700XT, I expect a 5080 or 5090 should….should be sufficient for the PC version to run. That being said, the PC edition will come out until 2026 or 2027, so it is completely possible that they may be targeting that version of the game for the 60 series….
Hey, would've been happy for you to inform us on hardware requirements to get 4k240 running. Even with a 4090 I am worried about the support of 4k240. Plouffe said "using HDMI 2.1a" but the spec doesn't support 4k240 (as stated on NVIDIAs page).
@@Nh_audios They are dp 1.4a, which supports this refresh rate and resolution, however the screen only supports dp 1.4 (non-a) as stated by the specs on the website.
Dont worry, you'll hear him talk about that monitor at least once a month through the different videos or posts he will be included in, until he changes monitor
just got this monitor yesterday and its absolutely amazing! what would you recommend for picking the most accuracte color option? (standard / FPS / MOBA / etc etc)
Video idea, guys: Comparing different panels to each other. Show WHY HDR is so much better in a way that SDR plebs will understand. There's so many videos on how cool a singular monitor is on it's own, but hardly any comparisons online showing the actual differences as apples-to-apples as it could be.
They can't show you when you are watching on a SDR screen, it can only be described in nits and experience. You have to go seek out side by side comparison yourself.
@@curtisbme I feel like there's ways to do that.Various black levels are easy enough, show them sbs in a black room, the SDR panels will stand out. LTT can also put the raw data comparisons out along with their attempt at visually showing SDR users what HDR looks like. The team is beyond creative, I have faith in them.
@@varunaX Not to us on SDR monitors, no. I'm pretty sure there's clever ways to film an HDR and SDR monitor side by side to show the differences with the right cameras and editing to SDR video.
1199 damn that was a lot less than I expected I was expected 2k range well that is still in USD so EURO would be 1300 maybe still I did tell myself for 1300 bucks I will consider upgrading my X34P finally
So my big question here is what sort of DSC profile is being used to achieve 4k@240Hz, because that is way beyond the uncompressed 48Gbps limit of HDMI 2.1 - even with RBv2 timings. I keep reading that Nvidia cards sacrifice a mess of features to use DSC, and the results are less than pretty. It's such a shame Nvidia didn't embrace DP 2.0 this generation - we might have seen a much better adoption rate of the port.
I would be interested how would this compare to the ASUS equivalent, and yes I mean the one that has only 4K 240 HZ, as they are releasing two versions of the 32 inch OLED (4k 240HZ only and 4k 240hz/1080 480hz).
Now if I could just get this exact panel but flat instead of curved that would be great. 32" monitor that doubles as a bedroom TV. That's my go to. Sadly I probably won't be able to afford this quality of a panel for another 5-10 years 😂
@@Davids6994 content consumption while laying in bed. Movies don't look great from 8 feet away on a 32" display, let alone a curved 32" display. (My "gaming monitor" = my bedroom TV)
I think my biggest holdup would be that the last time I tried a 32" 4K monitor, I thought that the resulting image at 100% was just a *TINY* bit too small in the desktop. I think I was only using 125% scaling in the end, but that little bit helped. To a degree, I'd rather just avoid scaling if I can, so I wonder if a 36" panel would get me to that sweet spot. Albeit, at that point, I'm getting closer to the 42" TV that I'm currently using!
@@GirlOnAQuest Depends on which. An LG CX is going to get you far, although way too big for me. My personal preferences are 27 or 32 for main, 24 or 27 for secondary, nothing bigger, nothing smaller.
@@GirlOnAQuest I was a bit hesitant as I was going from a dual 27" setup to... something. I tried out the Alienware widescreen, the Samsung ultra widescreen, and just a 32" 4K from ASUS. Ultrawide ended up being a no-go due to some games not supporting it, and just not liking the idea that I could be stuck with black bars on the side due to a game not supporting wide or ultra-wide resolutions. The ASUS monitor was just awful in comparison to the other two that it wasn't even a contender. I ended up picking up an LG 42" C2, and while it's not perfect, it works well enough. (It was also on sale, which was nice compared to its more expensive monitor variant, the ASUS PG42UQ.) I do kind of miss distinct monitors at times, but I also like having the larger monitor in the center. The one weird thing that I have to keep in mind is that I need to wait a second after turning the TV on or else my window positions and sizes will get messed up.
Exactly. It's so clear that the ltt team just has no idea what they are talking about when it comes to modern games or graphics tech. Channels like digital foundry are much better for that. Not using dlss at at least Quality Mode is just stupid nowadays, if you play at 4k. Native with TAA looks worse. The "blur" or whatever he's talking about is either a game issue or placebo. You will get a worse picture quality at native with TAA. And native without AA is just full of aliasing.
@@gavinderulo12 Yes they're actually clueless. At that framerate Frame Gen is literally free fluidity and DLSS results in a flat better image at Quality.
I am gonna stick with my Alienware AW3423DW Ultrawide 1440p monitor, for higher FPS, 4K isn't here yet. Also AW3423DW latest firmware are here now and can be update.
My only concern with this, that y'all didn't touch on, does it still have the weird text issue that other OLEDs have had where the sub pixel layout is incompatible with the windows text protocol? The only dealbreaker for me would be the weird pink and green halo on text.
@@Phil_529 Only maybe If You have black bars with a static image. 99,999% of the cases You have black bars with a video that shoulden't have any impact.
Its kind of crazy how you can convince yourself that screen looks so much better when Im still just watching it on my standard pc monitor. I mean it cant really look any better than my current screen is capable of displaying right? 😂
It already is supported, but it barely has any uses other than Netflix and pirated movies. My experience is with laptops though, so it may be different with a third party monitor.
This is exactly why 4k still to this day doesn't make much sense for gaming. Even with maxed-out PC you won't be able to hit those 120+ fps. 1440p is still the sweet spot. That is why I am so excited for the new QD-OLEDs that have been announced this past week, like the one Linus presented. Another huge thing for me is also the much improved text clarity. 2024 might be the year I'll finally switch to OLED.
@@ZabivakaPirate69 Lowering res makes the image worse and DLSS at performance is still a pretty big performance hit over 1440p Quality. And reducing settings once again gets you only so far and affects the image quality to such an extent that you wonder why you even bothered to go up to 4k.
@@ZabivakaPirate69 Changing resolution in monitor makes everything look terrible. For some reason they look terrible even if the resolution change is made so that 1 pixel becomes 2x2 grid. No, resolution change is not the answer.
0:41 I have acute psychosis and when I hallucinate I hallucinate all the time from getting drugged so from PCP so it's very hard for me so if you guys can work on like the text is very hard to read and then I start hallucinating so just try to work on that I'll try to work on so it's not so harsh on my eyes cuz it's very hard for me to look at like very like pixelated text
Aspect ratio is irrelevant. It's the width that determines whether a curve is useful. As the owner of a 32" curved (16:9) monitor, I can confirm that it is.
@@jasonhurdlow6607 Aspect Ratio is NOT irrelevant. It might not be the main factor, but it's not irrelevant at all. It is a combination of Size, Aspect Ratio, Viewing Distance, Radius of Curve, and Use Case that determines what is useful or not in a monitor. As a fellow owner of a 32" 16:9 curved monitor (Odyssey Neo G8), I can tell you that its usefulness is completely subjective. For gaming? Maybe, but other than that, I'd rather have an unskewed image for anything else, which is why I have a Flat 32"(INNOCN 32M2V) as my main and use the curved screen for gaming only. Even then, I still use the flat monitor for gaming most of the time. Just feels better to me. Besides, "The Best" is, and will always be, a matter of personal preference.
I just placed my order today, arriving Monday :) - I currently own the Alienware AW3423DW which I am planning on selling to cover part of this new one.
Coming from China, a country where advertisements are forbidden to claim “The best”, I really don’t like this title, especially when there are dozens of other monitors using exactly the same panel.
Yes, I returned my aw3423dwf and the difference from 1440p to 4k especially at 32in. makes any game and video look insanely detailed. You can see pores and wrinkles in skin from far away, grass looks sharp. Spiderman looked blurry on my aw3423dwf but the now it’s crazy at the qf’s PPI
What GPU were you using since you mentioned the usage of HDMI 2.1 at some point and then image generation with path tracing. Is there an RTX4090 that has HDMI 2.1?
Never had to think about it before, never purchased external speakers. I understand it comes with eARC that can work with most modern soundbars, but what if you want to use standard PC speakers?
I like my AW3223DWF... except the outer layer of the screen is very fragile and easily picks up scuffs. My OLED TV on the other hand is tough like gorilla Glass. I really like this monitor, but I can't imagine buying it if it's going to get scratched up just as easily.
now someone else should make a video saying “YOURE BOTH WRONG THIS is the best monitor” make it a trilogy
Yeah they are both wrong. The 27” 165 HTZ one for $360 is my favorite option
Bro its your favorite monitor@@ryanhamstra49
MacAddress with the Apple Pro Display XDR /jk
But its missing DISPLAYPORT 2.1
@@gohanpcgamernot needed, your not driving 500hz, and no graphics cards have 2.1 other than amd 7000 series GPU's anyway.
The Alienware monitor team is holding this company up. Absolutely been blowing the monitors out of the park.
Dell was known for awhile for making great monitors. happy to see they are returning to that standard.
Even non-alienware monitors are great. I used a 32-inch 4k Dell monitor (not Alienware) and it was impressive. Affordable, almost no BLB, great calibration. Kind of makes Samsung look like shit in their monitor space.
@@vedantdesai1 I think Dell owns alienware brand
@@randommango1337I am pretty sure he knows just meant that it wasn’t specifically from Alienware department
@@gozutheDJ No "was" about it, they still make pretty fantastic monitors for the money. My past couple monitor purchases have ended up being Dells - the last purchase I even tried to find something else just to try a different brand, but the Dell was the best monitor for my requirements and price range lol. The next upgrade we'll be moving up in the world with nice ultrawide though, so it'll probably end up being an Alienware, even though I'll try to talk myself into an LG or a Samsung...
Could some of your cameramen make a video about the colour space naming themes and how they are measured? Nice always to get the stats, but I don't personally understand the gamut space. Thanks!
Im in the same boat here. I was going to make the same comment but no need anymore since I found yours
there's a techquickie about this, just search "techquickie color gamut". it's the one luke host
I third this.
I was literally thinking this while watching. Ive always heard about the percentage gamut space but never had it explained
They should. But for a brief explanation a larger gamut is like having a larger pool of colors. You can get more vivid "fluorescent" colors at the extremes. That's what the triangle is showing. For example a printer would have a triangle with a lot in the grey zone cause it's hard to print fluorescent colors with only a few ink colors, but it's easy for a bright monitor to display those. And then a higher bit depth means you have more granularity around all the colors to get smooth gradients. Basically exponentially more shades of colors.
At 5:12, it looks like the video was put on the display in post and that this is edited. That's how amazing that screen looks even through a macbook air display.
Exactly this! This was the first thing I thought the moment he started the video with Andy in the forest. It looks like an optical illusion 😮
Was just about to comment this lol, that's really impressive
My mind couldn't make sense of it
@shortcircuit What video is that? I can't seem to find it!
For people that prefer to use speakers that may only use HDMI (like Sonos), the inclusion of an eARC port is clutch. I hope GPU's and monitors start to include those going forward. It was such a pain to get my Sonos setup working with my PC.
Did you ever get eARC to work through your pc? I just went with optical
That would be nice, I know so many people use headphones or at most stereo speakers but if you want to hook up a surround system to computer your only option that isn't an ancient highly compressed 5.1 channel format is HDMI.
@@TheGarner0 I did, but I had to buy an HDMI splitter that would output eARC. It was like ~$80-$100 on Amazon, I think. I tried some cheaper options, but they didn’t work.
The other downside was that my computer thinks I have a 2nd monitor since I have an HDMI cable plugged into my gpu, and sometimes windows open there by default and it takes a few seconds for me to remember that whole issue.
@@PatrickTheGreat I think you dan disable this monitor in windows. although... that might also remove its ability to transmit the audio.
@@TheGarner0 currrently using eARC. PC(RTX3080) > TV(LG C1) > Receiver(Denon AVR-X1700) > Speaker.Dolby , HDR, etc everything works exactly as it should.
best part of all the new oled displays coming, the old ones getting cheaper and used oleds way cheaper, so snatching them for close or even half the original price is possible this year!
Well oled is something I wouldnt buy used
used oleds? are you nuts or do you like deadpixl
He ist dead inside so those Pixels suit him well i guess 😂😂😂
Used OLED hahaha hahaha good joke.
you want to buy a used oled because it's cheaper. i want to buy a used oled monitor because i want to see what other people's chrome bookmarks are. we are not the same
5:00 why is no one talking about that hand? It looks exactly like someone in studio was pointing at the display. Looked soooo real.
Having eArc, 4K, and Dolby Vision is huge. This will be the ultimate small space (studio, dorm, room etc) monitor that can do it all. eArc means it’s easy to just plug in a soundbar and have “hifi surround sound” that just works, a bonus if the soundbar has extra HDMI inputs. Also, w/ how accurate and affordable QD-OLED is, I foresee most of the Pro Display market dead.
Just placed my order.
Im most excited about Dolby Vision! Needs to be on more monitors
more monitors need earc.
I know the Odyssey ARK had it, but it didn't work right and dolby atmos never worked while this one claims it does. Yet to find anyone who tested it, so if you do can you give a yes/no if atmos works?
How is the monitor?
Hi, what does the extra HDMI inputs on a soundbar do?
How has it been?
the screen curve is perfect for me. I use an older 32 inch monitor as my secondary monitor right now and it does feel like the sides can be a bit too far away sometimes.
he keeps talking and talking and talking......
less reviews guys !!!!!!!
@@lucasremit's not a review. It's an unboxing/first impressions. I would generally not recommend Ltt for any type of reviews.
hey, I wanted to ask... have you ever had a 1440p ultrawide? I want to buy this Alienware 32" and I have the 34" ultrawide one, and don't want to regret the purchase when I decide to swap it for the 32" 4K model. How is it? thanks
@@dolan_plzDon't think I can offer much. I use a VA ultra-wide currently and haven't yet decided to buy the 4k qd-oled monitor. I probably will, but just not in any rush to do so.
@@MCG31 oh ok, I though you already have it 🙂
Two things that’ll never change:
-Sun rising from the East
-Plouffe lusting over Alienware OLED monitors
Well, not according to flat earthers though. I'd rather pick the speed of light being 299 792 458 meters per second.
While its not the best thing in the world, its damn good for the price. I love mine
One issue with HDR 1000 for desktop use is that it may suffer from ABL with larger bright windows being displayed. It's an issue with the AW3423DW(F), at least. If you bring a white/bright window to the foreground you can see ABL kick in, even if you're not displaying HDR content. HDR 400 True Black fixes the issue. HDR 1000 is good for games or full-screen video, but personally I just leave it in HDR 400 all the time to avoid blank screen intervals during mode switches. Also creator mode is an SDR profile, so you can't be in creator + HDR at the same time.
But you lose that hdr punch on specular highlights during actually hdr content. And is the ABL as bad in the 16:9 ratio?
I don't know if that's the way to go but I personally just turn off HDR most of the time and only turn it on for HDR videos and games, that's honestly pretty quick to do with the on/off shortcut in windows 11
@@FlipperWolf jup. That's how I do it too.
I’m generally not inclined to believe you guys when you say things like “it’s not as expensive as you might think” but in this case, I’m actually pretty surprised by the price. I expected AT LEAST $2,000 USD
Yeah, same for me, the price tag is very reasonable for this quality of a monitor, which is not usual in LTT vids
Compared to the other 32" 4k monitors on the market $1200 is decent.
(Looking at you Samsung neo G-8s)
Dont worry the same panel at Asus will cost $2k
@@deregulationIC lol, they won't have as many buyers
@@TheFreckCo price is decent compared to market, 2023 monitors will go down big time in price.
As of 11/19 this is currently on sale with Dell for $800. I also had 10% off in my rewards account and $50 in dell offers on my AMEX. A steal at that price.
Just ordered my own. Perfect timing too, the sale started while it was in my cart. Glad I hesitated for a bit, I almost wasted 300-400$.
as a aw34dw owner the biggest draw for me is simply going 4k
Same. Trying to figure out where to sell a monitor lmao 😢
@@James__Smith lmao what
Honestly i dont mind it. UW 1440p at least doesnt require 4090.
Im super happy with 34 inch one
@@PanPrezesoThis is the correct take. 1440 gaming is the performance/value option for people who know the value of a dollar earned. 3440 x 1440 is absolutely the sweet spot for gaming today. People can enjoy their 40fps @ 4k on this thing. Lovely monitor, but feels like it's two years early in terms of where GPU perf/dollar is.
@@FrenziedManbeastlaughs in 4090
3:58 Probably you're not aware, but the original dw did receive a firmware update in Dec 2023. The GSync module is not an issue anymore for firmware updates.
Finally monitors ahead of TVs and the price is not outrageous despite the Alienware badge. Upcoming 4k TVs going to support 240Hz but at 1080p.
You're lucky it doesn't have an Apple on it.
@@krane15 good point
TV brightness will hold for a much larger portion of the screen, 1000 nit at 1% screen on this monitor is actually pretty terrible. For pure HDR content TVs are still ahead by quite a bit, due to being able to maintain higher brightness over a larger portion of the screen. These monitors (yes even the 3rd Gen ones here) will dim quite noticeably when anything approaches full screen in a bright scene like snow.
@@SamfisherSam true. I learned this myself in 2017 when I bought my first amoled phone and a 4k Lcd tv.
Price is not outrageous?! Lol bs! 😆
So for a 4090 would you recommend using the display 1.4 port or the HDMI 2.1 port?
Dp
@@ShalowRecord um why DP? HDMI 2.1 runs at 48gbps and DP 1.4 runs at 32gbps, both would need DSC so I would personally run HDMI 2.1 on this monitor
I have a 4090 with the Samsung G9 OLED and I had to use HDMI 2.1 to get 240hz
@@EndoV2I kinda already have wires ran and cable management done. I used the display port cable for some reason connecting to my 4090. Do you think it’s worth it to swap out the display port with a HDMI 2.1 cable? What are the differences if both have to use dsc?
Jeez, when that arm reach in the video to tap the birdhouse I legitimately thought it was the camera man in the studio, and my mind was blown that it could interact with the video that way
Lol - same. Then the hand moved the branches in the screen. Amazing screen. Still don't know if I will like the curve.
how do you even connect 4k 240hz HDR 1000 4:4:4. I've always struggled with cables and this amount of data.
The image is being compressed for transfer, making the data stream around 4 times smaller.
DSC won't even do that rate on DP 1.4 and the RTX 4090 doesn't have that connection.@@10Filip
Are these panels always glossy or can there be a matte version? What about the purple glow the qdoleds get in a dark scene when there is some ambient light in the room? Is this still there?
YOOOO I had a panic attack when the screen went black at 7:14 haha. I'm looking at getting the ASUS ROG Swift OLED PG32UCDM. As far as I can tell the only difference between the two is that Alienware is curved and ASUS is flat?
Could you imagine how good Alienware desktops could be if that team was half as dedicated as the monitor team?
Well Samsung made the panels, so dell just stuck their sticker on it to sell it
no one cares about pre-builts, not even the developers lmao
@@raspberrycrazyant no, that is not quite true. The panel yes, but the software, outputs, inputs and other neat extras like G-Sync and Dolby Vision, also including calibration and quality control before sending out, so not all Samsung panels are going to be equally as good. Let's wait for the real reviews.
Desktops have way lower margins than monitors most likely
Dell/Ailenware are just resellers of other's great work like everybody else. They don't make these panels. It's just the matter of how much money they're willing to put into their rebranded objects.
Is this one passively cooled? Does it have a fan like its predecessors?
Has a fan
Did I miss something. HDMI 2.1 and display port 1.4 both cannot drive 4k at 240hz?
DSC
Alienware OLED with G-Sync Ultimate (AW3423DW) can now get user-installable firmware updates, just FYI since you implied that was still a limiting factor to FW update eligibility.
Thank you!!! I own that monitor and had to stop using it for work because the pulsing fan noise was driving me insane. I gave up on checking for firmware updates when I heard the GSync module meant they couldn't update it. This'll make a huge difference for me if it works. Hope this update works on older versions. I was a pretty early adopter after the LTT review.
I just updated mine due to this comment. I had the impression that the gsync module meant it was impossible to update it. It's great because hardware unboxed revisited the monitor a few time with the updated firmwares and found that it corrected some of the accuracy issues with the gamma curve (look, It has been a minute since I've seen the videos so pardon if it's wrong terminology) and also corrected the some of the standby issues I've been dealing with for a while
Why did you say 4k 240hz through HDMI 2.1 @ 2:08 ?? Is that an error??
No, he was listing off the specs but I get what you're saying. You CAN, technically, run at 240hz with screen compression but it can look like poop. Although not REALLY 240hz over hdmi 2.1, it's still possible.
I do wish they would have specified that though. It's a devils-details kind of thing.
Yeah, all right, this ticks all the boxes, that's the fastest I've pressed the 'order' button in a while. I've currently got a color-calibrated 27" 1440p monitor from mainline Dell, but it only does 60Hz and SDR, and I've kind of been waiting til there was a monitor available that was strictly better along every axis while remaining highly color accurate. So, nice of Samsung Display and Dell Alienware to launch this right when I needed it!
Sooooo… should I worry about OLED burn in or not? Honestly I will be using this more for productivity than gaming. Or should I just wait for the Dell U4025QW and avoid the OLED burn altogether?
Doesn't matter Dell gives 3 year burn in warranty
Oled burn in has been anxiety inducing a decade ago but modern oleds are much more durable, especially this 3rd gen qdoled. If you’re still worried there’s a 3 year warranty if anything
Sure, if you have a 4090 you can go with 4K. I still think 1440p is the sweet spot.
Honestly, with my eyeballs, even if I had a 7090 Super mega Ti in 3 years, I'm still sticking with 1440. 4k is just completely irrelevant to me, personally. At least in relation to monitors at normal viewing distance for me.
4k still looks better even if you have to use upscaling, speaking as someone who uses a Pixio 1440p 27inch monitor and a Samsung Neo G7 4k 32inch monitor. I literally straight up have the experience and hardware necessary to prove you wrong. I don't even use dlss and it still looks better than my native 1440p monitor in terms of detail and clarity even with FSR on balanced mode.
Complete bullshit that you "need" a 4090 to have a good time with 4k. Hate this myth. I use an RX 6800 XT.
@@BlackSmokeDMax Same. I legit cannot see any practical difference between 1440p and 2560p when I'm gaming on a desktop monitor. I'd rather go 1440 and grab all those extra 1% frames per second instead. Having my lowest 1% FPS as high as possible is priority after hitting 1440 IMO.
Honestly!
"The sweet spot is the worse one." Nah.
The one thing these screens miss is a built in eye saver/night light mode. The Windows one doesn't work with some games in fullscreen vs windowed mode.
Just bought this monitor can’t wait for it to get delivered on Saturday
Is it good?
and? Do you enjoy it?
@@El_Deen it’s been a game changer, no lie. I literally can’t go back to not having an oled 4k monitor. We’re going to upgrade our tv in the living room to be oled next but this monitor has been incredible.
@@TheDeviousLight did you see the LG 32GS95UE ( 4K 240Hz / 480Hz OLED ) coming this month ?
@@TheDeviousLightI hate you. Mine arrived yesterday, only for me to find out today that it's a dead paperweight. It won't power on or nothing. So much for a "great monitor." It's a POS!
One thing I haven't seen yet on this monitor, is reviews describing working with text for 10 hours straight. Is this monitor limited to just gaming? Or can it be also used to work 10h / week day and 10h gamin on weekends? Or am I supposed to get two monitors?
Did you get it? This is the video I've been searching for! I cannot believe nobody has put up a productivity video with this monitor...I'm not a gamer but I got it for Editing and productivity so it needs to work with my M2 Mac studio and I can't find a video to help us out! I may return it for Dell 40" ULTRASHARP 5k but not sure...Alienware aw3225qf is still in the box...there also better not be any scratches on it if i open it up...if so Dell it is!
@@pattip8110 No. Took it for 2 week tests, but text fringing was too much for me to handle, Had to return it and went back to Gigabyte M32U.
The most tempting one for me is the 27” 165 HTz version. QD-older for $360??? That’s actually affordable to someone who isn’t Plouff!!! Plus he already has a monitor
What's the model number?
None are going for $360…
I think you are misinformed. No OLED will be going for that price for quite some time.
Just save up for longer
That one is "Fast IPS", not QD-OLED
Since the firmware updates fixed alot of issues now how does it compare to Lg C2 ? Is there a huge difference in image quality and colour ?
I'm a great fan of Alienware Monitors, but a little disappointed that they didn't upgraded the DP1.4 to DP2.1. The Monitor is very appealing however, especially because of eArc and DP3 99. Would like to see an in depth color accuracy, text clarity and overall settings test.
Same man what monitor are looking to get yourself. I feel like me and you need the same type of purchase and was wondering what your interested in.
In my opinion they also forgot about power delivery and video via USB-C. :(
A KVM switch would also have been nice.
Just use HDMI 2.1 if you want 240 Hz. I don’t see why people have gripes. All the Nvidia bugs will be fixed through software.
@@YTShade I'm thinking of the new Gigabyte. All the CES monitors seem to have the same specs, except gigabyte has DP 2.1.
@1:45 Do you think Alienware realizes that there is a DP 2.1 That support eARC/ARC too? A lot has changed in the 8 years since DP 1.4.
I was kinda waiting for 42/48 inch QD-OLED panels...but maybe this is what I've been waiting for all along.
I wanted 49 too, so old Oleds or mini leds for us, great ;/
Just get the lg c2 then
Funny enough I've been waiting for this to replace my 42" C2. It's just too big for my desk so 32" would be perfect.
11:34 if you watch movies on a 16:9, wouldn't black bars be on the top and bottom and not the sides ?
This looks insane, but im so used to my 43" monitor I really need something at least 38"+ these days
Those sizes are useless for any real work. Good for gaming otherwise terrible.
That's a TV.
@@DrakonR no its not, theres no TV tuner, no remote, no smart apps (android tv etc...) and its a low latency 160Hz panel. It's just a big monitor. If you do programming, video editing, digital audio production, you'd appreciate a large 4K monitor. Its a bonus its great for gaming too ( I sit 1M away from it)...very immersive.
@@jarsky ya, I can hook my PC up to my 85" and therefore it's a monitor too.
Get over it buds, it's a TV.
@@DrakonR no a tv generally doesn't run at 144hz most of the time unless you put it in game mode. It also includes tuner, smart controls, etc...and typically won't have display port either. I know what you're tying to do by conflating the 2, but they're different. Just stop mate
Can clarify how does the earc work on a monitor? Does that mean a soundbar can be connected to that other non-earc hdmi port to receive audio?
For TV usually there’s a dedicated port that outputs earc audio signal.
They should make this monitor in 5K 16:10 resolution, and then add dual-mode so you can switch 1600p (1440p but 16:10) with higher framerate.
It would basically combine the best of both monitors.
My days that sounds good, I'm just waiting for a good 16:10 gaming OLED monitor to come out, not spotted one yet, 16:9 is not my cup of tea for gaming on, not enough vertical space for FPS/3rd-Person games, I also like to use emulators for older 4:3 games and those look much better on a 16:10 display.
@@Wobble2007 Yeah I agree. 16:9 makes alot of sense for a TV, but for a monitor 16:10 is just better.
And the 5K allows you to have dual mode to 1/4th resolution (nn interpolation) with higher frame rate.
Would be perfect imho.
@@jp_8988 5K is a really good res for 16:10, of course I think 10,240x6400 would be the perfect 16:10 resolution, 5K is just over 10 megapixels, but 10K is over 64 megapixels and would give a really nice almost print quality 450 PPI, that is my ultimate 16:10 monitor resolution anyway lol, 16K is 134 megapixels by the way, 128 times 720p, man that would give unbelievable fidelity in a video game, you don't even need to drive it with a native resolution at that PPI level, you only need to give it a 2.5K to get a nice image, obviously the native resolution would be best, but that makes it future-proof.
Yep will be perfect if they release it when the 5090 comes out.
Wouldn't it be better to use a display port connection on this monitor? I thought DP gives better/smoother frames than hdmi. I'm confused.
The older HDMI 1.4 standard didn't have the bandwidth to entirely match DP throughput, but HDMI 2.1 resolves that disparity.
@@bartolomeothesatyr isn't dp still better?
DisplayPort 2.0 technically *_does_* have a higher maximum bandwidth than HDMI 2.1, but unless your GPU can drive 4k HDR at more than 144 fps, you'll never need that extra bandwidth. They both support 4K 144+Hz variable refresh rates with full 10-bit 4:4:4 HDR color gamut, @@nickp3173
Note to editor: 1:30 don't cut away from Plouffe, he was showing where the curving was going to show up in my peripheral vision.
Why does it take so long to get latest DP support in these monitors?
The Gigabyte version will have confirmed DP 2.1. But I wish all the brands offered DP 2.1 rather than old DP 1.4.
I want this in a 40”, 5120x2160 ultrawide 4k format. That would be…the perfect monitor.
100% ! shut up and take my money. i want that so bad.
maybe a 5090 could drive that decent lol
Just upscale through nvidea control panel. 4090 can do it with 60 fps usually. I was playing hunt showdown on a lg c2 at like 7k and looked insane
@@Omnitrio1 how can you upscale via the CP? Never did it
@@Mark-oy8pf I’m thinking that would be the play. GTA 6 will be the goal, and considering it’s supposed to run on modern consoles running equivalents of a 6700XT, I expect a 5080 or 5090 should….should be sufficient for the PC version to run.
That being said, the PC edition will come out until 2026 or 2027, so it is completely possible that they may be targeting that version of the game for the 60 series….
Hey, would've been happy for you to inform us on hardware requirements to get 4k240 running. Even with a 4090 I am worried about the support of 4k240. Plouffe said "using HDMI 2.1a" but the spec doesn't support 4k240 (as stated on NVIDIAs page).
Nvidias cards are dp1.4
@@Nh_audios They are dp 1.4a, which supports this refresh rate and resolution, however the screen only supports dp 1.4 (non-a) as stated by the specs on the website.
@@mltdnmatthe yeah I’m sure the splay is still amazing but I can’t believe Nvidia did that
@@mltdnmattheyou’re overthinking it. You can do 4k240 on all the inputs.
Plouffe should do a long term review on these monitors
Dont worry, you'll hear him talk about that monitor at least once a month through the different videos or posts he will be included in, until he changes monitor
just got this monitor yesterday and its absolutely amazing! what would you recommend for picking the most accuracte color option? (standard / FPS / MOBA / etc etc)
Video idea, guys:
Comparing different panels to each other. Show WHY HDR is so much better in a way that SDR plebs will understand. There's so many videos on how cool a singular monitor is on it's own, but hardly any comparisons online showing the actual differences as apples-to-apples as it could be.
i dont think theres a way to even view hdr comparisons on a non hdr display, which 95% of people watch these videos on. thatd be my guess
They can't show you when you are watching on a SDR screen, it can only be described in nits and experience. You have to go seek out side by side comparison yourself.
@@curtisbme I feel like there's ways to do that.Various black levels are easy enough, show them sbs in a black room, the SDR panels will stand out.
LTT can also put the raw data comparisons out along with their attempt at visually showing SDR users what HDR looks like. The team is beyond creative, I have faith in them.
@@varunaX Not to us on SDR monitors, no. I'm pretty sure there's clever ways to film an HDR and SDR monitor side by side to show the differences with the right cameras and editing to SDR video.
Should i get the aw34 or this? Using RTX 3080 now and looking to play AAA games with RT, and also Overwatch 2? Thanks for the input guys!
1199 damn that was a lot less than I expected I was expected 2k range well that is still in USD so EURO would be 1300 maybe still I did tell myself for 1300 bucks I will consider upgrading my X34P finally
the price would drop to 1100 in euros
@@-Xeno- Well they actually add it on the price, with luck it will be 1199 euros but that's never the case sadly.
@@-Xeno- No you only did change conversion, but here in EU we have 20% VAT over anything imported.
@@ZinoAmareIts 1050 -1100 Euros in Germany at the moment, did you buy it?
So my big question here is what sort of DSC profile is being used to achieve 4k@240Hz, because that is way beyond the uncompressed 48Gbps limit of HDMI 2.1 - even with RBv2 timings. I keep reading that Nvidia cards sacrifice a mess of features to use DSC, and the results are less than pretty. It's such a shame Nvidia didn't embrace DP 2.0 this generation - we might have seen a much better adoption rate of the port.
I am worried about that myself, wouldnt the compression ratio for DP 1.4 be close to 3: 1 for 4k 240HZ 10 bit?
two shortcircuits in 20 minutes? what
CES rush man😂
I would be interested how would this compare to the ASUS equivalent, and yes I mean the one that has only 4K 240 HZ, as they are releasing two versions of the 32 inch OLED (4k 240HZ only and 4k 240hz/1080 480hz).
Yes, please!
I will wait for a flat version, but I'm super happy to see these specs finally arriving. It's a monitor golden age.
the 27 inch is flat. I'd recommend that one
Is this gen 2 or gen 3 QD-OLED? Since gen 3 was shown in the LTT video but the new Odyssey G8 is supposed to be gen 2?
Its gen 3
Now if I could just get this exact panel but flat instead of curved that would be great. 32" monitor that doubles as a bedroom TV. That's my go to. Sadly I probably won't be able to afford this quality of a panel for another 5-10 years 😂
Almost every other manufacturer has announced a flat equivalent. Sadly, this is the only curved one. :(
The Panel is flexible so other Monitor Manufacturers like ASUS are releasing Flat Versions with this exact Panel
What's the problem with the curve?
@@Davids6994 content consumption while laying in bed. Movies don't look great from 8 feet away on a 32" display, let alone a curved 32" display. (My "gaming monitor" = my bedroom TV)
Can Displayport 1.4 or HDMI 2.1 really push 4k 240fps with HDR?
I think my biggest holdup would be that the last time I tried a 32" 4K monitor, I thought that the resulting image at 100% was just a *TINY* bit too small in the desktop. I think I was only using 125% scaling in the end, but that little bit helped. To a degree, I'd rather just avoid scaling if I can, so I wonder if a 36" panel would get me to that sweet spot. Albeit, at that point, I'm getting closer to the 42" TV that I'm currently using!
Using a TV as a monitor is a brutal no lol
@@GirlOnAQuest Depends on which. An LG CX is going to get you far, although way too big for me.
My personal preferences are 27 or 32 for main, 24 or 27 for secondary, nothing bigger, nothing smaller.
unfortunatly 36'' is quiet rare, and doen't exist in WOLED - QD Oled panel market I think
Trying going down in the resolution instead, then you'll won't have to use scaling and you'll get more readable text, as ironic as that is.
@@GirlOnAQuest I was a bit hesitant as I was going from a dual 27" setup to... something. I tried out the Alienware widescreen, the Samsung ultra widescreen, and just a 32" 4K from ASUS. Ultrawide ended up being a no-go due to some games not supporting it, and just not liking the idea that I could be stuck with black bars on the side due to a game not supporting wide or ultra-wide resolutions. The ASUS monitor was just awful in comparison to the other two that it wasn't even a contender.
I ended up picking up an LG 42" C2, and while it's not perfect, it works well enough. (It was also on sale, which was nice compared to its more expensive monitor variant, the ASUS PG42UQ.) I do kind of miss distinct monitors at times, but I also like having the larger monitor in the center. The one weird thing that I have to keep in mind is that I need to wait a second after turning the TV on or else my window positions and sizes will get messed up.
0:36 why are you comparing a gaming focused display to a gaming monitor?
I just bought a 27" 4k 144 Hz monitor for $400 the day before this video releases, because of course I would do that. Ugh.
I mean, it's pretty damn stupid to buy right before CES. So
Yeah I don’t know man why did you decide to buy a monitor right before CES if I may ask?
It'll make a great second monitor to put your Discord on. :D
do you run any game that can do 4k at 144 fps? if not then you are perfectly fine with that monitor
This monitor is $1.199 though. You saved yourself a lot of money.
I love this monitor but I need an answer, which is better Samsung odyssey Neo g8 or this monitor?
I like the 27 inch qhd version better. You definitely do not need to drive a 4k display if you need to use scaling anyways to make it usable.
thats what im thinking too. plus you dont need a maxed our GPU to run games on the 4k monitor
at 7:54 - 8:00, that's not a frame gen issue, that's an effect from the game itself.
Exactly. It's so clear that the ltt team just has no idea what they are talking about when it comes to modern games or graphics tech. Channels like digital foundry are much better for that.
Not using dlss at at least Quality Mode is just stupid nowadays, if you play at 4k. Native with TAA looks worse. The "blur" or whatever he's talking about is either a game issue or placebo. You will get a worse picture quality at native with TAA. And native without AA is just full of aliasing.
@@gavinderulo12 Yes they're actually clueless. At that framerate Frame Gen is literally free fluidity and DLSS results in a flat better image at Quality.
How hard is it to just do this but ultrawide 😭
Ikr
Is it glossy or matte ? No review mentionned it.
I am gonna stick with my Alienware AW3423DW Ultrawide 1440p monitor, for higher FPS, 4K isn't here yet.
Also AW3423DW latest firmware are here now and can be update.
My only concern with this, that y'all didn't touch on, does it still have the weird text issue that other OLEDs have had where the sub pixel layout is incompatible with the windows text protocol? The only dealbreaker for me would be the weird pink and green halo on text.
I need the ultrawide version of this. DP 2.x please!
11:36 Why would you worry about the black bars?
Can cause the panel to burn in faster from running harder. Less automatic brightness limiting going on since a large portion of the screen is off.
@@Phil_529 Only maybe If You have black bars with a static image. 99,999% of the cases You have black bars with a video that shoulden't have any impact.
What system is realistically pushing 4k at 240 FPS to take advantage of this thing
I'm a little confused how this can support 4K 240 HZ on HDMI 2.1 when I thought HDMI 2.1 topped out at 4K 120 or 144?
Compressed
OMG I JUST BOUGHT THE OTHER MONITOR CAUSE LINUS SAID THAT WAS THE BEST ONE!!!11!!11!!1!
now i have to buy this one cause this one is now the best monitor
I did the same now I going to send it back, I ordered this one already lol
Isn't LG coming out with OLED panels that are also 4k240hz but can also switch to 1080p 480hz?
Its kind of crazy how you can convince yourself that screen looks so much better when Im still just watching it on my standard pc monitor. I mean it cant really look any better than my current screen is capable of displaying right? 😂
Any idea when Dolby Vision support will be in Windows? Or do they want you to plug in your Shield and stream from that to take advantage?
It already is supported, but it barely has any uses other than Netflix and pirated movies. My experience is with laptops though, so it may be different with a third party monitor.
Dolby Vision has been supported since Windows 10. EA used to ship busted versions of it in their games.
This is exactly why 4k still to this day doesn't make much sense for gaming. Even with maxed-out PC you won't be able to hit those 120+ fps. 1440p is still the sweet spot. That is why I am so excited for the new QD-OLEDs that have been announced this past week, like the one Linus presented. Another huge thing for me is also the much improved text clarity. 2024 might be the year I'll finally switch to OLED.
Y'know that you can always just change your resolution, also you can reduce your settings in games.
@@ZabivakaPirate69 Lowering res makes the image worse and DLSS at performance is still a pretty big performance hit over 1440p Quality. And reducing settings once again gets you only so far and affects the image quality to such an extent that you wonder why you even bothered to go up to 4k.
@@ZabivakaPirate69 Changing resolution in monitor makes everything look terrible. For some reason they look terrible even if the resolution change is made so that 1 pixel becomes 2x2 grid. No, resolution change is not the answer.
exactly. anyone raving about 4k 240hz doesnt understand that your game has to render 4k at 240fps, which wont exist for years
@@ZabivakaPirate69 what's the point of having this very nice 4k monitor then?
Is the DP to USB c cable BI DIRECTIONAL??
Bought it this morning with chat discount $1079!
What’s a chat discount?
Code?
Sales rep had to make a quote and email a link to click
@@mz1929 so you just call them and tell them what? How do you get discount
What did you ask chat to get the discount? Could you provide order number pls.
0:41 I have acute psychosis and when I hallucinate I hallucinate all the time from getting drugged so from PCP so it's very hard for me so if you guys can work on like the text is very hard to read and then I start hallucinating so just try to work on that I'll try to work on so it's not so harsh on my eyes cuz it's very hard for me to look at like very like pixelated text
Curve on a 16:9 automatically disqualifies it from "The Best"
Aspect ratio is irrelevant. It's the width that determines whether a curve is useful. As the owner of a 32" curved (16:9) monitor, I can confirm that it is.
@@jasonhurdlow6607
Aspect Ratio is NOT irrelevant. It might not be the main factor, but it's not irrelevant at all.
It is a combination of Size, Aspect Ratio, Viewing Distance, Radius of Curve, and Use Case that determines what is useful or not in a monitor.
As a fellow owner of a 32" 16:9 curved monitor (Odyssey Neo G8), I can tell you that its usefulness is completely subjective. For gaming? Maybe, but other than that, I'd rather have an unskewed image for anything else, which is why I have a Flat 32"(INNOCN 32M2V) as my main and use the curved screen for gaming only. Even then, I still use the flat monitor for gaming most of the time. Just feels better to me.
Besides, "The Best" is, and will always be, a matter of personal preference.
0:37
could we see/hear the actual resolutoin please?
those are my initials
Hi, have you had an issues with ghosting? I've also heard the monitor makes some coil noise from the back.
This video was so frustrating to watch (for my wallet)
isn't rog coming out with a 4k QD OLED 240hz 27 inch monitor?
Need this in a ultrawide. True 4k by 2k. No argument here.
I just placed my order today, arriving Monday :) - I currently own the Alienware AW3423DW which I am planning on selling to cover part of this new one.
Why you decided for the QF? Is it better than the DW?
Coming from China, a country where advertisements are forbidden to claim “The best”, I really don’t like this title, especially when there are dozens of other monitors using exactly the same panel.
This isn't an advertisement, it's a personal opinion
what kind of pc do you need to run 4k 240hz? I don't game, I just want this for everyday tasks (i have a rx 6750 xt)
And now we just need to wait for a real review with real information...
Man ltt is trash...
my question is, is the higher PPI that much noticeably better that you would upgrade from an AW3423DW to this?
Yes, I returned my aw3423dwf and the difference from 1440p to 4k especially at 32in. makes any game and video look insanely detailed. You can see pores and wrinkles in skin from far away, grass looks sharp. Spiderman looked blurry on my aw3423dwf but the now it’s crazy at the qf’s PPI
What GPU were you using since you mentioned the usage of HDMI 2.1 at some point and then image generation with path tracing. Is there an RTX4090 that has HDMI 2.1?
Would this be worth it if I'm running a 3090? I was planning on getting the 27" 360hz monitor.
Depends, can the rest of your system run the games you play at 4k >240 Hz?
I thought the max bandwidth for HDMI is 4K 120?
It's doubled with DSC
Never had to think about it before, never purchased external speakers. I understand it comes with eARC that can work with most modern soundbars, but what if you want to use standard PC speakers?
It has to use DSC at 240hz with 4k. That might hinder the clarity.
Great review! I just bought two of these and can't wait to get em running!
I like my AW3223DWF... except the outer layer of the screen is very fragile and easily picks up scuffs. My OLED TV on the other hand is tough like gorilla Glass. I really like this monitor, but I can't imagine buying it if it's going to get scratched up just as easily.
It's bad, you should replace it for this one.
is it better than aw3423dw?