I bought a UCR about 9 months ago and I absolutely love it. I use it only with a Blackmagic Design video output device so I don't have any odd computer color profile issues. I find the display works better when used as a true video display. Thats at least what I have found so far in my use. I was also concerned about the local dimming zones but frankly I feel this concern is way over exaggerated in the industry. Yes OLED is awesome and it's easy to love the brightness per pixel but in most content the blooming is rarely an issue. Even when it's there I just don't consider it much of a showstopper. It's still HDR in the sense of hitting 1000 nits. The contrast may not be there but it's not hard to look at scopes and look at a mostly black image and just know the black will likely look darker with less glow on a OLED display. It really doesn't take a lot of imagination to visualize around that blooming in my opinion. What the UCR gets right for HDR is rec2020 color and a sustained 1000 nits across 100% of the frame without any burn in or an eventual dimming of the brightness to prevent burnin. While grading if a bright scene is left on screen for a few minutes while chatting or examining details it will eventually dim. For grading I prefer to have rec2020 color and actual nits of brightness for what I create. If I'm targeting HDR 1000 then I want to see 1000 nits. Not on 10% or 20% of the display but 100% of the display. I personally feel OLED can throw us off more because of all the dimming and limited brightness it does. The UCR allows me to grade much more consistent results in my opinion. Yes the blooming is there at times but I also have a Apple 14" MBP with a XDR display and a 13" M1 iPad Pro with a XDR display that have over 2000 nits and to be honest its not like the blooming ever truly goes away. Smaller blooming is of course better but it's still there on any of my mini LED displays. Even XDR. It's one of those things that on paper feels like it would be inferior but in practice I just don't find a massive difference visually from 512 and 2048 local dimming zones. At least not enough so to say it's no longer there. I will also add I feel like sometimes some movie scenes for space are graded too much with OLED in mind and push contrast along edges too much and if they would work that contrast a little bit more a lot of that extreme blooming could be avoided across any type of panel. I feel like blooming is a thing because content isn't graded properly to compensate for it. Grading on mini LED to me means great HDR content that can look great on both mini Led and OLED.
Good info. Im considering this monitor. Thanks for your input. Would love to get your thoughts on the measured contrast ratio in SDR vs HDR. Does it get above 1000:1 in SDR at all? Also quick note on the following point: "If I'm targeting HDR 1000 then I want to see 1000 nits. Not on 10% or 20% of the display but 100% of the display." This is a misnomer that HDR is all about full screen 100% brightness. it only concerns specular highlights or bright areas and the differenc those bright areas and the darkest parts. Look ups light spaces article on HDR. They point this out. If full screen brightness was the essence of HDR then you would get a very flat washed out image at that high luminance of equal value across the screen. And every manufactures spec for HDR peak brightness is talking about those brightness in "peak" luminance values. The full screen brightness spec is separate and always indicated as such with the wording "full screen brightness" which is always lower than peak brightness. But outside of that would love to get your feedback on measured contrast ratio in SDR v HDR.
@@chevonpetgrave4991 sustained brightness is important. Yes it should only be used for specular but sometimes that specular can take up a larger portion of the screen. This is where OLED can struggle and the brightness drops considerably to 200 nits when there are larger areas of coverage like an explosion on screen or someone opening a door to let in a huge amount of light. This is the problem I have with OLED and why it’s not the end all HDR solution. Not yet at least. Plus OLED drops brightness over time to prevent burn in. You could be grading a scene of a bright sky or interview in front of a window where that bright region stays on screen for many minutes. The panel will lower the brightness eventually which means the brightness you see after a minute or two will not be what you saw initially. This is why I prefer a display that can have a sustained 1000 nits of brightness. I know there is nothing funny going on. I can grade the same scene for over 30 minutes and it’s still just as bright. I can grade an explosion or sudden burst of bright light and it will be as bright as the glint along the edge of the sword. It’s a level of consistency just not possible on OLED. The only negative is the blooming which I feel some incorrectly assume is a lack of consistency but it’s really just an artifact. It’s annoying to see but so far has not had a negative impact on my ability to judge color or brightness on anything. As for SDR contrast I honestly don’t know. I only use it for HDR. It’s only hooked up to a AV output device and I only use it for a video output when working with HDR. To me black levels are black levels however and when I have output SDR to the display it was still capable of using the local dimming zones to turn off the darker regions. That should not change if it’s SDR vs HDR. What impacts contrast on QLED is how many dimming zones can turn off the light to make those pixels true black. A normal LED just has a single light so the contrast sucks. This display looks good with SDR content as far as I can tell because it still has the ability to turn off the lights behind the pixels that should be dark. Beyond that I don’t have much more of a professional opinion on how it looks vs something like OLED for SDR content. I do however question the industry need to grade at that level of contrast. Just how dark do we want people to see? How important are those details in the 0 to 1 nit range that we want people to be paying attention to vs the actual subject?
I'd definitely want at least a 120hz monitor. 2024 is also going to be a great year for affordable HDR monitors. CES is coming up next week. Hope you can give your opinions on future monitors
I found myself here after buying this monitor and I have the same problem you do where discolored blobs appear in areas of uniform gray, typically in the sidebars and backgrounds of windows. I notice it heavily in my text editor. I'm not a colorist, and solid color across the screen is obviously an uncommon scenario when grading video, but seeing the disparity in perceived color in these discolored blobs does give me pause when I think about how important color accuracy is for many professionals. The issue is pretty easy to perceive if you set the Dynamic Dimming to Fast. The problem goes away entirely if you set Dynamic Dimming to OFF, but of course you then lose a ton of contrast in dark areas. I've only had this monitor for a couple days now but so far everything you've said in your video has been absolutely spot on in my experience with it so far.
Have you tried the m4 iPad Pro? It’s surprisingly good, tandem oled helps with the brightness. I have the mini led m2 version, which is also surprisingly good.
12:42 Yes, these problems are normal for that display. I use PA32UCX-K, and it has the same "effects." I don't know why ASUS local dimming algorithms work this way. But it's quite annoying. You also should be able to switch to the SDR monitor profile, and then that backlight behavior should disappear, as all mini-LEDs will work synchronously. 16:09 Never managed to get the calibration software working properly. Always getting pannel overheat warning during the calibration. :)
The innocn 27m2v and 32m2v is a straight upgrade from this monitor for 25-30% less cost. More or less has all the same specifications except with 1152 dimming zones and a 160hz refreshrate on the 27m2v, 144hz on the 32m2v.
@@truedoh2831 I bought it for $600, and it has a warranty and yes I am Canadian. It is indeed an upgrade too. Theres no specs that the PA32UCR beat it at.
Many content creators do not use the hardware outside the reviews … only use it for the review or a youtube build and then they use their hardware again … hence follow ups rarely happen.
I sold my PA32UCR recently to make the switch to a higher end gaming OLED. Color accuracy and uniformity is mostly the same, and I strongly prefer the contrast and higher refresh rate. However, I will definitely miss the sustained 1000nit at full window
Being a gamer as well as someone who does video stuff, i landed on a Samsung Odyssey Neo G7 with 1196 backlight zones, 4k resolution, 165hz, 10 bit colors, "2000" nits peak brightness, HDR support. It does what i need for both tho i cant seem to get Davinci Resolve to recognize it as an HDR display
Thanks for making this video. can we turn off mini-LED to minimize the halo effect on this monitor or on the one you own? It seems like a major detriment when I do color corrections.
imagine in the future like in 20 years they figure out how to manufacture Micro LED displays good enough that we all have relatively affordable Micro LED monitors 🥲extreme high brightness LEDs and OLED like blacks, maybe we can dream of a cool future
Thanks for the insightful video. I wanted to get the UCR but it got sold out in B&H. I'm intrigued to find out that the display peak brightness can go over 100nits, but I suspect the 1000nit designation is for sustained brightness. Hopefully, it gets back in stock.
It is more or less sustained even if they don't say it. Since I mainly work with HDR 1000 I find the 1000 nits sustained to be just about perfect and I don't really need more. I use both of my Apple XDR displays (14" MBP and 13" iPad Pro) set to 1000 sustained HDR 1000 as well and all three look gorgeous together. So happy to have 32" now along with really good rec2020 color.
Did you consider the philips 27e1n8900 (ProArt PA27DCE panel equivalent) which has 400 nits peak and is RGB OLED for $1000? That's what I went with and I'm very satisified with it. I don't have it calibrated though, I'm not really sure how to especially with HDR. As far as peak brightness: more nits is always nice, but movies have looked wonderful for years in SDR. I think needing every shot in a video to go over 200 nits across the whole panel is probably overkill. I started thinking this way after watching a talk on HDR best practices where they discussed the conservative use of peak brightness in Dune (2021).
OLEDs are definitely a great option for HDR grading. I just happened to lean towards Mini-LED in this case because I wanted something I could use without fear of encountering burn-in. I was also worried that the videos I graded on the OLED were brighter than I thought they were due to the panel limiting the brightness.
@@VideoTechExplained Ah that's fair and is a legitimate concern. As far as dimming brightness don't these monitors have a clipping mode where it doesn't tonemap (I could be totally wrong)? Anyway thanks for the video, I appreciate the effort you put into these! You've been the most informative channel for HDR editing setups!!
Can this monitor be used for single-cable connection to a laptop? It lists 80W power delivery, but also says the DisplayPort support is only v1.2, suggesting you need to use its HDMI 2.0 ports for HDR support.
I haven't tested single-cable but have no reason to think it wouldn't work. You can send display output over its Type C port. I had no issues with getting HDR to work over displayport
I have two of them! They're the other two monitors on my desk in the background. I can't speak to the quality of its reproduction of AdobeRGB since I've almost exclusively used them in Rec.709 mode. But if you're looking for a color-accurate SDR monitor they seem to be the best bang for the buck
Hello, when I use DaVinci to make Dolby Vision HDR videos on this monitor, should the color space of this monitor be set to HDR- PQ DCI, HDR-PQ REC2020, HDR- HLG BT2100, or HDR- HLG DCL? Which one is the best? Thank you. You must help me solve this problem, because the videos I uploaded to TH-cam are over-excited, while the videos on the computer disk can be viewed normally. Thank you
I also just bought the UCR and have been loving it aside from an issus it had with the blackmagic decklink 4k card I'm using in my PC to view an HDR image out of resolve. I'm curious if anybody else had this issue where the monitor would receive a 1080p image just fine but when putting the output to 4k it would randomly go black and have artifacts at the top of the frame. I ultimately fixed the issue by putting an HDMI splitter in between the monitor and the decklink card but it was annoying that the two didn't play nicely together at 4K.
I use a BMD video output card on a Mac and output UHD perfectly fine to my UCR. Funny story. I used to have a Mac that I used a eGPU with. Since the new Apple Silicon Macs no longer support eGPU I use the enclosure for the BMD Decklink card. Works perfectly since the enclosure is just a 16x slot with a power supply. So I got to repurpose the eGPU enclosure and have a nice true UHD HDR video output from both FCP and Resolve on my MBP.
I too had to use the user mode for my HDR. I deleted the calibration app and reinstalled it with a private link. I'd suggest not to delete the app because ASUS is reluctant to give you a download link. Remember their HDR is not 100% accurate so I always create an offset node to compensate from what I see to what is exported in davinci resolve. My offset node is contrast at .800 and pivot .651. I turn it on to grade and turn it off when I export HDR content.
As it stands there is no standard for HDR apart from 1000 nits peak brightness minimum for the least optimum experience in most scenarios. The closes thing to a standard is Dolby Vision’s P3 based (not Rec2020) PQ curve workflow. Which is a color grading (not consumer) guideline. You affecting the curve PQ transfer curve to look like what exactly? What reference are you basing those offset contrast and pivot adjustments on?
Whenever I compare HDR to SDR versions of movies I always prefer SDR. HDR is just too vivid to tolerate for the length of a movie. Shoot in HDR and you can fine tune your SDR output very nicely.
For movies HDR is a blessing … looks so much better than non-HDR. You simply must go with quality equipment … many monitors have crappy pseudo HDR that is useless. The HDR of budget TV’s often looks dodgy as well. It is also not supposed to look overly vivid, then you must calibrate your device properly. Not comparable to vivid mode of TV’s which are usually unwatchable.
Hello I got a question I watch your video on why some movies wish as the prequel trilogy can’t be remastered but what if they transferred to 35mm film and then scanned digitally at a higher resolution?
Transferring from digital to 35mm won't add any new detail, it would just introduce film grain & other imperfections. Garbage in, garbage out. You can't increase the resolution of an image just by changing mediums
Okay maybe they won’t get their resolution increased but are you sure that they would be ruined if they were transferred to film because Dune (2021) and The Batman (2022) were shot digitally and then transferred to 70mm film and they look great?!
@@alexcontrerasnegativened7777 Dune was shot for IMAX digitally using an ARRI ALEXA LF at 4.5K resolution. The Batman was also shot on an ARRI ALEXA LF at 4.5K and 6K resolution. Star Wars Episode 2, by comparison, was shot at 2k resolution. You don't magically gain extra detail just by transferring digital to film, the detail has to be there in the first place to show up on film after the transfer.
I cannot stand 24fps videos on a 60hz monitor, 120, 144 and 240 are all multiples of 24, so they play it great. Of course you can set the monitor to 48hz and 24p content will be much smoother, but then everything else will feel horrible
You still cant actually use either for true HDR work. I hope you don't sell your services to any real delivery streams like Netflix or any other pro network.
He explained this at the beginning of the video Pro work is subjective. I color, QC and deliver HDR work all the time on a monitor that doesn’t support 1000 Nits. All while making a good living. For job’s that absolutely require it, I’ll rent out suite time or the monitor itself which does support 1000 Nits
don't stop. your videos are so useful and unique.
Thanks for the detailed review!
I’m curious if you have had any problems with overheating with this monitor?
Nope. It's never overheated on me and the fan has never been obtrusive. I can't even hear it next to my PC
@@VideoTechExplained awesome, thank you!
I bought a UCR about 9 months ago and I absolutely love it. I use it only with a Blackmagic Design video output device so I don't have any odd computer color profile issues. I find the display works better when used as a true video display. Thats at least what I have found so far in my use.
I was also concerned about the local dimming zones but frankly I feel this concern is way over exaggerated in the industry. Yes OLED is awesome and it's easy to love the brightness per pixel but in most content the blooming is rarely an issue. Even when it's there I just don't consider it much of a showstopper. It's still HDR in the sense of hitting 1000 nits. The contrast may not be there but it's not hard to look at scopes and look at a mostly black image and just know the black will likely look darker with less glow on a OLED display. It really doesn't take a lot of imagination to visualize around that blooming in my opinion.
What the UCR gets right for HDR is rec2020 color and a sustained 1000 nits across 100% of the frame without any burn in or an eventual dimming of the brightness to prevent burnin. While grading if a bright scene is left on screen for a few minutes while chatting or examining details it will eventually dim. For grading I prefer to have rec2020 color and actual nits of brightness for what I create. If I'm targeting HDR 1000 then I want to see 1000 nits. Not on 10% or 20% of the display but 100% of the display. I personally feel OLED can throw us off more because of all the dimming and limited brightness it does. The UCR allows me to grade much more consistent results in my opinion.
Yes the blooming is there at times but I also have a Apple 14" MBP with a XDR display and a 13" M1 iPad Pro with a XDR display that have over 2000 nits and to be honest its not like the blooming ever truly goes away. Smaller blooming is of course better but it's still there on any of my mini LED displays. Even XDR. It's one of those things that on paper feels like it would be inferior but in practice I just don't find a massive difference visually from 512 and 2048 local dimming zones. At least not enough so to say it's no longer there.
I will also add I feel like sometimes some movie scenes for space are graded too much with OLED in mind and push contrast along edges too much and if they would work that contrast a little bit more a lot of that extreme blooming could be avoided across any type of panel. I feel like blooming is a thing because content isn't graded properly to compensate for it. Grading on mini LED to me means great HDR content that can look great on both mini Led and OLED.
Good info. Im considering this monitor. Thanks for your input. Would love to get your thoughts on the measured contrast ratio in SDR vs HDR. Does it get above 1000:1 in SDR at all? Also quick note on the following point: "If I'm targeting HDR 1000 then I want to see 1000 nits. Not on 10% or 20% of the display but 100% of the display." This is a misnomer that HDR is all about full screen 100% brightness. it only concerns specular highlights or bright areas and the differenc those bright areas and the darkest parts. Look ups light spaces article on HDR. They point this out. If full screen brightness was the essence of HDR then you would get a very flat washed out image at that high luminance of equal value across the screen. And every manufactures spec for HDR peak brightness is talking about those brightness in "peak" luminance values. The full screen brightness spec is separate and always indicated as such with the wording "full screen brightness" which is always lower than peak brightness. But outside of that would love to get your feedback on measured contrast ratio in SDR v HDR.
@@chevonpetgrave4991 sustained brightness is important. Yes it should only be used for specular but sometimes that specular can take up a larger portion of the screen. This is where OLED can struggle and the brightness drops considerably to 200 nits when there are larger areas of coverage like an explosion on screen or someone opening a door to let in a huge amount of light.
This is the problem I have with OLED and why it’s not the end all HDR solution. Not yet at least. Plus OLED drops brightness over time to prevent burn in. You could be grading a scene of a bright sky or interview in front of a window where that bright region stays on screen for many minutes. The panel will lower the brightness eventually which means the brightness you see after a minute or two will not be what you saw initially.
This is why I prefer a display that can have a sustained 1000 nits of brightness. I know there is nothing funny going on. I can grade the same scene for over 30 minutes and it’s still just as bright. I can grade an explosion or sudden burst of bright light and it will be as bright as the glint along the edge of the sword. It’s a level of consistency just not possible on OLED. The only negative is the blooming which I feel some incorrectly assume is a lack of consistency but it’s really just an artifact. It’s annoying to see but so far has not had a negative impact on my ability to judge color or brightness on anything.
As for SDR contrast I honestly don’t know. I only use it for HDR. It’s only hooked up to a AV output device and I only use it for a video output when working with HDR. To me black levels are black levels however and when I have output SDR to the display it was still capable of using the local dimming zones to turn off the darker regions. That should not change if it’s SDR vs HDR. What impacts contrast on QLED is how many dimming zones can turn off the light to make those pixels true black. A normal LED just has a single light so the contrast sucks. This display looks good with SDR content as far as I can tell because it still has the ability to turn off the lights behind the pixels that should be dark. Beyond that I don’t have much more of a professional opinion on how it looks vs something like OLED for SDR content.
I do however question the industry need to grade at that level of contrast. Just how dark do we want people to see? How important are those details in the 0 to 1 nit range that we want people to be paying attention to vs the actual subject?
I love your videos. I hope you will start uploading again. I learn so much from these videos and I’ve always had a huge interest in video technology.
I'd definitely want at least a 120hz monitor. 2024 is also going to be a great year for affordable HDR monitors. CES is coming up next week. Hope you can give your opinions on future monitors
Any monitors from CES that you’re excited about..I’m thinking of waiting abit
Probably the ASUS ROG Swift PG32UCDM thats comming out this month. 4K 240hz OLED 1000-nits Dolby Vision. Glossy screen. @@Iamlordmanu
How many hz is this monitor?
@HindsightWithHustle from what I could find, 60hz unfortunately
@@HindsightWithHustleNot that relevant for content creation … most still have 60/75Hz. Even up to 3000£.
Ah, he's back!
I found myself here after buying this monitor and I have the same problem you do where discolored blobs appear in areas of uniform gray, typically in the sidebars and backgrounds of windows. I notice it heavily in my text editor. I'm not a colorist, and solid color across the screen is obviously an uncommon scenario when grading video, but seeing the disparity in perceived color in these discolored blobs does give me pause when I think about how important color accuracy is for many professionals. The issue is pretty easy to perceive if you set the Dynamic Dimming to Fast. The problem goes away entirely if you set Dynamic Dimming to OFF, but of course you then lose a ton of contrast in dark areas. I've only had this monitor for a couple days now but so far everything you've said in your video has been absolutely spot on in my experience with it so far.
The youtube thing you mentioned is a feature of youtube when you are not in theater mode, it's not an issue with the display
KEEP MAKING VIDEOS, WITTY AND CLEVER great videos
waiting for an updated video on the monitor before i pull the trigger and get it
Your amazing! Help up the great work
I very rarely leave comments.
That was a great video.
Great info. Keep em coming. Thanks!!
thank you so much for the info!
Have you tried the m4 iPad Pro? It’s surprisingly good, tandem oled helps with the brightness. I have the mini led m2 version, which is also surprisingly good.
12:42 Yes, these problems are normal for that display. I use PA32UCX-K, and it has the same "effects." I don't know why ASUS local dimming algorithms work this way. But it's quite annoying. You also should be able to switch to the SDR monitor profile, and then that backlight behavior should disappear, as all mini-LEDs will work synchronously.
16:09 Never managed to get the calibration software working properly. Always getting pannel overheat warning during the calibration. :)
The innocn 27m2v and 32m2v is a straight upgrade from this monitor for 25-30% less cost.
More or less has all the same specifications except with 1152 dimming zones and a 160hz refreshrate on the 27m2v, 144hz on the 32m2v.
It's not exactly an upgrade, but better in some ways. Price depends on where you live. It's over 1k in Canada and there's no warranty.
@@truedoh2831 I bought it for $600, and it has a warranty and yes I am Canadian.
It is indeed an upgrade too. Theres no specs that the PA32UCR beat it at.
Please make 30 day later review so we can see your experience after using it
I used it during the production of this video and I had no issues with it!
Many content creators do not use the hardware outside the reviews … only use it for the review or a youtube build and then they use their hardware again … hence follow ups rarely happen.
I sold my PA32UCR recently to make the switch to a higher end gaming OLED. Color accuracy and uniformity is mostly the same, and I strongly prefer the contrast and higher refresh rate. However, I will definitely miss the sustained 1000nit at full window
Which gaming OLED? I am trying get a monitor for grading
Asus proart do have oled lines but low refresh rate
That ambient light on youtube is default, so maybe its not a panel issue
Do you use Davinci Resolve? Do you have a decklink for hdr connection with your monitor?
Being a gamer as well as someone who does video stuff, i landed on a Samsung Odyssey Neo G7 with 1196 backlight zones, 4k resolution, 165hz, 10 bit colors, "2000" nits peak brightness, HDR support. It does what i need for both tho i cant seem to get Davinci Resolve to recognize it as an HDR display
Thanks for making this video. can we turn off mini-LED to minimize the halo effect on this monitor or on the one you own? It seems like a major detriment when I do color corrections.
You can turn off the local dimming but then the shadows are incredibly washed out
for youtube did you have ambient mode on? that makes a color gradient on the side to make it more "immersive"
Nope, I checked that. Ambient mode was off
imagine in the future like in 20 years they figure out how to manufacture Micro LED displays good enough that we all have relatively affordable Micro LED monitors 🥲extreme high brightness LEDs and OLED like blacks, maybe we can dream of a cool future
Thanks for the insightful video. I wanted to get the UCR but it got sold out in B&H. I'm intrigued to find out that the display peak brightness can go over 100nits, but I suspect the 1000nit designation is for sustained brightness. Hopefully, it gets back in stock.
It is more or less sustained even if they don't say it. Since I mainly work with HDR 1000 I find the 1000 nits sustained to be just about perfect and I don't really need more. I use both of my Apple XDR displays (14" MBP and 13" iPad Pro) set to 1000 sustained HDR 1000 as well and all three look gorgeous together. So happy to have 32" now along with really good rec2020 color.
Did you consider the philips 27e1n8900 (ProArt PA27DCE panel equivalent) which has 400 nits peak and is RGB OLED for $1000?
That's what I went with and I'm very satisified with it. I don't have it calibrated though, I'm not really sure how to especially with HDR.
As far as peak brightness: more nits is always nice, but movies have looked wonderful for years in SDR. I think needing every shot in a video to go over 200 nits across the whole panel is probably overkill. I started thinking this way after watching a talk on HDR best practices where they discussed the conservative use of peak brightness in Dune (2021).
OLEDs are definitely a great option for HDR grading. I just happened to lean towards Mini-LED in this case because I wanted something I could use without fear of encountering burn-in. I was also worried that the videos I graded on the OLED were brighter than I thought they were due to the panel limiting the brightness.
@@VideoTechExplained Ah that's fair and is a legitimate concern. As far as dimming brightness don't these monitors have a clipping mode where it doesn't tonemap (I could be totally wrong)?
Anyway thanks for the video, I appreciate the effort you put into these! You've been the most informative channel for HDR editing setups!!
Yes. The backlight blooming doesn’t affect grading decisions.
Can this monitor be used for single-cable connection to a laptop? It lists 80W power delivery, but also says the DisplayPort support is only v1.2, suggesting you need to use its HDMI 2.0 ports for HDR support.
I haven't tested single-cable but have no reason to think it wouldn't work. You can send display output over its Type C port.
I had no issues with getting HDR to work over displayport
@@VideoTechExplained good point, probably can tunnel HDMI that way or newer DP protocol. Thanks!
Would you make a video about the PA279CRV? It seems to be one of the cheapest AdobeRGB capable displays available.
I have two of them! They're the other two monitors on my desk in the background. I can't speak to the quality of its reproduction of AdobeRGB since I've almost exclusively used them in Rec.709 mode. But if you're looking for a color-accurate SDR monitor they seem to be the best bang for the buck
@@VideoTechExplained That's good to hear. How do you calibrate them?
32” might be good as my primary display, but my others are 27”. Did you research the PA27UCX? Is it essentially the same?
It looks like the specs are pretty much the same, yeah
@@VideoTechExplained It does look like the 27" model has Dolby Vision support.
Hello, when I use DaVinci to make Dolby Vision HDR videos on this monitor, should the color space of this monitor be set to HDR- PQ DCI, HDR-PQ REC2020, HDR- HLG BT2100, or HDR- HLG DCL? Which one is the best? Thank you. You must help me solve this problem, because the videos I uploaded to TH-cam are over-excited, while the videos on the computer disk can be viewed normally. Thank you
Brother can you please do a detailed review about ASUS ProArt Display OLED PA32DC Professional Monitor.
I would if I could afford one!
@@VideoTechExplained Thanks brother.
I also just bought the UCR and have been loving it aside from an issus it had with the blackmagic decklink 4k card I'm using in my PC to view an HDR image out of resolve. I'm curious if anybody else had this issue where the monitor would receive a 1080p image just fine but when putting the output to 4k it would randomly go black and have artifacts at the top of the frame. I ultimately fixed the issue by putting an HDMI splitter in between the monitor and the decklink card but it was annoying that the two didn't play nicely together at 4K.
I use a BMD video output card on a Mac and output UHD perfectly fine to my UCR.
Funny story. I used to have a Mac that I used a eGPU with. Since the new Apple Silicon Macs no longer support eGPU I use the enclosure for the BMD Decklink card. Works perfectly since the enclosure is just a 16x slot with a power supply. So I got to repurpose the eGPU enclosure and have a nice true UHD HDR video output from both FCP and Resolve on my MBP.
I have this monitor and the 4K mini monitor, and it's a known issue with that card. It's ancient and they haven't updated it in years.
I too had to use the user mode for my HDR. I deleted the calibration app and reinstalled it with a private link. I'd suggest not to delete the app because ASUS is reluctant to give you a download link. Remember their HDR is not 100% accurate so I always create an offset node to compensate from what I see to what is exported in davinci resolve. My offset node is contrast at .800 and pivot .651. I turn it on to grade and turn it off when I export HDR content.
As it stands there is no standard for HDR apart from 1000 nits peak brightness minimum for the least optimum experience in most scenarios. The closes thing to a standard is Dolby Vision’s P3 based (not Rec2020) PQ curve workflow. Which is a color grading (not consumer) guideline.
You affecting the curve PQ transfer curve to look like what exactly? What reference are you basing those offset contrast and pivot adjustments on?
On your video titled: "Why some films can never be remastered" what is the background song starting at minute 1:30??
What about stills?
Whenever I compare HDR to SDR versions of movies I always prefer SDR. HDR is just too vivid to tolerate for the length of a movie. Shoot in HDR and you can fine tune your SDR output very nicely.
For movies HDR is a blessing … looks so much better than non-HDR. You simply must go with quality equipment … many monitors have crappy pseudo HDR that is useless. The HDR of budget TV’s often looks dodgy as well. It is also not supposed to look overly vivid, then you must calibrate your device properly. Not comparable to vivid mode of TV’s which are usually unwatchable.
Hello I got a question I watch your video on why some movies wish as the prequel trilogy can’t be remastered but what if they transferred to 35mm film and then scanned digitally at a higher resolution?
Transferring them to film won't magically increase their resolution or detail.
Even if they’re transferred to 70mm or even 120mm?
Transferring from digital to 35mm won't add any new detail, it would just introduce film grain & other imperfections. Garbage in, garbage out. You can't increase the resolution of an image just by changing mediums
Okay maybe they won’t get their resolution increased but are you sure that they would be ruined if they were transferred to film because Dune (2021) and The Batman (2022) were shot digitally and then transferred to 70mm film and they look great?!
@@alexcontrerasnegativened7777 Dune was shot for IMAX digitally using an ARRI ALEXA LF at 4.5K resolution. The Batman was also shot on an ARRI ALEXA LF at 4.5K and 6K resolution. Star Wars Episode 2, by comparison, was shot at 2k resolution. You don't magically gain extra detail just by transferring digital to film, the detail has to be there in the first place to show up on film after the transfer.
👍
ভাই গুগলের রহস্য নিয়ে একটি ভিডিও ইউটিউবে বানিয়ে ছি দেকে আসেন😂😂
i just want to sleep good
Personally im not a fan of Flanders Scientific as they are merely a reseller of another product at a huge markup
I cannot stand 24fps videos on a 60hz monitor, 120, 144 and 240 are all multiples of 24, so they play it great. Of course you can set the monitor to 48hz and 24p content will be much smoother, but then everything else will feel horrible
Still not full frame 1000 nits it is peak L
Enough with this monitor....
You still cant actually use either for true HDR work. I hope you don't sell your services to any real delivery streams like Netflix or any other pro network.
What even is "true HDR" ?
By that definition, what monitor do you need at minimum for True SDR Work? xD
He explained this at the beginning of the video
Pro work is subjective. I color, QC and deliver HDR work all the time on a monitor that doesn’t support 1000 Nits. All while making a good living. For job’s that absolutely require it, I’ll rent out suite time or the monitor itself which does support 1000 Nits