I got this working with Windows 11 on a 5700XT. The gradient in photoshop was much smoother to the point of not being able to see any stepping. I'd also add the the workstation limitation used to be just for desktop apps, but always worked in games. But now in 2023 even the desktop limitation was removed. My question is this. Does it take more VRAM to display 10-bit and more horsepower to render?
I'm receiving a PD3200q tomorrow, I can't get info on how to enable the 10 bit in GTX card, I know that in 2019 NVidia released a driver update to allow 10bit on consumer cards, but I can't find anywhere how to do so, not sure It's working on HDMI or I need a Display port.
Update for anyone watching in 2021, he mentions windows need workstation cards for 10 bit. This is no longer true, both Nvidia and AMD now support 10 bit in consumer grade cards. Just Google which cards support 10 bit colour and for my card (AMD 5900 XT) I have to manually enable via AMD Radeon software as it's disabled by default. Hope this helps someone
Thank you very much for your videos and info on the website, really amaing stuff for someone wading into the printing world. I'd like to add that I was able to enable 10-bit on my BenQ SW270C (confirmed with the 10bit test ramp file in Photoshop), using a GTX 1070 graphics card, by following your steps but also 1) ensuring graphics card drivers are up to date and 2) that under Photoshop Preferences->Performance->Advanced Settings, drawing mode needs to be set to Advanced (or Normal, just not Basic, my Photoshop was set to Basic by default) Hope this helps someone!
I am about to buy a Monitor that can display 10 bit(are those the deep color consumer or professional 10 bits you talked about?). The problem I have is that my graphic card has only DVI output. I read DVI can´t display 10 bit(consumer or professional?). Is it going to make a difference if I bought one of those quadro or ATIFire cards so the monitor can dosplay in fact the 10 bit(consumer or professional?)?
I got this working with Windows 11 on a 5700XT. The gradient in photoshop was much smoother to the point of not being able to see any stepping. I'd also add the the workstation limitation used to be just for desktop apps, but always worked in games. But now in 2023 even the desktop limitation was removed.
My question is this. Does it take more VRAM to display 10-bit and more horsepower to render?
I very much appreciate the frankness that 10 bit display is usually an imperceptible difference!
I'm receiving a PD3200q tomorrow, I can't get info on how to enable the 10 bit in GTX card, I know that in 2019 NVidia released a driver update to allow 10bit on consumer cards, but I can't find anywhere how to do so, not sure It's working on HDMI or I need a Display port.
Update for anyone watching in 2021, he mentions windows need workstation cards for 10 bit. This is no longer true, both Nvidia and AMD now support 10 bit in consumer grade cards. Just Google which cards support 10 bit colour and for my card (AMD 5900 XT) I have to manually enable via AMD Radeon software as it's disabled by default. Hope this helps someone
Yep it's been an option in my 5700XT for several years now
Thank you very much for your videos and info on the website, really amaing stuff for someone wading into the printing world.
I'd like to add that I was able to enable 10-bit on my BenQ SW270C (confirmed with the 10bit test ramp file in Photoshop), using a GTX 1070 graphics card, by following your steps but also 1) ensuring graphics card drivers are up to date and 2) that under Photoshop Preferences->Performance->Advanced Settings, drawing mode needs to be set to Advanced (or Normal, just not Basic, my Photoshop was set to Basic by default)
Hope this helps someone!
You dont need a workstation gpu? I have a 1070 too. We can enable 10-bit? Is there a performance hit?
I am about to buy a Monitor that can display 10 bit(are those the deep color consumer or professional 10 bits you talked about?). The problem I have is that my graphic card has only DVI output. I read DVI can´t display 10 bit(consumer or professional?). Is it going to make a difference if I bought one of those quadro or ATIFire cards so the monitor can dosplay in fact the 10 bit(consumer or professional?)?