Why No 12-Bit TVs & What’s 12-bit color anyway?

แชร์
ฝัง
  • เผยแพร่เมื่อ 15 พ.ย. 2024

ความคิดเห็น • 412

  • @musiclistener211
    @musiclistener211 4 ปีที่แล้ว +85

    I wanted to clear up a few things from your video:
    The EOTF (electrical optical transfer function) of the TV converts the maximum brightness to what your TV can display, and that since it is a 10 bit panel, there would still be 1024 gradients of each color on the screen. Since the TV can't get as bright as the source material in some cases, this would mean the gradients are just more compressed (meaning more pixel gradient information at all shades of color). If the TV is capable of displaying the brightness intended by the source material, say 4,000 nits, the EOTF would not have to re-map the pixel shades to lower brightnesses (the higher brightness shades are re-mapped more severely than lower brightness shades), but the TV would still display 10 bits of color. If this is true, and I think I have a proper understanding of the topic, a 12 bit TV would have some benefit in reducing banding gradients, even if the TV is not capable of say 4,000 nits. Banding gradients are highly reduced by dithering though, so a 12 bit source signal played back on a 10 bit TV capable of reading the 12 bit source (Dolby Vision) and processing the source data with dithering before displaying the final 10 bit signal on the TV would produce a much cleaner image than a 10 bit source could produce, and nearly indistinguishable (though still could be technically detected) from a true 12 bit image theoretically.
    Secondly in your blue to white example, the "blue" would look almost completely black at low brightness, and should look very bright and blue at the highest brightness your TV can produce, not white. If all 3 pixels are driven as hard as they can at the same time, this should produce the highest luminance your TV is capable of producing (in an RGB TV). Only by blending the 3 pixels together do we see white. The current OLED's have an additional white sub pixel which adds to the overall brightness of the TV but not the maximum color intensity of a single pixel.

    • @stopthefomo
      @stopthefomo  4 ปีที่แล้ว +28

      AWESOME insight, thank you for educating us. I'm pinning this! Gotta love my viewers, please continue contributing 😊

    • @truthseeker6649
      @truthseeker6649 4 ปีที่แล้ว +3

      doesn't dithering add quantization noise ? and you need to remove that with noise-shaping filters. it's already done in audio with side effects which is a controversial subject.

    • @ezilka
      @ezilka 4 ปีที่แล้ว

      musiclistener211 came for dithering comment and got it. Somehow ive was waiting for fomo to talk about it.
      Because you dont need 12 bit panel TV to show 12 bit signal aproximation using dithering.

    • @C--A
      @C--A 4 ปีที่แล้ว +3

      @@ezilka True though a 12 bit panel will produce better picture quality than a 10 bit panel specifically for 4K blu ray discs mastered in Dolby Vision at 12 bits.
      Its similar to current market where a 10 bit tv consisting of 8 bit+FRC never has as good picture quality as a true 10 bit tv.

    • @optimalsettings
      @optimalsettings 4 ปีที่แล้ว

      So does it mean sending 12 Bit from PC is visually better than sending 10 Bit?

  • @KoseChris
    @KoseChris 4 ปีที่แล้ว +10

    Great video, great explanation and i totally agree we need 12 bit. Also going from 6bit to 10bit is almost the same as going from 60hz to 144hz, but from 10bits to 12bits its the same as going from 144hz to 240hz almost not noticeable but it is and the extra "smoothness" adds a lot to the overall smoothness so would 12 bits.

  • @johncolbron3136
    @johncolbron3136 4 ปีที่แล้ว +12

    Your videos are so good at cutting through the jargon and explaining everything clearer than these tv manufacturers keep up the good work

  • @madpistol
    @madpistol 4 ปีที่แล้ว +7

    As someone who just bought a 55" CX, this definitely makes me feel better about my purchase. The TV can't display 12-bit color. So? I had a hard time believing there is a much better TV than this currently on the market. The colors, contrast, motion handling, gaming features, etc. are all sooooooo good. The TV looks amazing!

    • @stopthefomo
      @stopthefomo  4 ปีที่แล้ว +2

      Exactly. This is why the C9 and A9G last year continue to be relevant options this year and for the foreseeable future!

    • @madpistol
      @madpistol 4 ปีที่แล้ว +1

      Yokai Ninja99 I was going to post somewhere that I’ve been incredibly impressed with the dark scene performance of the CX... that is to say I don’t see any black crush. Everything looks very defined. The CX is a visual treat.

  • @timmturner
    @timmturner 4 ปีที่แล้ว +44

    I need 10K nits so I can tan at home, just put up an image of the sun and tan away.

    • @darkknightforU
      @darkknightforU 4 ปีที่แล้ว +5

      T Turner and Greta Thunberg standing knocking outside your door and screaming how dare you 😂😂😂

    • @CaptainScorpio24
      @CaptainScorpio24 4 ปีที่แล้ว

      🤣

    • @kjellrni
      @kjellrni 4 ปีที่แล้ว +5

      I won't be happy until my TV can generate X-rays.

    • @CaptainScorpio24
      @CaptainScorpio24 4 ปีที่แล้ว

      @@kjellrni 👉😉

    • @CaveyMoth
      @CaveyMoth 4 ปีที่แล้ว +1

      This would be great for watching the movie 'Sunshine.'

  • @wizzdom100
    @wizzdom100 4 ปีที่แล้ว +10

    Sony has a concept 8k 85 inch 10,000 nit tv, a few years now, hopefully we get it in the next 7 years.

  • @coisasnatv
    @coisasnatv 2 ปีที่แล้ว +2

    Tv's are limited to 10 bits?
    My 2012 Sony Bravia KDL-47W805A (KDL-47W802A in US) has a 12-bit display, it works on both 4:4:4 RGB/YCbCr, this is the one that I use to edit videos and photos and it can display 12-bit colors just fine. The sad thing is that OLED and other technologies still uses a 10-bit display.

    • @C--A
      @C--A 2 ปีที่แล้ว

      No your 2012 Sony Bravia doesn't lol. No current consumer tv or professional monitor has native 12 bit depth!
      That Sony Bravia tv you own from over a decade ago is a second/third tier at the time cheap IPS LCD with poor black levels compared to VA LCDs. Let alone OLEDs with true blacks!

    • @coisasnatv
      @coisasnatv 2 ปีที่แล้ว +2

      @@C--A I work for the industry since late 80's and yes, this 2012 Sony Bravia Monitor has a native 12-bit depth display, only a few of us had the information at the time and this is why I bought it and this is the reason why some studios still uses.
      To me, it really doesn't matter what you think about it. This just proves that people that suppose to know these things actually doesn't know nothing at all.

    • @akyhne
      @akyhne 18 ชั่วโมงที่ผ่านมา

      ​@@coisasnatvYour display is not 12 or 10 bit. It's the electronics inside, that works with a certain bitrate. The LEDs on the display don't care about bit-rate.

    • @coisasnatv
      @coisasnatv 15 ชั่วโมงที่ผ่านมา

      @@akyhne It has nothing to do with bit-rate, the display can handle 12-bit color depth.

    • @akyhne
      @akyhne 15 ชั่วโมงที่ผ่านมา

      @@coisasnatv The display - the thing you're looking at, doesn't work with bitrate. It's completely analogue.

  • @mikef5659
    @mikef5659 2 ปีที่แล้ว

    Are you a teacher/professor? If not you should be because that was probably the easiest to understand, well spoken video I've ever seen. Very well done!

  • @LeMatt87n
    @LeMatt87n 4 ปีที่แล้ว +45

    Next episode he’s not even going to be in the frame.

    • @stopthefomo
      @stopthefomo  4 ปีที่แล้ว +9

      ROFL was thinking the same thing while editing - "what the hell??" In my defense this video was recorded at 1AM after a long day at work, and I was just happy I got the information correct - after re-shooting 5 times!

    • @sn1-w5f
      @sn1-w5f 3 ปีที่แล้ว

      😂 😂 😂

  • @abap-gaming
    @abap-gaming 4 ปีที่แล้ว +4

    From what I understand, HDMI 2.0 only enough bandwidth for HDR in 4:2:0 chroma, as long as the CX can do HDR with 4:4:4 chroma I"m good.

  • @stopthefomo
    @stopthefomo  4 ปีที่แล้ว +14

    Does this put a damper on your dreams of buying a 12-bit TV on Black Friday, or are you still rockin' an 8-bit CRT?

    • @knekker1
      @knekker1 4 ปีที่แล้ว +1

      Apart from LG secretly downgrading the 2.1 HDMI bandwith without telling anybody until being questioned about it, I think it would be insightful if you addressed why it is such a big deal that LG CX tv's got their bandwith reduced from 12bit to 10bit. Judging by the specs, it really shouldn't matter, given that the 10 bit panel on LG CX tv, isn't able to saturate colors more than 10bit anyway. So why the big fuss? Perhaps keeping the 12 bit bandwith, would allow to overclock the tv's frame rate beyond 120hz without having to compromise on the uncompressed 4:4:4 chroma subsampling?

    • @tehama4321
      @tehama4321 4 ปีที่แล้ว

      @@knekker1 can't overclock the frequency beyond 120 hz.
      You're correct that it shouldn't make a difference since a 10-bit panel is still a 10-bit panel regardless.
      On his previous video he claimed that the C9 is 'more future proof' since it has the full 48 Gb and can accept 12-bit when compared to the CX.

    • @bravedwarf
      @bravedwarf 4 ปีที่แล้ว

      @@knekker1 for me c10 is for my xbox series x this fall

    • @bravedwarf
      @bravedwarf 4 ปีที่แล้ว

      @@holgerbahr8961 could you please elaborate on the dts encoder missing?

    • @knekker1
      @knekker1 4 ปีที่แล้ว +4

      @@tehama4321 I guess the big question is then, how is C9 more future proof with it's full 48gb bandwith, if the panel is still just a 10 bit panel?

  • @PaceyPimp
    @PaceyPimp 4 ปีที่แล้ว +1

    So should i never choose 12bit on the option on the media devices or will 12bit interpolate or downscale better than 10-bit color?

  • @christophermartin1973
    @christophermartin1973 3 ปีที่แล้ว

    I feel the need to necro this thread to point a couple things out:
    1. Downsampling always produces better results than upsampling - so content should always be recorded 12bit+ that way so it doesn't look terrible in the future - I want video that gets better over time not worse.
    2. I want 12 bit color specifically because it requires a higher nit value, so then the top end TVs get way more bright and maybe even need some active cooling - but this will make the trickle-down effect of brighter displays at 10 bit happen much quicker because in order to meaningfully realize 12bit capabilities for those big-box store displays it requires they raise the brightness to differenciate it side by side.
    Sometimes we want "the thing" not because it's useful or practical - but instead because of the implication of that thing existing because it needs to stand in contrast with what we already have in order to sell.

  • @pheotonia
    @pheotonia 4 ปีที่แล้ว +8

    10K nits would probably not be any hotter than my LG Plasma from 2010!

    • @stopthefomo
      @stopthefomo  4 ปีที่แล้ว +3

      LOL I have literally felt my bedroom get hotter when watching on my 50" Kuro - had to keep the windows open!

    • @mastersmurfify
      @mastersmurfify 4 ปีที่แล้ว +2

      @@stopthefomo still running my Kuro now (game on a LCD I plan to replace with a CX) - I am not even going to worry about this because my Kuro still looks great for streaming roku tv

  • @fabiozagoo
    @fabiozagoo 4 ปีที่แล้ว +1

    So why oppo 203 upscale for 12 bits If the tv cant handle ? For Buff or reduce artifacts ?

  • @johnnyhustle6976
    @johnnyhustle6976 2 ปีที่แล้ว

    When people were limited to 480i standard definition (SD) TV, they probably didn't think they needed high definition (HD) television either. But when they saw how crisp, clear & vibrant the HD picture quality looked, they knew they needed it. The same will be with 12-Bit color. HDR on those cheap 8-Bit TVs, like the older TCL 4 Series TVs of 3 or 4 years ago looked decent. However, when we saw what real HDR10 & Dolby Vision looked like on a 4K TV with a true 10-Bit color panel, those of us with sense knew how much of a game changer it was. And the same will happen when true 12-Bit color panels are available for retail purchasing.

  • @kev3226
    @kev3226 4 ปีที่แล้ว +1

    Thanks for explaining it. It really helps. Now I won't be missing out on 12bit color.

  • @adammcpherson9536
    @adammcpherson9536 4 ปีที่แล้ว +1

    So well explained FOMO. Really enjoy your style of presenting information in a clear and entertaining way.

  • @HeloisGevit
    @HeloisGevit 4 ปีที่แล้ว +2

    Thanks for this vid, very informative and shows that kicking up a fuss over the CX not having bandwidth for 12 bit is ridiculous.

  • @gamersplaygroundliquidm3th526
    @gamersplaygroundliquidm3th526 2 ปีที่แล้ว

    watching you right now in 12 bit on my tv 65inch it looks amazing since i found the settings i forgot about and yes it really is in 12 bit mode and it is very noticeable compared to 8 and 10 bit. so much so when my wife came home she started to say hello and saw the tv picture paused and said u baught a new damn tv ? again? i really had to show her a few vids to show how and why it looks so much better

  • @xephyrxero
    @xephyrxero ปีที่แล้ว +1

    So now that Hisense's UXN is going to have 6000 nits of brightness, can we resume our desires for 12bit color?
    Although it's still not fully correct until all these TVs start supporting 100% of the Rec.2020 colorspace. I want that even more

  • @DCMCOSPLAYdotCOM
    @DCMCOSPLAYdotCOM 4 ปีที่แล้ว +1

    While this may be one way of looking at, nits based, take a deeper dive into the color banding. This is where more color is needed even in the 300 nits range. For example, take a color gradient, doesnt matter what two primaries, say red to blue. Even at 10bit, thats 1024 shades you can do between them. 4K has nearly four times the number of pixels across the screen. If each one can not be assigned a different color, banding can occur. This problem is even worse when you zoom into that gradient, you still only have 1024 shades but are trying to fill more pixels with them, more banding occurs. 12bit is barely enough to satisfy the needs of 4K displays. This is especially an issue with gaming. Movies, BT2020, compression, thats a whole other can of worms when it comes to color. Gaming can produce a wider display of colors and gradients are used in virtually everything. Even worse, you gotta look into chroma subsampling, that can cause text to become unreadable when presented at less bit depths.
    We need 12 bit / 4:4:4 color for more reasons than just the Dolby vision specification. The lack of it is a total dealbreaker for HTPC enthusiasts.

  • @green_universe
    @green_universe 2 ปีที่แล้ว

    I know what color bit is, but your in-depth explanation was colorful, gotta tip it.

  • @tomchan2559
    @tomchan2559 4 ปีที่แล้ว +1

    So 12 bit TV is what the manufacturers should be focused in the near future. In terms of resolutions, 4K is good enough for most households.

  • @lucee2261
    @lucee2261 4 ปีที่แล้ว +20

    Excellent, informative, video - you should be a teacher.👍

  • @chrisbullock6477
    @chrisbullock6477 4 ปีที่แล้ว +8

    That was SHARP by the way with the Yellow.

    • @stopthefomo
      @stopthefomo  4 ปีที่แล้ว

      YES, I didn't want to say anything as I felt bad for kicking a dog when it's down. Sharp has really gone south since those heady days when they were literally at the cutting edge of TV technology - that yellow pixel caused quite a stir, but made everything feel like the "Magic Hour"

    • @thomasvinelli
      @thomasvinelli 4 ปีที่แล้ว

      @@stopthefomo I thought Hisense bought sharp.

    • @stopthefomo
      @stopthefomo  4 ปีที่แล้ว +3

      @@thomasvinelli Definitely not! Foxconn bought Sharp and Sharp continues to develop/produce panels from Japan. Hisense only bought the Sharp TV factory in Mexico and the exclusive naming rights to sell the "Sharp" branded TV in the U.S. So all Sharp branded TV in the U.S. are actually Hisense rebranded, but outside of the U.S. Sharp continues to be independent of Hisense. Sharp was hoping to re-enter the U.S. and take back its naming rights from Hisense, but changed its mind after CES 2020 when retailers like Best Buy said "No, nobody will buy your TVs no matter how good it is."

    • @rolandm9750
      @rolandm9750 4 ปีที่แล้ว

      @@stopthefomo Wonder how true it is that "nobody will buy the TVs" given Hisense and TCL TVs were fairly unknown at one time but sell fairly well today, all based on positive reviews from people like yourself to various other personalities/publications. Hisense relegated the "Sharp" name to the low-end in Can/US, but still if Sharp "came back" and had the units had some stellar reviews I think ppl would start buying them just the same. Only thing is I think they'd have to buyout Hisense's remaining time (however long they contracted the rights for) which is probably not worth it. IIRC they tried to sue Hisense but that didn't work, so they probably just said "forget it" for now.
      That said Quattron was interesting tech, but when it was out there it wasn't like those TVs were considered amazing just because of the yellow. Similarly Sony's original Triluminos which was RGB backlight LEDs...the XBR8 was an amazing TV at the time but in retrospect it doesn't seem like the RGB backlight was necessary given Sony dropped it and never looked back. These enhancements seem to come and go, without any real loss. Just like you don't really *need* a quantum dot filter to make a great LCD TV either.

    • @NUCLEARARMAMENT
      @NUCLEARARMAMENT 4 ปีที่แล้ว

      @@rolandm9750 I have a monitor with an RGB-LED backlight, the original HP DreamColor LP2480zx with 10-bit IPS, A-TW polarizer, and 97% DCI-P3 color gamut coverage.

  • @ReganMarcelis
    @ReganMarcelis 2 ปีที่แล้ว +1

    Hey brother, I am still on a Panasonic GT50 Plasma hence how I found you originally and s started to check into the different techs and this Panny Plasma at least in windows says it is now running on 12 bit depth but my radeon pro wx 8200 says 10 bit? Either way it is def more amazing than the panel used to look if this is even possible, do they make monitors in 38 to 42 inches, ultrawide be a bonus that has 12 to 16 bit, higher the better? I am dying for this, love the rich colors... and I am finally ready to get rid of my Panny GT50 as I know the power it uses it insanity even though she is still nice to me even for gaming... I also had the screen get a tad messed up but can only see it when it is off which is a shame as "somebody" cleaned it and must of had a tiny scratch that went thrU the black pro coating that I did NOT know was an actual coating ....and it wiped off so looks like a BLOTCH on the screen 24/7 and when off yuou can see it all day however when on it is barely visible except maybe at a very unusual angle and even then must also be a specific color in that area which rarely happens but again, just a shame so time to upgrade what is done is done... BTW: I so is it even possible that my PANNY is running at 12 bit depth that windows states it is or even the 10 bit depth my Radeon WX PRO 8200 says it is? I had the TV a long time and it feels like it LOOKS the best it ever could, not in my head.... However, I am ready because of the one things not so HUGE on my agenda but the POWER it draws I know is getting absurd for now a days we have came far enough and I will always MISS plasma for certain reasons especially the GT and ZT (even VT) SERIES, I think you have or hadf a ZT-65 OR 60? .... I always wondered with the VT-65 OR VT60 UP AGAINST THE BEST PIONEER "KURU" PLASMA, Who would win?.... or the last plasma ever released by Samsung, forget the name/number…. Which would WIN! I would like an ultrawide panal in 38 to 42 inches with the brightest knits and highest bit depth as well as quality built all the way down to the stand and the revfresh rate being 90 or above would be awesome as well, suggestions?...

  • @chungexcy
    @chungexcy 4 ปีที่แล้ว +14

    12-bit vs 10-bit is all about more shades and better gradation. They have the exact same range defined in HDR, both can up to 10000 nits.
    12 bit color can provide smoother gradation and control the delta-e between two neighbor colors below human's perceptual threshold. 10 bit is pretty good and 12 bit is perfection.
    If you don't explain about Perceptual Quantization (PQ curve) defined in HDR standard, you audience will never get this point.

    • @noname1st139
      @noname1st139 2 ปีที่แล้ว

      I got 12 bit option on firestick, would you have that on instead of 10bit, I got Samsung QN90A 50, thanks in advance 👍

  • @id104335409
    @id104335409 4 ปีที่แล้ว +5

    Close your right eye to watch this video comfortably.

    • @stopthefomo
      @stopthefomo  4 ปีที่แล้ว +1

      I'll drift back into the frame next time :)

  • @AndySomething
    @AndySomething 4 ปีที่แล้ว

    Found your channel recently and i'm very impressed. This video was very informative and I've come away learning something that I didnt know before 😅 All the best ❤️

  • @RushHourGameplay395
    @RushHourGameplay395 3 ปีที่แล้ว

    awesome information , couldnt be better , thanks

  • @zeonabdul2001
    @zeonabdul2001 4 ปีที่แล้ว +5

    So I may be watching a 9 bit TV right now, how nice let's just put the cart before the horse and move on to 16k

    • @Toliman.
      @Toliman. 4 ปีที่แล้ว +3

      @@holgerbahr8961 Pfft, worst year... as if. In 2013 and 2014, they made HDMI 2.0 4k TV's with "HDR support"... knowing they couldn't get 4K HDR over HDMI 2.0 cables, it was limited to 1080p HDR.
      The UHD group also didn't ratify HDMI 2.0b until after they had started making inroads on HDMI 2.0, leaving 2 years of premium priced 4K TV's that you could not play UHD discs on.

  • @Jimmy-ph8xn
    @Jimmy-ph8xn 4 ปีที่แล้ว +1

    So even though Xbox Series X says it’s supports up to 12 bit color, no display can handle it. Even Hisense Dual Cell probably can’t really display 12 bits either?

  • @MERCERENiTY
    @MERCERENiTY 4 ปีที่แล้ว

    Already liked and subscribed, at one minute mark of the vid. The way you talk and what you say, reeled me in that quick.

  • @JackRABBITslim27
    @JackRABBITslim27 4 ปีที่แล้ว

    I would just like to say thank you for all you do. You literally answer all the questions for TV buyers of all types in a very logical and intelligent way. In my quest to set purchase requirements to buy the best TV in the year 2023, its hard to track what these manufactures are doing. Bit depth has been a recent interest of mine and you pretty much settled that debate. I currently own a 2019 Samsung Q70r paired with an Panasonic UB820 to watch my movies. Future purchase requirements are hard to make. The global pandemic I believe will drive higher bandwidth streaming content making 8k more viable. However, HDMI 2.1 and 4k@120 is still the biggest upgrade for my needs. Thanks again, and stay safe EDIT. Also, entertainment and sporting events with no spectators combined with increased at home viewership might force content providers to invest in higher bandwidth content. Just my 2 cents

  • @H0DLTHED0R
    @H0DLTHED0R 4 ปีที่แล้ว +1

    Been wondering this myself, especially with 8K TVs now here. Thanks for covering this.

  • @SwagKingColeLeprecun
    @SwagKingColeLeprecun 4 ปีที่แล้ว

    “The pixel is lit” - i felt that

  • @robertobuatti7226
    @robertobuatti7226 4 ปีที่แล้ว

    I'm a person that get's easily confused but you explained the color system and peak brightness very well and easy to follow. I just purchased a Samsung Q80T 55 because I couldn't stand the blooming on the X950G but am a little disappointed with the peak brightness on the Q80T as it's only 700 nits and I spent $2100 I should at least be getting 1000 nits.

    • @jkairi4
      @jkairi4 4 ปีที่แล้ว

      700 nits is not bright enough for you. There are compromises in with all of these displays.

    • @robertobuatti7226
      @robertobuatti7226 4 ปีที่แล้ว

      @@jkairi4 For HDR it's not because HDR needs over a 1000 nits or more. But for SDR it's fine.

  • @killerkevin27
    @killerkevin27 4 ปีที่แล้ว +1

    So I've been watching you and Vincent from HDTVTest for a year now and I swear I just now realized that you two aren't the same person. Mind blown but at the same time I feel like an idiot. Lol in other news best buy just delivered my LG CX today. Just waiting for my stand to come tomorrow for it.

  • @cjnelson79
    @cjnelson79 4 ปีที่แล้ว

    I really enjoyed this. Thanks you. Very enlightening. You do a great job of teaching these things.

  • @gaming-zombie1392
    @gaming-zombie1392 4 ปีที่แล้ว

    Thanks for the info I'm happy with 8bit for now & when we hit 12bit it be Amazing to see...

  • @chris30605
    @chris30605 4 ปีที่แล้ว +7

    So this video makes your other video irrelevant. The lg cx not being full hdmi 2.1 48gps bandwith won’t affect anything but a 12bit panel if i’m correct. The new cx is not a 12 bit panel and won’t probably get to 1000 nits so why worry about the missing 8gbps on hdmi 2.1? I own a c9 btw....

    • @knekker1
      @knekker1 4 ปีที่แล้ว +1

      My exact question as well. I am speculating if the downside of limiting the bandwith, could be that we then miss out on the potential to overclock beyond the tv's 120hz frame rate, without having to compromise on the 4:4:4 chroma subsampling.

    • @musiclistener211
      @musiclistener211 4 ปีที่แล้ว +1

      Dithering of 12 bit content at 120hz can not be done by the TV. Perhaps it can be done instead by the source device before sending the signal. If the source is capable of this you won't really be loosing anything at that point.

    • @jaredjones787
      @jaredjones787 4 ปีที่แล้ว +3

      Holger Bahr you are incorrect. HDMI 2.1 is not a mandatory list of specifications. It’s a menu of specifications and you do not need to achieve the full 48 gbps. Look it up. Don’t give information unless you know it is 100% accurate please.

    • @chris30605
      @chris30605 4 ปีที่แล้ว

      Alfiandri Adin ok yea well, I do agree with that....

  • @JuanGarcia-lh1gv
    @JuanGarcia-lh1gv 4 ปีที่แล้ว

    Thanks for the information! I have an 8-bit display with only 180 nits of peak brightness. But to my surprise, after calibration, it covers 100% sRGB and 100% P3! I calibrated it with my colormunki. 8-bit content looks great, but the contrast is only 3000:1, so it doesn't hold a candle to OLED or QLED.

  • @1VideoGameDude
    @1VideoGameDude 3 ปีที่แล้ว

    Thank you for explaining this. I noticed 9 months ago this was filmed I'm wondering how true is it still today? Is 12 bit still irrelevant as long as our TVs don't reach 10,000 nits? I'm on the fence about splurging on the LG CX because of the 10 bit issue and wanted your input . Thank you again!

  • @BryantAvant
    @BryantAvant 4 ปีที่แล้ว

    So, I was at CES 2018 and Sony had a 8k 10,000 nit tv. It was piercingly bright but didn't give off heat.

  • @dantefekete7617
    @dantefekete7617 3 ปีที่แล้ว

    Thank you very much for this video. I do, however, have one question. I am a hobbyist filmmaker, and my friends and I will get together to watch each other’s films and stories. As I film in 12 bit 1080p, would a 4K tv downsample that information, or will I have issues viewing that footage appropriately? We are all coming from using actual film and editing for projection, so digital cinema is somewhat new to all of us.

  • @tehama4321
    @tehama4321 4 ปีที่แล้ว +10

    This video demonstrated just how unreasonable it is for anyone to get upset at any current tv not being able to accept a 12-bit color signal. It also shows that you were well aware of there being no difference in the image being displayed by a 10-bit panel that can accept a 12-bit color signal and a 10-bit panel that can 'only' accept a 10-bit color signal. However, you chose to posit your previous video as if this year's CX panel was 'dumbed down', and that for PC gamers the C9 was more 'future proof' for having the ability to accept a 12-bit signal even though it couldn't display it either.
    Now that you've explained how far away we are from fully exploiting 10-bit color space. Now that you've explained how far away from having 12-bit panels we are. Rather than ask us the question of if 'we still want a 12-bit tv', how about you answer the question of letting us know when you're going to make a video that explains what was the purpose of you being the cause of FOMO and saying that the CX not accepting a 12-bit signal makes it any less capable in any way form or fashion including as a PC monitor when 12-bit color can't be displayed on it (as you so eloquently explained in this video)?

    • @cryptohawado705
      @cryptohawado705 4 ปีที่แล้ว +5

      Lol, he makes a video like people wanted a 12-bit display when he was acting like LG took something from him by not supporting 12-bit color inputs on the new OLED

    • @pharmd718
      @pharmd718 4 ปีที่แล้ว +5

      Exactly what I was thinking as I was watching this.

    • @Jigga0428
      @Jigga0428 4 ปีที่แล้ว +2

      tehama1979 I was waiting for this myself. Not a video about something no one has. We pointed out the issues with the last video that you created hype over something that wasn’t an issue that seems in this video you knew. Yet your last video created hysteria and mis-information when the whole point of this channel is to educate and provide accurate info that doesn’t misinform consumers.

    • @Heliosvector
      @Heliosvector 4 ปีที่แล้ว

      Whats the issue?

  • @suliman9058
    @suliman9058 4 ปีที่แล้ว

    I miss college, thanks for the lecture

  • @AzzaBro59
    @AzzaBro59 4 ปีที่แล้ว +1

    I feel the simple way of explaining is their is no 12bit panel TV's. So the 10bit panel only needs a 10bit signal all together.
    I feel smarter. But question how do they get a low luminance white. Bit then I guess oled has the white pixel.

  • @aboodinatorgaming1929
    @aboodinatorgaming1929 3 ปีที่แล้ว

    I was searching for this after I heard that LG nerfed the C series from 48 gbps in the C9 to 40 gbps in the CX and was wondering if that even mattered. This cleared the issue for me. Thanks.

  • @eeeeyuke
    @eeeeyuke 4 ปีที่แล้ว

    So why the minor uproar upon finding out about the cap on HDMI 2.1 for LG CX. Even if it wasn't capped, that TV isn't bright enough for 12 bit.

  • @matwtf
    @matwtf 4 ปีที่แล้ว

    One thing no one is talking about is color resolution.
    On a 1080p TV the black and white component of the image is 1080p. But the colour is often a quarter of that. . Graphics are full colour resolution. 4:4:4. Video however is usually 4:2:0. (full b+w, 1/4 spread out colour) You don't notice mind because your eyes also see black and white at a higher resolution than colour.

  • @SuperKrisNOR2
    @SuperKrisNOR2 4 ปีที่แล้ว +1

    Great content, keep up the good work! Love your channel!

  • @MikeAningyao47
    @MikeAningyao47 2 ปีที่แล้ว

    Which is better? A 55 inch 8 bit 4K HD TV or a 55 inch 12 bit FHD TV?

  • @9yearoldepicgamersoldier129
    @9yearoldepicgamersoldier129 4 ปีที่แล้ว +1

    What an excellent video. Thank you for explaining this. I feel much wiser now. But we definitely won't get there with oled so can't wait for microled

  • @ledooni
    @ledooni 4 ปีที่แล้ว +1

    8K resolution in 12-bit up to 10K nits will be great... in maybe 5-10 years. For the next few years 10-bit on an OLED will still be the best experience in a dark room or a QLED in a bright room. Only thing I am interested in for the next years is if Samsung can actually improve the OLED technology with their QD-OLED with maybe slightly more brightness and color volume.

  • @Frencho9
    @Frencho9 4 ปีที่แล้ว

    So 8 bit color at 200 nits brightness and 10 bit color at 1000 nits brightness is ideal for TVs? Is it the same for laptop screens? Mine is an 8 bit panel 300 nits 99% sRGB 17.3 inch panel.

  • @dneary
    @dneary 2 ปีที่แล้ว

    If you're talking about luminance nits does that mean that the native colorspace of a TV/screen is La*b* or YCrCb? I thought that traditionally TVs were mostly RGB with different color LEDs, each of which had ~3 bit settings, but if you have YCrCb, wouldn't you would still want 6-8 bits subsampling of Cr/Cb to differentiate colors? Luminance is only one axis of the color, so 12 bits would equal 6 bits luma and 4:2:0 subsampling or 4:4:4 RGB - is that right?

    • @xephyrxero
      @xephyrxero ปีที่แล้ว

      The bits he's talking about in the video are per channel, not the entire payload. And with 4:2:0 it refers to the number of pixels encoded, not bits. So you get 4 pixels with their own unique value for the first channel, then those 4 pixels will share the 2 pixels worth of data for the second, and the last channel is omitted. So with 8bit YUV we're talking 48bits for the whole cluster of 4 pixels. Or 12 bits per pixel thanks to this compression (as opposed to 24 for RGB). And so at 10 bit 4:2:0 we'd have 15bpp, and for 12 bit color it would come out to 18bpp.
      So even in YUV 4:2:0, each channel would still get 10 or 12 bits, but we've cut the bandwidth required in half. Although a hybrid with 10 bit chroma, and a 12 bit luma would be a neat idea (17bpp).

  • @Ghostelmalo44
    @Ghostelmalo44 4 ปีที่แล้ว +3

    amazing job you're doing my friend. thank you so much !! i'm learning a lot !!! WAITING FOR THE RIGHT TV FOR THE NEW XSX CONSOLE !

    • @trumpameri1638
      @trumpameri1638 4 ปีที่แล้ว +1

      Yes we all waiting for the next gen console PS5 or Xbox and best possible tv solution

    • @stopthefomo
      @stopthefomo  4 ปีที่แล้ว +3

      If you are OK with OLED, it really doesn't get better than either the C9 or CX (if you need 48") for console TV gaming - preserves all the color while having the best latency for any TV. Even when Vizio brings out its OLED, I can't imagine it will be priced lower than the C9. As much as I'm excited about what Samsung is doing with 8K QLED this year, I'm disappointed that all QLED TVs lower the picture quality and FALD control in order to have better latency. This means going from movie watching to game playing will relegate my color/black levels. LG C9/CX maintains all picture quality regardless if you're in game mode or not.

    • @Ghostelmalo44
      @Ghostelmalo44 4 ปีที่แล้ว

      @@stopthefomo what about burning problems with oled ?... is not more convenient to get a micro led from lg.. i heard those have amazing quality. better than qled. ( i heard )

    • @AngelicRequiemX
      @AngelicRequiemX 4 ปีที่แล้ว

      @@Ghostelmalo44 The real question is: Can you actually afford a micro led TV when it first releases? Probably not.
      In that case, buy an OLED instead. Burn-in is basically a non-issue with the latest generations of OLEDs unless you have: 1.) A defective panel. 2.) Purposely abuse the TV like Rtings did for their 1 year burn-in stress test.

    • @Ghostelmalo44
      @Ghostelmalo44 4 ปีที่แล้ว

      @@AngelicRequiemX micro led has been out for a few years now. Don't know what you're talking about m8. And they're more cheap than oled.

  • @jakey1995abc
    @jakey1995abc 4 ปีที่แล้ว

    I have an old 1080p Toshiba TV from 2008ish and Nvidia control panel says it supports 8bit and 12bit color depth, but my new monitor only supports 8 bit... I got a feeling either you are wrong or Nvidia control panel is broken?

  • @moireibhmacaoidh2017
    @moireibhmacaoidh2017 4 ปีที่แล้ว +1

    Actually, there are some tv's that report 12bit video, but I think you have to be using a computer with a gpu that supports it. I have a samsung curved led tv that reports 12 bit. Using a 290x gpu.

  • @mjjjjslwbzxxfkkek
    @mjjjjslwbzxxfkkek 4 ปีที่แล้ว

    I absolutely love the way you indirectly are calling people dumb, because they are too scared to make their own opinion on things and to trust their own expectations.. and totally makes their opinion upon what youtubers tell them to believe and rely on their expectationens instead of their own, its a crazy “minority” group... So before i’m getting burned alive, i have to say its a beautyfull thing aswell.. and that i believe in every word you are saying, love your work ☀️

  • @trendkillwill9806
    @trendkillwill9806 4 ปีที่แล้ว

    I really enjoy your videos, very informative. I'm a big nerd 🤓 so I like learning things like the nits to make hdr 10 work and all the technical specs explained. Im having a hard time with trying to figure out which tv I want for next gen. Seems newer Tvs are focusing on 8k instead of perfecting things like 2.1, hdr, vrr, and supporting next gen. Thanks.

  • @IosifViorelMila
    @IosifViorelMila 4 ปีที่แล้ว

    Hi, Can you please make a video about the 2020 Samsung "The Frame" QLED vs standard QLED range ? For example, how does The Frame 55LS03T compare with Q60T/Q70T/Q80T. Looking forward for your opinion on this. Thanks!

  • @lateralus46n2777
    @lateralus46n2777 4 ปีที่แล้ว +1

    Look out!! He's whipping out props....

    • @stopthefomo
      @stopthefomo  4 ปีที่แล้ว

      You mean the prop that disappeared out of the left side of the frame? LOL

  • @stanlee5465
    @stanlee5465 4 ปีที่แล้ว

    I'm planning to harness the SUN with mirrors to power my 14 bit display panel!!!

  • @malash3972
    @malash3972 4 ปีที่แล้ว

    For my Xbox one X.. should I use 10bit or 8bit? Also would you recommend HDR on or off?

  • @catalin.siminiuc
    @catalin.siminiuc 3 ปีที่แล้ว

    1. If 8 bit has 256 shades, how is represented in 100 nits, since I thought 1 nit represent 1 step and therefore are only 100 shades. Are there 0.5 nits measures or how does it work? How many shades 100 nits can display or how can we calculate the number of shades in relation to nits?
    2. Also why some manufacturers have 400+ nits panels with 10bit and also 8bit+frc, why use frc then if it's achievable 10bit? Is something different in pixel layout/panel treatment? How are colors added without frc?
    3. What is the relation or the difference between a 8 bit panel which displays 16 million colors and 16bit colors which I was selecting in Windows to display 65536 colors?
    Offtopic: how bit/color depth are represented in printing since there is no lumination. What is the maximum which can be printed?

  • @De03314
    @De03314 4 ปีที่แล้ว

    This is one of your best videos! Thanks!

  • @AzzaBro59
    @AzzaBro59 4 ปีที่แล้ว

    Fomo you talk about brightness changing shades. But what if you want a light blue but low luminance? Like oled for example has pixel level controlled nits. I get what you saying. But for example how do they make a white with low luminance for example?

  • @realistvision2547
    @realistvision2547 4 ปีที่แล้ว +1

    Great video buddy very interesting and informative. How far away do you think 12 bit TVs are 5 years?

    • @stopthefomo
      @stopthefomo  4 ปีที่แล้ว +2

      Possibly 3 years or not until TVs can honestly get past 4,000 nits. In my interviews with the leading developers of TV technology (my videos with CEOs of Nanosys and Rohinni), my take away is that we are already at 5,000 nits backlighting technology, so it's just a matter of combining this with the next generation quantum dot color filter - it's this latter development that we are waiting (Samsung is working very hard on this QD color filter tech, $11 billion and all that). Nanosys confirmed that we should be seeing working prototypes of TVs using QD color conversion filters later this year. Assuming it takes another 24 months of fine tuning the color processor to integrate 4,000 nits of mini LED and QDCC into a Dolby Vision qualified 12-bit display, we may have 12 bit "capable" TVs in 3 years. Remember this first generation of 12-bit TVs will look no better than the best 10-bit TVs at the same brightness, because the 12-bit TV would have to tone map all the colors lost from the 10,000 nit content which kind of defeats the purpose of 12-bit. It's like LG giving us HDMI 2.1 capable C9 last year but the TV itself had no hope of ever processing content requiring 48Gbps of bandwidth. First generation 12-bit TVs will not do 12-bit content justice just like first generation 10-bit TVs were not good enough to deliver HDR performance without further R&D. This year with the Samsung Q900TS/950TS, we may be ever closer to what 10-bit quality offers because it can hit 3,000 nits while being noticeably blacker than before (but obviously short of OLED). So in 3 years when 10-bit TVs finally hit peak performance, 12-bit TVs must do a lot better to justify the 150% price premium (yes, more than double the price at launch).

    • @realistvision2547
      @realistvision2547 4 ปีที่แล้ว +1

      @@stopthefomo thank you for the in-depth response I really appreciate it. I agree with your detailed analysis. There are a lot of exciting and interesting technologies on the horizon and this is going to have a significant impact on picture quality. However until we get a 12 bit self emmisive displays people are best getting a high quality 4K TV now and then wait until TV technology develops. However this information does confirm that Oled is getting closer to its sell by date and this technology is not the future. Where LED TVs will comfortable be able to produce 12 bit panels. I find this very interesting as Samsung has stopped LED production and LG well we know what little effort they have put into their LED TVs.

  • @honklertheconkler155
    @honklertheconkler155 4 ปีที่แล้ว

    Thanks for explaining big help

  • @ArtVandelay-ImporterExporter
    @ArtVandelay-ImporterExporter 4 ปีที่แล้ว

    I just bought a 65" Q7 last week from Costco for $1000 only to discover it doesn't support Dolby Vision. Being constrained by a lower budget I am unable to gratify my hometheater wishes with the best tv's. I am happy with the Q70 but am wondering if I could get a better tv in my price range with DV, and if that DV would give me a better picture. Should I return it and swap it? Any recommendations if so? Last question is I notice the movies in Disney Plus and Netflix that say dolby vision on other devices just say HDR on my Samsung. If I don't have DV how am I able to see HDR, which I am seeing?

  • @9yearoldepicgamersoldier129
    @9yearoldepicgamersoldier129 4 ปีที่แล้ว +1

    Now i'm interested which is a bigger deal oled or 12 bit color because you will never have 12 bit color with oled. Not even full 10 bit color

  • @IkariWarrior1701
    @IkariWarrior1701 4 ปีที่แล้ว

    Great clearly explained video! Really appreciate this kind of content!

  • @TheEgzi
    @TheEgzi 4 ปีที่แล้ว +2

    Ur TV vids are so good!

  • @icekuba
    @icekuba ปีที่แล้ว

    are the Samsung Qled like 65Q70A/B has 10 bit or mayby 8+FRC.

  • @biueprint
    @biueprint 4 ปีที่แล้ว

    hey man i wanted to ask and see if maybe you knew about any new 4k 144hz monitors coming out this year i was going to buy the rog swift 27 inch 4k monitor but i feel like that panel has been out a while and maybe something newer is on the horizon can you let me know what you think
    monitor = asus ROG Swift PG27UQ 27" Gaming Monitor 4K UHD 144Hz DP HDMI G-SYNC HDR Aura Sync with Eye Care

  • @nedywest71
    @nedywest71 4 ปีที่แล้ว

    Are you planning to review TCL C81?

  • @Odank
    @Odank 4 ปีที่แล้ว

    I understand that having 12 bit signal source may provide “supersampling” when being displayed on a 10 bit display that may or may not have a noticeable difference. But more specifically, if the 40gb bandwidth limit can’t provide 4K 120hz 12bit it should be able to process a 4K 60 12bit signal no problem. That is well within the 40gb limit anyway. This is for people who are freaking out over this of course. If you show me a 12bit source on a C9 in the future and that same source output in 10bit for a CX and go on to tell me there is a visual difference than maybe I would be worried then. But considering both displays would be taking that signal and will be spitting out 10bit anyway - I doubt you will.

  • @TheUAProdigy
    @TheUAProdigy 4 ปีที่แล้ว

    Unrelated: If possible can you do a video showing how to compensate for a tv’s weaknesses. Ex: Fixing black crush on the TCL 8 series.

  • @doyouthinkitsdead
    @doyouthinkitsdead 4 ปีที่แล้ว

    Dude whats a FOMO? Also great channel. I've learned a few things recently here. Whilst looking at what a tv needs for either ps5 or series X consoles.

    • @eclectice
      @eclectice 3 ปีที่แล้ว +1

      Fear Of Missing Out

  • @Ryand0523
    @Ryand0523 4 ปีที่แล้ว

    I was told that Dolby vision was 12 bit color while hdr 10 was 10 bit color. Back when hdr, was new to consumer tv’s at least. So while dolby vision may define 12 bit color, the displays can only display up to about 10 bit color.... hmm.

  • @moviesandgamestrailler4825
    @moviesandgamestrailler4825 3 ปีที่แล้ว

    Hey man I don’t understand you you at the beginning of the video you told in order to get 8 bit Of color the tv 📺 must required 100 nits to display 8 bit and to display 10 bit it required at least 500 nits of peak brightness then at the end of the video you told that our TVs today even high end can’t display full 10 bit of color that means high-end TVs like Oleds and qleds its displays only 8 bit of color 🤷‍♂️?

  • @nataflet
    @nataflet 4 ปีที่แล้ว +1

    For those who don't know, PS5 can be used with any tv.
    The PS5 doesn't require a TV with HDMI 2.1.

    • @Dutchfalco
      @Dutchfalco 4 ปีที่แล้ว +1

      @Moist Gnome Superior specs, yes. But if you're into that, go PC gaming and wait for the 3090-cards from Nvidia to annihilate the Series X. Superior exclusives? ROFL, lets not even start about the games you'll be missing out on!

  • @Keabrown79
    @Keabrown79 4 ปีที่แล้ว

    So how does 10 bit (8bit + FRC) figure into all of this as far as color depth?

  • @PaulShare1
    @PaulShare1 4 ปีที่แล้ว

    Good video, learnt a great deal here. Well done.

  • @majorastorm
    @majorastorm 4 ปีที่แล้ว

    I was just looking this up for the Series x

  • @Doofsta007
    @Doofsta007 4 ปีที่แล้ว

    I'm always quibbling about the luminance of my TV - but that's just me nits picking

  • @tyroneslothdrop9155
    @tyroneslothdrop9155 4 ปีที่แล้ว

    Great video; I did not know that color gamut was determined by luminance. This makes the LG controversy look even more ridiculous. I don't want to pay a premium for useless features, but many people seem to feel differently.
    I am, however, less excited to upgrade my Panasonic plasma if color banding is still an issue with newer televisions.

  • @chriskaradimos9394
    @chriskaradimos9394 4 ปีที่แล้ว

    Fantastic video thank you, you cleard it up for me.

  • @Durkur_Owl
    @Durkur_Owl 4 ปีที่แล้ว

    Wait a minute. After watching this, a new/old question arises: oled has the "perfect blacks"/contrast, but the nit output is waaay lower than what led or qled tech offers. How does this all fit in with hdr dynamics (just in terms of 10-bit)?

  • @Marc28031984
    @Marc28031984 4 ปีที่แล้ว

    4 years ago there was the whole 8 bit vs 10 bit discussion.
    There were even 8 bit panels that somehow could display 10 bit gradient (8 bit + FRC)
    And now the whole thing starts again...
    Since Dolby Vision there should be people yelling about 12 bit, cause Dolby Vision is developed for 12 bit.

  • @chrisborland6787
    @chrisborland6787 4 ปีที่แล้ว

    Love the detailed information. Thank you.

  • @TheCrucialQ
    @TheCrucialQ 4 ปีที่แล้ว

    I must add that if you are truly interested in see some of the best HDR Dolby Vision and HDR10+, you consider purchasing Spears and Munsils UHD HDR Benchmark Bluray from Amazon.
    There is a montage at 10,000 nits BT.2020, (HDR10, Dolby Vision FEL/MEL, HDR10+ versions). This montage is stunning. This really test your display tone mapping capabilities.
    Test are included to help you get to the proper settings for your display and UHD Bluray player.

  • @sciroccomods
    @sciroccomods 3 ปีที่แล้ว

    I have just set up my ps5 to my philips led tv and if i press the + button on the tv remote it says 12 bit 4:2:2 . Is that just meaning thats what the ps5 is sending to the tv or what the tv is actually doing. 🤔

  • @Koopinou.
    @Koopinou. 4 ปีที่แล้ว

    Excellent work !!! Kiss from 🇫🇷

  • @DavidGillooly
    @DavidGillooly 4 ปีที่แล้ว +1

    I am hoping to see H9G by June 1....will I be disappointed?

    • @zeonabdul2001
      @zeonabdul2001 4 ปีที่แล้ว

      I too am waiting on it but with all the fiasco the big brands are dishing out I will be happy with the price I pay for it instead of paying over $1000 to be shafted hard!

  • @darkmatter7274
    @darkmatter7274 4 ปีที่แล้ว

    I get the impression you are saying that the brightness range has to be extended to 4000 nits to allow more steps or increments of light level. Surely this is not the case. Couldn't they just create finer increments within the brightness range achievable. And only a remap of the wider map would be required. So if there was a 1000 increments in a wider brightness range, couldn't they just create 1000 smaller increments in a smaller range. E.g. smaller fractions of brightness.

  • @itchyblanket5508
    @itchyblanket5508 4 ปีที่แล้ว

    Awsome video!! I feel like an expert now.