Thank you so much for this, I was looking at this setting on my Series X and you explained everything perfectly. Answered all of my questions before I had a chance to ask. Subscribed. Thank you!
Thanks for the information. Very interesting. You got yourself a new subscriber. I’m a nerd and love this lol. I got to appreciate my new LG CX 65” and my series X even more 😍
This is an interesting topic right now, because all the available (and many of the upcoming) AV Receivers that support HDMI 2.1 have that Panasonic Chipset mismatch issue right now. It happens on Nvidia cards and Xbox, but not PS5. Now, the question is, whether Microsoft could simply offer a "workaround option" for now that allows us to limit the Series S and X Output to 32Gbit/s like the PS5 currently is, thus enabling AVR users to enjoy 4K120HDR (albeit in 4:2:2)...🤔
You are right I did the test I put an 8bit game and I put the console in 10 bit and I noticed that the colors were not natural they were saturated on the neck I went back to 8 bit and c 'was much better, and for the games which sound in 10 bit you are also right because the console automatically switches to 10 bit so it is better to leave in 8 bit like that we have the best of both worlds
The CX is limited to 40gbps, putting it in 12 bit will lower the chroma subs ambling to 4.2.0, 10 bit will give you RGB 4.4.4 which is what you want, now put your oled into PC mode to take advantage of it.
@@tomekkrr7806 The TV and your Xbox "have to speak the same language", when you put the TV (LG CX, C9, etc) in PC mode, it will recognize and process the 4.4.4 signal without conversion, if you don't, it will process all signals from the Xbox as 4.2.2
@@MrKenn1230 so only in pc Mode it is 4.4.4 ? I tested some pictures and in console mode the edges are smoother while in pc mode the the edges are less smooth. More steps are visible. Very strange.
This really helped. I tried changing the override to hdmi but switching back to auto detect because the resolution dropped. For whatever reason I never on my own thought to just increase the resolution back up to 1080p. Then I made the mistake of changing the color bit only to learn that it isn't necessary. So now years later I'm finally playing on 60hz which wasn't a option before.
Nice vid but flawed opinion. The xbox and tv do have auto settings but unfortunately they will go with the lowest common denominator to ensure a smoother experience when there is still more to the image that is not being represented, namely in the bit depth. Really turn on 12 bit, disallow 4:2:2, set your tv to native color space from auto (similar effect as the 4:2:2 xbox option) and you will see a clear difference between 12 bit and 10 bit. Film it, show people, it's clear as night and day. The only problem I have had with specifiying higher settings on my xbox is with Hulu, it gets a greenish tint to it but that is removed by checking 4:2:2 color again. I'm doing this on a 55" h8f also, so you cx should be even better.
@@dillonweaver2307 lol at what exactly? I did this and it works on my LG C3. Every other answer left my picture looking like a yellow piss hell with non existent blacks.
@@Gorgutsforcongress you don’t know how to setup the TV and the console together is all. 12 bit panels don’t exist in TVs yet that I know of so no reason to have input lag when your having to change it back to 10bit again. It’s still 10 bit if your lucky. A lot of TVs are 8bit but can accept 10 bit input also so then might as well be on 8bit. Up to you but those consoles are not very powerful nor TVs so make them do the least possible. And 4:2:2 is only a setting you should need on a really crappy 4k TV.
@@Gorgutsforcongress what’s your gamma at what’s your black levels at what color settings are you using what about white levels?? If you don’t like a warm picture you don’t like accurate colors but that’s another things that’s just preference. I try to set all mine up as color accurate as possible no additional features turned on and get the lowest possible latency I can. My LG CX I have many ways I can run it but it’s mostly about if it’s a 8bit or 10bit game and if it uses HGIG or not.
I calibrate everyone’s display for them just about they always call me when they get a new TV to adjust it for home use and give them options for light and dark room use. I do it by eye but I’ve done it so long I don’t need instruments usually to get a pretty accurate display and fix other issues they come with because there setup usually for being under a bunch of lighting and usually cooler color lighting. So the stupid modes they come on look like garbage unless you only ever seen them that way. I get a lot of people that say there new TV makes them nauseous. Would you happen to know why that is?
For xbox it is easy to just keep it at 8bit and let it do the math for us. I will see what I can do for Playstation as PS5 settings give you very limited control
Great video on Sony a90k would u use the reality creation setting or just leave on auto and if not how come ? And what about the advanced contrast enhancer would u use this ?
I have an element smart TV and it let's me use the 120Hz with 12Bit but I'm not actually sure if it's doing the real 12Bit since I don't think it has a picture info menu like that one, it may just be outputting 10Bit or even 8Bit while the console says 12Bit.
Thanks! That’s cool to know. My lg c2 outputs 12 bit at 120 hz. It changes it from RGB to YCBCR420 12b 4L6. 48gbps. I guess they enabled that within the last two years. I also have a cx I’d like to check. But that’s good to know what the max and what the tv is actually outputting. I’m about to test a few 48gbps hdmi 2.1 switches so knowing how to access this is invaluable
I don't remember if the older LG CX could give one the full 48gbs. I know the C3 can. For a couple of years, they went back to 40gbs. The Xbox can't display 48GBs..only 40GBs and the PS5 can only display 32GBs. That's why true 4K@120HZ is impossible on the PS5 and really almost impossible on the XBSX because it doesn't have enough power. There is one game released two years ago, that is a 2D scrolling game, that can maintain actual 4K@120HZ on the XBSX. Most games are between 1080P@120hz or 1800P@120hz. A 60hz panel can only really supply 4K@60z with 8bit. An HDMI 2.0 port/cable can only handle up to 18GBs. (Not 40GBs or 48GBs) The question is 4:2:2 at 12 bit (17.82GBs) or 4:4:4 at 8bit (17.82GBs)? That's the highest for an HDMI 2.0 cable or a 60HZ refresh rate panel and not a 120hz or an HDMI 2.1 port/cable. 12bit at 60HZ us only 24.06 GBs. Any 2.1 port TV can do this.
I have a 8-bit + FRC "10-bit" monitor and i usually go with 10bit 422. Get less bandning when gaming. I honestly can't tell a difference in other things except the color banding. Text and menus looks the same.
I have an 8bit +FRC myself the Samsung Q70b QLED tv and I uncecked my 422 box in settings and having the console in 8bits. You say that enabling 422 and going up on 10bits on my seriesX would be better?, It's not all the SDR content like games in SDR and TH-cam videos in 8 bits anyway and my seriesX would go automatically up and down correctly without being forced on 10 bits always?. Wouldn't 422 limit my bits on 10 and I could never achieve 12bits while it's a enabled? I mean 420 -12 bits would always be better than 422 10 bits if there is content available for that range of bit depth?, don't you think?.
On the X series I'm lost to have the best image in sdr and hdr advise you to put on 8 bit or 10 bit, because if the console automatically switches to 10 bit what is the point of putting it in 10 bit I don't understand thank you to enlighten me I but 8 bit or 10 bit?
Hey, is there a way to fix the frames per second counter on the free sync menu when in 120 hz mode? Somehow it always displays 5.5 when the system is in 120hz mode
How do you show that FreeSync Info? I pressed the green button or two dot button on my lg remote, but all it does is control my xbox home screen like a controller? Mind you I got AMD freesync off because it disables Dolby Vision.
Don’t tick 4:2:2 and leave it to 8bit when hdr is played it will automatically change to 10bit. When sdr is played it will stay locked to 8bit. Just leave it alone or you will wash colours in sdr games
@@Gorgutsforcongress you must mean HDR washes out the color. You gotta adjust the settings but yeah it’s not quite as vibrant because it’s trying to simulate how your eyes really see color in real life
@@symboltherapper no the vrr definitely does something to the blacks in game. It raises them for some reason and to me all games just look a tad better and punchier when vrr is disabled.
to use in a 4k tv in 60hz and HDMI 2.0 i have choose 8 bit right? and with HDR ON, the depht color will change automactly to 10bit? is correct? i havo to check ycc 4:2:2?
I saw Vincent say that allowing ycc 4:2:2 on hdmi 2.0 tvs actually outputs 12bit422 instead of the signal info displaying 8bit422 my question is does this same process work on non dolby vison tvs specifically hdr thru 4k60?
2 YRS and I'm just about to put the hammer down on a series X and found your vid and thought dude I'm on currently on an xbox one and a benq projector from 2015 it's putting out a crisp 1080p Pic and really love this platform alot where is the series X gonna disappoint? I subbed I want to know what's going on since I'll watch it. I really need to know is my settings wack or what? Have a good one. Later days. -951-
It’s 2022 and I was suddenly bugged by this in terms of which is the better option. I blame my PC (long story). So basically what I gathered is PC RBG is best for monitors, otherwise Standard is for TVs and bit depth wise just leave it at 8-bit like you said because the console will work out when it needs to use anything higher. So far would you say I am on the money? I hope so! Lol
@@MyGadgetsWorld OK so no loss of image quality left the console in 8bit ?? but then I do not understand that she is interested in having the choice between 8 bit 10 bit or 12 bit if the console does it automatically?
Press the green button 8 times in a row quickly. Vincent from HDTVTest says to leave Series X at 4K120 with 10 bit for a full 40GBPs bandwidth with the LG CX
What would you recommend for HDMI cables for simply the XBOX Series X connected to an LG 32 inch 32GP83B-B.AUS? I don’t have a PC. Thanks for the video and any advice you can give me would be great. I’m attempting to get the best settings for PUBG and yes like you said , I keep changing my settings non stop and experimenting lol. Thanks in advance .
@@MyGadgetsWorld thank you so much. There so many different variants of them though. Lol. Any one in particular? Price isn’t a concern. Just want the best possible option.
I have a Samsung 50” 7 series RU7100 and wondering what colour depth setting I should set my series X too I noticed if I set it to 8 bit the game tile for assassins creed Valhalla looks pixelated on the Xbox home screen, I’ve heard that it changes to 10 bit automatically when viewing content, I’ve heard setting it to 10 bit causes unnessisary hdmi bandwidth Cos it’s converting it to 10 bit twice?, and I’ve heard 12 bit can cause chopped blacks if the panel doesn’t support it, so I’m completely lost on what to set it too
You shouldn't see much difference between Limited (Standard) or Full Range RGB (PC RGB) if all the settings are matched properly between your TV and your XBOX (and the calibrations (color, sharpness, etc) are set properly for each mode). Most specialists recommend using the PC input ON YOUR TV and setting the color space to the default Standard setting (remember to also set your TV's black level to LOW (not auto) ) for smooth transitions between video and gaming. I've seen both settings work but that is the general recommendation.
On pc hdr is horrid so been staying to sdr just wondering if there is any benefit to 10 bit sdr or if everything should keep to 8 bit also should I be going for 4.4.4 or should I stick with rgb? - thanks
Great video! Thanks so much for your help! I'm trying to get my lg c1 to play (fps) with ori and the wisp? I'm changing it from 60hz to 120 in my settings but when i look at my game optimizer it says 59 frames..... nick
Check the settings for the game if it is in 120 fps mode or not. I just checked and it is the game which can be played in 60 and 120. Make sure it is set to 120 fps ( performnace in the game settings)
You get that either way. People just assume things sometimes without actual application . You can see it swtiches to RGB automatically. Keep it at standard and let TV do switching
@@MyGadgetsWorld since changing my CX to PC mode while on the Xbox input ive noticed it pops up saying "instant game response" on every page I go to. It used to be once I first switched inputs to the Xbox only. Small gripe, but if you don't think there's any benefit I'll change it back.
@@MyGadgetsWorld The AVS forums people say that display info is wrong. "That output info is misleading because this tv ALWAYS downsamples to 422 unless pc mode is enabled. There is no other way around. You want 444? Pc mode is the only way to go."
So regardless of what the tv is capable of. The bandwith from the console is limited your saying ? So if I wanted 12 bit 444 120hz 4k it’s just not possible befusss the Xbox only outputs 40gb ?
Im having issues with my xbox series is displaying dolby vision gaming on my tcl 5 series tv. There is a grey like haze showing on my tv wen hdr is enabled. Wen the dolby vision box is checked only.
I have 2 questions i hope someone can help with. The first question is more a confirmation. I bought the new LG NanoCell 85 series 65in. For my xbox series x, do i want to use 8bit, 10bit, or 12bit? 10bit would be best correct? My second question is, my new TV has "AMD FreeSync Premium". Should i turn this option on? If so, whats the best setting? High or wide? Thanks everyone. Any advice is greatly appreciated.
Hello. You can set your bitrate to 8 or 10 , either will work, I'd just look to see which one looks better. I use 10 bit since the xbox has to go to 10 bit for HDR anyways. Make sure you uncheck the YCC 4.2.2 on your series X. Also, you may want to skip AMD free sync and just use VRR- Freesync on the LG can disable DolbyVison in some instances without added benefit. It seems that reviewers are saying that VRR and Freesync have similar performance ( but VRR doesn't have the DolbyVision loss).
@@MrKenn1230 Thanks man. This is the first bit of advice ive read with some explanations to help. Im going to jump on and check my settings again. Thanks again!
Hello. Help me please lol, im french, i dont understand good english. I have an lg b7v (2017) compatible 10 bit: i can use 10 bit or i need to use 8 bit? better to use standard mode or pc rgb mode? and last question i desable or enable the ycc 422? thank you so much.
Bonjour! Set your Xbox at 8bit or 10 bit (whichever looks best to you on your TV). Leave the Xbox on standard mode AND set the black level to LOW (NOT Auto) on the TV. DISABLE YCC 422 (It can degrade your picture with some inputs/signals).
So were you running gears 5 in the “HDR configuration” at the end of the video? I ask because I play the same game on PC, not in HDR, but I have my Nvidia Control Panel set to 10 bit color (reference mode) since my monitor is capable of “8bit+FRC”. Is this “PC-set”10bit color space only applicable for HDR? Is it applicable for SDR as well? I want to know conclusively since I do not want to create any bandwidth/color conflicts with the game, windows, and my monitor in order to remove any banding/dithering. I do not use HDR on my pc since it does not look good on my gaming monitor (yes my monitor is capable of HDR, but only outputs 400 nit peak brightness). Any applicable information would be appreciated.
Just theoretically futuristically speaking, how much Gbps would be required for 4K 120 Hz RGB 12bit 4:4:4 if the TV's are going to be 12 bit at some point? If I get it right lol.
@@tomekkrr7806 and that's if you find, in the near future, a consumer source that will put out those specs (except for PCs)- The Series X and PS5 seemed to be capped at a 40Gbps output maximum at this point. Even with PCs, I'd be curious if you would see the difference.
Quick question. Why is it that when i have my Series X on my Lg CX set to 60 hz it says ycb and when I have it set to 120hz it says RGB 8 or RGB 10, BUT WHEN I have my PS5 ON the LG CX always says RGB 12?????? DOES THIS MEAN the PS5 IS USING BETTER COLORS THUS giving a superior IMAGE????? 🤔 DOES this mean Series X cannot do Rgb 12???
Some like 8 bit and some like 10bit. Everyone agrees either will work but what looks best depends on how your TV processes native content (10 bit) or upscaled content (8 bit). Choose which one you think looks best and you should be good to go!
I have an Xbox X and I have bought some 4k film of the movie page but when I come to play them I get a message your TV does not support UltaHD format................. I have changed every setting 8 bit 10 bit 12 bit gone to HDMI Mode you name it I have done it I have watched many Utube video's but still get the message .................. My PS5 plays 4k discs perfect so it not the TV an LG bought last year please can you give advice ................Thank You ?
me too on my C9... its very subtle but I could see it in Dirt 5 when black bar were shown, I could see vertical pixelated lines, never at the same spot. Super weird. I tried a lot of things such as go back to 60hz, remove VRR, etc and the pixelated lines would still be visable in that specific situation. If normal gaming, I can't see anything. I know its not burn-in either.
I have the Samsung 65-inch curved television the 6-series, should I keep it at the 24 a bit cuz I got it at the 36 12-bit bit right now. I'm on the Xbox series s.
The C9 does accept a 12 bit video signal but it is only a 10 bit panel- It will downsample the signal to 10 bit to display it. If you are also using PC RGB with 4.4.4 chroma output and 120Hz, your Xbox or PS5 is reducing the chroma output to 4.2.2 or 4.2.0 (lowering picture quality) because they cap total output at 40Gpbs (Xbox) or 32Gbps (PS5) (not the total max input of the C9 of 48Gbps). Assuming you are using the new consoles and not a high end PC rig, you may want to lower the bitrate on the console to 10 bits.
@@MrKenn1230 Maybe, it's kinda true tbh but since the C9 has 48gbps it runs smoother and not at full or overload the system of the tv. If I change it to 10 bits panel it makes the image not pitch black but kinda grey I have tested all the possibilities between Series X/C9 configuration and 12 bits it's certainly the best blacks you can get
@@MrKenn1230 no, it is not 12 or 10 bit. It is WOLED, so it can as well be 10 * 4 = 40 bit. Also for HDR 12 bit is just better black and white gradation.
@@lolerie In my example I was speaking about the panel’s native color depth. Most TVs today that support HDR are 10 bit panels meaning they can reproduce 1024 shades of each of the primary colors (red, green, blue). They are other factors in hdmi video and audio bandwidth like you mentioned but the ultimate limiting factor in the number of colors the Tv can display is the native bit count of the panel.
Bro your English is PERFECT, the kids complaining about your speech has hearing problems.
Thank you brother 🙏
Fax
The people that complain, would happily listen to mumble rap on a daily basis lol
He speaks better than I! AND I can ONLY speak English!
You sound like a beautiful young man
Thank you so much for this, I was looking at this setting on my Series X and you explained everything perfectly. Answered all of my questions before I had a chance to ask. Subscribed. Thank you!
Thanks for the information. Very interesting. You got yourself a new subscriber. I’m a nerd and love this lol. I got to appreciate my new LG CX 65” and my series X even more 😍
This is an interesting topic right now, because all the available (and many of the upcoming) AV Receivers that support HDMI 2.1 have that Panasonic Chipset mismatch issue right now.
It happens on Nvidia cards and Xbox, but not PS5. Now, the question is, whether Microsoft could simply offer a "workaround option" for now that allows us to limit the Series S and X Output to 32Gbit/s like the PS5 currently is, thus enabling AVR users to enjoy 4K120HDR (albeit in 4:2:2)...🤔
🙄🙄🙄 Ridiculous, am I right.
I cannot believe the amount of shit we have to do just to get plasma screen black levels. Absolutely shocking.
You are right I did the test I put an 8bit game and I put the console in 10 bit and I noticed that the colors were not natural they were saturated on the neck I went back to 8 bit and c 'was much better, and for the games which sound in 10 bit you are also right because the console automatically switches to 10 bit so it is better to leave in 8 bit like that we have the best of both worlds
Thank you
@@MyGadgetsWorld what if your monitor is 10 bit?
8bit 4:4:4 on every 60hz panel is the way to go. It's 17.82GBs and under the 18GBs HDMI 2.0 threshold.
The CX is limited to 40gbps, putting it in 12 bit will lower the chroma subs ambling to 4.2.0, 10 bit will give you RGB 4.4.4 which is what you want, now put your oled into PC mode to take advantage of it.
Why do i need to put tze tv into pc mode? I know Vincent said it too. But in gears 5 it shows also RGB.. thanks
@@tomekkrr7806 Lower input lag.
@@ReLoadXxXxX no. Input lag is 13ms in game mode and is not lower in pc mode.
@@tomekkrr7806 The TV and your Xbox "have to speak the same language", when you put the TV (LG CX, C9, etc) in PC mode, it will recognize and process the 4.4.4 signal without conversion, if you don't, it will process all signals from the Xbox as 4.2.2
@@MrKenn1230 so only in pc Mode it is 4.4.4 ? I tested some pictures and in console mode the edges are smoother while in pc mode the the edges are less smooth. More steps are visible. Very strange.
This really helped. I tried changing the override to hdmi but switching back to auto detect because the resolution dropped. For whatever reason I never on my own thought to just increase the resolution back up to 1080p. Then I made the mistake of changing the color bit only to learn that it isn't necessary. So now years later I'm finally playing on 60hz which wasn't a option before.
This was an awesome explanation! Thank you!
Thank you. Please visit my new website. Visit my Official Website here:
mgwreviews.com
Nice vid but flawed opinion. The xbox and tv do have auto settings but unfortunately they will go with the lowest common denominator to ensure a smoother experience when there is still more to the image that is not being represented, namely in the bit depth. Really turn on 12 bit, disallow 4:2:2, set your tv to native color space from auto (similar effect as the 4:2:2 xbox option) and you will see a clear difference between 12 bit and 10 bit. Film it, show people, it's clear as night and day. The only problem I have had with specifiying higher settings on my xbox is with Hulu, it gets a greenish tint to it but that is removed by checking 4:2:2 color again. I'm doing this on a 55" h8f also, so you cx should be even better.
Lol
@@dillonweaver2307 lol at what exactly? I did this and it works on my LG C3. Every other answer left my picture looking like a yellow piss hell with non existent blacks.
@@Gorgutsforcongress you don’t know how to setup the TV and the console together is all. 12 bit panels don’t exist in TVs yet that I know of so no reason to have input lag when your having to change it back to 10bit again. It’s still 10 bit if your lucky. A lot of TVs are 8bit but can accept 10 bit input also so then might as well be on 8bit. Up to you but those consoles are not very powerful nor TVs so make them do the least possible. And 4:2:2 is only a setting you should need on a really crappy 4k TV.
@@Gorgutsforcongress what’s your gamma at what’s your black levels at what color settings are you using what about white levels?? If you don’t like a warm picture you don’t like accurate colors but that’s another things that’s just preference. I try to set all mine up as color accurate as possible no additional features turned on and get the lowest possible latency I can. My LG CX I have many ways I can run it but it’s mostly about if it’s a 8bit or 10bit game and if it uses HGIG or not.
I calibrate everyone’s display for them just about they always call me when they get a new TV to adjust it for home use and give them options for light and dark room use. I do it by eye but I’ve done it so long I don’t need instruments usually to get a pretty accurate display and fix other issues they come with because there setup usually for being under a bunch of lighting and usually cooler color lighting. So the stupid modes they come on look like garbage unless you only ever seen them that way. I get a lot of people that say there new TV makes them nauseous. Would you happen to know why that is?
Very helpful, THANK YOU
Welcome Jeremy!
Great video brother. So I have a lg G1 I should definitely change my output to 10bit correct? Will everything look better ?
This is wonderful. Thank you!
Great video! 🙏🏽
Awesome work my friend!! Thank you!!
We would really appreciate a same video for PS5 aswell.. with lg cx and samsung tv..
People are really confused out there..
Hey Sumit I have couple of videos lined up i will make sure to include this.
For xbox it is easy to just keep it at 8bit and let it do the math for us. I will see what I can do for Playstation as PS5 settings give you very limited control
Ps5 is capped at 32gigs apparently 4.2.2
Great video on Sony a90k would u use the reality creation setting or just leave on auto and if not how come ? And what about the advanced contrast enhancer would u use this ?
Great video bro. Question? Can the LG Oled C1 support 12 Bit Xbox Series X?
I will prefer to keep it at 8 or 10, No TV supports 12bit color at the moment.
@@MyGadgetsWorld thanks bro! I appreciate it 👍🏽
The c1 supports 12bit on my series s so I would think the x is no different
Wow smart.. thanks man! How about the color space? Standard or PC RGB?
I have an LG CX too, I’ve turned HDR10 off on my Xbox series x, should I leave it on 8bit? What should my display settings look like?
Leave it at 8bit and let xbox and tv do the math for you.
Why did you turn off HDR10?
Is there a good reason for turning hdr10 off ?
@@euanm10 with it enabled it looks kinda washed out, I’ve left it off and adjusted my picture settings and it looks far more vibrant.
Informative video. I appreciate that. 👍
I have a 1080p asus monitor that can support up to 165hz what color debt should I keep it at? I also have a Xbox series x
I always hear that people think 8 bit is the best but i don’t know for sure it’s just what I hear
What U prefer for fps games (Compettive Games) plz explain?
1080p 120hz
Great video bro! 👏🏽
Thank you.
@@MyGadgetsWorld your welcome!
I have an element smart TV and it let's me use the 120Hz with 12Bit but I'm not actually sure if it's doing the real 12Bit since I don't think it has a picture info menu like that one, it may just be outputting 10Bit or even 8Bit while the console says 12Bit.
Thanks! That’s cool to know. My lg c2 outputs 12 bit at 120 hz. It changes it from RGB to YCBCR420 12b 4L6. 48gbps. I guess they enabled that within the last two years. I also have a cx I’d like to check. But that’s good to know what the max and what the tv is actually outputting. I’m about to test a few 48gbps hdmi 2.1 switches so knowing how to access this is invaluable
No it does not do 12 10 bit panel
@@pontusborgjonsson1246I don't understand because it's Dolby vision and what I understood from that is Dolby vision is 12 bit
@@RinzSach well yes and no. Its made in 12 but all tv can do atm is 10 so if its 40 48 etc does not matter.
That’s 4x6 which is only 24bit
Super low
Explained a lot so thanks for the video.
I don't remember if the older LG CX could give one the full 48gbs. I know the C3 can. For a couple of years, they went back to 40gbs. The Xbox can't display 48GBs..only 40GBs and the PS5 can only display 32GBs. That's why true 4K@120HZ is impossible on the PS5 and really almost impossible on the XBSX because it doesn't have enough power. There is one game released two years ago, that is a 2D scrolling game, that can maintain actual 4K@120HZ on the XBSX. Most games are between 1080P@120hz or 1800P@120hz. A 60hz panel can only really supply 4K@60z with 8bit. An HDMI 2.0 port/cable can only handle up to 18GBs. (Not 40GBs or 48GBs) The question is 4:2:2 at 12 bit (17.82GBs) or 4:4:4 at 8bit (17.82GBs)? That's the highest for an HDMI 2.0 cable or a 60HZ refresh rate panel and not a 120hz or an HDMI 2.1 port/cable. 12bit at 60HZ us only 24.06 GBs. Any 2.1 port TV can do this.
I have a 8-bit + FRC "10-bit" monitor and i usually go with 10bit 422. Get less bandning when gaming. I honestly can't tell a difference in other things except the color banding. Text and menus looks the same.
Thanks for this info.
I have an 8bit +FRC myself the Samsung Q70b QLED tv and I uncecked my 422 box in settings and having the console in 8bits. You say that enabling 422 and going up on 10bits on my seriesX would be better?, It's not all the SDR content like games in SDR and TH-cam videos in 8 bits anyway and my seriesX would go automatically up and down correctly without being forced on 10 bits always?. Wouldn't 422 limit my bits on 10 and I could never achieve 12bits while it's a enabled? I mean 420 -12 bits would always be better than 422 10 bits if there is content available for that range of bit depth?, don't you think?.
If you set it to 10-bit and then play a Xbox game that doesn't have HDR which would be 8-bit does it automatically switch to 8-bit colour?
No because you are forcing it to keep it at 10bit.
Keep it at 8bit
@@MyGadgetsWorld Xbox should just put in an auto option in that menu so we can just set it and forget it.
@@MyGadgetsWorld why advise you to keep in 8 bit? normally 10 bit or 12 bit is better?
@@Mck179 because xbox can switch it automatically for you. No need to push it to 10 or 12. SDR content uses 8 bit
On the X series I'm lost to have the best image in sdr and hdr advise you to put on 8 bit or 10 bit, because if the console automatically switches to 10 bit what is the point of putting it in 10 bit I don't understand thank you to enlighten me I but 8 bit or 10 bit?
Hey, is there a way to fix the frames per second counter on the free sync menu when in 120 hz mode? Somehow it always displays 5.5 when the system is in 120hz mode
Thanks for the explanation and advice.... much appreciated mate ✊👊
Welcome. Have a great weekend and happy new year.!
@@MyGadgetsWorld heyyy, you too my brother ✌
Vizio M55Q8-H1 2020. Which color
Bit 8 or 10? And enable or disable YCC 4:2:2? I play 4K HDR 60hz and 1080 none HDR 60hz games.
How are you getting RGB? I keep getting ycb 4:2:0
I am using TV in PC mode
How do you get that pc mode?
@@widaflow change the hdmi input from game to pc
@@widaflow change HDMI input icon (icon, not text) to PC.
How do you show that FreeSync Info? I pressed the green button or two dot button on my lg remote, but all it does is control my xbox home screen like a controller? Mind you I got AMD freesync off because it disables Dolby Vision.
U need to press the small green button at the bottom quickly several times until u see it.its a geek thing :)
Great video thanks
Don’t tick 4:2:2 and leave it to 8bit when hdr is played it will automatically change to 10bit. When sdr is played it will stay locked to 8bit. Just leave it alone or you will wash colours in sdr games
@wesley 504 SDR=Standard Dynamic Range.
It does not on Windows, in HD color.
I get less color banding in games when I use 10bit 422. (2.0 monitor with 8-bit + FRC)
@@LouSanuz Not sure about that.
What would you recommend as next best tv to get if you can’t afford the LG OLED...I really want VRR though...
get the smaller oled or get the sony x900
Why does everybody want VRR when all it seems to do to my games is wash them out? I had to disable vrr while playing GTA because it looked so flat.
@@hugono3938 I got the LG A1 OLED, love it but it doesn’t have VRR
@@Gorgutsforcongress you must mean HDR washes out the color. You gotta adjust the settings but yeah it’s not quite as vibrant because it’s trying to simulate how your eyes really see color in real life
@@symboltherapper no the vrr definitely does something to the blacks in game. It raises them for some reason and to me all games just look a tad better and punchier when vrr is disabled.
to use in a 4k tv in 60hz and HDMI 2.0 i have choose 8 bit right? and with HDR ON, the depht color will change automactly to 10bit? is correct? i havo to check ycc 4:2:2?
8 bit changes to 10 bit automatically on hdr games
@@itzdougie9745 no, it does not, on PC on nvidia in Windows 10 HD color menu. 8 bit but it will be HDR.
@@lolerie I meant 8 bit changes to 10 bit on non hdr games , sorry
@@itzdougie9745 yes, that is true. Games that support HDR will switch you to 10 bit HDR from 8/10/12 bit SDR.
I saw Vincent say that allowing ycc 4:2:2 on hdmi 2.0 tvs actually outputs 12bit422 instead of the signal info displaying 8bit422 my question is does this same process work on non dolby vison tvs specifically hdr thru 4k60?
2 YRS and I'm just about to put the hammer down on a series X and found your vid and thought dude I'm on currently on an xbox one and a benq projector from 2015 it's putting out a crisp 1080p Pic and really love this platform alot where is the series X gonna disappoint? I subbed I want to know what's going on since I'll watch it. I really need to know is my settings wack or what? Have a good one. Later days. -951-
It’s 2022 and I was suddenly bugged by this in terms of which is the better option. I blame my PC (long story). So basically what I gathered is PC RBG is best for monitors, otherwise Standard is for TVs and bit depth wise just leave it at 8-bit like you said because the console will work out when it needs to use anything higher. So far would you say I am on the money? I hope so! Lol
I've got a q60r Samsung should I set at 10bit or 8bit
Hi,can you tell me which is best 55 or 48 tv for the xbox series x please ?
65”
Lg c1
Thanks for your time
Advise you to put the console in 8 bit or 10 bit ?? I specify that I have a 4k 10 bit oled.
Yes xbox has ability to switch it upto 12 bit so no worries at all. SDR content will use 8bit to you don't want to force that to 10bit or 12bit.
@@MyGadgetsWorld OK so no loss of image quality left the console in 8bit ?? but then I do not understand that she is interested in having the choice between 8 bit 10 bit or 12 bit if the console does it automatically?
Not at all. Leave it at 8 bit and see how it works for you. I have been using it on 8bit since xbox one x.
OK, thanks
@@Mck179 The Xbox switches to 10bit (if the monitor supports) when outputting HDR anyway, so it's best of both worlds if left at 8 bit.
Thank you Mr Pichai
Thanks a lot!
Unfortunately, I have an LG OLED77CX9LA, and I can't get that information with the green button :(
Press the green button 8 times in a row quickly.
Vincent from HDTVTest says to leave Series X at 4K120 with 10 bit for a full 40GBPs bandwidth with the LG CX
@@driverdis3488 it worked, thanks a lot!
Your English is really good you can never perfect some thing your good bro
Mine works on the 12 bit will that be good for warzone ?
No lol
Wherd you get your tv at for 4k uhd and 120 hz and how much
$1000 n over. I get from bestbuy or Costco
@@MyGadgetsWorld whats it called bro i need a link maybe lol
@@medievalshadows2836 LG CX?
Hello, on my oled 4k with my x box series x advise you to set the color space to 8 bit, 10 bit, or 12 bit?
10
Depending on your monitor if it's HDR then 10 bit is all you need and can utilize on today's monitors
I have a Samsung Q80A 4k 120hz tv, seems to work at 12 bit
Thanks for the wonderful demonstration!
Thank you Curtis.
I set mines to 10 but to upscale my games so my tv don’t have to keep switching 8 and 10 all the time
What would you recommend for Sony 900e owners? I’m running 1080p 120hz, which in this mode, hdr isn’t supported. Please help! Thanks
I wish I had sony 900e or 900f. Sorry . I hope someone else can help in the comments.
Can the Xbox series x on the LG C9 do 12-bit @120hz ?
No, it can't support 12-bit regardless of panel.
The reason is the bandwidth of the Xbox. It has 40gbps. Max 10bit 4.4.4.
What would you recommend for HDMI cables for simply the XBOX Series X connected to an LG 32 inch 32GP83B-B.AUS? I don’t have a PC. Thanks for the video and any advice you can give me would be great. I’m attempting to get the best settings for PUBG and yes like you said , I keep changing my settings non stop and experimenting lol. Thanks in advance .
Zeskit HDMI cables are best
@@MyGadgetsWorld thank you so much. There so many different variants of them though. Lol. Any one in particular? Price isn’t a concern. Just want the best possible option.
@@jimmyjameswu6474 Get the braided one any as long as it says certified Ultra High Speed HDMI cable you will be all set
@@MyGadgetsWorld again thank you so much. You’re a legend mate.
@@jimmyjameswu6474 really appreciate it my man. Glad to help.
I have a Samsung 50” 7 series RU7100 and wondering what colour depth setting I should set my series X too
I noticed if I set it to 8 bit the game tile for assassins creed Valhalla looks pixelated on the Xbox home screen, I’ve heard that it changes to 10 bit automatically when viewing content, I’ve heard setting it to 10 bit causes unnessisary hdmi bandwidth Cos it’s converting it to 10 bit twice?, and I’ve heard 12 bit can cause chopped blacks if the panel doesn’t support it, so I’m completely lost on what to set it too
Такая же дилемма, тоже такой же телевизор и xbox series x.Уже решили проблему?
Can you post about color space on Series X? Limited vs Full Range RGB, which you prefer and why. Thanks.
You shouldn't see much difference between Limited (Standard) or Full Range RGB (PC RGB) if all the settings are matched properly between your TV and your XBOX (and the calibrations (color, sharpness, etc) are set properly for each mode). Most specialists recommend using the PC input ON YOUR TV and setting the color space to the default Standard setting (remember to also set your TV's black level to LOW (not auto) ) for smooth transitions between video and gaming. I've seen both settings work but that is the general recommendation.
Limited
Nice video bro
On pc hdr is horrid so been staying to sdr just wondering if there is any benefit to 10 bit sdr or if everything should keep to 8 bit also should I be going for 4.4.4 or should I stick with rgb? - thanks
There is no such thing like 10 bit on SDR 🤣
Great video! Thanks so much for your help! I'm trying to get my lg c1 to play (fps) with ori and the wisp? I'm changing it from 60hz to 120 in my settings but when i look at my game optimizer it says 59 frames.....
nick
It is happening in other games or only that specific one.?
Check the settings for the game if it is in 120 fps mode or not. I just checked and it is the game which can be played in 60 and 120. Make sure it is set to 120 fps ( performnace in the game settings)
@@MyGadgetsWorld I'll get back to you
I have seen people recommending PC mode to get full 4:4:4 HDR color. What are your thoughts on this?
You get that either way. People just assume things sometimes without actual application . You can see it swtiches to RGB automatically. Keep it at standard and let TV do switching
@@MyGadgetsWorld I wonder if the PS5 will do the same thing. Evil Boris recommended the PC mode on CX for full HDR color.
@@MyGadgetsWorld since changing my CX to PC mode while on the Xbox input ive noticed it pops up saying "instant game response" on every page I go to. It used to be once I first switched inputs to the Xbox only. Small gripe, but if you don't think there's any benefit I'll change it back.
Try to revert back to console.
@@MyGadgetsWorld The AVS forums people say that display info is wrong. "That output info is misleading because this tv ALWAYS downsamples to 422 unless pc mode is enabled. There is no other way around. You want 444? Pc mode is the only way to go."
I've got 36 bits ( 12 bit ) is that the best?
No
I have the LG oled c7 TV and whe. I play my Xbox on 10 bit there's alot of croping on video why? By croping the image takes along time to resolve
So regardless of what the tv is capable of. The bandwith from the console is limited your saying ? So if I wanted 12 bit 444 120hz 4k it’s just not possible befusss the Xbox only outputs 40gb ?
@MyGadgetsWorld I have a 2016 E6 OLED and was wondering what are the best chroma settings to use ie (4:2:2 or 4:2:0) for the PS5?
4:2:2 is always better if you can
@@MyGadgetsWorld wouldn't it be 8bit then and not 10bit due to HDMI 2.0 on my E6 not having enough bandwidth?
@@CosmicNutcase 8 bit. 10bit for 4:2:0 hdr
@@MyGadgetsWorld so switch the PS5 on my E6 to 4:2:0 to get 10bit colour yes?
Yes at 60hz.
What if you tested this on the C9? Would it change the results?
Almost same
thanks..liked and sub'd
Thank you so much.
i got a vot a samsung Q80B. 65 in...i put it on 12bit and it works fine..
Im having issues with my xbox series is displaying dolby vision gaming on my tcl 5 series tv.
There is a grey like haze showing on my tv wen hdr is enabled.
Wen the dolby vision box is checked only.
I have 2 questions i hope someone can help with.
The first question is more a confirmation.
I bought the new LG NanoCell 85 series 65in. For my xbox series x, do i want to use 8bit, 10bit, or 12bit? 10bit would be best correct?
My second question is, my new TV has "AMD FreeSync Premium". Should i turn this option on? If so, whats the best setting? High or wide?
Thanks everyone. Any advice is greatly appreciated.
Hello. You can set your bitrate to 8 or 10 , either will work, I'd just look to see which one looks better. I use 10 bit since the xbox has to go to 10 bit for HDR anyways. Make sure you uncheck the YCC 4.2.2 on your series X. Also, you may want to skip AMD free sync and just use VRR- Freesync on the LG can disable DolbyVison in some instances without added benefit. It seems that reviewers are saying that VRR and Freesync have similar performance ( but VRR doesn't have the DolbyVision loss).
@@MrKenn1230 Thanks man. This is the first bit of advice ive read with some explanations to help. Im going to jump on and check my settings again. Thanks again!
so 10bit is the best not 12bit?
I have the 2019 Samsung series 6 4k can I use this setting and how do I get Dolby vision
What ideal setting if using c8 for Xbox one x or ps5 or Xbox series x?
Hello. Help me please lol, im french, i dont understand good english.
I have an lg b7v (2017) compatible 10 bit: i can use 10 bit or i need to use 8 bit? better to use standard mode or pc rgb mode? and last question i desable or enable the ycc 422? thank you so much.
Bonjour! Set your Xbox at 8bit or 10 bit (whichever looks best to you on your TV). Leave the Xbox on standard mode AND set the black level to LOW (NOT Auto) on the TV. DISABLE YCC 422 (It can degrade your picture with some inputs/signals).
@@MrKenn1230 thank you! :-)
@@MrKenn1230 hdmi 2.0 not runs 10bit with 4:4:4
@@MrKenn1230 8 bit without 4:2:2 or 10bit and check 4:2:2, your tv is a hdmi 2.0 right?
@@MrKenn1230 why not auto?
So were you running gears 5 in the “HDR configuration” at the end of the video? I ask because I play the same game on PC, not in HDR, but I have my Nvidia Control Panel set to 10 bit color (reference mode) since my monitor is capable of “8bit+FRC”. Is this “PC-set”10bit color space only applicable for HDR? Is it applicable for SDR as well? I want to know conclusively since I do not want to create any bandwidth/color conflicts with the game, windows, and my monitor in order to remove any banding/dithering. I do not use HDR on my pc since it does not look good on my gaming monitor (yes my monitor is capable of HDR, but only outputs 400 nit peak brightness). Any applicable information would be appreciated.
im using a series S, with an 1440p 120 hz panel, what bit you woould say is good?
I have a fire insignia tv which bit and standard should I put it on
How did you get the blue coloured borders around everything?
Accessibility > High Contrast
@@MyGadgetsWorld thanks boss
@@speedtab215 welcome. Have a great weekend
thanks for all. Can I put Color Space RGB(PC) on lg c2? what happen if i do this?
What are best settings if using ps5 or Xbox one x or new Xbox if using LG OLED65C8pua tv?
Just theoretically futuristically speaking, how much Gbps would be required for 4K 120 Hz RGB 12bit 4:4:4 if the TV's are going to be 12 bit at some point? If I get it right lol.
At least 40gbps.
No, 48 is needed.
@@tomekkrr7806 and that's if you find, in the near future, a consumer source that will put out those specs (except for PCs)- The Series X and PS5 seemed to be capped at a 40Gbps output maximum at this point. Even with PCs, I'd be curious if you would see the difference.
48,11 Gbps
So, not possible with HDMI 2.1.
64,15 Gbps at 16bit.
Quick question. Why is it that when i have my Series X on my Lg CX set to 60 hz it says ycb and when I have it set to 120hz it says RGB 8 or RGB 10, BUT WHEN I have my PS5 ON the LG CX always says RGB 12?????? DOES THIS MEAN the PS5 IS USING BETTER COLORS THUS giving a superior IMAGE????? 🤔 DOES this mean Series X cannot do Rgb 12???
How do I get that "freesync information" on screen with oled C9?
Yeah good question!
Green button with 2 white dots 3 times fast
@@Err0neus on c9 ? This only works on x versions
@@Err0neus thanks thays great to know
@@Err0neus that is only on CX.
You know why RE village shows as 8bit on the series X/CX?
What happens to the color bit depth when watching Netflix does it stay on 8 bit or switch to 10 bit automatically ?
I am fine but I wanted to know how to put 8 bit or 10 bit is the same thing ?? because I would like to leave 8 bit
Some like 8 bit and some like 10bit. Everyone agrees either will work but what looks best depends on how your TV processes native content (10 bit) or upscaled content (8 bit). Choose which one you think looks best and you should be good to go!
10bits is best option, best colors and deep black
Amazing content, nice and unbiased. Thanks, get a like out of me.
How can I pull out the free sync info on the right top side ?
I have an Xbox X and I have bought some 4k film of the movie page but when I come to play them I get a message your TV does not support UltaHD format................. I have changed every setting 8 bit 10 bit 12 bit gone to HDMI Mode you name it I have done it I have watched many Utube video's but still get the message .................. My PS5 plays 4k discs perfect so it not the TV an LG bought last year please can you give advice ................Thank You ?
I’m getting some random vertical pixelated lines on my picture. What could be the cause of this?
me too on my C9... its very subtle but I could see it in Dirt 5 when black bar were shown, I could see vertical pixelated lines, never at the same spot. Super weird. I tried a lot of things such as go back to 60hz, remove VRR, etc and the pixelated lines would still be visable in that specific situation. If normal gaming, I can't see anything. I know its not burn-in either.
I think it has to do with a difference between fps and the refreshrate of the tv.
Try switching your imput label from XBOX to 🖥 PC
That should fix it
Tv model?
@@ELECTROFELO1 input label is on pc
The XSX dashboard font and settings icon is blurry when you select it. Is anyone experiencing on their LG CX?
Great video! Thank you!
Thanks for watching!
I have the Samsung 65-inch curved television the 6-series, should I keep it at the 24 a bit cuz I got it at the 36 12-bit bit right now. I'm on the Xbox series s.
I wanna know too
My Samsung is able to sustain 12 bit and pcrgb does that mean my wifi be less powerful compare the when I had it at 8bit
My C9 supports 12 bits I'm also using with PC RGB
I just added the C9 is a 12bit 48GBps panel too lol.
The C9 does accept a 12 bit video signal but it is only a 10 bit panel- It will downsample the signal to 10 bit to display it. If you are also using PC RGB with 4.4.4 chroma output and 120Hz, your Xbox or PS5 is reducing the chroma output to 4.2.2 or 4.2.0 (lowering picture quality) because they cap total output at 40Gpbs (Xbox) or 32Gbps (PS5) (not the total max input of the C9 of 48Gbps). Assuming you are using the new consoles and not a high end PC rig, you may want to lower the bitrate on the console to 10 bits.
@@MrKenn1230 Maybe, it's kinda true tbh but since the C9 has 48gbps it runs smoother and not at full or overload the system of the tv. If I change it to 10 bits panel it makes the image not pitch black but kinda grey I have tested all the possibilities between Series X/C9 configuration and 12 bits it's certainly the best blacks you can get
@@MrKenn1230 no, it is not 12 or 10 bit. It is WOLED, so it can as well be 10 * 4 = 40 bit. Also for HDR 12 bit is just better black and white gradation.
@@lolerie In my example I was speaking about the panel’s native color depth. Most TVs today that support HDR are 10 bit panels meaning they can reproduce 1024 shades of each of the primary colors (red, green, blue). They are other factors in hdmi video and audio bandwidth like you mentioned but the ultimate limiting factor in the number of colors the Tv can display is the native bit count of the panel.
What about 1440p@120 12 bit gaming?
Yes, it does work.
How do you do this?
Can we get an update this for the new LG c1 2021 model
Passing threw a new demon AVR 4700 with a 12 bit Samsung q7f . I’m getting black outs during switching sources .