Keep in mind that when you change color temperature it will take a few minutes for your eyes to adjust to the new white point. So don't dismiss the warmer color temperature immediately.
What I like doing is watching a little bit at settings a slightly more exaggerated than the recommended . Ex: After about 5 mins of watching TV at WARM 3, i would switch to warm 2/1 and all the accuracy becomes completely satisfactory !
Nope. On my previous Samsung TV I had it on warm for 2 years and then I visited my neighbout and saw his beautiful NORMAL color temperature and instantly had to revert this option. I NEVER could get used to the piss filter. Also why are you all always comparing warm to cold and saying cold looks too blueish? Why not compare to normal temperature setting. I don't like either cold nor warm, normal is always the best and most realistic looking.
True, cant stand either the blue or yellow tint in the before and after. White should be white If you take a picture of something, the TV and the thing should look the same
@@adriantrinidad1296 "On" if you have a Samsung. Just whatever setting that makes the picture look smaller, because that means your TV is not zooming in and killing the borders.
Summary: 1. Don't crank up the sharpness setting (keep it neutral). 02:15 2. Turn off Noise reduction 3. Ensure Zero overscan. 04:36 4. Turn off Motion Smoothing 5. Color temperature - use (target the D65 white point) Tl:Dr - filmmaker mode
5:11 I feel like this is due to your association of movies typically being 24 frames per second and handheld cameras typically have 60 FPS (modern phones anyway). If you get rid of the built-in association in your head then it is just smoother motion
You are honestly a saving grace for TV buyers. I'm glad you're busting myths and educating people on the REAL technology, not pseudo marketing nonsense.
@@arkham_miami I personally don't put the sharpness all the way down, I put it on 10/100, sometimes a little bit of sharpening helps crisp up the image a little.
Well, it would be except that he is objectively wrong on a couple of counts. Motion smoothing can do nasty things with certain shots, but it's better than the "purist" view of watching what feels like stop motion. Similarly, film grain isn't "artistic", it's still just unnecessary noise. In some of these cases, he's not busting myths, he's spreading them.
@@bevanfindlay 24fps doesn't feel like stop motion. It's been filmed at that rate for years and the only reason why you feel its unbearable is because you think "more is better". Film grain may not be to your artistic tastes (and thats fine), but though it is 'noisier', the image is actually clearer because denoising generally makes an image less detailed.
@@normietwiceremoved No, that frame rate was chosen as being the bare minimum that usually looks passable for most scenes. Try playing a first person game at 24 frames and at 60 and tell me which you prefer and if you can tell the difference (spoiler: we can; vision alone can pick up changes to around 90 Hz under many circumstances, higher in certain unusual ones). Pro gamers can perform differently at as much as 200 fps, though some of that has to do with more than just vision (input lag etc). Noise is the same - even a basic denoise algorithm isn't going to affect detail much, and film grain is still just garbage optical noise - it's the "purists" who are ignoring visual science. It's people saying they prefer something broken over something better. The Hobbit movies were the first time I didn't feel like a movie was horrible and jittery (pity that they made some other mistakes that some blame on the frame rate when they're unrelated).
You mean like, my "HDMI 2.1 port is ready for your high bandwidth cable, master... Oh yes, I'm fully certified with ALL specifications... Plug it in!"?
Vincent, All I have to say is THANK YOU. I recently cut the cable cord and currently going with Hulu + Live TV. When I did I experienced all kinds of picture quality and mainly motion related issues on my Samsung QN90B to the point where I was literally about to pull the trigger on another new TV. Your guidance has made a drastic improvement. I am still seeing some minor imperfections which I can live with but thanks to you I won't be shelling out big bucks for a new tv.
Omg, this is by far one of the most enlightening videos ever. Since forever I could not understand for the life of me that why my UHD OLED tv wouldn't show the high quality 8k image in that quality and what's wrong. As u said, found all those settings on in my JVC tv, switched them off and immediately got the quality that I have always been yearning for
You have answered, with freeze fame and video, the questions I had for years. I bought a 4K TV a few years ago, and it never looked right. The LotR and Mission: Impossible comparisons were exactly what I have been viewing: too smooth, too glossy, like the movies were rendered instead of filmed. A massive thank you, good sir!
You are a LOOSER. What are you doing here in the Philippines? Whats wrong with Allan and Steve? Why are they stealing our packages? Are those people fake?
@@theswampus670 I hate that so many people have ended up making that mistake. I remember back when Blu-Ray and HDTVs were first becoming a big thing, and motion smoothing was coming out around the same time. I saw a lot of people saying things like "HD/Blu-Ray makes it look too real, like I'm watching behind the scenes footage or something." I remember I saw a comment once where someone said, "I saw them playing the new Die Hard movie on Blu-ray in the store, and it looked like I was watching Bruce Willis film the movie, instead of watching the movie. I prefer DVD, where it looks like a movie." I had to tell him that he was probably watching a tv with motion smoothing turned on, and that Blu-ray actually looks more filmic with the right settings. It is very frustrating that so many people don't notice it, or if they do, don't know it can be turned off. I'd say about half of my friends and family, when I go over to their house, I ask if we can turn the motion smoothing off, and they're like "What's that? I don't know how." and then I turn it off and they're like, "Oh wow, you got rid of the soap opera effect!"
I have an older DVD box set of the Starwars 4,5,6 remasters. If I go into the extras menu they have a whole thing for setting up your TV, a bunch of test screens and instructions to get just the right sharpness, contrast, brightness, color balance, etc. After using it all my movies look a lot better than they did with the default settings options.
As a hobby photographer I agree with the problems caused by too much sharpness and noise reduction. It’s exactly the same when you are editing photos. This also applies to white points and picture temperatures.
Sharpened images always look better, have more accurate lighting, better contrast and 10x more detail. I think 88% of the population are just visually impaired without even knowing it.
@@godzilla2k26I live in a clean city. And the snow doesn't look bluish like the cold and standard temperature on the Television. Same with clouds. The sunlight have a natural yellow tint
Fully agreed with every of the 5 points - But I have to admit it took me two decades to change the color temperature to warm 50. Strangely I always found the neutral/colder temeperatures more realistic. But once I got used to it, there is no way back.
Always felt like there was a yellow smear. It’s hard for my eyes to accept something like clouds or snow with a tinge of yellow. But this would explain why skin tones can sometimes go very pink or magenta. I will readjust my settings and try it out for a while like you recommended.
@@bigmoviefreak skin is pretty pink in reality. Shifting in some green and amber into the white balance makes white people look more pleasingly beige, but everyone else looks more green.
Human eye whitebalances automaticly. Ambient light usually 3200k (yellow) and background of your screen can shift your perception of colors. For this reason I would recommend watching movies in dark room.
As a Canadian (who also aggressively uses f.lux), I must push back on the “warm” temp setting for LOTR. Above a certain latitude, the UV rays of the sun “glance off” the atmosphere and no longer reach the surface or most of the sky. This means that being outside on even the sunniest days brings no warmth from the sun, and that everything is much bluer. To your credit, I think LOTR got this wrong, but to my Canadian eyes the one on the left looks far more accurate.
Yes they also made it warmer on film but your right but also him in away but truth it the way it's film yah I hate way directors tweak colors like that there no point.. but also Linus tech tips said don't trust the filmmaker mode because it's up user because some tv may differ only way is if you cal it through expert via pc but still... If you think you can I look up I think spider x data color and use calman it really fix the tv yah it made it warm but I push it back my lg was set to warm 1 but I push it to medium...
I'm not even Canadian. I grew up in Ohio U.S. and the snow definitely strikes me as being white. Ffs what happened to the phrase "don't eat the yellow snow"?
@@josiahferrell5022 FOR THE LOVE OF GOD!!! The fucking movies are COLOR GRADED!!!! If you want to see the fucking snow in it's natural color look out the fucking window. The movies are color graded to the desire of the director and colorist. WTF?!!
Great tips in this video! This advice was generally true for old CRT tvs, and is especially true with today's digital LCD tvs. Try turning off some of those extra noise or blur filters to see if you prefer the look of the picture better without them or not, and turn down the picture adjustment settings a bit. Too much extra contrast, brightness, and color saturation is not a good thing. You lose a lot of subtle details when they are fully turned up. And keep your color temp/tint/hue setting mostly to the middle so it isn't too unnaturally cool/green- pale blue, or overly warm/ orange-red. You will know it is set right if your primary colors appear correct. (REDS that that are deep and bright without looking orangeish or maroonish, and GREENS/yellows, and BLUES, that appear natural and vivid without looking muted or like other colors. Also keep the sharpness set low or barely on if used ay all, because too much creates exaggerated noisey edges. Keeping the picture settings lower not only produces a more natural and detailed picture, it improves the overall performance, and extends the useful life of your TV.
This is one video where I totally agree with everything you've said. Do you have any idea why tv manufacturers continue to keep pushing these film-destroying options as default?
It's because some idiots at the company who think slapping on unneeded filters is somehow better picture quality when they are actually hurting it. The average consumer is too dumb to realize.
The reason I've heard for most of these settings is because it enhances viewing live Sports. Which is what the majority of TV's are actually used for. Hence it being default.
The filmmaker setting tends to not take into consideration room lighting. It's great if the room is completely black, but with windows the setting is too dark and sometimes too washed out.
💯 watched a video where some guy said to set my brightness at 8...the picture looked like it was night time the entire movie and I couldn't see stuff that was going on if the films setting was night on top of that
@@rickycardenas5154 Set for color accuracy and then just go through a few scenes and adjust the brightness until you're happy with it. I find the star wars prequels to be very dark in dolby vision mode but when I tested a dark scene while paused turning up the brightness didn't reveal any more details just raised the shadow from black to grey.
I own a lesser-known brand of TV that has a faux filmmaker mode. At first I thought it was because it wasn’t true filmmaker mode that’s why it was too dark, but your comments here now confirms that it is the same for legit filmmaker mode too.
Absolutely useful information. I applied all the picture settings recommended by Vincent Teoh except motion smoothing. Just can't stand the judder. Great job, Vincent. Thank you.
Got a new Samsung TV today, found this video by accident and changed the TV mode to Filmmaker Mode. I used the same scene from Lord of the Rings for comparison. I was blown away by the fact how much the image has improved. Thank you very much!
This was no bs, straight to the point, informative stuff. Also anybody who prefers motion blur is wrong. Edit: to clarify I was referring to the setting, which is actually called "motion smoothing" on some TVs.
It pissed me off beyond comprehension that HD tvs come with that setting on. If I go to a friend's house and they have that on, I will always insist on changing it. I don't get out anyone is able to watch shit with that awful awful f@cking setting on.
I liked it because it made things look kind of 3D to me. I think the first time I saw it was when I was watching a demo for one of the Transformers movies. Hyper-realism does have its place with some content. But the "some" is the key word here. If I was watching 12 Angry Men with motion smoothing on it would just be annoying as shit.
I’ve never had a motion smoothing tv until I bought a Samsung on Black Friday. When I watched my first movie I was like, something feels really off. So I sat through the movie then immediately messed with the settings to see if I could fix it. Sure enough I could and I turned it off. If I was watching TH-cam then maybe it would be good, but it’s a hard no when it comes to using it
I’m guessing you mean motion smoothing? Motion blur is what filming at 24 fps naturally produces (the so called “cinematic” feel) Personally, I hate artificial motion smoothing, but if it’s content natively intended to be high framerate, I prefer that
Switched up my TV last night and wow. I mostly game on it and it first and at first it did feel yeelish as some mentioned. However, once my eyes adjusted it was crazy good. So much more tone and depth. Made the space scenery really pop with the blacks contrasting to color. Can't ever go back, thanks for the video.
This is so misleading. The Warmer 2 (50 or whatever) is there to mimic the film media that has a red/orange cast into it, just look at a film strip for yourself. The target D65 doesn't look at all to the image at the right.
On my LG OLED TV I agree with everything except colour temp. I've found warm1 to be the best compromise, warm2 is pretty good but warm3 looks like a urine filter over the image
Thank you for this, Vincent. I turned off all this garbage and I’m surprised how much more I’m enjoying my 900H. I thought I loved the artificial enhancements, especially the Live Color, but I realized it was making all reds, for example, look equally vibrant, thus severely limiting the range of color in a movie. It also hides some details with Live Color on. I’m honestly shocked, and I’m finally using the Custom mode and DolbyVision modes where possible.
I agree with most of this except for 2 things. 1: just because there's a way the director wanted me to see the movie doesn't mean I care what he wants, I can watch it however I like. This relates mostly to the colour temp. 2: smooth motion is actually more immersive to me, once I got used to it, now I wouldn't want to go back. The argument for movies being in 24 fps is just as dumb as games targeting 30fps instead of 60. The 24 fps came from technological limitation and compromise, and the arguments for it are just trying to explain your personal preferences, which stem only from the fact that you're not used to it. Except for the artifacts, they can happen sometime, that's true, but I can live with it, it's not the end of the world.
On the c1 if you completely disable the motion settings most movies and TV series look like a blurry juddering mess. You need it at least on its lowest setting which is cinema
@@MILADINY0 it's the same with game optimazer mode. It's great as long as the game is 60fps but there are no motion settings in this mode so as soon as you playba 30fps game it's hard to even Look at it
In the late 90s early 00s, I was a subscriber to a magazine called Sound & Vision. I loved their articles about this. Been turning off my NR, and in the 1080p days only introduce minimal sharpness, amongst other things turned off and only at minimal settings. I love this channel, Vincent is awesome, extremely funny, entertaining, and very informative.
I recently joined LG C1 family and this channel has been a boon of information for someone who hasn’t dealt with TVs since 90s. Thank you for your knowledge and your work into making these videos, Mr Vincent. Best of luck to you!
Very nice and humourous video. Recently got Samsung s95c and have been scouring Reddit and TH-cam for Filmmaker mode settings that work for me. The problem is that Warm 2 (and even Warm 1) is like watching a movie wearing your sunglasses. Too dull, dark and can't see detail. I reckon Standard or Cool is better for me.
I can agree to all what u said excluding 1. Smoothness (soap opera ) is a must specially bigger screens. I dont how you can say turn it off with a straight face. Maybe some humans eyes are different from another, for me when i turn it off it gives me headaches and i feel im watching it on a very cheap screen.
I have an older SONY Bravia TV, by default reading the manual I though that setting all options to max would be the best option but I had problems with the image and it didn't look right. I decided to give it some time until I were used to it as I was changing from an old tube TV but I wasn't able to get used to it. So one day I decided to disable Sharpness, Noise Reduction, Motion Smoothing, Display Area to Full Pixel and setting Color Temperature to Neutral the image imrpoved a lot and it looked like a new TV! It seems that I was doing the right choice without knowing!
7:22 As a Swiss who has skied all my life, cold is more realistic to my eye. 😂 Yellowish snow is something you learn quickly to avoid first years of your childhood. 😂😂
Seriously, thank you! I feel seen! Like the idea that warm is “natural” like no somethings wrong with your eyes dude, go to a doctor. Snow even here in the middle of US, is white as white can be… Some of these settings are just too extra, and gross elitism.
@@goranm9430 @Satan-777 Hehe. Thank you for your comment. I was also feeling a little alone on this one. But honestly snow is complicated because it really tricks electronic calibration sensors/algos (especially cameras) for white balancing which is why we have usually a "snow scene" parameter.
You had great points in this video, but I have to disagree on SOE, I love how films and tv series look with it turned on. I also like the sharpness turned up half way and noise reduction turned on. I absolutely can not get used to Warm setting turned on I leave everything on cool setting.
When I watch movies on Blu-ray and 4K UHD disc I can handle film grain because that is part of the cinematic look the Filmmakers are going for which comes from shooting on film stock but what I can't handle is digital noise where the image has kinda like mosquito noise which is not part of the cinematic look and shows up on camera when ISO is to high.
@@90lancaster That's why I prefer physical media to get the best and highest picture quality but sometimes digital noise is apparent on some modern films which can look quite ugly because of the mosquito noisy picture it brings.
@@BeginsWithTheEnd That's more to do with video compression, sharpening the picture won't bring out detail because the image is muddy looking to begin with by sharpening it you can't bring out more detail, resolution really gives you detail, which is why you don't sharpen on 1080p or 4K content on disc, streaming platforms compress their movies and shows even on the higher resolutions.
I would argue that blur smoothing feels worse because of the nature of interpolation and not inherently because of the smoothness itself. Like you mentioned, the interpolation is creating artificial frames that can lend to some really odd visuals. In my opinion, that's what gives it the cheap smartphone camera feel. I think smoothness in of itself (aka fps + display refresh rate, I guess?) has its own strengths, though it definitely isn't preferable to 24 frames in cinema. Video in 60fps is very good for conveying objective information, for example.
you pose a valid point. personally i find it difficult to watch 24fps content on a monitor (projection in cinema is fine) so I use SVP on my pc to play everything back at 60/72fps with frame interpolation. i definitely notice the artifacts, but to me they are the lesser evil to the headache inducing stutter that is inherent to such a low frame rate of 24, especially in panning scenes. the ideal scenario is films like the hobbit or gemini man that are actually filmed at higher frame rate, so the benefits are there without the artifacting. one would assume it is possible to film at say 48fps and then playback at 24 for those that prefer it, though i have no idea as to the extra cost this would add to production. all the other settings mentioned in this video i agree with though. i prefer to keep the image as close to original as possible. the noise reduction one is a massive pet peeve of mine as it gives this dream like ghosting quality.
I can totally agree that artificial smoothing creates problems. However, I hate that cinema is still 24fps. In fast fighting scenes, it's impossible to actually see the choreography (it's just blurred limbs) and when I see a stuttery or blurry camera pan, I often get slightly dizzy. I think people only view smooth framerates as "cheap" because they're so used to only cheaper TV productions using them. Doubling or quadrupling the framerate would be a much more meaningful improvement to image quality (and require less resources) than bumping up the resolution from 4k to 8k, imo. It just doesn't make any sense to me that a smoother framerate would make a camerafilmed movie look less realistic/immersive. We don't see the real world in just 24fps either! I understand that for animated movies, the timing of the animations are more important than the framerate and that higher framerate might make poor animations look less realistic (though I don't know of any PC game that has this problem, so...), but with camerafilmed movies, the only small problems I could see with having higher framerate are that they don't mask bad fighting choreographies and that CGI is gonna be a bit more difficult to make look realistic, because more frames need to be rendered for it. But both of those are probably we should expect the movie industry to be able to easily overcome.
@@LRM12o8 The best solution that is well known and successfully used in the animation industry for many decades (and recently famously in the Spider-Man: Into The Spider-Verse) is hybrid framerates. For live action we would need to have camera motion at 48 FPS or more (for prefect panning without judder) and leave most of other things at 24 FPS (but not necessarily everything) to keep the dreamy look that allows the brain to fill in the gaps in a pleasing way. However this is technically non-trivial to achieve with live action and requires costly post processing work. I think Cameron even talked about this concept years ago when discussing Avatar 2.
X-Motion Clarity is what I use on my Sony. (Smooth 2, Clearness 1), the Black Frame insertion properly emulates the motion you see in a theatre without making the image look like a stuttery mess like it usually does on modern panels @24p. Sony does it right.
I use black frame insertion on my C1 as well. I'm dialing the djutter and the motion smoothing seperately or use a preset. I use 8 and 8 on each out of 10. Definite sweet spot for my eyes. To get some hands on time with a newer Sony to compare.
I personally prefer smoother videos/movies, it is definitely a contentious topic. The newer movies are shot on digital cameras anyways, so the genuine film argument falls flat. (Maybe for some old classics you should disable it, but for me 60 fps is way to go) Why would a 60fps camera look cheaper when compared to 24fps one?
Thanks for the tips. I turned on most settings when I got my tv because they sounded like they gave a better picture quality without knowing their actual effect. Not sure I agree with the warm setting though. But not everyone's eyes, tvs, or even films are the same though. I will definetly be doing some tweaks to my settings to test how I like things.
Vincent, even when I disagree with your recommendations (rare, but we are all human, and motion smoothing is niiiice), I can’t help but be impressed by your logic - and your humor. Keep up the excellent work.
Thanks for the overscan tip ,I cant believe both of my LG tv had it on. I disable the one on my Sony . Also, I leave motion smoothing on,very low, otherwise I'll be a puking mess. Sorry Vincent n Tom. Thanks as always for your effort n time.
films at 24p was implemented because of the limitations of those times with techonolgy and not because of that cinamtic look you think it gives, its just called cinematic because we got used to it. Early film was expensive, and not very sensitive. You needed a lot of light to enter the camera to get a good picture. You can solve both of these problems by running at a low frame rate .
Not sure if he covered this but on my Samsung tv I have an option for Shadow detail which was set a 2 . I turned it down to -4 which made dark scenes look great. Before dark scenes were grainy and had white shadows .
I agree 100% when it comes to motion smoothing. I play both games and watch movies and cannot stand even slight motion smoothing while watching movies. Glad to see there is an option to turn it off. All companies should do proper calibration in TV instead of over-riding features just for appeal. But customers eventually would come to their senses.
@Swami What is ‘proper’ calibration? Proper calibration is what is most appealing for the viewer … regardless what anybody else says. Smoothing on looks much better to me when watching UHD than turning it off, same with a good level of sharpness. Got mine professionally calibrated from one of our leading companies … looked absolutely shit. Same with those best settings. Locally they even stopped doing calibration services as the viewer’s individual taste usually massively conflicted with best calibrations. Best is what the viewer enjoys most.
@@GanymedeXD smoothing on feels like fast forward to me. Like you said, every individual has their taste. If you see RTINGS, they have pre-calibration settings review and post calibration settings review points.
I disagree, any game player worth a damn running a game at 60fps or 120+ fps would see 24 frames per second as an abomination. And the smoothness and buttery motion nature does not take one out of the game. You are just used to that trash gutter tier 24p in movies because it's how most things were shot, there is nothing special about it. All the arguments are nothing but cope. ps, if you like film grain, that is trash too, but it's on the creative side there.
A large number of people, especially males, are color blind to varying degrees. "Color Temperature" (and individual "hue" or "red/green") settings will thus always be a largely personal preference...no matter what the director wanted or the "purity" of one's aesthetics.
Most people don't realize it, but you're doing god's work. You're literally the only channel on yt that goes so much in depth and gives actually useful info
Overscan and Noise reduction do still have their places, but only on the analog input. I found them quite useful for improving picture quality for my retro consoles before I could get a proper analog to digital converter with better results than the TV.
@@johnrussell3961 Yeah but my games are digital and going from digital to analog to digital is pretty hard without a degredation in quality, at least if you're a TV manufacturer trying to bring a 4k TV to market for less than 300 bucks.
RAHelllord . Your eyes are analog, as are your ears. We evolved in an analog world. We are not happy if digital media does not stimulate us the same way as analog. Few are going to suffer because they are told digital is perfect. They will use their own eyes , and ears to judge that..
@@johnrussell3961 Yes, but a supbar conversion along the chain can still introduce noise due to cheap components and bad algorithms. My Retrotink 5x Pro is capable of doing that conversion without a loss in quality before it gets displayed on the TV. Which was my point, it can have a purpose to help deal with noise introduced by cheap hardware.
This is a lie that the TV manufactures have to keep going because of all the horrible things they started doing since the dawn of HDTVs. Flat panels were crap and expensive back then so they had to cheat to make them seem decent. Everything was way too cool to make them seem brighter and they turned up the contrast and saturation to make it pop. Unfortunately they conditioned the public into believing that this is what looks good
I don't know about the colours. As someone who lives in the middle of snowy mountains I find the colder image more realistic. The problem is that if the film maker has applied a weird colour filter on the image, whitening correctly the snow will crush the whites of the whole picture. It's tricky.
@@postboy2242 Snow actually does have a slightly blue hue, especially when you see light streaming through it from above. If you’ve ever build a snow cave on a sunny day, you’ll know that it’s certainly not clinically white.
Wow, this one video has saved me so much aggravation. I bought a Sony 65" X85L three days ago to replace a 2019 Sony 65" and was horrified with the picture quality. I had my old 2019 model dialled in perfectly but could not get the X85L picture to look anywhere close to as good as my old tv, 4k was fine but HD looked over saturated, skin tones looked too smooth and the whites had a blue tinge to them, added to this SD content looked appalling. It was so bad I had planned to return the tv. I followed these settings and was just amazed, the image quality is now fantastic, I'm honestly shocked at the difference. Thank you!!!
I really learned a lot from you, so I definitely subscribed! Not only do you impart you information clearly, but you are so funny! You make learning funny! The only downside that I've found watching the 2 videos that I have seen so far, is that I'm going to have to spend a LOT of time watching your videos!
The trick with a lot of these things is that you don’t really ever get to do these side-by-side comparisons, and it often shows the trade offs you make. For example, the ‘snow’ example used to show the ‘correct’ setting for color temperate left me conflicted. Unlike what Vincent said, I felt the snow looks more realistic in the Cool setting. But skin looks more realistic in the Warm setting. But would I have noticed the slightly yellow snow in Warm had I not seen the difference side by side? Would my eyes adapt over time?
you're just used to watching things in cool, or the monitor you're watching on is too warm. Objectively, the picture colour is more accurate and eye strain is prevented when using the correct temperature settings.
I completely agree with you that snow (among other things, also metals and some shiny materials) does(/do) not look right on Warm settings. I don’t know if this is different in different places, i.e. at different latitudes; but snow up here close to the Arctic Circle definitely never looks orange-y.
@@SwedenTheHedgehog are we watching the same video? everything's too blue and sterile on "cool" settings. on warm, the snow is still bright white but the colours are deeper and more natural
@@oxstorm644 If you by "bright white" mean "sun bleached-looking yellow", then sure. There is nothing natural about the color of the snow nor the skin colors (at 7:52 for example), with the "Warm" setting.
@@SwedenTheHedgehog Except people forget this is a movie production not 100% natural, for the color setting of this movie the skins and everything else is natural by its context with warm
I personally use motion smoothing on my LG OLED but only set to 1, the lowest setting. I'm one of these people who is very susceptible to motion, I find the rainbow effect on DLP projectors very annoying when none of my friends notice it at all, so watching a digitally shot movie at native 24p can sometimes give me a headache. "1917" is probably the worst offender, it felt like watching a slideshow. The judder with no smoothing is horrible, and so is smoothing set too high. I find smoothing on the lowest setting to be the lesser of two evils.
@@PotentChr0nic I have my sharpness on 20 on my TCL Roku tvs. I have the temperature setting on warm which I just recently changed from normal. Warm 50 must be a default setting. Adjust it to suit your taste that's all I can think of.
I've noticed that "motion smoothing" tends to highlight the CGI in movies, in a very bad way. It makes the CG look so out of place and almost "cheap" that it can be quite jarring. It appears like you can see the layering that is done or something.
While i agree with you that it makes CGI look bad. i personally absolutely love the smoothing effect. Since im a heavy PC gamer and well started with frame interpolation long ago i can barely go to a move to watch a film anymore, i get a headache from the stuttering... I would love if producers stop with the 24FPS and go up to atleast double... loved the hobbit HFR compared to the standard version!
yup. There is nothing natural about 24fps. "Cinematic feel". It's literally a holdover from 1929.. Same thing with d65 white point. It's warm as hell for old movie screens. Warm 50 is NOT true to life. Let me say it again 1929...
Man, this video has been AN EDUCATION for me. Thank you so much!!! Have had a beautiful Samsung 4K TV for a few months now after having had a rather crappy HD Tv for 10 odd years. The upgrade has been great but I really struggled with the screen settings and could never get it quite right yet. This video is awesome and I love the FILMMAKER MODE. I'm never going back 😂 Thank you!!!!
Watching this with MEMC enable and Native colour space instead of auto colour space on my Samsung TV Filmmaker mode seems too dull to me😅 But Warm 2 do look natural so yes I changed it to warm 2 and it looks much better
Not necessarily accurate but my settings on my QN90C 43" are: Base on Standard mode with these changes: Brightness - 25 Contrast - 45 Sharpness - 0 Local Dimming - High Contrast Enhancer - OFF Colour temperature - Warm2 And for HDR mode: Do the same but keep Contrast and Brightness on 50 And it's much more "natural" than the outta the box settings and still vibrant than firmmaker mode. Ofc I turned off overscan
8:55 What if i told you that i have reseted my LG CX filmmaker profile to it's default and will still have sharpening set to 10 on HDR and SDR ! Worse settings is on SDR Oled light is set to 80 and on HDR DTM is activated and peak brightness is set to high, that won't do justice to the movies at least in a dark room, that will overbrigthen the image. Kudos to people that are able to watch content without motion smoothness on oled because on an oled due to the nature of the technology, low framerates movies will looks like crap when motion is involved specialy if you are sensitive to stutter, i will take anyday soap opera effect over stutter and no that will not break the immersion for me and watching Gemini man didn't either.
LG used to have Filmaker Mode target a 100 nits with the OLED light set much lower but they changed it in a firmware update because people were complaining that it was "too dim" even though it was supposed to be the most accurate picture preset on the TV. It kinda defeats the point of FM mode. I do agree with you that low framerates on these tv's is unnecaptalbe shit because of the sample and hold nature of modern displays. As much as I respect Vincent, I don't really agree with him on 24fps content being the "ideal" for cinema. Movies should have started filming at 48fps or higher a long time ago and the 24fps standard is so archaic and was not done due to artistic reassons but cost. The funny thing is old movies when they were projected using a single blade shutter way back in the day used to look much smoother than when they switched over to multi blade shutters, to minimize flicker but "cinemaphilles" have convinced themselves that the blurry smeared "dreamy" garbage is what movies always used to look like, when in reality that couldn't be further from the truth. In fact it blatant revisionist history. I still don't use motion smoothing on my TV but I don't call people weird for using it and I can completely understand it. What needs to happen is that 24fps needs to die, but as long as the old guard is in place making these decisions it won't happen unfortanetly.
@@vdentertaiment4088 Are you saying that switching to three-blade shutter makes the picture worse? What? It was introduced to combat the flicker - flashing each frame three times helps get over the flicker fusion threshold, the motion gets smoother.
@@RockinEnabled 24fps movies when they were presented originally on a all projectors that were single blade 24hz way back in the early years of cinema. The image would have very viable flicker but the motion was ultra smooth and clear and there was no judder. Film projectors started to use double/triple bladed shutters to increase the refresh rate to 48/72 Hz. It would cut down on the flicker and reduce it dramatically, but the motion was a blurry mess as a result. Just like on modern >24hz displays in your home. 24hz is just tooo blurry. That "blurry dreamy" look that Vincent mentioned was not an artistic decision at all and 24fps when displayed on a proper 24hz display would have no motion blur at all. It would be very flickery but it would not have much judder or motion blur. TV manufacturers should offer 24hz black frame insertion to display 24hz on modern displays properly, but that will give many people a headache as the flicker would be too much for many I suspect. BFI is only offered at 60hz or 120hz.
@@vdentertaiment4088 so, in the end, we get the judder, which is much better than flicker. I've been to a showing which was projected from film, and it was flickery to the point I had to just either leave or get used to it. I somewhat tried not to notice it. But it didn't look good. Next time I'll be at a showing from film, I'll pay attention to judder.
@@RockinEnabled I don't experience judder, as i said it is stutter wich isn't the same thing, the TV itself by the "Real cinema" fonction can display content without judder. Weirdly enough you will have to increase the de-judder fonction to get rid of the stuttering. To really understand what i'm talking about i suggest you get a look at the video : "Judder on TVs Explained (Motion 5/5) - Rtings" Without any motion smoothness i can't watch movies, i got strong stuttering, wich isn't what you can see in a movie theater, so it ain't supposed to look like that, that's not what creators intented. De-judder at 2 is the bare minimun i set it to get an ok experience without getting soap opera effect BUT stuttering still occur, to get rid of all stutter i have to put the settings to 7 that will introduce alot artifacting and soap opera effect so to get some sort of compromised i keep my settings between these two, that's the settings i change the most on the TV, i try to get used to lower settings (and stick to 2) but i can't, i'm too much sensitive to stuttering.
Thank you for saving me the trouble of making this video... the biggest shame is that with new TVs, you need to spend half an hour fixing all the damn settings back to a neutral viewing experience.... I might add that finding the mid-point in the colour temperature might be preferred by most people rather than going full warmth... good video.
I expected you to talk about adjust contrast..its the biggest factor on my u6k ..its unbelievable how much more natural the picture looks when its turned off
I like the cool version of the snow, to me the warm setting has a yellowish tint and is dark. Something about brightness = perception of higher quality to me. I am not a videophile. I just judge with what my eye tells me and either like it or dont. To a certain extent it may come down to personal preference. It may not be the director's intent, but I only care what looks best to me.
I've learned not to say anything when at people's homes when I notice a bad picture on their television. Most people think it comes from the factory with the settings set to optimal. On several occasions I think I may have offended people when I suggest they adjust their picture. To this day I still go to some peoples homes and they have the picture set to zoom on an HD channel. Just this past weekend I visited my sister's home where they have an enormous 80 inch 4k UHDTV. The first thing I noticed was how dark the picture was and the strong blue tint. To me the picture was unwatchable but I certainly didn't say that. I mentioned that maybe they should try the warm picture setting. Pretty sure that pissed off my brother in law. When I mentioned they should try streaming some 4k content from Netflix or get a 4k UHDTV bluray player to really appreciate their new tv, I could feel a chill in the room. My brother stopped by my house and noticed I had LOR 4k steelbook. He asked if he could borrow it and I asked if he had a 4k bluray player. "Fine, I don't want to watch the stupid movie anyway" was his reply. I'm learning to just keep my mouth shut.
The one that gets me is the "dynamic contrast". We were watching Daredevil on Netflix and couldn't see a thing so often we just criticised the maker. Finally I went into my TV settings, and stumbled across DC - suddenly we could see everything. I ended up turning most things off because I realise they were just artificially trying to tidy stuff up. Meanwhile one set of my parents are watching TV with blue skinned people, and the other is watching videos were white has a pink tint and I can't fix them without them complaining. Drives me insane.
Tip 1 literally saved my HDR... I always thought my TV has just a bad implemantation of HDR but is was actually just my sharpness setting cranked up to maximum. I have a Samsung Q6FN btw. Thanks for that tip!
I am not sure I agree with the colour temperature on your example from LOTR. In my opinion the snow with low colour temperature looks more realistic if it is portraying high mountain, freshly fallen snow. The warmer colour looks more like old snow, perhaps even tainted by city pollution. I am a Norwegian, by the way, so I have seen my share of different types of snow.
Interesting. On #5 Color Temperature, the snow you said looked more realistic in the video on the right actually looked yellow and fake to me, while the snow in the video on the left looked pristine and white, I wasn't seeing any blue hue there at all.
The video on the left look bluish to my eyes. Its because your eyes is accustomed to bluish look. And you must know that light from the sun is look little bit yellow, so the video on the right is more accurate and more realistic
I agree, i'd shoot for somewhere in the middle of those two, maybe slightly closer to the right screen because the faces looked more realistic, but the snow was yellow. So maybe Warm 30 or so is a good middle ground vs going all the way to 50
All this stuff is great advice and a good resource to show people who don't get why I want to play around with their TV settings, except for motion smoothing I know it's not perfect and even can destroy certain animated shows but 24 FPS just causes me to feel motion sick and get a headache
Same here. In fast pace scenes I also don't perceive the pictures as movement anymore, but as individual pictures, because they vary too much from one another in 24 fps.
It's not 'great advice'. The whole point of options/customisation is personal preference. If I want sharpening (essential for SD content), I'll sharpen. If I want smooth motion (which I do), then I'll not take advice from a 24fps diehard.
@@--legion I don't like 24fps, I much prefer smother motion even if it leaves visual artifacts, I didn't know where you got the "24fps diehard" from, even in my original message it says I get motion sick from it
@@KaelumKrispr I'm referring to the poster of this video and others like him. Those that refuse to accept 60fps is nearer human optics and therefore more natural, preferring instead to cling to the stuttering unnaturalness of 24fps. This 'advice' is defective.
5:49 love how the frame rate temporarily increases to 60 fps - to demonstrate what Vincent is talking about. I actually prefer non-movie videos at higher framerates - have you thought about changing for future videos?
As a long time viewer of Mr. Teoh videos i've already enabled all these correct settings on my tv, and the result speaks for itself, the picture is jaw-dropping in every source i watch, amazing. I say this every time, Vincent you are really the best in the business! Wish you a very happy Christmas :)
I have a C1 and one of the reasons I prefer motion smoothing is to reduce judder. How would you suggest removing judder while also avoiding said motion smoothing issue?
02:15 for those with older samsung smart tv's the sharpness range is 0-100 and not 0-20 like the newer one's. So in that case apparently neutral sharpness is 20/100 not 0. If you have a newer one then it's 0 02:48 Noise reduction seems to be called Digital Clean View on the older samsung smart tv's 04:51 motion smoothing, aka auto motion plus 08:45 warm 50 on LG Warm 2 for Samsung
I respectfully have to disagree on the motion smoothing and color temperature. What you said is valid but anything less than 60 fps especially on a big screen gives me a headache. For color temp I find the middle setting a good compromise. White looks more natural, cold or warm are too extreme on the yellow and blue spectrum.
Same for me. If it's 60hz without motion handling, I can see the judder, and it really throws me off. My current TV is 120hz with a 480 motion rate and I love it. If I turn off motion smoothing, then the judder just gets real bad and I can't stand it. I also never see the "soap opera" effect, maybe because I've gotten used to the higher motion?
In short: For digital sources, where signal degradation has far different effects than analog signal degradation, software-enhancements, while meaning well on analogue, have no positive effect. Also color reproduction is a crazy insane topic, which, apart from using calibrating software, goes mostly by ear.
Motion smoothing is a double edged sword to me - it makes slow moving/nearly still images look better and fast moving images look worse, I normally run it on a small value. Same with the color - I just like the look of cooler colors than what is intended. My TV is for my viewing experience and some settings I just prefer whether it is "supposed" to be like that or not. But Ive been playing games my whole life, so I think my eyes are just adjusted to higher frames. In fact I used to think "man why do soap operas look so good" haha. I'd bet the ratio of time you spend playing PC games vs watching film would be a decent indication of whether you like the soap opera effect or not.
This is where I'm at. I think any argument of "24 fps looks better because it allows your brain to interpolate it" is laughable, and it's coming up for an excuse. We like 24fps because that's what we are used to, and that's what cinema looks like. If we all grew up with cinema at 60fps, would we look at 24fps and say "yes, this is better"? I really doubt it. I like 24 fps for the most part... But rapid action, and black scrolling infront of white at the lower frame rates looks worse than at higher.
@@rattslayer Another problem with 24fps on TV is that TV screen is always lit. Movies are presented in short bursts of light interposed with darkness when the film roll moves to the next frame. Some TVs offer black frame insertion, but it's far from the real thing. Modern TVs aren't bright and fast enough to truly imitate cinema feel.
I agree with you guys and I add: -whether or not some movie is supposed to look in some way, It seems like filmmakers expect you to have the most expensive setup. Movies for me sometimes are too dark to the point where's not enjoyable anymore (yes I don't have HDR or OLED or a dark room). -It's ironic how tv makers crank the motion thing while game makers themselves almost always bias towards visuals instead of fps. -As a gamer, I feel the soap opera effect at 60 fps in movies, but only recently I watched something at 120 fps (phone's 120hz display) and it looked fantastic! It was like I was there. No soap opera feeling. Is it just me?
@@fernosan I know you can see more than 60 fps, but it depends on the person. I have a monitor that does 144, and when I first bought it I tested it with GTAV to keep increasing the graphics incrementally to see what framerate I thought was best and I realized around 90 fps is the top for me. I can watch something at 85 vs 90 and tell which one is which, but above 90 it all looks the same. Similarly I think 4k is only marginally better than 1080, so I think 8k will be useless to me. These displays are getting to the point that every person's personal optical capabilities are starting to hold things back in some aspects.
I struggled with gaming on my Samsung tv default settings cuz it made my eyes hurt a lot. I then changed the settings to make it warmer, look natural, and also lighten the dark parts (default settings makes shadows look too dark and over sharpens the lighter areas). Some days later, it started to recognize my PC and unlock a ‘graphics picture mode’. That’s what I use automatically now. I love it
It's important to note that Sony's Reality Creation Feature is separate from the Sharpness setting, and I swear it's worth leaving on Auto (it's not overly aggressive and it really really helps up-scaling), I though I would never manually set it past 20. It uses an object-based sampling method to enhance detail. not just contrast based. Otherwise I'm completely on Vinny's side here. As an aside, I used to have TVs on my sales floor that had such awful, awful motion smoothing that it would actually cause the film grain to smear across the screen when things were in motion, because it was applying interpolation to the grains. It was nuts. It took way longer than it should have to trouble-shoot that issue, because I had initially pegged it as a noise/sharpness issue.
Reality Creation on auto is great for cleaning up Nintendo Switch - especially the 720p home screen. The LG OLED that I had before my A9G had a similar setting, but even on high I saw no difference.
Reality Creation is intelligent sharpening and it's awesome. I leave it on auto for most stuff, like PS4 games and such, but I do bump it up significantly when playing older PS2 games, as it hardly causes any sort of ringing artifacts. It feels like it's actually undoing some of the blurry upscaling and getting closer to the scaling looking like it was integer scaled. 50 is good for most PS2 games (and also some DVDs), but I go even up to 75 for other softer looking games. For Switch games and PS3 games it also works great, but I think Switch benefits even more due to some of the odd HD resolutions and dynamic resolution scaling used in games. Most games don't need extreme values, but I actually found Snake Pass looking soft enough that I could safely push Reality Creation to the max without causing ringing in gameplay elements (some does become visible on HUD), and even then the effect is still subtle but looks reasonably clean and sharp. A really great tool.
It can help try and salvage a shit situation like a 720p source or something, but in general if you're starting with a good clean image then it's not necessary and doing more harm than good.
Man this guy is hilarious! His costume and everything hahahah too good! I love all the hardwork you put in and genuinely enjoy your reviews, comparisons and guides!!!
Vincent, I noticed when watching Dolby Vision content that sharpness automatically jumps to 20. Since dolby vision is supposed to "dynamically adjust for each scene", I was wondering if this should also be set to 0 when viewing dolby vision? I have Lg C1, Thanks
On my TCL 6 Series R646 Google TV, is not allowing almost any of the settings to change. I see this on streaming services consistently such as Disney +.
I noticed this as well. I recommend turning it down to 0. From what I've learned, the reason the LG C1 commonly defaults to 10 or 20 sharpness (Even in game mode where all settings should be disabled/neutral) is because of picture settings like super resolution, noise reduction, etc. With these settings enabled the tv needs slight sharpness correction to try and make the image look cleaner. If you are going to turn all of these settings off like Vincent suggests, then sharpness should also be set to 0.
@@lilbabypimp3232 I second this. As an C and G1 owner 0 really is the way to go I've never gone higher than 10 and even at 10 personally I could spot some posturation/ghosting as well as scenes seeming *fuzzy* OLED (especially the C/G & Master Series alomgside the Panasonics are amazingly sharp) BUT given that Sonys default sharpness is at 50 it makes me question if they just appear sharper who knows... anyways TLDR; I second this comment and have worked at richer sounds for nearly 17 years.
It's likely that your screen is lacking a 24hz mode so you get a notice stutter. I had the same issue with my old tv, but my new hisense knows to swap out to 24hz when watching a movie, and looks much better.
Vincent, I've been reading and watching your reviews for over 10 years, and want to thank you for the great content. Would love to know which colorimeter or other device for screen calibration you recommend for home use.
You saved me my LG C2 was set to Just Scan Auto so as per your instructions I changed Just Scan to On and it zoomed back out how it should be. I thought having my aspect ratio set to original would take care of "zooming in" picture but nope I also needed to change Just Scan to On and now the aspect ratio really is proper. Thank you so much! I thought I had it all figured out on my C2 but this is a nice surprise thanks again. I like SOE a lot so no convincing me to change that even if its a more accurate picture I just like it.
Motion Smoothering or Interpolaration always bothered me as an anime fan. I have a friend, who always watched his stuff in 60fps and got used to that so much, that he couldn't watch stuff without it anymore. But especially on anime, where scenes with low movement are animated in only 3-15fps, the program or TV has to assume so many new frames, artefacts, smearing and stuff appearing frequently was inevitable. Also for animation, especially anime, directors and animators actually use smoothness creatively for certain effects, like emphesizing on impact. For example for an action scene, which they want to stand out, or when they want to go for a time-slowing effect, they increase the amount of details and frames drawn per second for additional impact or to create a more smooth eerie feeling. Very talented animators also use the timing of frames to add weight in motion or to covey a very swift action on screen. All those effects become completely lost when having artificial 60fps constantly, I feel like it's almost disrespectful to the artist. A bit like how you said noise reduction destroys intended graininess by the director, just even worse.
Animation also relies much more heavily on brain filling the gaps, especially 2D animation, so this topic is even more important in this context. There is a reason why no doll-like 3D Pixar/Disney character with tons of precise nuances in facial expressions will ever feel as human and believable as much simpler (technically) anime character in a well told story. Pixar's characters slap your brain with 100% final representation, so there is nothing to fill-in. You literally see this big elastic doll with complex physically based shading and not a true human. Meanwhile in anime even absurdly big eyed design and poor artstyle don't prevent your brain to accept it and reinterpret it as a possible person. However there are examples of some shots in anime where 60 FPS interpolation can achieve quite amazing results. It's usually horrible, but the fact there are some motions that can look beautiful with it means animators should themselves embrace this possibility and try to use it sometimes for high budget titles, which would mean mastering at more than 24 FPS to have higher range of possibilities.
I watch anime and playing videogames with motion interpolation on my panasonic plasma tv. Plasmas especially suffer if you don’t use it as you get a lot of judder and green ghosting with 24/25/30 material when it is off, and get buttery smooth “effective 300 Hz” when it is on. It is smart enough to not touch low framerate cel animation, but panning backgrounds and CG animations become glorious, smooth and sharp. IDK about other TVs, but in mine the interpolation is clever enough to improve anime without creating too many artifacts. If an anime is drawn with high enough framerate, also gets interpolated with adequate quality. In games, you have lag obviously, and it breaks down on unstable framerate, minimum effect if the game is already 60 FPS, though still noticeable on camera pans. So, in fighting, STG or any PvP games I turn it off.
Motion Smoothering is like a drug. Once you get used to it, you can not stop using it. As an old school movie watcher I ofcourse NEVER use this feature.
@@noop9k yep Panasonic and Sony are ahead on Interpolation. I also have an old (10 years nearly) 42 inch LCD TV in my living room and the Interpolation is always on point, especially on streaming videos when the compression kicks in. In the other side my new (3/4 years old) LCD LG seems to suffer in Interpolation... Newer hardware, worse performances... Better multimedia side, but who cares when the image is laggish and unbalanced...
I use a 43" Samsung UN43NU6900 as a general purpose computer monitor. It is set to 10 (0 to 20) and it looks good. If I set it to 0 like you suggest, the text gets very blurry and faint. If I set it to 20, I get the edge artifacts you're talking about. I don't think it should be set to 0 though.
Each TV is different. The suggested settings are for the most current and expensive models, I think. I have an old Panasonic plasma TV (a lot older that your TV) that was cheap and on that TV the most correct sharpness setting is 6/10 (I've tested it).
8:40... actually, I like the cold temp better. It looks more realistic. The snow in the warm setting is yellowish and everyone looks like they have jaundice. To be perfectly honest I think the best setting would be somewhere in between these two as I don't actually like either. But the cold temp is better to my eyes.
What I hate most is the color temperature setting, especially added to the habit of most manufacturers to not calibrate the colors and gamma of their TV in factory. You never know it the colors you're seeing are right unless you have a colorimeter and know how to properly measure them. And also I hate when some PQ settings are on by default, and even more when they cannot be fully disabled (Samsung?).
Samsung is awful, or was about this. There is no setting to adjust dynamic dimming. It's super aggressive and subtitles will turn indoor scenes black, and dark scenes with a window of light go black, title screens on black look excessively dim. You have to go into their service menu to adjust the range to disable dynamic dimming.
Never been a fan of the warm colour temps. I know it's the recommended but for my eyes, i prefer the normal or standard temps. Then calibrate to that. Colour: 50. Colour Tone: standard. And Colour space: auto on my Samsung looks pretty good to me.
@Michael Weizenfeld Advocating something simply because it is a “standard” is a terrible argument. Everything in the history of the world that was a standard until it isnt anymore. The Earth being flat was the standard until it was round.
@@jacobmarley2417 unless you don't have a monitor calibrator, and didn't know a theory, behind it, then you are more likely a flat-earther, claiming something based on your feelings.
@@vincentjacobsmm I am not discussing preferences. There are standards for display devices under which they correctly convey the image that was intended by the author. And a person who has no idea what he is doing should not go into them.
This is an excellent presentation, thank you. One note, I set each of these OFF as you went through them on my LG, but at the end when I selected Filmmaker Mode, the Sharpness reset to 10%. (?). I went ahead and reset it to zero, otherwise all the other settings went to your suggestions. FYI, on my LG I believe the motion smoothing is called “TruMotion” which I verified as OFF, and the new Color option for Filmmaker mode is “Warm 2” which is a 50% color. Thanks for the tips, and I will check your other videos as I’m getting a new LG C1 soon.
I don't like artificial motion smoothing but I do hate that 24fps is still used in cinema, even when a movie is shot 100% digital. Pans and sideways motion look awfully juddery.
Here is the thing: Having sharpness on a low level is beneficial on my ultrawide monitor and does not hurt image quality. Monitors behave different from TVs I found which makes sense because of the very different use cases.
It's very difficult to stop using motion smoothing for me. Without it, I have some stutter from every backgrouns in a movie, especially when the camera is travelling. And the option on my LG "true cinema" isn't often available ... I wonder how you can improve this without motion smoothing. If someone has an answer, i'll take it !!!
True cinema only helps with judder which looks like the panning is skipping back every few frames. Caused by showing some frames 3 times and others 2 times to fit 24fps into 60hz. True cinema just makes the screen run at 120hz so 24 can divide evenly. Unfortunately there is nothing you can do. 24fps just plain looks bad on panning shots and on fast action. Why they don’t film panning shots at 60fps still confuses me.
@@silverwatchdog I heard that James Cameron might help us with Avatar 2 : HFR becomes VFR, Variable Frame Rate. Finally !! Hope that it will work for the majority of the viewers !!
I followed these settings and the difference was huge. The factory settings in my Samsung S92C were awful. Like having a Porsche limited to 80 km/h. Thank you so much.
Keep in mind that when you change color temperature it will take a few minutes for your eyes to adjust to the new white point. So don't dismiss the warmer color temperature immediately.
Dismisses warmer color anyways because it doesn’t feel true to what my eyes see in nature.
As a designer I always found the default settings to be too cool. Sadly, I’ve met too many designers who don’t get this concept
What I like doing is watching a little bit at settings a slightly more exaggerated than the recommended . Ex: After about 5 mins of watching TV at WARM 3, i would switch to warm 2/1 and all the accuracy becomes completely satisfactory !
Nope. On my previous Samsung TV I had it on warm for 2 years and then I visited my neighbout and saw his beautiful NORMAL color temperature and instantly had to revert this option. I NEVER could get used to the piss filter. Also why are you all always comparing warm to cold and saying cold looks too blueish? Why not compare to normal temperature setting. I don't like either cold nor warm, normal is always the best and most realistic looking.
True, cant stand either the blue or yellow tint in the before and after. White should be white
If you take a picture of something, the TV and the thing should look the same
The fact that overscan is still on by default on many tv's is insane.
What's more sad is the number of people who want it on when I show them the difference.
Same thing with interpolation. Still there, for whatever reason.
so should i leave fit to screen on “on” or “off”
@@adriantrinidad1296 Yes
@@adriantrinidad1296 "On" if you have a Samsung. Just whatever setting that makes the picture look smaller, because that means your TV is not zooming in and killing the borders.
Summary:
1. Don't crank up the sharpness setting (keep it neutral). 02:15
2. Turn off Noise reduction
3. Ensure Zero overscan. 04:36
4. Turn off Motion Smoothing
5. Color temperature - use (target the D65 white point)
Tl:Dr - filmmaker mode
Thanks
I don’t have filmmaker mode on my Samsung. There’s dynamic, standard, natural, and movie.
@@WalterMelons i don't know what modell and year your tv is from but have you ever updated the tv's software?
@@thedutchfisherman7078 I intentionally have not ever connected it to the internet. Maybe that’s why lol.
thanks G
5:11 I feel like this is due to your association of movies typically being 24 frames per second and handheld cameras typically have 60 FPS (modern phones anyway). If you get rid of the built-in association in your head then it is just smoother motion
You are honestly a saving grace for TV buyers. I'm glad you're busting myths and educating people on the REAL technology, not pseudo marketing nonsense.
I turned off sharpness and now everything looks disgusting
Thanks alot asian police man
@@arkham_miami I personally don't put the sharpness all the way down, I put it on 10/100, sometimes a little bit of sharpening helps crisp up the image a little.
Well, it would be except that he is objectively wrong on a couple of counts. Motion smoothing can do nasty things with certain shots, but it's better than the "purist" view of watching what feels like stop motion. Similarly, film grain isn't "artistic", it's still just unnecessary noise.
In some of these cases, he's not busting myths, he's spreading them.
@@bevanfindlay 24fps doesn't feel like stop motion. It's been filmed at that rate for years and the only reason why you feel its unbearable is because you think "more is better". Film grain may not be to your artistic tastes (and thats fine), but though it is 'noisier', the image is actually clearer because denoising generally makes an image less detailed.
@@normietwiceremoved No, that frame rate was chosen as being the bare minimum that usually looks passable for most scenes. Try playing a first person game at 24 frames and at 60 and tell me which you prefer and if you can tell the difference (spoiler: we can; vision alone can pick up changes to around 90 Hz under many circumstances, higher in certain unusual ones). Pro gamers can perform differently at as much as 200 fps, though some of that has to do with more than just vision (input lag etc).
Noise is the same - even a basic denoise algorithm isn't going to affect detail much, and film grain is still just garbage optical noise - it's the "purists" who are ignoring visual science. It's people saying they prefer something broken over something better.
The Hobbit movies were the first time I didn't feel like a movie was horrible and jittery (pity that they made some other mistakes that some blame on the frame rate when they're unrelated).
This uniform is for Vincent "other job"🤣🤣🤣
Lol legend
You mean like, my "HDMI 2.1 port is ready for your high bandwidth cable, master... Oh yes, I'm fully certified with ALL specifications... Plug it in!"?
Oh no...
LMFAO
Lol
Vincent, All I have to say is THANK YOU. I recently cut the cable cord and currently going with Hulu + Live TV. When I did I experienced all kinds of picture quality and mainly motion related issues on my Samsung QN90B to the point where I was literally about to pull the trigger on another new TV. Your guidance has made a drastic improvement. I am still seeing some minor imperfections which I can live with but thanks to you I won't be shelling out big bucks for a new tv.
Omg, this is by far one of the most enlightening videos ever. Since forever I could not understand for the life of me that why my UHD OLED tv wouldn't show the high quality 8k image in that quality and what's wrong. As u said, found all those settings on in my JVC tv, switched them off and immediately got the quality that I have always been yearning for
Incredible when something like that all comes together at once after such a long time, isn't it?
You have answered, with freeze fame and video, the questions I had for years. I bought a 4K TV a few years ago, and it never looked right. The LotR and Mission: Impossible comparisons were exactly what I have been viewing: too smooth, too glossy, like the movies were rendered instead of filmed. A massive thank you, good sir!
here i was thinking it was just Blu-ray remastering messing with LOTR when the newer tv's were also to blame.
You are a LOOSER. What are you doing here in the Philippines? Whats wrong with Allan and Steve? Why are they stealing our packages? Are those people fake?
By the way, BLUE doesnt live in the Philippines anymore! Those who remains here are loosers stealing packages!!!
@@theswampus670 I hate that so many people have ended up making that mistake. I remember back when Blu-Ray and HDTVs were first becoming a big thing, and motion smoothing was coming out around the same time. I saw a lot of people saying things like "HD/Blu-Ray makes it look too real, like I'm watching behind the scenes footage or something." I remember I saw a comment once where someone said, "I saw them playing the new Die Hard movie on Blu-ray in the store, and it looked like I was watching Bruce Willis film the movie, instead of watching the movie. I prefer DVD, where it looks like a movie." I had to tell him that he was probably watching a tv with motion smoothing turned on, and that Blu-ray actually looks more filmic with the right settings.
It is very frustrating that so many people don't notice it, or if they do, don't know it can be turned off. I'd say about half of my friends and family, when I go over to their house, I ask if we can turn the motion smoothing off, and they're like "What's that? I don't know how." and then I turn it off and they're like, "Oh wow, you got rid of the soap opera effect!"
Haha yeah thank God, I thought I was the only one who noticed this
I have an older DVD box set of the Starwars 4,5,6 remasters. If I go into the extras menu they have a whole thing for setting up your TV, a bunch of test screens and instructions to get just the right sharpness, contrast, brightness, color balance, etc. After using it all my movies look a lot better than they did with the default settings options.
never use the default settings
Picture setup on my TVs has always felt like a huge rabbit hole for me, especially for gaming. This channel's been of great help for me.
As a hobby photographer I agree with the problems caused by too much sharpness and noise reduction. It’s exactly the same when you are editing photos. This also applies to white points and picture temperatures.
That’s not true. Most RAW files are under sharpened.
Sharpened images always look better, have more accurate lighting, better contrast and 10x more detail. I think 88% of the population are just visually impaired without even knowing it.
@@joshuakyle9494Agreed. I'm looking at snow far whiter than what this video claims is possible to even see and I live in a dirty city.
@@godzilla2k26I live in a clean city. And the snow doesn't look bluish like the cold and standard temperature on the Television. Same with clouds. The sunlight have a natural yellow tint
@@sonyx4500 Sunlight is natural white light. This is why people need to get out of the cities sometimes.
Fully agreed with every of the 5 points - But I have to admit it took me two decades to change the color temperature to warm 50. Strangely I always found the neutral/colder temeperatures more realistic. But once I got used to it, there is no way back.
Always felt like there was a yellow smear. It’s hard for my eyes to accept something like clouds or snow with a tinge of yellow. But this would explain why skin tones can sometimes go very pink or magenta. I will readjust my settings and try it out for a while like you recommended.
@@bigmoviefreak skin is pretty pink in reality. Shifting in some green and amber into the white balance makes white people look more pleasingly beige, but everyone else looks more green.
I did the same on my phone. Whenever I look at someone else's phone, I always forget how blue the white is.
Im still like you and prefer medium one day I'll have to try warm 2
Human eye whitebalances automaticly. Ambient light usually 3200k (yellow) and background of your screen can shift your perception of colors. For this reason I would recommend watching movies in dark room.
As a Canadian (who also aggressively uses f.lux), I must push back on the “warm” temp setting for LOTR.
Above a certain latitude, the UV rays of the sun “glance off” the atmosphere and no longer reach the surface or most of the sky. This means that being outside on even the sunniest days brings no warmth from the sun, and that everything is much bluer. To your credit, I think LOTR got this wrong, but to my Canadian eyes the one on the left looks far more accurate.
Also a Canadian and I also was like “uuhh what this guy talking about, the one on the left looks much better to me”
Yes they also made it warmer on film but your right but also him in away but truth it the way it's film yah I hate way directors tweak colors like that there no point.. but also Linus tech tips said don't trust the filmmaker mode because it's up user because some tv may differ only way is if you cal it through expert via pc but still... If you think you can I look up I think spider x data color and use calman it really fix the tv yah it made it warm but I push it back my lg was set to warm 1 but I push it to medium...
A filmmaker dude made the Hateful Eight look warm and it looks like crap too.
I'm not even Canadian. I grew up in Ohio U.S. and the snow definitely strikes me as being white. Ffs what happened to the phrase "don't eat the yellow snow"?
@@josiahferrell5022 FOR THE LOVE OF GOD!!! The fucking movies are COLOR GRADED!!!! If you want to see the fucking snow in it's natural color look out the fucking window. The movies are color graded to the desire of the director and colorist. WTF?!!
Great tips in this video! This advice was generally true for old CRT tvs, and is especially true with today's digital LCD tvs. Try turning off some of those extra noise or blur filters to see if you prefer the look of the picture better without them or not, and turn down the picture adjustment settings a bit. Too much extra contrast, brightness, and color saturation is not a good thing. You lose a lot of subtle details when they are fully turned up. And keep your color temp/tint/hue setting mostly to the middle so it isn't too unnaturally cool/green- pale blue, or overly warm/ orange-red. You will know it is set right if your primary colors appear correct. (REDS that that are deep and bright without looking orangeish or maroonish, and GREENS/yellows, and BLUES, that appear natural and vivid without looking muted or like other colors. Also keep the sharpness set low or barely on if used ay all, because too much creates exaggerated noisey edges. Keeping the picture settings lower not only produces a more natural and detailed picture, it improves the overall performance, and extends the useful life of your TV.
1. Sharpness 00:15
2. Noise Reduction 02:23
3. Overscan 03:42
4: Motion Smoothing 04:45
5. Color Temperature: WARM 06:53
Thankyou 👍
You'd be surprised how many people prefer motion smoothness. I for one prefer it.
Fuck warm. Neutral only(maybe slightly warm with manual calibration, if the TV's "neutral" is too cool) Not yellow not blue. Just right.
Motion smoothing is weird i hate it... Update I'm starting to like it
This is one video where I totally agree with everything you've said.
Do you have any idea why tv manufacturers continue to keep pushing these film-destroying options as default?
@@Moody_Blues_ TLDR: "reasons" to by this years new model and scrap the one from last year aka marketing
It's because some idiots at the company who think slapping on unneeded filters is somehow better picture quality when they are actually hurting it. The average consumer is too dumb to realize.
The reason I've heard for most of these settings is because it enhances viewing live Sports. Which is what the majority of TV's are actually used for. Hence it being default.
@@cruxtymusic It absolutely does not enhance sports. It makes them worse, as it does with all content.
@@cruxtymusic The majority of TVs are definitely not used for just sport
The filmmaker setting tends to not take into consideration room lighting. It's great if the room is completely black, but with windows the setting is too dark and sometimes too washed out.
I agree.
💯 watched a video where some guy said to set my brightness at 8...the picture looked like it was night time the entire movie and I couldn't see stuff that was going on if the films setting was night on top of that
@@rickycardenas5154 Set for color accuracy and then just go through a few scenes and adjust the brightness until you're happy with it. I find the star wars prequels to be very dark in dolby vision mode but when I tested a dark scene while paused turning up the brightness didn't reveal any more details just raised the shadow from black to grey.
some t.v have additional camera sensor that cancel the ambient lighting by readjusting the color temp.
I own a lesser-known brand of TV that has a faux filmmaker mode. At first I thought it was because it wasn’t true filmmaker mode that’s why it was too dark, but your comments here now confirms that it is the same for legit filmmaker mode too.
Absolutely useful information. I applied all the picture settings recommended by Vincent Teoh except motion smoothing. Just can't stand the judder. Great job, Vincent. Thank you.
Exactly the same for me
I agree
Got a new Samsung TV today, found this video by accident and changed the TV mode to Filmmaker Mode.
I used the same scene from Lord of the Rings for comparison. I was blown away by the fact how much the image has improved.
Thank you very much!
This was no bs, straight to the point, informative stuff.
Also anybody who prefers motion blur is wrong.
Edit: to clarify I was referring to the setting, which is actually called "motion smoothing" on some TVs.
It pissed me off beyond comprehension that HD tvs come with that setting on. If I go to a friend's house and they have that on, I will always insist on changing it. I don't get out anyone is able to watch shit with that awful awful f@cking setting on.
I liked it because it made things look kind of 3D to me. I think the first time I saw it was when I was watching a demo for one of the Transformers movies. Hyper-realism does have its place with some content. But the "some" is the key word here. If I was watching 12 Angry Men with motion smoothing on it would just be annoying as shit.
I’ve never had a motion smoothing tv until I bought a Samsung on Black Friday. When I watched my first movie I was like, something feels really off. So I sat through the movie then immediately messed with the settings to see if I could fix it. Sure enough I could and I turned it off. If I was watching TH-cam then maybe it would be good, but it’s a hard no when it comes to using it
I’m guessing you mean motion smoothing? Motion blur is what filming at 24 fps naturally produces (the so called “cinematic” feel)
Personally, I hate artificial motion smoothing, but if it’s content natively intended to be high framerate, I prefer that
Fuck motion blur. It makes me sick and is the first thing I always disable. My gf thought I was crazy until I showed her why
Switched up my TV last night and wow. I mostly game on it and it first and at first it did feel yeelish as some mentioned. However, once my eyes adjusted it was crazy good. So much more tone and depth. Made the space scenery really pop with the blacks contrasting to color. Can't ever go back, thanks for the video.
Huge thanks for showing the specific setting names for the different manufacturers. Most other reviewers I've seen don't provide clarity with this.
This is so misleading. The Warmer 2 (50 or whatever) is there to mimic the film media that has a red/orange cast into it, just look at a film strip for yourself. The target D65 doesn't look at all to the image at the right.
On my LG OLED TV I agree with everything except colour temp. I've found warm1 to be the best compromise, warm2 is pretty good but warm3 looks like a urine filter over the image
Agreed I prefer Warm1 on my LG OLED as well.
Thanks I'm about to purchase a C1 I'll keep this in mind.
I prefer cool
@@Csal92 Warm1 user as well. It's the best way to go.
@@majorastorm Cool is to blue-ish for my taste
I've lived almost 40 years now, and this the first time I've heard some of these settings explained. Thank you.
Thank you for this, Vincent. I turned off all this garbage and I’m surprised how much more I’m enjoying my 900H. I thought I loved the artificial enhancements, especially the Live Color, but I realized it was making all reds, for example, look equally vibrant, thus severely limiting the range of color in a movie. It also hides some details with Live Color on. I’m honestly shocked, and I’m finally using the Custom mode and DolbyVision modes where possible.
I agree with most of this except for 2 things.
1: just because there's a way the director wanted me to see the movie doesn't mean I care what he wants, I can watch it however I like. This relates mostly to the colour temp.
2: smooth motion is actually more immersive to me, once I got used to it, now I wouldn't want to go back. The argument for movies being in 24 fps is just as dumb as games targeting 30fps instead of 60. The 24 fps came from technological limitation and compromise, and the arguments for it are just trying to explain your personal preferences, which stem only from the fact that you're not used to it. Except for the artifacts, they can happen sometime, that's true, but I can live with it, it's not the end of the world.
On the c1 if you completely disable the motion settings most movies and TV series look like a blurry juddering mess. You need it at least on its lowest setting which is cinema
Yeah, this. On the cx and c9 LGs too. Probably all LG OLEDs.
@@MILADINY0 it's the same with game optimazer mode. It's great as long as the game is 60fps but there are no motion settings in this mode so as soon as you playba 30fps game it's hard to even Look at it
@@oddcabbage yep, good point. Unplayable - and the issue with switching to another mode in gaming is the input latency.
@@MILADINY0 they should have some motion settings available in game mode. My Samsung Q80 has it so it's not hard to implement
@@oddcabbage That’s why we need to get rid of 30fps games. It will always look and feel like dogshit.
In the late 90s early 00s, I was a subscriber to a magazine called Sound & Vision. I loved their articles about this. Been turning off my NR, and in the 1080p days only introduce minimal sharpness, amongst other things turned off and only at minimal settings. I love this channel, Vincent is awesome, extremely funny, entertaining, and very informative.
easily the best AV guy imo. Him and CNET are all i look for to reviews.
I recently joined LG C1 family and this channel has been a boon of information for someone who hasn’t dealt with TVs since 90s.
Thank you for your knowledge and your work into making these videos, Mr Vincent. Best of luck to you!
Very nice and humourous video. Recently got Samsung s95c and have been scouring Reddit and TH-cam for Filmmaker mode settings that work for me. The problem is that Warm 2 (and even Warm 1) is like watching a movie wearing your sunglasses. Too dull, dark and can't see detail. I reckon Standard or Cool is better for me.
I can agree to all what u said excluding 1.
Smoothness (soap opera ) is a must specially bigger screens. I dont how you can say turn it off with a straight face.
Maybe some humans eyes are different from another, for me when i turn it off it gives me headaches and i feel im watching it on a very cheap screen.
Smoothness is a joke not a must. It's the opposite if you turn in on everything turns into cheap amateurish piece of shit. No, thanks.
I have an older SONY Bravia TV, by default reading the manual I though that setting all options to max would be the best option but I had problems with the image and it didn't look right. I decided to give it some time until I were used to it as I was changing from an old tube TV but I wasn't able to get used to it. So one day I decided to disable Sharpness, Noise Reduction, Motion Smoothing, Display Area to Full Pixel and setting Color Temperature to Neutral the image imrpoved a lot and it looked like a new TV! It seems that I was doing the right choice without knowing!
Why for the love of God would you think setting everything to maximum would the the best option? What was the logic behind it?! Depressing really...
You are like going all in or all out
7:22 As a Swiss who has skied all my life, cold is more realistic to my eye. 😂 Yellowish snow is something you learn quickly to avoid first years of your childhood. 😂😂
Seriously, thank you! I feel seen!
Like the idea that warm is “natural” like no somethings wrong with your eyes dude, go to a doctor. Snow even here in the middle of US, is white as white can be…
Some of these settings are just too extra, and gross elitism.
Thank you! I was searching for a comment to confirm that I'm not crazy for thinking that snow should be white! I love my cool setting.
@@goranm9430 @Satan-777 Hehe. Thank you for your comment. I was also feeling a little alone on this one. But honestly snow is complicated because it really tricks electronic calibration sensors/algos (especially cameras) for white balancing which is why we have usually a "snow scene" parameter.
You had great points in this video, but I have to disagree on SOE, I love how films and tv series look with it turned on. I also like the sharpness turned up half way and noise reduction turned on. I absolutely can not get used to Warm setting turned on I leave everything on cool setting.
When I watch movies on Blu-ray and 4K UHD disc I can handle film grain because that is part of the cinematic look the Filmmakers are going for which comes from shooting on film stock but what I can't handle is digital noise where the image has kinda like mosquito noise which is not part of the cinematic look and shows up on camera when ISO is to high.
Sony has a somewhat capable, specific, Digital Noise reduction feature. it looks for the cubic noise, not grain.
Yeah Netflix is especially bad for "visual snow" and you'd want some of those filters on when watching some streaming sites for certain.
@@90lancaster That's why I prefer physical media to get the best and highest picture quality but sometimes digital noise is apparent on some modern films which can look quite ugly because of the mosquito noisy picture it brings.
@@soylencer That's the thing you don't want to scrub any detail away of the image but want the mosquito digital noise removed not film grain.
@@BeginsWithTheEnd That's more to do with video compression, sharpening the picture won't bring out detail because the image is muddy looking to begin with by sharpening it you can't bring out more detail, resolution really gives you detail, which is why you don't sharpen on 1080p or 4K content on disc, streaming platforms compress their movies and shows even on the higher resolutions.
I would argue that blur smoothing feels worse because of the nature of interpolation and not inherently because of the smoothness itself. Like you mentioned, the interpolation is creating artificial frames that can lend to some really odd visuals. In my opinion, that's what gives it the cheap smartphone camera feel.
I think smoothness in of itself (aka fps + display refresh rate, I guess?) has its own strengths, though it definitely isn't preferable to 24 frames in cinema. Video in 60fps is very good for conveying objective information, for example.
you pose a valid point. personally i find it difficult to watch 24fps content on a monitor (projection in cinema is fine) so I use SVP on my pc to play everything back at 60/72fps with frame interpolation. i definitely notice the artifacts, but to me they are the lesser evil to the headache inducing stutter that is inherent to such a low frame rate of 24, especially in panning scenes.
the ideal scenario is films like the hobbit or gemini man that are actually filmed at higher frame rate, so the benefits are there without the artifacting. one would assume it is possible to film at say 48fps and then playback at 24 for those that prefer it, though i have no idea as to the extra cost this would add to production.
all the other settings mentioned in this video i agree with though. i prefer to keep the image as close to original as possible. the noise reduction one is a massive pet peeve of mine as it gives this dream like ghosting quality.
I agree but 24p looks awful to me. I still have to find one person IRL that doesn't like smoothing.
I can totally agree that artificial smoothing creates problems.
However, I hate that cinema is still 24fps. In fast fighting scenes, it's impossible to actually see the choreography (it's just blurred limbs) and when I see a stuttery or blurry camera pan, I often get slightly dizzy. I think people only view smooth framerates as "cheap" because they're so used to only cheaper TV productions using them.
Doubling or quadrupling the framerate would be a much more meaningful improvement to image quality (and require less resources) than bumping up the resolution from 4k to 8k, imo.
It just doesn't make any sense to me that a smoother framerate would make a camerafilmed movie look less realistic/immersive. We don't see the real world in just 24fps either!
I understand that for animated movies, the timing of the animations are more important than the framerate and that higher framerate might make poor animations look less realistic (though I don't know of any PC game that has this problem, so...), but with camerafilmed movies, the only small problems I could see with having higher framerate are that they don't mask bad fighting choreographies and that CGI is gonna be a bit more difficult to make look realistic, because more frames need to be rendered for it. But both of those are probably we should expect the movie industry to be able to easily overcome.
@@LRM12o8 The best solution that is well known and successfully used in the animation industry for many decades (and recently famously in the Spider-Man: Into The Spider-Verse) is hybrid framerates. For live action we would need to have camera motion at 48 FPS or more (for prefect panning without judder) and leave most of other things at 24 FPS (but not necessarily everything) to keep the dreamy look that allows the brain to fill in the gaps in a pleasing way. However this is technically non-trivial to achieve with live action and requires costly post processing work. I think Cameron even talked about this concept years ago when discussing Avatar 2.
@@JavierYunes Found one here. Motion smoothing is of the devil. I surreptitiously turn it off on extended family's tvs. No one has noticed.
X-Motion Clarity is what I use on my Sony. (Smooth 2, Clearness 1), the Black Frame insertion properly emulates the motion you see in a theatre without making the image look like a stuttery mess like it usually does on modern panels @24p. Sony does it right.
I use black frame insertion on my C1 as well. I'm dialing the djutter and the motion smoothing seperately or use a preset.
I use 8 and 8 on each out of 10. Definite sweet spot for my eyes.
To get some hands on time with a newer Sony to compare.
Smooth 2 is in fact already Motion Interpolation 😏
Smooth 2 already adds a little bit of soap opera effect, I prefer Smooth 1
@@anguineus_vir unfortunately, you can't use BFI with HDR because of the brightness reduction it causes.
I personally prefer smoother videos/movies, it is definitely a contentious topic. The newer movies are shot on digital cameras anyways, so the genuine film argument falls flat. (Maybe for some old classics you should disable it, but for me 60 fps is way to go) Why would a 60fps camera look cheaper when compared to 24fps one?
I'm italian and you are the N1. Amazing example of nerd-light-hearted-brilliant-intelligent guy.
Thanks for the tips. I turned on most settings when I got my tv because they sounded like they gave a better picture quality without knowing their actual effect. Not sure I agree with the warm setting though. But not everyone's eyes, tvs, or even films are the same though. I will definetly be doing some tweaks to my settings to test how I like things.
Vincent, even when I disagree with your recommendations (rare, but we are all human, and motion smoothing is niiiice), I can’t help but be impressed by your logic - and your humor. Keep up the excellent work.
Thanks for the overscan tip ,I cant believe both of my LG tv had it on. I disable the one on my Sony . Also, I leave motion smoothing on,very low, otherwise I'll be a puking mess. Sorry Vincent n Tom. Thanks as always for your effort n time.
films at 24p was implemented because of the limitations of those times with techonolgy and not because of that cinamtic look you think it gives, its just called cinematic because we got used to it. Early film was expensive, and not very sensitive. You needed a lot of light to enter the camera to get a good picture.
You can solve both of these problems by running at a low frame rate .
Not sure if he covered this but on my Samsung tv I have an option for Shadow detail which was set a 2 . I turned it down to -4 which made dark scenes look great. Before dark scenes were grainy and had white shadows .
I agree 100% when it comes to motion smoothing. I play both games and watch movies and cannot stand even slight motion smoothing while watching movies. Glad to see there is an option to turn it off. All companies should do proper calibration in TV instead of over-riding features just for appeal. But customers eventually would come to their senses.
@Swami What is ‘proper’ calibration? Proper calibration is what is most appealing for the viewer … regardless what anybody else says. Smoothing on looks much better to me when watching UHD than turning it off, same with a good level of sharpness. Got mine professionally calibrated from one of our leading companies … looked absolutely shit. Same with those best settings. Locally they even stopped doing calibration services as the viewer’s individual taste usually massively conflicted with best calibrations. Best is what the viewer enjoys most.
@@GanymedeXD smoothing on feels like fast forward to me. Like you said, every individual has their taste. If you see RTINGS, they have pre-calibration settings review and post calibration settings review points.
I’m the opposite lol I love when they are high frames. Makes it feel so much more real same with games
@@swami5073feels the same to me just more real
I disagree, any game player worth a damn running a game at 60fps or 120+ fps would see 24 frames per second as an abomination. And the smoothness and buttery motion nature does not take one out of the game. You are just used to that trash gutter tier 24p in movies because it's how most things were shot, there is nothing special about it. All the arguments are nothing but cope.
ps, if you like film grain, that is trash too, but it's on the creative side there.
A large number of people, especially males, are color blind to varying degrees. "Color Temperature" (and individual "hue" or "red/green") settings will thus always be a largely personal preference...no matter what the director wanted or the "purity" of one's aesthetics.
Most people don't realize it, but you're doing god's work. You're literally the only channel on yt that goes so much in depth and gives actually useful info
Overscan and Noise reduction do still have their places, but only on the analog input. I found them quite useful for improving picture quality for my retro consoles before I could get a proper analog to digital converter with better results than the TV.
We are analog. At some point digital has to become analog.....and that process often creates a load of crap.
@@johnrussell3961 Yeah but my games are digital and going from digital to analog to digital is pretty hard without a degredation in quality, at least if you're a TV manufacturer trying to bring a 4k TV to market for less than 300 bucks.
RAHelllord . Your eyes are analog, as are your ears. We evolved in an analog world.
We are not happy if digital media does not stimulate us the same way as analog.
Few are going to suffer because they are told digital is perfect. They will use their own eyes , and ears to judge that..
@@johnrussell3961 Yes, but a supbar conversion along the chain can still introduce noise due to cheap components and bad algorithms. My Retrotink 5x Pro is capable of doing that conversion without a loss in quality before it gets displayed on the TV. Which was my point, it can have a purpose to help deal with noise introduced by cheap hardware.
RAHelllord . I think you will find most people complaining have very expensive LG oleds.
I have a C9.
Biggest crime is labelling D65 colour temp standard as a warm setting!
Watching this cold vs warm i swear that on my screen the cold displays snow as white and the warm displayed snow with slight pee in it?
This is where people get hung up. You can have a set calibrated to D65 and still have it appear "cooler" if that's what the director intends.
@@lemsdk17 LOL nice 1
@@lemsdk17 This is the problem I have with warmer temperatures. Cold always makes whites look WHITE, whereas warm, makes whites look slightly yellow.
This is a lie that the TV manufactures have to keep going because of all the horrible things they started doing since the dawn of HDTVs. Flat panels were crap and expensive back then so they had to cheat to make them seem decent. Everything was way too cool to make them seem brighter and they turned up the contrast and saturation to make it pop. Unfortunately they conditioned the public into believing that this is what looks good
I don't know about the colours. As someone who lives in the middle of snowy mountains I find the colder image more realistic. The problem is that if the film maker has applied a weird colour filter on the image, whitening correctly the snow will crush the whites of the whole picture. It's tricky.
i agree
Snowmobile enough back country and I have only seen yellow snow when I pee
How? It makes snow blue, you know snow isn't blue.
@@postboy2242 Snow actually does have a slightly blue hue, especially when you see light streaming through it from above.
If you’ve ever build a snow cave on a sunny day, you’ll know that it’s certainly not clinically white.
Warmer settings are measurably more accurate.
Wow, this one video has saved me so much aggravation. I bought a Sony 65" X85L three days ago to replace a 2019 Sony 65" and was horrified with the picture quality. I had my old 2019 model dialled in perfectly but could not get the X85L picture to look anywhere close to as good as my old tv, 4k was fine but HD looked over saturated, skin tones looked too smooth and the whites had a blue tinge to them, added to this SD content looked appalling. It was so bad I had planned to return the tv. I followed these settings and was just amazed, the image quality is now fantastic, I'm honestly shocked at the difference. Thank you!!!
I really learned a lot from you, so I definitely subscribed! Not only do you impart you information clearly, but you are so funny! You make learning funny! The only downside that I've found watching the 2 videos that I have seen so far, is that I'm going to have to spend a LOT of time watching your videos!
The trick with a lot of these things is that you don’t really ever get to do these side-by-side comparisons, and it often shows the trade offs you make. For example, the ‘snow’ example used to show the ‘correct’ setting for color temperate left me conflicted. Unlike what Vincent said, I felt the snow looks more realistic in the Cool setting. But skin looks more realistic in the Warm setting. But would I have noticed the slightly yellow snow in Warm had I not seen the difference side by side? Would my eyes adapt over time?
you're just used to watching things in cool, or the monitor you're watching on is too warm. Objectively, the picture colour is more accurate and eye strain is prevented when using the correct temperature settings.
I completely agree with you that snow (among other things, also metals and some shiny materials) does(/do) not look right on Warm settings.
I don’t know if this is different in different places, i.e. at different latitudes; but snow up here close to the Arctic Circle definitely never looks orange-y.
@@SwedenTheHedgehog are we watching the same video? everything's too blue and sterile on "cool" settings. on warm, the snow is still bright white but the colours are deeper and more natural
@@oxstorm644 If you by "bright white" mean "sun bleached-looking yellow", then sure. There is nothing natural about the color of the snow nor the skin colors (at 7:52 for example), with the "Warm" setting.
@@SwedenTheHedgehog Except people forget this is a movie production not 100% natural, for the color setting of this movie the skins and everything else is natural by its context with warm
I personally use motion smoothing on my LG OLED but only set to 1, the lowest setting. I'm one of these people who is very susceptible to motion, I find the rainbow effect on DLP projectors very annoying when none of my friends notice it at all, so watching a digitally shot movie at native 24p can sometimes give me a headache. "1917" is probably the worst offender, it felt like watching a slideshow. The judder with no smoothing is horrible, and so is smoothing set too high. I find smoothing on the lowest setting to be the lesser of two evils.
Same, I usually set it to 1 or 2 on my Sony. Can't stand judder.
Curious if you have tried "cinematic movement" before? Just got my LG C3 and like you, I found 1917 quite migraine inducing without any interpolation.
Yes totally I agree with you... Recently I changed all these settings & now it looks real & professional.
Thank you for the table of neutral sharpness levels. I always leave it at default in fear that lower is actually adding blur.
Usually default is about 20
Can anyone confirm if TCL's neutral is a 50 (like Sony) or a 0 on its Android TVs?
@@PotentChr0nic I have my sharpness on 20 on my TCL Roku tvs. I have the temperature setting on warm which I just recently changed from normal. Warm 50 must be a default setting. Adjust it to suit your taste that's all I can think of.
I've noticed that "motion smoothing" tends to highlight the CGI in movies, in a very bad way. It makes the CG look so out of place and almost "cheap" that it can be quite jarring. It appears like you can see the layering that is done or something.
Yeah honestly the motion smoothing effect looks really unnatural. Even the ghosting issues that old lcd monitors had looks better ironically.
While i agree with you that it makes CGI look bad. i personally absolutely love the smoothing effect. Since im a heavy PC gamer and well started with frame interpolation long ago i can barely go to a move to watch a film anymore, i get a headache from the stuttering... I would love if producers stop with the 24FPS and go up to atleast double... loved the hobbit HFR compared to the standard version!
100%
yup. There is nothing natural about 24fps.
"Cinematic feel". It's literally a holdover from 1929..
Same thing with d65 white point. It's warm as hell for old movie screens. Warm 50 is NOT true to life.
Let me say it again 1929...
@@paulcox2447 indeed, i guess life is a soap opera then 🤣
Man, this video has been AN EDUCATION for me. Thank you so much!!! Have had a beautiful Samsung 4K TV for a few months now after having had a rather crappy HD Tv for 10 odd years. The upgrade has been great but I really struggled with the screen settings and could never get it quite right yet. This video is awesome and I love the FILMMAKER MODE. I'm never going back 😂 Thank you!!!!
Exactly, this is super eye-opening.
Ya my tv settings were total crap and I never knew it lol. My eyes feel so much better now.
Good breakdown. Too bad I only use my tv for background noise with a cheap antenna that only works good for like 5 channels
Watching this with MEMC enable and Native colour space instead of auto colour space on my Samsung TV
Filmmaker mode seems too dull to me😅 But Warm 2 do look natural so yes I changed it to warm 2 and it looks much better
Not necessarily accurate but my settings on my QN90C 43" are:
Base on Standard mode with these changes:
Brightness - 25
Contrast - 45
Sharpness - 0
Local Dimming - High
Contrast Enhancer - OFF
Colour temperature - Warm2
And for HDR mode:
Do the same but keep Contrast and Brightness on 50
And it's much more "natural" than the outta the box settings and still vibrant than firmmaker mode.
Ofc I turned off overscan
8:55 What if i told you that i have reseted my LG CX filmmaker profile to it's default and will still have sharpening set to 10 on HDR and SDR !
Worse settings is on SDR Oled light is set to 80 and on HDR DTM is activated and peak brightness is set to high, that won't do justice to the movies at least in a dark room, that will overbrigthen the image.
Kudos to people that are able to watch content without motion smoothness on oled because on an oled due to the nature of the technology, low framerates movies will looks like crap when motion is involved specialy if you are sensitive to stutter, i will take anyday soap opera effect over stutter and no that will not break the immersion for me and watching Gemini man didn't either.
LG used to have Filmaker Mode target a 100 nits with the OLED light set much lower but they changed it in a firmware update because people were complaining that it was "too dim" even though it was supposed to be the most accurate picture preset on the TV. It kinda defeats the point of FM mode.
I do agree with you that low framerates on these tv's is unnecaptalbe shit because of the sample and hold nature of modern displays. As much as I respect Vincent, I don't really agree with him on 24fps content being the "ideal" for cinema. Movies should have started filming at 48fps or higher a long time ago and the 24fps standard is so archaic and was not done due to artistic reassons but cost.
The funny thing is old movies when they were projected using a single blade shutter way back in the day used to look much smoother than when they switched over to multi blade shutters, to minimize flicker but "cinemaphilles" have convinced themselves that the blurry smeared "dreamy" garbage is what movies always used to look like, when in reality that couldn't be further from the truth. In fact it blatant revisionist history.
I still don't use motion smoothing on my TV but I don't call people weird for using it and I can completely understand it.
What needs to happen is that 24fps needs to die, but as long as the old guard is in place making these decisions it won't happen unfortanetly.
@@vdentertaiment4088 Are you saying that switching to three-blade shutter makes the picture worse? What? It was introduced to combat the flicker - flashing each frame three times helps get over the flicker fusion threshold, the motion gets smoother.
@@RockinEnabled 24fps movies when they were presented originally on a all projectors that were single blade 24hz way back in the early years of cinema. The image would have very viable flicker but the motion was ultra smooth and clear and there was no judder.
Film projectors started to use double/triple bladed shutters to increase the refresh rate to 48/72 Hz. It would cut down on the flicker and reduce it dramatically, but the motion was a blurry mess as a result. Just like on modern >24hz displays in your home. 24hz is just tooo blurry.
That "blurry dreamy" look that Vincent mentioned was not an artistic decision at all and 24fps when displayed on a proper 24hz display would have no motion blur at all. It would be very flickery but it would not have much judder or motion blur.
TV manufacturers should offer 24hz black frame insertion to display 24hz on modern displays properly, but that will give many people a headache as the flicker would be too much for many I suspect. BFI is only offered at 60hz or 120hz.
@@vdentertaiment4088 so, in the end, we get the judder, which is much better than flicker. I've been to a showing which was projected from film, and it was flickery to the point I had to just either leave or get used to it. I somewhat tried not to notice it. But it didn't look good. Next time I'll be at a showing from film, I'll pay attention to judder.
@@RockinEnabled I don't experience judder, as i said it is stutter wich isn't the same thing, the TV itself by the "Real cinema" fonction can display content without judder.
Weirdly enough you will have to increase the de-judder fonction to get rid of the stuttering.
To really understand what i'm talking about i suggest you get a look at the video : "Judder on TVs Explained (Motion 5/5) - Rtings"
Without any motion smoothness i can't watch movies, i got strong stuttering, wich isn't what you can see in a movie theater, so it ain't supposed to look like that, that's not what creators intented.
De-judder at 2 is the bare minimun i set it to get an ok experience without getting soap opera effect BUT stuttering still occur, to get rid of all stutter i have to put the settings to 7 that will introduce alot artifacting and soap opera effect so to get some sort of compromised i keep my settings between these two, that's the settings i change the most on the TV, i try to get used to lower settings (and stick to 2) but i can't, i'm too much sensitive to stuttering.
Thank you for saving me the trouble of making this video... the biggest shame is that with new TVs, you need to spend half an hour fixing all the damn settings back to a neutral viewing experience.... I might add that finding the mid-point in the colour temperature might be preferred by most people rather than going full warmth... good video.
that's exactly.. neither cold nor warm. Totally neutral!
I expected you to talk about adjust contrast..its the biggest factor on my u6k ..its unbelievable how much more natural the picture looks when its turned off
I like the cool version of the snow, to me the warm setting has a yellowish tint and is dark. Something about brightness = perception of higher quality to me. I am not a videophile. I just judge with what my eye tells me and either like it or dont. To a certain extent it may come down to personal preference. It may not be the director's intent, but I only care what looks best to me.
90 perfect people love cool colour temperature in this world only 10 perfect love old cinema warm colour but this is 2024 not 2004
I've learned not to say anything when at people's homes when I notice a bad picture on their television. Most people think it comes from the factory with the settings set to optimal. On several occasions I think I may have offended people when I suggest they adjust their picture. To this day I still go to some peoples homes and they have the picture set to zoom on an HD channel. Just this past weekend I visited my sister's home where they have an enormous 80 inch 4k UHDTV. The first thing I noticed was how dark the picture was and the strong blue tint. To me the picture was unwatchable but I certainly didn't say that. I mentioned that maybe they should try the warm picture setting. Pretty sure that pissed off my brother in law. When I mentioned they should try streaming some 4k content from Netflix or get a 4k UHDTV bluray player to really appreciate their new tv, I could feel a chill in the room. My brother stopped by my house and noticed I had LOR 4k steelbook. He asked if he could borrow it and I asked if he had a 4k bluray player. "Fine, I don't want to watch the stupid movie anyway" was his reply. I'm learning to just keep my mouth shut.
Same here 😀
I just adjust my friends' and family's TV sets without asking when they don't notice. Nobody has complained yet.
Just turn the colour to 0 and have a nice Black and White colour
The one that gets me is the "dynamic contrast". We were watching Daredevil on Netflix and couldn't see a thing so often we just criticised the maker. Finally I went into my TV settings, and stumbled across DC - suddenly we could see everything. I ended up turning most things off because I realise they were just artificially trying to tidy stuff up.
Meanwhile one set of my parents are watching TV with blue skinned people, and the other is watching videos were white has a pink tint and I can't fix them without them complaining. Drives me insane.
Tip 1 literally saved my HDR... I always thought my TV has just a bad implemantation of HDR but is was actually just my sharpness setting cranked up to maximum. I have a Samsung Q6FN btw. Thanks for that tip!
Your knowledge combined with your good sense of humor makes this channel so amazing to watch 😂
I am not sure I agree with the colour temperature on your example from LOTR. In my opinion the snow with low colour temperature looks more realistic if it is portraying high mountain, freshly fallen snow. The warmer colour looks more like old snow, perhaps even tainted by city pollution. I am a Norwegian, by the way, so I have seen my share of different types of snow.
I would describe it as watching with sunglasses on
Interesting. On #5 Color Temperature, the snow you said looked more realistic in the video on the right actually looked yellow and fake to me, while the snow in the video on the left looked pristine and white, I wasn't seeing any blue hue there at all.
The video on the left look bluish to my eyes. Its because your eyes is accustomed to bluish look. And you must know that light from the sun is look little bit yellow, so the video on the right is more accurate and more realistic
I agree, i'd shoot for somewhere in the middle of those two, maybe slightly closer to the right screen because the faces looked more realistic, but the snow was yellow. So maybe Warm 30 or so is a good middle ground vs going all the way to 50
For me to. In my Eyes it looks mutch better than warm 50.
The Right Picture isnt white.
True, this man obviously has never seen snow.
The camera exaggerates it. Cool hue looks horrendous IRL
All this stuff is great advice and a good resource to show people who don't get why I want to play around with their TV settings, except for motion smoothing I know it's not perfect and even can destroy certain animated shows but 24 FPS just causes me to feel motion sick and get a headache
Same here. In fast pace scenes I also don't perceive the pictures as movement anymore, but as individual pictures, because they vary too much from one another in 24 fps.
It's not 'great advice'. The whole point of options/customisation is personal preference. If I want sharpening (essential for SD content), I'll sharpen. If I want smooth motion (which I do), then I'll not take advice from a 24fps diehard.
@@--legion I don't like 24fps, I much prefer smother motion even if it leaves visual artifacts, I didn't know where you got the "24fps diehard" from, even in my original message it says I get motion sick from it
@@KaelumKrispr I'm referring to the poster of this video and others like him. Those that refuse to accept 60fps is nearer human optics and therefore more natural, preferring instead to cling to the stuttering unnaturalness of 24fps. This 'advice' is defective.
@@--legionstill only one point ( that is his personal preference that he says is controversial ) in a fine video
5:49 love how the frame rate temporarily increases to 60 fps - to demonstrate what Vincent is talking about. I actually prefer non-movie videos at higher framerates - have you thought about changing for future videos?
As a long time viewer of Mr. Teoh videos i've already enabled all these correct settings on my tv, and the result speaks for itself, the picture is jaw-dropping in every source i watch, amazing.
I say this every time, Vincent you are really the best in the business! Wish you a very happy Christmas :)
I struggle with switching my color temp all the time. White looks so much more natural being colder. But I do like the way colors look being warmer.
I have a C1 and one of the reasons I prefer motion smoothing is to reduce judder. How would you suggest removing judder while also avoiding said motion smoothing issue?
Turn off tru motion and turn on real Cinema. It multiplies existing frames rather than inserting frames that never existed.
02:15 for those with older samsung smart tv's the sharpness range is 0-100 and not 0-20 like the newer one's. So in that case apparently neutral sharpness is 20/100 not 0. If you have a newer one then it's 0
02:48 Noise reduction seems to be called Digital Clean View on the older samsung smart tv's
04:51 motion smoothing, aka auto motion plus
08:45 warm 50 on LG Warm 2 for Samsung
I respectfully have to disagree on the motion smoothing and color temperature. What you said is valid but anything less than 60 fps especially on a big screen gives me a headache. For color temp I find the middle setting a good compromise. White looks more natural, cold or warm are too extreme on the yellow and blue spectrum.
I'm guessing you can't go to the movie theater then?
Same for me. If it's 60hz without motion handling, I can see the judder, and it really throws me off. My current TV is 120hz with a 480 motion rate and I love it. If I turn off motion smoothing, then the judder just gets real bad and I can't stand it. I also never see the "soap opera" effect, maybe because I've gotten used to the higher motion?
In short: For digital sources, where signal degradation has far different effects than analog signal degradation, software-enhancements, while meaning well on analogue, have no positive effect. Also color reproduction is a crazy insane topic, which, apart from using calibrating software, goes mostly by ear.
Motion smoothing is a double edged sword to me - it makes slow moving/nearly still images look better and fast moving images look worse, I normally run it on a small value. Same with the color - I just like the look of cooler colors than what is intended. My TV is for my viewing experience and some settings I just prefer whether it is "supposed" to be like that or not.
But Ive been playing games my whole life, so I think my eyes are just adjusted to higher frames. In fact I used to think "man why do soap operas look so good" haha. I'd bet the ratio of time you spend playing PC games vs watching film would be a decent indication of whether you like the soap opera effect or not.
This is where I'm at. I think any argument of "24 fps looks better because it allows your brain to interpolate it" is laughable, and it's coming up for an excuse.
We like 24fps because that's what we are used to, and that's what cinema looks like.
If we all grew up with cinema at 60fps, would we look at 24fps and say "yes, this is better"? I really doubt it.
I like 24 fps for the most part... But rapid action, and black scrolling infront of white at the lower frame rates looks worse than at higher.
@@rattslayer Another problem with 24fps on TV is that TV screen is always lit. Movies are presented in short bursts of light interposed with darkness when the film roll moves to the next frame. Some TVs offer black frame insertion, but it's far from the real thing. Modern TVs aren't bright and fast enough to truly imitate cinema feel.
I agree with you guys and I add:
-whether or not some movie is supposed to look in some way, It seems like filmmakers expect you to have the most expensive setup. Movies for me sometimes are too dark to the point where's not enjoyable anymore (yes I don't have HDR or OLED or a dark room).
-It's ironic how tv makers crank the motion thing while game makers themselves almost always bias towards visuals instead of fps.
-As a gamer, I feel the soap opera effect at 60 fps in movies, but only recently I watched something at 120 fps (phone's 120hz display) and it looked fantastic! It was like I was there. No soap opera feeling. Is it just me?
@@fernosan I know you can see more than 60 fps, but it depends on the person. I have a monitor that does 144, and when I first bought it I tested it with GTAV to keep increasing the graphics incrementally to see what framerate I thought was best and I realized around 90 fps is the top for me. I can watch something at 85 vs 90 and tell which one is which, but above 90 it all looks the same. Similarly I think 4k is only marginally better than 1080, so I think 8k will be useless to me. These displays are getting to the point that every person's personal optical capabilities are starting to hold things back in some aspects.
I struggled with gaming on my Samsung tv default settings cuz it made my eyes hurt a lot. I then changed the settings to make it warmer, look natural, and also lighten the dark parts (default settings makes shadows look too dark and over sharpens the lighter areas). Some days later, it started to recognize my PC and unlock a ‘graphics picture mode’. That’s what I use automatically now. I love it
It's important to note that Sony's Reality Creation Feature is separate from the Sharpness setting, and I swear it's worth leaving on Auto (it's not overly aggressive and it really really helps up-scaling), I though I would never manually set it past 20. It uses an object-based sampling method to enhance detail. not just contrast based. Otherwise I'm completely on Vinny's side here.
As an aside, I used to have TVs on my sales floor that had such awful, awful motion smoothing that it would actually cause the film grain to smear across the screen when things were in motion, because it was applying interpolation to the grains. It was nuts. It took way longer than it should have to trouble-shoot that issue, because I had initially pegged it as a noise/sharpness issue.
Reality Creation on auto is great for cleaning up Nintendo Switch - especially the 720p home screen.
The LG OLED that I had before my A9G had a similar setting, but even on high I saw no difference.
Reality Creation is intelligent sharpening and it's awesome. I leave it on auto for most stuff, like PS4 games and such, but I do bump it up significantly when playing older PS2 games, as it hardly causes any sort of ringing artifacts. It feels like it's actually undoing some of the blurry upscaling and getting closer to the scaling looking like it was integer scaled. 50 is good for most PS2 games (and also some DVDs), but I go even up to 75 for other softer looking games.
For Switch games and PS3 games it also works great, but I think Switch benefits even more due to some of the odd HD resolutions and dynamic resolution scaling used in games. Most games don't need extreme values, but I actually found Snake Pass looking soft enough that I could safely push Reality Creation to the max without causing ringing in gameplay elements (some does become visible on HUD), and even then the effect is still subtle but looks reasonably clean and sharp.
A really great tool.
It can help try and salvage a shit situation like a 720p source or something, but in general if you're starting with a good clean image then it's not necessary and doing more harm than good.
I 100% agree! Reality Creation is a fantastic feature! 👍
Right?! Lots of reviewers say Sony has a winner in their new upscaling chip, yet they're the same ones that 'recommend' to turn it off lol.
Man this guy is hilarious! His costume and everything hahahah too good! I love all the hardwork you put in and genuinely enjoy your reviews, comparisons and guides!!!
Vincent, I noticed when watching Dolby Vision content that sharpness automatically jumps to 20. Since dolby vision is supposed to "dynamically adjust for each scene", I was wondering if this should also be set to 0 when viewing dolby vision? I have Lg C1, Thanks
Sometimes sharpness jumps to 50
On my TCL 6 Series R646 Google TV, is not allowing almost any of the settings to change. I see this on streaming services consistently such as Disney +.
I noticed this as well. I recommend turning it down to 0. From what I've learned, the reason the LG C1 commonly defaults to 10 or 20 sharpness (Even in game mode where all settings should be disabled/neutral) is because of picture settings like super resolution, noise reduction, etc. With these settings enabled the tv needs slight sharpness correction to try and make the image look cleaner. If you are going to turn all of these settings off like Vincent suggests, then sharpness should also be set to 0.
@@lilbabypimp3232 I second this. As an C and G1 owner 0 really is the way to go I've never gone higher than 10 and even at 10 personally I could spot some posturation/ghosting as well as scenes seeming *fuzzy*
OLED (especially the C/G & Master Series alomgside the Panasonics are amazingly sharp) BUT given that Sonys default sharpness is at 50 it makes me question if they just appear sharper who knows... anyways
TLDR; I second this comment and have worked at richer sounds for nearly 17 years.
I thought for lg c3 oled it's good to turn on motion smoothing to cinema motion because otherwise the 24fps and the 60 or 120hz stutters a little
Me: "I don't know-- that warm looks *too* warm."
Also me: looks at the clock and realizes that my computer is in night mode.
i find it impossible to watch 24p content without a tiny bit of smoothing on oled its litterally looks like a slideshow
I feel the same on my LCD Bravia 240hz screen... 24fps is unwatchable with all smoothening off...
It's likely that your screen is lacking a 24hz mode so you get a notice stutter. I had the same issue with my old tv, but my new hisense knows to swap out to 24hz when watching a movie, and looks much better.
Vincent, I've been reading and watching your reviews for over 10 years, and want to thank you for the great content. Would love to know which colorimeter or other device for screen calibration you recommend for home use.
You saved me my LG C2 was set to Just Scan Auto so as per your instructions I changed Just Scan to On and it zoomed back out how it should be. I thought having my aspect ratio set to original would take care of "zooming in" picture but nope I also needed to change Just Scan to On and now the aspect ratio really is proper. Thank you so much! I thought I had it all figured out on my C2 but this is a nice surprise thanks again. I like SOE a lot so no convincing me to change that even if its a more accurate picture I just like it.
Motion Smoothering or Interpolaration always bothered me as an anime fan. I have a friend, who always watched his stuff in 60fps and got used to that so much, that he couldn't watch stuff without it anymore. But especially on anime, where scenes with low movement are animated in only 3-15fps, the program or TV has to assume so many new frames, artefacts, smearing and stuff appearing frequently was inevitable. Also for animation, especially anime, directors and animators actually use smoothness creatively for certain effects, like emphesizing on impact. For example for an action scene, which they want to stand out, or when they want to go for a time-slowing effect, they increase the amount of details and frames drawn per second for additional impact or to create a more smooth eerie feeling. Very talented animators also use the timing of frames to add weight in motion or to covey a very swift action on screen. All those effects become completely lost when having artificial 60fps constantly, I feel like it's almost disrespectful to the artist. A bit like how you said noise reduction destroys intended graininess by the director, just even worse.
Animation also relies much more heavily on brain filling the gaps, especially 2D animation, so this topic is even more important in this context. There is a reason why no doll-like 3D Pixar/Disney character with tons of precise nuances in facial expressions will ever feel as human and believable as much simpler (technically) anime character in a well told story. Pixar's characters slap your brain with 100% final representation, so there is nothing to fill-in. You literally see this big elastic doll with complex physically based shading and not a true human. Meanwhile in anime even absurdly big eyed design and poor artstyle don't prevent your brain to accept it and reinterpret it as a possible person.
However there are examples of some shots in anime where 60 FPS interpolation can achieve quite amazing results. It's usually horrible, but the fact there are some motions that can look beautiful with it means animators should themselves embrace this possibility and try to use it sometimes for high budget titles, which would mean mastering at more than 24 FPS to have higher range of possibilities.
I watch anime and playing videogames with motion interpolation on my panasonic plasma tv. Plasmas especially suffer if you don’t use it as you get a lot of judder and green ghosting with 24/25/30 material when it is off, and get buttery smooth “effective 300 Hz” when it is on.
It is smart enough to not touch low framerate cel animation, but panning backgrounds and CG animations become glorious, smooth and sharp.
IDK about other TVs, but in mine the interpolation is clever enough to improve anime without creating too many artifacts.
If an anime is drawn with high enough framerate, also gets interpolated with adequate quality.
In games, you have lag obviously, and it breaks down on unstable framerate, minimum effect if the game is already 60 FPS, though still noticeable on camera pans. So, in fighting, STG or any PvP games I turn it off.
Motion Smoothering is like a drug. Once you get used to it, you can not stop using it. As an old school movie watcher I ofcourse NEVER use this feature.
Into the Spider-verse also famously used different framerates throughout the movie to show eg Miles' confidence.
@@noop9k yep Panasonic and Sony are ahead on Interpolation. I also have an old (10 years nearly) 42 inch LCD TV in my living room and the Interpolation is always on point, especially on streaming videos when the compression kicks in.
In the other side my new (3/4 years old) LCD LG seems to suffer in Interpolation...
Newer hardware, worse performances... Better multimedia side, but who cares when the image is laggish and unbalanced...
I use a 43" Samsung UN43NU6900 as a general purpose computer monitor. It is set to 10 (0 to 20) and it looks good. If I set it to 0 like you suggest, the text gets very blurry and faint. If I set it to 20, I get the edge artifacts you're talking about. I don't think it should be set to 0 though.
Each TV is different. The suggested settings are for the most current and expensive models, I think. I have an old Panasonic plasma TV (a lot older that your TV) that was cheap and on that TV the most correct sharpness setting is 6/10 (I've tested it).
8:40... actually, I like the cold temp better. It looks more realistic. The snow in the warm setting is yellowish and everyone looks like they have jaundice. To be perfectly honest I think the best setting would be somewhere in between these two as I don't actually like either. But the cold temp is better to my eyes.
for color temp i usually set it to neutral it seems like a good balance between warm and cold
What I hate most is the color temperature setting, especially added to the habit of most manufacturers to not calibrate the colors and gamma of their TV in factory. You never know it the colors you're seeing are right unless you have a colorimeter and know how to properly measure them. And also I hate when some PQ settings are on by default, and even more when they cannot be fully disabled (Samsung?).
Samsung is awful, or was about this. There is no setting to adjust dynamic dimming. It's super aggressive and subtitles will turn indoor scenes black, and dark scenes with a window of light go black, title screens on black look excessively dim. You have to go into their service menu to adjust the range to disable dynamic dimming.
Never been a fan of the warm colour temps. I know it's the recommended but for my eyes, i prefer the normal or standard temps. Then calibrate to that. Colour: 50. Colour Tone: standard. And Colour space: auto on my Samsung looks pretty good to me.
D65 is a standard, no matter what you fan of.
@Michael Weizenfeld
Advocating something simply because it is a “standard” is a terrible argument.
Everything in the history of the world that was a standard until it isnt anymore.
The Earth being flat was the standard until it was round.
@@MichaelWeizenfeld No one said it wasn't a standard. You're discussing preference
@@jacobmarley2417 unless you don't have a monitor calibrator, and didn't know a theory, behind it, then you are more likely a flat-earther, claiming something based on your feelings.
@@vincentjacobsmm I am not discussing preferences. There are standards for display devices under which they correctly convey the image that was intended by the author. And a person who has no idea what he is doing should not go into them.
This is an excellent presentation, thank you. One note, I set each of these OFF as you went through them on my LG, but at the end when I selected Filmmaker Mode, the Sharpness reset to 10%. (?). I went ahead and reset it to zero, otherwise all the other settings went to your suggestions. FYI, on my LG I believe the motion smoothing is called “TruMotion” which I verified as OFF, and the new Color option for Filmmaker mode is “Warm 2” which is a 50% color. Thanks for the tips, and I will check your other videos as I’m getting a new LG C1 soon.
I'm wondering about the sharpness level as well.
Color temperature set to d65 white point but some time you can extent for more blossom picture but D65 is neutral and natural
I don't like artificial motion smoothing but I do hate that 24fps is still used in cinema, even when a movie is shot 100% digital. Pans and sideways motion look awfully juddery.
Here is the thing:
Having sharpness on a low level is beneficial on my ultrawide monitor and does not hurt image quality.
Monitors behave different from TVs I found which makes sense because of the very different use cases.
It's very difficult to stop using motion smoothing for me. Without it, I have some stutter from every backgrouns in a movie, especially when the camera is travelling. And the option on my LG "true cinema" isn't often available ... I wonder how you can improve this without motion smoothing. If someone has an answer, i'll take it !!!
True cinema only helps with judder which looks like the panning is skipping back every few frames. Caused by showing some frames 3 times and others 2 times to fit 24fps into 60hz. True cinema just makes the screen run at 120hz so 24 can divide evenly. Unfortunately there is nothing you can do. 24fps just plain looks bad on panning shots and on fast action. Why they don’t film panning shots at 60fps still confuses me.
@@silverwatchdog I heard that James Cameron might help us with Avatar 2 : HFR becomes VFR, Variable Frame Rate.
Finally !! Hope that it will work for the majority of the viewers !!
I followed these settings and the difference was huge. The factory settings in my Samsung S92C were awful. Like having a Porsche limited to 80 km/h. Thank you so much.