I’m extremely mad at the HDMI Forum too. Their convoluted naming schemes mean you don’t even need to support the latest features to get included in the HDMI 2.1 umbrella.
Absolutely. What THEY have done is even more unforgivable than what MediaTek just did! What were they even thinking there? Who is it that they are trying to look out for and protect there with that HORRIBLE decision? Certainly not the tech consuming public. What in the hell is wrong with these people? Did they think this was just going to quietly slip by and no one was going to catch this screw up on their part? Did they think no one was paying attention? They need to apologize and clean up THEIR total PR mess now, just as badly as MediaTek does. (Maybe the two of them should team up and go on a world-wide "Apology Tour" to try and undo some of the anger and frustration their POOR judgement has caused consumers? I know a certain TV company in South Korea that horribly messed up a certain QD-OLED roll-out/product launch recently that could probably stand to tag along on that one as well, now that I think about it) Wow, 2023, at least from a tech perspective, is off to an absolutely terrible start. Let's just hope it gets better from here on out.
They are really the ones to blame for this; It was the same with the HDR specification where HDR400 was a thing that was advertised, even though it provided basically a standard picture.
The C9 was one of the best LG televisions! It had full bandwidth support for HDMI 2.1 and also allowed the user to modify practically all of the tv settings! And if you can find a brand new one anywhere, that still has a warranty included.. it is worth looking into!
@@CalvinTennessee if you have a soundbar that uses eARC, a PS5, XBox Series X, and an Apple TV, then you would want all 4 to support true 2.1. If you’re a gamer with any sort of speaker system, you need at least 3.
@@Tomiply LG and Samsung do not sell TV SoC. So if you are neither of these companies who make their own SoCs, you're pretty much stuck with MediaTek so no competition. Sucks for TCL, Sony, Hisense, Vizio and the others.
I'm not mad at you Caleb :). You tried to help us and you were mislead. Not on you at all. Much respect for retracing your steps and providing solid evidence too.
This was eye-opening. I sell TV's as a "vendor expert" in a blue-themed big box store and that vendor has claimed for years that you "don't need more than two HDMI 2.1 ports" and it turns out its just the SoC they have to use. Thanks for breaking this story.
Funny cause they'll sell you a bunch of things you dont need. In fact if a seller ever told me that I'd point our I don't need to buy most electronics since they're a luxury
I surely don't need them, but some other people could. Professional videographers, maybe? I've got a PS5 (and that's one port), and I could maybe hook up my PC one day (and that's two).
A certain amount of truth behind the “only need 2 x 2.1 Ports”. I’d love to know the percentage of ports used on TVs and the number of devices requiring 2.1. I think it’s scummy that they are trying to call all ports 2.1, but while the knowledge was more available it, it really is caveat emptor.
BAMBOOZLED!! It’s not a critical feature per se, but shocking that Mediatek can’t read the room. Playing 1080p Blu-ray is all it takes for 2.1 certification lol
@@mastixencounter No, sadly I'm not kidding - if you look up the specs, "Full HD Blu-Ray Disc" capability is the baseline spec of 2.1 - See the article on TFTcentral "when hdmi 2.1 isn't hdmi 2.1 - fake hdmi 2.1" discussing this problem where a Xiaomi monitor can only display 1080p but is certified as HDMI 2.1
@@stopthefomo HDMI 2.0 doesn't exist anymore and the name has been scrapped since 2.1 is fully backwards compatible. When it just say's HDMI 2.1, that means HDMI 2.0. If it were a more true HDMI 2.1 capable device, then they have to list the specific HDMI 2.1a features that weren't capable with HDMI 2.0 previously
Glad you have found this out, it was pretty big news many months back when the standard was revised and tech press realized that HDMI 2.1 was going to be meaningless. I am pissed too
Why I have been thinking over the years why not go via DP. Since on certain issue's it takes HDMI to cartch up via resolution and what Hertz it can do. (But will only probably be in monitors. Yes I lean toward the PC gaming.)
I had a Sony 900h and felt totally ripped off by how they advertised 4k 120hz and it wasn’t there for over a year after purchase. Bought 2 LG C1 OLED sets 2021 Black Friday and still feel like to this day it was one of the best purchases I have ever made, this video even further confirms it. I even replaced a computer monitor with a 42” C2 OLED and love it, all 3 with 4 HDMI 2.1 full bandwidth ports 👍
Kind of reminds me of how Intel got comfortable not giving things promised for years because they had the computer market fairly cornered and it wasn’t until Apple started making their own SOCs and AMD started to provide real competition that they started trying to innovate again in anyway. Same thing needs to happen to Mediatek.
One of the big reasons i love my LG G2... 4 hdmi 2.1 ports. Plus it has the newest ATSC 3.0, so whenever companies start pushing higher resolution over the air it's ready for it 👍2022 LG oled tvs were all really nicely equipped
@@rijjhb9467 I have sony x90k that has 2 hdmi 2.1 ports. Now I want to connect 1. Apple tv 4k need 2.1 for atmos and vision 2. Soundbar need 2.1 for atmos 3. pc for 120fps gaming 4. ps5 for 120fps gaming I would say it’s reasonable for someone to want this setup.
@@rijjhb9467 A: that's not even the point. They said they would and they went back without being open about it. B: The alternative is 2, which, if you are using a soundbar or amp with eARC already takes up one, so now you're left with one. There's PLENTY of people that have both a PS5 and an Xbox Series X, or a console and Apple TV or Want to hook up a console and PC and they now have to choose what machine they want higher FPS on due there being onlyu 1 port available left.
I may not need to have four full bandwidth 2.1 HDMI ports, but I don't appreciate feeling that I have been lied to by Mediatek. I remember when you and others first reported about the 2.1 ports and what you were led to believe. There are lots of features that us consumers may not need, but get excited anyway because we know how convenient it could be. They should be more honest with the reporters that they use to help promote their products.
Your tv has one hdmi 2.1 port. Rip thats used up if you connect a surround via hdmi with eArc. You will need to buy an hdmi switcher to connect more 2.1 devices
@@SIMRIG412... and the new Apple TV... ... and the latest GPUs and laptops... ... and anything in the future which supports it because that's how time and tech works.
For a TV maker to suddenly turn into a design house just for the sake of designing TV SOCs doesn’t make financial sense when you can rely on existing design houses and fabs that have sufficient yields to enjoy economies of scale.
I love it how manufactures keep pulling this BS and then wonder why peoples upgrade cycle is significantly slower than what it used to be. They go out of their way to piss their customers off, screw them over with poor customer service, screw them over with poor pricing, screw them over with barely upgraded products, than wonder why last year's iPhone 14 was apparently the worst selling model Apples ever seen! Here's a hint guys, it's not because of the recession. Guess my LG B6 is going to be buried with me...
I know I'm in the minority, but as someone that has a PS5, Series X and a sound bar based surround setup it is incredibly annoying to see just two 2.1 ports especially when one of them is the ARC too. If the ARC port wasn't one of the ports, it really would alleviate a lot of complaints and there's been a few TV's that have done it. It's just annoying.
This is stupid because it’s not a HDMI limitation it’s in the TVs software all the HDMI dose is send data from a connector to another connector the cable is the pathway. So my thinking is these companies probably don’t have the cpu power whatever chip they are using in these sets to handle more then two ports with 2.1. Samsung and LG are using pretty powerful chips in there sets now Sony has been too so it’s frustrating they are not giving more then two either. But again if soundbars and av receivers can offer more so can a tv
In a bedroom nothing beats LG and in a living room nothing beats a Sony .. Samsung is meh and they always mess up after 4 years .. my ks9000 was my fav but messed up in just 4 years of moderate use or no use I went to turn it on and it had lines … Samsung lost my love after that
@@fabolousjada5070 interesting I have an old Samsung 1080p set big chunky boy from 2011 it still works today maybe they don’t make them the same quality anymore
100% agreed. This is an unforgiveable, inexcusable screwup on the part of MediaTek. It doesn't even matter if you are the type of customer that needs four HDMI 2.1 ports on your TV or not. This is about integrity, business ethics and professionalism. There is NO excuse for what MediaTek has done here. None. Now that this is all out in the open and the damage has been done, where is the formal written statement of APOLOGY from MediaTek? Is it posted up on their web site yet? If not, WHY NOT? Where is THEIR video of the CEO of the company coming forward to explain what happened and take full responsibility for this absolute PR disaster - and BEG for the public's forgiveness for his company's incredibly poor execution and total lack of good judgement? We're waiting.
i don't need hdmi 2.1 for rubbish dolby atmos 4k format its a scam . they use near field mixes it won't make the sound any better its a scam by film studios use it as cash milking cow
I have an eARC Nakamichi soundbar, gaming PC with 3080 Ti, PS5, and Apple TV 4K, plus a Switch. It is totally reasonable to desire all my HDMI ports to be 2.1 Thankfully I bought a C9 when it was the new hotness as it seemed future proof for the time being. That is still holding true to this day.
i think a reasonable compromise we should expect at this point is for TV manufacturers to try reallocate the eARC port to one of the HDMI 4k 60 hz ports instead of occupying 1 of the only 2 available 2.1 full bandwidth 120 hz ports, that would solve the issue many people had although not a fully ideal one. TCL can do this so I believe it shouldn't be a challenge for other brands
That poses a problem for anyone using a Receiver instead of soundbar. Since the Receiver is full 120Hz 4K. But it still needs the eARC to get the sound passed back when you plug in our devices to it. You'd be limited to 60Hz, rendering the Receive useless as a HDMI 2.1 capable one. So it's best to keep the eARC on 120Hz port and instead everyone using an HDMI switch for $25 to create an extra port and hence have 3 full ports. It's just a matter of hitting a button when you choose to play XSX or PS5 or PC. It's not like you will do all of them at once.
@@andrewjohnson1486 video is about mediatek not fullfilling the promises of their chips. You tell me they are way to good? That 90% mean they haven‘t had to resist against much competition…
@@andrewjohnson1486 Oh yes they could. Qualcomm owns a good portion of the smartphone market, they know how to make good stuff. If you have a flagship Android phone, then odd are it's powered by a Qualcomm SOC, even earlier iPhones. Mediatek also makes smartphone SOCs but their market share is pretty much limited to cheap "burner" phones. That said, UNISOC is a more direct competitor to Mediatek and they could also step up to the plate if they wanted.
all the marketing hype with 8K and can’t even put 4 2.1 HDMI ports in their TV‘s 👎 it’s 120 hz at 4K, where’s the problem? if HDMI 2.1 isn’t standard, they shouldn’t even talk about 8K
I think what we are seeing in general is a break down of consumer facing standards. I mean between this and USB whatever, do we really know what we are getting? As a techie I'm more aware of what I need to look for but the ordinary person has no idea that what they bought was really just branding.
This is why all the TVs in my house are made from LG. I mean even to this day they are still updating some of the older models like my CX OLED. And of course my best and newer model reserved for my man cave the LG G2 evo is just supreme! My LG TVs has just been reliable, engaging and immersive. I just trust that LG will deliver on quality longevity and features.
Facts I got the lg cx until this day keep it clean and keep changing content for no burn in .. best tv I ever had plus updates are sometimes good I have a old update tho don’t wanna mess up my viewing
1:48 I love how *Vincent* is so well-known in the TV tech media that you don't even need to give his last name or explain who he is! He is *literally* the Leonardo of TV tech analysis, lol.
We need journalists like you and Vincent to call this shady company out. 4 real HDMI 2.1 ports was playing a big role in my purchase decision last year. I would have bought a Sony instead of LG if Sony had 4.
I don’t understand the appeal of 144 Hz. I do get 120 Hz as it divides equally into 24, 30, and 60. I think 120 Hz should become the new gold standard. 144 Hz, not so much.
I didn't really pay too close attention to this, but I think I got lucky in my recent purchase of a 2022 Samsung QN95B, four hdmi 2.1 ports on the oneconnect box.
As a person that deals with tvs everyday of my life. Other than a super high end computer with like a 4090. You can't even push 4k120 to the tv. Not a ps5 nor xbox series x does true 4k at 120. You will get 1080p 120 or maybe 1440p 120 if your lucky out of them but there will be a sacrifice. Now you could argue to me that vrr is useful i would agree there but there just really isn't very many true 4k 120 devices out there to justify 4 2.1 ports if you're only looking for 4k 120. I could be convinced you need 4k120 for your ultra blueray player but then how many movies out there use 120? I'd love to hear your comments
Yeah it sure is a bummer. However, I'm not in the hunt for a TV this year as I have already purchased two last year. Hopefully next time when I do, it will have become a standard then.
have a PC, PS5, and Series X, and a receiver in the E-arc port, sure I can plug things into the receiver too but just having all 4 of the ports being 2.1 on the TV means less guessing about which port it which since they're all the same makes like easier and convenient.
@@RJ-vy4yd They’re not mutually exclusive right? You can enjoy gaming on PC and on Xbox. My gf solely prefers console gaming as we can lay on the couch.
I'm pissed too. The HDMI forum lining their pockets from companies lobbying to muddy the rules just like USB did & the sheer incompetence from MediaTek to not clarify their "mistake" immediately are making this space frustrating to follow. 4 years, FOUR, now that LG has had 4x2.1 ports which they even back ported to the old X8 lineup. It's gone from inexcusable to embarrassing.
Thank you for this video. I really appreciate you holding these companies accountable. I bought a Sony a80k and I could 4 full bandwidth hdmi ports. My PS5 and Sonos sound bar takes up those ports and I make do with the other ports on my other devices making sacrifices. I planned to update my living room TV to a 2023 version on the a95 this year but I will skip it until all 4 ports are full 2.1.
Hey, thumbs up from me. I appreciate you letting us know what going on. In today's market this kind of info could easily slip by the way side so it's nice to have channels like you to inform us brands whom may be taking advantage of us.
I bout an 85inch x95J last year, well four of them, three showed up broken. I'm glad I did my research on this as one of it's primary functions is to act as a 4k HDR 120Hz gaming PC monitor. I did end up going the receiver route so I do have 4 usable fully functioning 2.1 ports and I did spend something like $150 on a long optical HDMI cable (which works amazingly BTW, do your research). However, I think the blame here rest fully on the HDMI standards people. What a lazy lackluster job they did on this one. I'm an IT guy with 20 years experience in technical details. I have the knowledge to figure this stuff out. Do you think anyone else in my house, friends group, family does? No. Isn't that exactly why we have standards to begin with? So non technical people can easily make sense of complex subject mater?
Question, the best buy rep told me you can't swivel and adjust the g2's mount. That it's just flush and a different mount wouldn't be compatible with it. So I opted to order a c2. Was this info accurate?
@@NYC_Goody The G2 has standard vesa mounting options as well as the mounting spots for the flush mount that comes with it so you can definitely buy your own wall mount for a G2.
Actually hdmi feature support has always been flaky but it's getting worse. Still remember lypc sync feature was meant to be supported all the way back on hdmi 1.4. It was optional in the end and caused me awful delay issues which I couldn't even correct myself with my old TV over arc.
Hey! That sucks Caleb, not your fault. What do you think about 8K HDMI Switches? I've been looking into them, as I probably won't upgrade my Receiver for another couple year's, but was aiming for my TV this year.
8K VRROOM from HD Fury is the only one on the market that supports all hdmi 2.1 features. Awesome solution even for 4K TV’s if you want additional 2.1 ports with no compromise.
My 2020 Vizio PQX has two “HDMI 2.1“ports with EARC on one of the 4K60. That arrangement suits my needs, but a 2023 TV should have four by now regardless.
I have the same T.V. and the setup works flawlessly. I hope the next iteration of the same brand increases HDMI 2.1 cause, eventually, I will want to upgrade since I game mostly on my T.V.
@@QuyetStorrm it is flawless now, but they made a mess of it a little over a year ago with firmware updates. The tv is pretty solid now, and happy with it for a few more years hopefully.
and current consoles don't really need hdmi 2.1 to be fully utilized. they don't have the horse power to need it, and most game companies aren't pushing the hardware to it's limits anyway. The only thing that will need 2.1 to be fully utilized would be the highest end pc graphics cards which is just 1 port, and for the few percent of tv buyers who thinks this affects them, get an hdmi switch and call it a day.
Not to sound silly here, but I have an LG42C2 with five self built gaming PCS connected to it. It had 4 HDMI 2.1 ports. I put an HDMI switch on the fourth HDMI 2.1 port (rated at 8K@60Hz 4K@120Hz) to add the fifth PC. I realize this is probably uncommon, but it works well.
Does an AVR solve the problem 100%? I mean, will the picture pass-through without extra delay and with no effect or filtering? Does it have to be a 2022 AVR or newer?
They were allowed to get away with this because of the HDMI deregulation and this will likely continue if not get worse now that they are allowed to claim they have 2.1 when they don’t fully have it. I feel ripped off as a consumer not being able to trust any of these HDMI claims.
I'm sure Sony's 2nd gen QD-OLEDs will win most 2023 shootouts for pure picture quality. But factor in price, a greater range of available sizes, and not having to pick and choose which features work on which inputs? And I'm thinking LG's G3 OLEDs still look like a pretty darn compelling offering.
It all depends what is most important to you. If saving money and having 4 2.1 ports is than go with LG. If the extra money you’re ok with and you want the most reference picture go with Sony.
As you said early on in the vid, the biggest thing that bums me out as a gamer is the 1/2 resolution downgrade still at 144hz. Everyone claimed it was 4k 144hz last year until it was tested countless times with various high end GPU's and still showed it was running at 1440. I can deal with 2 ports, but please can we get true 4k 144hz now that GPU's can push over that FPS. You did a great job at CES, but this was one of the top stories coming into 2023, how was this not on your top 5 list to vet while you were at the show? Keep up the good work.
Caleb, in my world you rock. I don't appreciate the idea of having to buy a stereo setup to get all the 2.1 ports I need for a lesss expensive TV. This definitely affects what I will buy. Thank you Caleb for all that you do for us!
I probably use way more HDMI ports than average. I primarily plug stuff into a VIZIO 50" TV that we got a few years ago on sale. It has 3 HDMI ports, but I have extenders that I use, too. The extenders/splitters are probably quite cheap and probably won't handle 4K at a high refresh rate. I just use it for the extra ports, as those are significantly more valuable to me than picture quality. As of right now, I have a Nintendo Switch, Roku, Wii U, two Xbox 360 Slims, DVD player, screenless laptop, and another computer that varies sometimes. Even with all the switches and ports, I still have to swap the cables very frequently and it's always an inconvenience either way. I just wish the TVs had 20 extra HDMI ports. That's only HDMI, though, when I also use so many other ports and devices, too. This is all very crazy to me.
The other TV brands not offering the 4 HDMI 2.1 ports, it makes me wondering if there was something in the licensing of the MediaTek chip that financially discouraged those TV brands to offer four? A plausible scenario: MediaTek possibly basing the price of the licensing royalties with the quantity of HDMI 2.1 ports for the TVs. Where MediaTek would charge more for having TVs with four ports instead of two? Just a thought.
Wondered if I would need more than the 2 that my Sony A80J has but since i went to a receiver and speaker system the receiver can switch any extra 2.1 ports I need while using the eARC port so in the end it is not that big a deal.
It isn't if you have a receiver like that, or if you spend a WHOPPING $25 Us dollars on a simple HDMI 2.1 8K/4K 120Hz switch allowing 3 full ports usage. This is making a Hen out of a feather. People complaining at 95% of them have one console and a soundbar. What would be worst would be no 120hz ports at all.
The day Vincent dropped that video I went straight to Best Buy and bought the G2. I been holding out holding 4 2.1 ports and was willing to pay out the wazoo for the2023 Sony. Nobody should be paying that much for a tv and be restricted in hdmi ports and unless it’s a home theater most consumers aren’t using a receiver in their bedroom.
$25 HDMI Switch solves the problem. So you jumped the gun. The MediaTek SoC still offers a lot of performance enhancements, it' wasn't just about HDMI ports.
@@loki76 for you maybe. Im already using a splitter but for one console and dvd player. I’m not trying to run 2 consoles on a splitter and have to bounce back n forth between picture settings an all that extra shit. I got to look at the a95k and g2 and between no added 2.1 ports, lack of certain gaming features and other minor differences, the change in picture quality wasn’t enough for me to justify an extra thousand bucks lol
@@loki76 I never said my dvd player needs 4K/120. Only mistake here is you not understanding my comment. I’m not splitting 2 consoles on one 48gbps port and having to constantly tweak picture settings everytime like I would the Sony since I also use eARC for audio. Luckily for me the LG has 4 2.1s so I can keep one hdmi inputs game optimizer and filmmaker mode untouched and bounce back n forth between Xbox and 4kUHD player.
It doesn't really bother me for two reasons. I have a Denon receiver with three HDMI 2.1 ports. Also, I just purchased a 65" A80J OLED in 2022 which I'm very happy with. It's not like I plan to run out and buy a shiny new 2023 OLED. 🙂
I love my C2 as well! Was definitely a great purchase and it will last us a long time. It may be a while until 8k is mainstream but when that time comes hopefully the technology is worth it and accessible
@@Bdot888 8K seems pointless unless you’re 85” inches or greater. Even then you need to get to the 100+ inches to have the difference really standout. I don’t think 8K is needed at all. 12 bit color though that would be noticeable on any display and definitely worth an upgrade.
@@Gichie79 yeah i dont think 8k will be popular for another 5-10 years if that. As of now there is little to no 8k content to enjoy. Only thing i can think of is that 8k upscalers will become better. But as of now, the price for 8k is not worth it and is hard to differentiate from 4k
This just means I bought the best tv! That will be good to use many years down the road if it doesn’t break (lg oled 65c9). Loving it, best tv ever. Software support is great, does everything that next gen gaming needs, perfect for me.
I plan to buy a TV this year, and use it for 5+ years. I am okay with two HDMI 2.1. however for the sake of future proofing my TV I will buy Samsung S95C & get 4 hdmi just for the sake of it
Caleb, Could they be trying to push the companies that want 4 ports to the Pentonic 2000? That one still says 4 HDMI 2.1 ports, and I'm assuming since they have to support 8K they will be full bandwidth.
That’s honestly what I’m thinking as well, but I can only see that SoC being used in very large TV sets right now (I would guess 75” being the minimum), if at all. We may see it in next year’s lineup.
I am strongly considering buying an A95K on discount this fall, but will eventually want more than the two HDMI 2.1 ports. Rather than buy a Samsung or LG TV (or wait *another* year for Sony/Mediatek to provide 4 ports), can I use an HDMI 2.0 splitter? Is it really that simple? If so, can you recommend any that are fully standards compliant? I have an Xbox Series X, a a PC, and would like some headroom for future. Short of a product recommendation (although I gladly accept one), what are the standards a switch needs to replicate HDMI 2.1? Sorry if this ia a noob question, but I want to be sure I'm buying the right switch (as HDMI 2.1 seems much more complicated than 2.0 and earlier). I see this question addressed frequently in other forums, but I've missed clear recommendations from reputable sources. Thanks!
Massively frustrating for those waiting for a 2023! I’m still thrilled with my G2, and will likely use it for YEARS (we still have our Panasonic g10 plasma we purchased in 2009). Glad I future-proofed!
Judging from what’s been happening ‘out on the street’ it sounds as though the HDMI 2.1 movement has turned into a ……. “dumpster fire.” The bigger issue (that I see) is the lack of 8k content in the first place. I’m running 4K and quite a lot of content is still not 4K.
Glad I got the Sony A95K 65" on holiday sale and didn't wait for anything. Really happy with my new TV and I hope to not need replacing it for a decade at least lol.
Thanks for the update. In my opinion, it won't be until the 2nd quarter of 2025 until we could possibly have all the tvs with HDMI 2.1 with all the features it can give. Next year, i plan on buying a new 75-inch Samsung, tho hopefully 8k if prices have gone down enough by then, but if not, it's probably just a high-end 4K TV. I love your videos, tho been watching them for Years!
Wow, thanks for this update!! So I really can't afford a high end receiver to connect my devices to utilize all the 2.1 features. Sounds like I'll have to choose which devices to connect to the limited 2.1 ports😤🤦🏽♂️ I feel I'll be paying more for a TV that offers less! 🤬 Not cool MediaTek
I’ve owned the LG signature G7, Wallpaper 8, Wallpaper 9 and now Gallery 2 and these all had full bandwidth hdmi 2.1 with feature Rick support. Yes they are all the top end of LG’s offering but it’s surprising in 5 years other manufacturers still lag even big weights like Sony in this department.
PC monitors have been lying about the HDMI 2.1 spec for a while now. It's why channels like Hardware Unboxed check a full range of features in every monitor review to see if they are REALLY a HDMI 2.1 monitor.
This is outrageous because it's not enough for my setup, which I think is the best for gaming. PC with HDMI 2.1, PS5 for exclusives, soundbar with eARC. It's 2023 and you can't have it the necessities in premium TV - 4x HDMI full bandwidth ports (only LG and Samsung), QD-OLED (only Samsung and Sony) and Dolby Vision (only LG and Sony). If only Sony would give us 4x HDMI ports (buying chips from LG or Samsung), QD OLED 77 panel from Samsung and Dolby Vision support...
Just bought the LG C2 in October 2022, tossed it up with the Sony A80K and decided with the LG instead mainly because of the availability of 4 HDMI 2.1 ports on the LG. It’s baffling that we talk about the “Sony premium” and yet they only have 2 HDMI 2.1 ports. I won’t be surprised that if the Sony TVs have 4 HDMI 2.1 ports, their sales might actually tick up.
Hi, With this HDMI 2.1 I have noticed when watching through Apple TV ( latest version) going through my Yamaha AV3080 in to the Samsung QN95A 75” it loses signal when you finish a program and looking for something else to watch. Have you ever come across this before or why it does it? I get around this by changing from Apple TV to another port on Yamaha and then back to Apple TV again to get it back on. Back in the days of scart and composite you had nothing like this and I feel that HDMI is not doing what the manufacturer is saying what it can do since say 1.4 and the band width is getting higher surely you be better off with something like an Ethernet port and not a HDMI port? I love your videos and would be interested on your thoughts and wait to hear from you.
I've used video switching HDMI receivers since the early 2000s. Never needed to use more than a single input on my living room display. Number of ports doesn't seem like an issue at all to me. But I'm with you on the meaningless HDMI spec. They went the same stupid 'screw the consumer' path as USB. At this point, saying 'HDMI 2.1' is just as meaningless as 'USB 4'.
I always wondered why there are only 4 HDMI inputs on TVs. Is it so expensive to offer more, especially on high priced gears? I use all 4 inputs and I am not even a gamer.
I was going to possibly get Sony, going from my 77” C1 OLED, the 2.1 port limitation is beyond a deal breaker, G3 it is. Why have all this updated tech if it’s all restricted by the bandwidth of your ports anyways, its complete ridiculousness. 😡
The saving grace of this situation is the consistently clear labeling of physical ports on TVs as “4K60” or “4K120”. You can look at the product photos and see what you’re getting, no need to worry about HDMI specs.
I just purchased a Sony A80K and am very happy with it the picture quality is awesome at 4k and beyond when using my PS-5 which is the reason i decided on it. I too am not happy with the games that are played but at least I have my 2 HDMI 2.1
HDMI realised that having all these qualifiers to make it 2.1, was really bad for their name. Since so many companies had issues with it and could do some, but not all features. Now we just have it as 4K = HDMI 2.1. HDMI also need to tell us when they're contract ends with all TVs. Many would love to have DisplayPort but we'll never see it since HDMI has a secret contract with all TV brands. Nobody is giving away info when this contract ends though.
Having worked in the repair of consumer electronics since the 70's all of these false specifications and the hype that goes with them means nothing until you have a final production sample on the test bench where you can confirm exactly what the true working specs are, not those on a web site or handed out at shows like CES. Manufacturers rely far too much on promised specs of new panels or chips that never make it to the finished product. Media Tek want to sell as many of those SOC's as possible, so they publish a spec that will make it seem like just what we wanted, then when the truth comes out we all feel cheated. This is why i only buy new products after i have tested them, and never rely on the false promises that are handed around as "this is far better than last years models" the truth is there if you look for it before buying.
I've already filled two with the two major console manufacturers. what's the advantage of connecting your soundbar/receiver via hdmi 2.1? having trouble with audio sync?
To be honest, I wouldn't mind a TV with 2 x HDMI ports IF eARC/ARC were on a separate HDMI port. It wouldn't be so restrictive. As I feel 2.1 is mainly for gamers. So at you can have a PS5 and Xbox plugged in and can still use a sound bar or what not. Or is that not doable? Because 4 is great, but I don't think most users would need 4 x HDMI 2.1. But at least free up both 2.1 ports for input devices while having another HDMI port for output audio.
Why does anyone need all four? Seriously. Caleb, you said it, a majority of consumers don’t need it. People are still struggling to get a next-Gen console, let alone two. Anyone with a high-end PC isn’t attaching it to a TV either, they’re going monitor. Don’t need 48Gb/s for blu-ray because films are done in 24fps. Someone please explain why this is a big deal. You don’t need four. Two is more than enough. It’s like supersizing your meal and then just having the rest of the drink and fries you didn’t finish just sit there.
I’m extremely mad at the HDMI Forum too. Their convoluted naming schemes mean you don’t even need to support the latest features to get included in the HDMI 2.1 umbrella.
They're a bunch of sellouts. TV manufacturers pay them to make these standards.
Absolutely. What THEY have done is even more unforgivable than what MediaTek just did! What were they even thinking there? Who is it that they are trying to look out for and protect there with that HORRIBLE decision? Certainly not the tech consuming public. What in the hell is wrong with these people? Did they think this was just going to quietly slip by and no one was going to catch this screw up on their part? Did they think no one was paying attention? They need to apologize and clean up THEIR total PR mess now, just as badly as MediaTek does. (Maybe the two of them should team up and go on a world-wide "Apology Tour" to try and undo some of the anger and frustration their POOR judgement has caused consumers? I know a certain TV company in South Korea that horribly messed up a certain QD-OLED roll-out/product launch recently that could probably stand to tag along on that one as well, now that I think about it) Wow, 2023, at least from a tech perspective, is off to an absolutely terrible start. Let's just hope it gets better from here on out.
They are really the ones to blame for this; It was the same with the HDR specification where HDR400 was a thing that was advertised, even though it provided basically a standard picture.
Pentonic SoC has 4 HDMI 2.1 full bandwidth 4K 120HZ Dolby Vision ports theses rumours from LG fanboys and Samsung fanboys
@@RobertK1993 it doesn’t though mediatek has confirmed this
It's crazy how my LG C9 from 4 years ago has 4 HDMI 2.1 ports while some premium TVs not even released yet won't.
Well, it's a 5 year old standard. LG just implemented it like Mediatek should have...
I was about to say, “I’m confused my c9 has all this”
The C9 was one of the best LG televisions! It had full bandwidth support for HDMI 2.1 and also allowed the user to modify practically all of the tv settings! And if you can find a brand new one anywhere, that still has a warranty included.. it is worth looking into!
How many products do you have that need 4k 120hz? List them. 4 2.1 ports is absolutely, obnoxiously, stupid.
@@CalvinTennessee if you have a soundbar that uses eARC, a PS5, XBox Series X, and an Apple TV, then you would want all 4 to support true 2.1. If you’re a gamer with any sort of speaker system, you need at least 3.
Mediatek has been getting too comfortable since they pretty much own the TV SoC market. They need a competition to get them out of comfort zone.
I agree... monopolies are jot good for the costumer
Don't they have competition though? I mean, aren't LG and Samsung by far the best sellers, and they both have four HDMI 2.1 ports on their TVs.
@@Tomiply LG and Samsung do not sell TV SoC. So if you are neither of these companies who make their own SoCs, you're pretty much stuck with MediaTek so no competition. Sucks for TCL, Sony, Hisense, Vizio and the others.
Well said 👍
LG has had four full HDMI 2.1 ports since 2019!
I'm not mad at you Caleb :). You tried to help us and you were mislead. Not on you at all. Much respect for retracing your steps and providing solid evidence too.
I'm mad at him, but I'm easy to please, so Caleb if you would, ship me a 55x90k at your expense and all is forgiven.
What is the Color fire screen saver in background on tv
This was eye-opening. I sell TV's as a "vendor expert" in a blue-themed big box store and that vendor has claimed for years that you "don't need more than two HDMI 2.1 ports" and it turns out its just the SoC they have to use. Thanks for breaking this story.
Vincent broke the story.
Funny cause they'll sell you a bunch of things you dont need. In fact if a seller ever told me that I'd point our I don't need to buy most electronics since they're a luxury
I surely don't need them, but some other people could. Professional videographers, maybe? I've got a PS5 (and that's one port), and I could maybe hook up my PC one day (and that's two).
A certain amount of truth behind the “only need 2 x 2.1 Ports”. I’d love to know the percentage of ports used on TVs and the number of devices requiring 2.1. I think it’s scummy that they are trying to call all ports 2.1, but while the knowledge was more available it, it really is caveat emptor.
BAMBOOZLED!! It’s not a critical feature per se, but shocking that Mediatek can’t read the room. Playing 1080p Blu-ray is all it takes for 2.1 certification lol
That's really all it takes?! That is beyond absurd.
@@slantia5 no, he was being sarcastic
@@mastixencounter Bamboozled indeed lol
@@mastixencounter No, sadly I'm not kidding - if you look up the specs, "Full HD Blu-Ray Disc" capability is the baseline spec of 2.1 - See the article on TFTcentral "when hdmi 2.1 isn't hdmi 2.1 - fake hdmi 2.1" discussing this problem where a Xiaomi monitor can only display 1080p but is certified as HDMI 2.1
@@stopthefomo HDMI 2.0 doesn't exist anymore and the name has been scrapped since 2.1 is fully backwards compatible. When it just say's HDMI 2.1, that means HDMI 2.0. If it were a more true HDMI 2.1 capable device, then they have to list the specific HDMI 2.1a features that weren't capable with HDMI 2.0 previously
Keep the pressure on them Caleb. This is just egregious of Mediatek and the way they kept misleading all of us. Great and informative video.
Glad you have found this out, it was pretty big news many months back when the standard was revised and tech press realized that HDMI 2.1 was going to be meaningless. I am pissed too
Why I have been thinking over the years why not go via DP. Since on certain issue's it takes HDMI to cartch up via resolution and what Hertz it can do. (But will only probably be in monitors. Yes I lean toward the PC gaming.)
I had a Sony 900h and felt totally ripped off by how they advertised 4k 120hz and it wasn’t there for over a year after purchase. Bought 2 LG C1 OLED sets 2021 Black Friday and still feel like to this day it was one of the best purchases I have ever made, this video even further confirms it. I even replaced a computer monitor with a 42” C2 OLED and love it, all 3 with 4 HDMI 2.1 full bandwidth ports 👍
I agree I love my C1
Sony's picture quality is better, but I guess that after a while you stop noticing.
Kind of reminds me of how Intel got comfortable not giving things promised for years because they had the computer market fairly cornered and it wasn’t until Apple started making their own SOCs and AMD started to provide real competition that they started trying to innovate again in anyway. Same thing needs to happen to Mediatek.
Do you think Mediatek has gotten fat and complacent like Intel did?
One of the big reasons i love my LG G2... 4 hdmi 2.1 ports. Plus it has the newest ATSC 3.0, so whenever companies start pushing higher resolution over the air it's ready for it 👍2022 LG oled tvs were all really nicely equipped
Do you have 4 devices that need the full 2.1 bandwidth?
@@rijjhb9467 if he has only two of them, he‘s on the winning side…
@@rijjhb9467 I have sony x90k that has 2 hdmi 2.1 ports.
Now I want to connect
1. Apple tv 4k need 2.1 for atmos and vision
2. Soundbar need 2.1 for atmos
3. pc for 120fps gaming
4. ps5 for 120fps gaming
I would say it’s reasonable for someone to want this setup.
@@rijjhb9467 A: that's not even the point. They said they would and they went back without being open about it. B: The alternative is 2, which, if you are using a soundbar or amp with eARC already takes up one, so now you're left with one. There's PLENTY of people that have both a PS5 and an Xbox Series X, or a console and Apple TV or Want to hook up a console and PC and they now have to choose what machine they want higher FPS on due there being onlyu 1 port available left.
@@ShrkBiT i mean just unplug and replug the hdmi cable
I'd prefer 4 or more 2.1 hdmi ports on the TV. May not use it often but it's nice to have when you need it
Ports do get faulty sometimes
I would rather have a DP 2.1 port. but only monitors will have them >.
Just get a splitter. You're never going to use all of the ports at once.
I may not need to have four full bandwidth 2.1 HDMI ports, but I don't appreciate feeling that I have been lied to by Mediatek. I remember when you and others first reported about the 2.1 ports and what you were led to believe. There are lots of features that us consumers may not need, but get excited anyway because we know how convenient it could be. They should be more honest with the reporters that they use to help promote their products.
I’m so glad I never have to worry about anything having an LG GX OLED! They are always ahead of everything as far technology goes!
Shame that still uses the old SOC that still has 2x 2.1 ports
Your tv has one hdmi 2.1 port. Rip thats used up if you connect a surround via hdmi with eArc. You will need to buy an hdmi switcher to connect more 2.1 devices
Apart from peak brightness lol
@@mrbrookeyoung I have my tv in a dark room. It’s never been an issue for me
My LG CX does it all across 4 HDMI 2.1 ports. Loving it.
Ya so worth it when only ps5,series x and a pc would only benefit from 2.1 hdmi so pointless for 4 ports at 2.1 lol
@@SIMRIG412... and the new Apple TV...
... and the latest GPUs and laptops...
... and anything in the future which supports it because that's how time and tech works.
I am so happy that a channel like this has educated me prior to purchase amd stayed away from mediatek. And went with the lg c2 55” for 1000
$1K? In the US or UK? If US, where are you seeing those prices?
@@brokenbones8790 euro. Netherlands. Yay for black friday deals
How can BIG companies like Sony/Sammy/TCL/Hisense/LG - NOT - being able to design their own HDMI 2.1 chips???????? Someone inside please explain.
Why should they it's dumb less profit although I think Sony should they only premium high quality brand in that group.
For a TV maker to suddenly turn into a design house just for the sake of designing TV SOCs doesn’t make financial sense when you can rely on existing design houses and fabs that have sufficient yields to enjoy economies of scale.
I love it how manufactures keep pulling this BS and then wonder why peoples upgrade cycle is significantly slower than what it used to be. They go out of their way to piss their customers off, screw them over with poor customer service, screw them over with poor pricing, screw them over with barely upgraded products, than wonder why last year's iPhone 14 was apparently the worst selling model Apples ever seen! Here's a hint guys, it's not because of the recession. Guess my LG B6 is going to be buried with me...
Same. Still rocking my 43' Samsung from 2018. Still kicking for both movies and games.
I know I'm in the minority, but as someone that has a PS5, Series X and a sound bar based surround setup it is incredibly annoying to see just two 2.1 ports especially when one of them is the ARC too. If the ARC port wasn't one of the ports, it really would alleviate a lot of complaints and there's been a few TV's that have done it. It's just annoying.
Totally agree. Same problem here.
I bought a hdmi 2.1 switch on Amazon for my ps5 and Xbox series x. It works great. Costs 21 bucks and no issues at all
@@p3terlos same with my Sony bravia I can't use the 120hrtz because of it 😢
@@TheAlexkokonas maybe I should look to that thx
3 true HDMI 2.1 would be Ok for me, but if I can get 4 with Samsung or LG, I will go with these brands. Prefer Samsung mini-led though.
@grumpygreyisgrumpy
I knew the extraterrestrials would agree with me! 👍👍
p.s. Hope you don't mind too much a little kidding about your pseudonym? 😉
This is stupid because it’s not a HDMI limitation it’s in the TVs software all the HDMI dose is send data from a connector to another connector the cable is the pathway. So my thinking is these companies probably don’t have the cpu power whatever chip they are using in these sets to handle more then two ports with 2.1. Samsung and LG are using pretty powerful chips in there sets now Sony has been too so it’s frustrating they are not giving more then two either. But again if soundbars and av receivers can offer more so can a tv
In a bedroom nothing beats LG and in a living room nothing beats a Sony .. Samsung is meh and they always mess up after 4 years .. my ks9000 was my fav but messed up in just 4 years of moderate use or no use I went to turn it on and it had lines … Samsung lost my love after that
@@fabolousjada5070
Are you sure that nothing beats LG or Sony in the bedroom? 😊
@@fabolousjada5070 interesting I have an old Samsung 1080p set big chunky boy from 2011 it still works today maybe they don’t make them the same quality anymore
100% agreed. This is an unforgiveable, inexcusable screwup on the part of MediaTek. It doesn't even matter if you are the type of customer that needs four HDMI 2.1 ports on your TV or not. This is about integrity, business ethics and professionalism. There is NO excuse for what MediaTek has done here. None. Now that this is all out in the open and the damage has been done, where is the formal written statement of APOLOGY from MediaTek? Is it posted up on their web site yet? If not, WHY NOT? Where is THEIR video of the CEO of the company coming forward to explain what happened and take full responsibility for this absolute PR disaster - and BEG for the public's forgiveness for his company's incredibly poor execution and total lack of good judgement? We're waiting.
i don't need hdmi 2.1 for rubbish dolby atmos 4k format its a scam . they use near field mixes it won't make the sound any better its a scam by film studios use it as cash milking cow
dude calm down, your acting like its the end of the world. Go outside, and live a little
@@Barbarapape I wasn't talking to you, I was referring to the OP who seemed like his life was going to end because of this
Calm down kid, it's a f*cking TV...
I have an eARC Nakamichi soundbar, gaming PC with 3080 Ti, PS5, and Apple TV 4K, plus a Switch. It is totally reasonable to desire all my HDMI ports to be 2.1
Thankfully I bought a C9 when it was the new hotness as it seemed future proof for the time being. That is still holding true to this day.
i think a reasonable compromise we should expect at this point is for TV manufacturers to try reallocate the eARC port to one of the HDMI 4k 60 hz ports instead of occupying 1 of the only 2 available 2.1 full bandwidth 120 hz ports, that would solve the issue many people had although not a fully ideal one. TCL can do this so I believe it shouldn't be a challenge for other brands
Exactly. Soundbar users have basically only 1 HDMI 2.1 port left, that‘s annoying.
@@oli61 no really
That poses a problem for anyone using a Receiver instead of soundbar. Since the Receiver is full 120Hz 4K. But it still needs the eARC to get the sound passed back when you plug in our devices to it. You'd be limited to 60Hz, rendering the Receive useless as a HDMI 2.1 capable one.
So it's best to keep the eARC on 120Hz port and instead everyone using an HDMI switch for $25 to create an extra port and hence have 3 full ports. It's just a matter of hitting a button when you choose to play XSX or PS5 or PC. It's not like you will do all of them at once.
I think Vizio started doing this
@@Spectral2k1 🤦♂
FTC should investigate HDMI Forum, this misrepresentation on products being advertised is negatively impacting consumers.
Surprised Qualcomm or Google haven't gotten into this market to ensure some competition. I feel like Qualcomm could do this today?
Definitely not. The quality of Mediatek chips is way too good. Mediatek owns 90% of the smart TVs in the world.
@@andrewjohnson1486 video is about mediatek not fullfilling the promises of their chips. You tell me they are way to good? That 90% mean they haven‘t had to resist against much competition…
@@andrewjohnson1486 Oh yes they could. Qualcomm owns a good portion of the smartphone market, they know how to make good stuff. If you have a flagship Android phone, then odd are it's powered by a Qualcomm SOC, even earlier iPhones. Mediatek also makes smartphone SOCs but their market share is pretty much limited to cheap "burner" phones. That said, UNISOC is a more direct competitor to Mediatek and they could also step up to the plate if they wanted.
all the marketing hype with 8K and can’t even put 4 2.1 HDMI ports in their TV‘s 👎
it’s 120 hz at 4K, where’s the problem?
if HDMI 2.1 isn’t standard, they shouldn’t even talk about 8K
I think what we are seeing in general is a break down of consumer facing standards. I mean between this and USB whatever, do we really know what we are getting? As a techie I'm more aware of what I need to look for but the ordinary person has no idea that what they bought was really just branding.
Man LG C9 was truly cutting edge. Still glad I purchased one 2-3 years ago.
How buggy dim VRR
@@RobertK1993 Not sure exactly what you’re trying to say…
@@MerryBlind during VRR on the C9 it can get buggy and quite dim (literally about the only bad thing you could say imho).
@@SSNebula True I was expecting the VRR to work better. The brightness can flicker noticeably if your framerate is unstable it’s weird.
Can we start adding display port 2.0 to TVs at this point?
This is why all the TVs in my house are made from LG. I mean even to this day they are still updating some of the older models like my CX OLED. And of course my best and newer model reserved for my man cave the LG G2 evo is just supreme! My LG TVs has just been reliable, engaging and immersive. I just trust that LG will deliver on quality longevity and features.
The CX isn't an old TV, they better still have update
@@cgsenior2007 True it's not old. But the way LG has been pumping out TVs as of late one would think that the tech in the CX is far behind.
Facts I got the lg cx until this day keep it clean and keep changing content for no burn in .. best tv I ever had plus updates are sometimes good I have a old update tho don’t wanna mess up my viewing
@@RunEmDownTony it’s not their just cloning that tv and adding small ass features and then charging extra lg cx is still the way to go
1:48 I love how *Vincent* is so well-known in the TV tech media that you don't even need to give his last name or explain who he is! He is *literally* the Leonardo of TV tech analysis, lol.
but he did say Vincent's full name and channel. :)
@@MichelleAlexandria-EM lol. Yup.
We need journalists like you and Vincent to call this shady company out. 4 real HDMI 2.1 ports was playing a big role in my purchase decision last year. I would have bought a Sony instead of LG if Sony had 4.
I don’t understand the appeal of 144 Hz. I do get 120 Hz as it divides equally into 24, 30, and 60. I think 120 Hz should become the new gold standard. 144 Hz, not so much.
I didn't really pay too close attention to this, but I think I got lucky in my recent purchase of a 2022 Samsung QN95B, four hdmi 2.1 ports on the oneconnect box.
As a person that deals with tvs everyday of my life. Other than a super high end computer with like a 4090. You can't even push 4k120 to the tv. Not a ps5 nor xbox series x does true 4k at 120. You will get 1080p 120 or maybe 1440p 120 if your lucky out of them but there will be a sacrifice. Now you could argue to me that vrr is useful i would agree there but there just really isn't very many true 4k 120 devices out there to justify 4 2.1 ports if you're only looking for 4k 120.
I could be convinced you need 4k120 for your ultra blueray player but then how many movies out there use 120? I'd love to hear your comments
This sounds like a good opportunity to review a few HDMI 2.1 splitters/switches ahead of the 2023 model drops.
That can even work properly?
Espcially bypass the security of media players or game consoles!!!!
I have yet to see an HDMI 2.1 splitter work. My screen always goes black about 5 min into a movie
Yeah it sure is a bummer. However, I'm not in the hunt for a TV this year as I have already purchased two last year. Hopefully next time when I do, it will have become a standard then.
have a PC, PS5, and Series X, and a receiver in the E-arc port, sure I can plug things into the receiver too but just having all 4 of the ports being 2.1 on the TV means less guessing about which port it which since they're all the same makes like easier and convenient.
As a high-end (4090) PC owner I have no idea why would I want a console, let alone 2 of them.
@@RJ-vy4yd They’re not mutually exclusive right? You can enjoy gaming on PC and on Xbox. My gf solely prefers console gaming as we can lay on the couch.
I'm pissed too. The HDMI forum lining their pockets from companies lobbying to muddy the rules just like USB did & the sheer incompetence from MediaTek to not clarify their "mistake" immediately are making this space frustrating to follow. 4 years, FOUR, now that LG has had 4x2.1 ports which they even back ported to the old X8 lineup. It's gone from inexcusable to embarrassing.
Thank you for this video. I really appreciate you holding these companies accountable. I bought a Sony a80k and I could 4 full bandwidth hdmi ports. My PS5 and Sonos sound bar takes up those ports and I make do with the other ports on my other devices making sacrifices. I planned to update my living room TV to a 2023 version on the a95 this year but I will skip it until all 4 ports are full 2.1.
SOny A80K has two 48 Gbps ports, right? The other two are HMDI 2.0b at 4K/60.
@@MrLutijen That's correct.
Glad I didn't wait, the TV I got said 2 2.1 HDMI ports and that's what I got & all I need 🤷🏼♂️👍🏼✌🏻
Hey, thumbs up from me. I appreciate you letting us know what going on. In today's market this kind of info could easily slip by the way side so it's nice to have channels like you to inform us brands whom may be taking advantage of us.
I bout an 85inch x95J last year, well four of them, three showed up broken. I'm glad I did my research on this as one of it's primary functions is to act as a 4k HDR 120Hz gaming PC monitor. I did end up going the receiver route so I do have 4 usable fully functioning 2.1 ports and I did spend something like $150 on a long optical HDMI cable (which works amazingly BTW, do your research). However, I think the blame here rest fully on the HDMI standards people. What a lazy lackluster job they did on this one. I'm an IT guy with 20 years experience in technical details. I have the knowledge to figure this stuff out. Do you think anyone else in my house, friends group, family does? No. Isn't that exactly why we have standards to begin with? So non technical people can easily make sense of complex subject mater?
So glad I did the research and opted to get an LG G2 this year. I love how they’re always future proof.
Question, the best buy rep told me you can't swivel and adjust the g2's mount. That it's just flush and a different mount wouldn't be compatible with it. So I opted to order a c2. Was this info accurate?
@@NYC_Goody The G2 has standard vesa mounting options as well as the mounting spots for the flush mount that comes with it so you can definitely buy your own wall mount for a G2.
@@NYC_Goody i mounted it flush with a different mount a Sanus fixed rail mount system. So, yes any vesa mount does work.
Lol oled aren’t future proof and get burn. Just rip offs lol
@SIM RIG 412 they're high maintenance but they're definitely the best picture you can get right now. They only burn if abused.
Why are they needed?
Actually hdmi feature support has always been flaky but it's getting worse. Still remember lypc sync feature was meant to be supported all the way back on hdmi 1.4. It was optional in the end and caused me awful delay issues which I couldn't even correct myself with my old TV over arc.
Why hasn’t somebody gone after them for lying to consumers?
Hey!
That sucks Caleb, not your fault. What do you think about 8K HDMI Switches? I've been looking into them, as I probably won't upgrade my Receiver for another couple year's, but was aiming for my TV this year.
8K VRROOM from HD Fury is the only one on the market that supports all hdmi 2.1 features. Awesome solution even for 4K TV’s if you want additional 2.1 ports with no compromise.
Appreciate the HDTVTest shout-out! I love Vincent's videos.
My 2020 Vizio PQX has two “HDMI 2.1“ports with EARC on one of the 4K60. That arrangement suits my needs, but a 2023 TV should have four by now regardless.
I have the same T.V. and the setup works flawlessly. I hope the next iteration of the same brand increases HDMI 2.1 cause, eventually, I will want to upgrade since I game mostly on my T.V.
@@QuyetStorrm it is flawless now, but they made a mess of it a little over a year ago with firmware updates. The tv is pretty solid now, and happy with it for a few more years hopefully.
Great vid bro!
Other than people with multiple game consoles, how problematic is it for most two be limited to two 2.1 ports? I'm genuinely curious.
and current consoles don't really need hdmi 2.1 to be fully utilized. they don't have the horse power to need it, and most game companies aren't pushing the hardware to it's limits anyway. The only thing that will need 2.1 to be fully utilized would be the highest end pc graphics cards which is just 1 port, and for the few percent of tv buyers who thinks this affects them, get an hdmi switch and call it a day.
The Office Space reference and clip was perfect! As for the ports, I’m picking up an LG next month so I’ll have plenty.
Not to sound silly here, but I have an LG42C2 with five self built gaming PCS connected to it. It had 4 HDMI 2.1 ports. I put an HDMI switch on the fourth HDMI 2.1 port (rated at 8K@60Hz 4K@120Hz) to add the fifth PC. I realize this is probably uncommon, but it works well.
Does an AVR solve the problem 100%? I mean, will the picture pass-through without extra delay and with no effect or filtering? Does it have to be a 2022 AVR or newer?
They were allowed to get away with this because of the HDMI deregulation and this will likely continue if not get worse now that they are allowed to claim they have 2.1 when they don’t fully have it. I feel ripped off as a consumer not being able to trust any of these HDMI claims.
That’s why I chose LG as my true tv. They offer all the gaming and media features, OLED, plus 4 hdmi 2.1 ports.
LG all day! 👌🏼
I'm sure Sony's 2nd gen QD-OLEDs will win most 2023 shootouts for pure picture quality. But factor in price, a greater range of available sizes, and not having to pick and choose which features work on which inputs? And I'm thinking LG's G3 OLEDs still look like a pretty darn compelling offering.
It all depends what is most important to you. If saving money and having 4 2.1 ports is than go with LG. If the extra money you’re ok with and you want the most reference picture go with Sony.
As you said early on in the vid, the biggest thing that bums me out as a gamer is the 1/2 resolution downgrade still at 144hz. Everyone claimed it was 4k 144hz last year until it was tested countless times with various high end GPU's and still showed it was running at 1440. I can deal with 2 ports, but please can we get true 4k 144hz now that GPU's can push over that FPS. You did a great job at CES, but this was one of the top stories coming into 2023, how was this not on your top 5 list to vet while you were at the show? Keep up the good work.
Caleb, in my world you rock. I don't appreciate the idea of having to buy a stereo setup to get all the 2.1 ports I need for a lesss expensive TV. This definitely affects what I will buy. Thank you Caleb for all that you do for us!
vincent was the one who revealed this story
what in the world do you need 4 2.1 inputs for? Certainly not for audio. Earc covers every single bandwidth. Do you know what you’re talking about?
I thought those 2 dudes were the same person
Happy I got LG OLED and will be sticking with them for the forseeable future
I probably use way more HDMI ports than average. I primarily plug stuff into a VIZIO 50" TV that we got a few years ago on sale. It has 3 HDMI ports, but I have extenders that I use, too. The extenders/splitters are probably quite cheap and probably won't handle 4K at a high refresh rate. I just use it for the extra ports, as those are significantly more valuable to me than picture quality. As of right now, I have a Nintendo Switch, Roku, Wii U, two Xbox 360 Slims, DVD player, screenless laptop, and another computer that varies sometimes. Even with all the switches and ports, I still have to swap the cables very frequently and it's always an inconvenience either way. I just wish the TVs had 20 extra HDMI ports. That's only HDMI, though, when I also use so many other ports and devices, too. This is all very crazy to me.
What about HDMI bars that can hook up many and allow you switch which one you want? Are there any good ones?
This is one of the specific reasons I went with the Samsung qn900a two years ago. I needed at least 3 hdmi 2.1 ports and only the 8k models had it.
The other TV brands not offering the 4 HDMI 2.1 ports, it makes me wondering if there was something in the licensing of the MediaTek chip that financially discouraged those TV brands to offer four?
A plausible scenario: MediaTek possibly basing the price of the licensing royalties with the quantity of HDMI 2.1 ports for the TVs. Where MediaTek would charge more for having TVs with four ports instead of two?
Just a thought.
Wondered if I would need more than the 2 that my Sony A80J has but since i went to a receiver and speaker system the receiver can switch any extra 2.1 ports I need while using the eARC port so in the end it is not that big a deal.
If you don't mind me asking. What receiver do you have?
@@freddyflores288 Onkyo Tx-NR6100
@@jamiedewberry6702 thank you!
It isn't if you have a receiver like that, or if you spend a WHOPPING $25 Us dollars on a simple HDMI 2.1 8K/4K 120Hz switch allowing 3 full ports usage.
This is making a Hen out of a feather. People complaining at 95% of them have one console and a soundbar.
What would be worst would be no 120hz ports at all.
The day Vincent dropped that video I went straight to Best Buy and bought the G2. I been holding out holding 4 2.1 ports and was willing to pay out the wazoo for the2023 Sony. Nobody should be paying that much for a tv and be restricted in hdmi ports and unless it’s a home theater most consumers aren’t using a receiver in their bedroom.
$25 HDMI Switch solves the problem. So you jumped the gun. The MediaTek SoC still offers a lot of performance enhancements, it' wasn't just about HDMI ports.
@@loki76 for you maybe. Im already using a splitter but for one console and dvd player. I’m not trying to run 2 consoles on a splitter and have to bounce back n forth between picture settings an all that extra shit. I got to look at the a95k and g2 and between no added 2.1 ports, lack of certain gaming features and other minor differences, the change in picture quality wasn’t enough for me to justify an extra thousand bucks lol
@@walidhannaoui1802 Why the hell are you using it for a DVD player. It doesn't need 4K 120Hz. That is your own mistake.
@@loki76 I never said my dvd player needs 4K/120. Only mistake here is you not understanding my comment. I’m not splitting 2 consoles on one 48gbps port and having to constantly tweak picture settings everytime like I would the Sony since I also use eARC for audio. Luckily for me the LG has 4 2.1s so I can keep one hdmi inputs game optimizer and filmmaker mode untouched and bounce back n forth between Xbox and 4kUHD player.
It doesn't really bother me for two reasons. I have a Denon receiver with three HDMI 2.1 ports. Also, I just purchased a 65" A80J OLED in 2022 which I'm very happy with. It's not like I plan to run out and buy a shiny new 2023 OLED. 🙂
Just to be clear we can get 4 HDMI 2.1 as long as we buy LG or Samsung, correct?
Yup!
THANKS to @hdtvtest as always for the discovery
Vincent Teoh is a legend
Loving my lg c2. Once 8k becomes a common thing. I'll upgrade to whatever Lg is offering.
I love my C2 as well! Was definitely a great purchase and it will last us a long time. It may be a while until 8k is mainstream but when that time comes hopefully the technology is worth it and accessible
@@Bdot888 8K seems pointless unless you’re 85” inches or greater. Even then you need to get to the 100+ inches to have the difference really standout. I don’t think 8K is needed at all. 12 bit color though that would be noticeable on any display and definitely worth an upgrade.
@@Gichie79 yeah i dont think 8k will be popular for another 5-10 years if that. As of now there is little to no 8k content to enjoy. Only thing i can think of is that 8k upscalers will become better. But as of now, the price for 8k is not worth it and is hard to differentiate from 4k
It's already extremely hard to run games at 4K, as only 4090 does it, really. I can't see 8K being anything useful anytime soon.
@@RJ-vy4yd Yeah we are still barely getting to 4k gaming territory. I dont think 8k will be even popular for gamers
This just means I bought the best tv! That will be good to use many years down the road if it doesn’t break (lg oled 65c9). Loving it, best tv ever. Software support is great, does everything that next gen gaming needs, perfect for me.
I plan to buy a TV this year, and use it for 5+ years. I am okay with two HDMI 2.1. however for the sake of future proofing my TV I will buy Samsung S95C & get 4 hdmi just for the sake of it
Makes me feel smart for buying my 65” LG CX 2 years ago that has FOUR HDMI 2.1 ports!!
Caleb, Could they be trying to push the companies that want 4 ports to the Pentonic 2000? That one still says 4 HDMI 2.1 ports, and I'm assuming since they have to support 8K they will be full bandwidth.
That’s honestly what I’m thinking as well, but I can only see that SoC being used in very large TV sets right now (I would guess 75” being the minimum), if at all. We may see it in next year’s lineup.
@@zgoaty9235 it may be why sony delayed announcing to check the market. Especially on the X and Z series.
Mediatek Pentonic will change that eventually
I am strongly considering buying an A95K on discount this fall, but will eventually want more than the two HDMI 2.1 ports.
Rather than buy a Samsung or LG TV (or wait *another* year for Sony/Mediatek to provide 4 ports), can I use an HDMI 2.0 splitter? Is it really that simple?
If so, can you recommend any that are fully standards compliant? I have an Xbox Series X, a a PC, and would like some headroom for future.
Short of a product recommendation (although I gladly accept one), what are the standards a switch needs to replicate HDMI 2.1?
Sorry if this ia a noob question, but I want to be sure I'm buying the right switch (as HDMI 2.1 seems much more complicated than 2.0 and earlier). I see this question addressed frequently in other forums, but I've missed clear recommendations from reputable sources.
Thanks!
Massively frustrating for those waiting for a 2023! I’m still thrilled with my G2, and will likely use it for YEARS (we still have our Panasonic g10 plasma we purchased in 2009). Glad I future-proofed!
The G2 is a beast, great choice.
Future proofed or not I will be stuck with a Sony A90J for the forseeable. If it lasts...
@@awilderireland not a bad TV to be ‘stuck’ with 🤷🏼♂️
@@drewbacca8063 Darn Tootin. First time I bought a top spec TV. I feel it's wasted on me. 🙂
Are you guys not running everything through your receiver??
Judging from what’s been happening ‘out on the street’ it sounds as though the HDMI 2.1 movement has turned into a ……. “dumpster fire.” The bigger issue (that I see) is the lack of 8k content in the first place. I’m running 4K and quite a lot of content is still not 4K.
Glad I got the Sony A95K 65" on holiday sale and didn't wait for anything. Really happy with my new TV and I hope to not need replacing it for a decade at least lol.
Thanks for the update. In my opinion, it won't be until the 2nd quarter of 2025 until we could possibly have all the tvs with HDMI 2.1 with all the features it can give. Next year, i plan on buying a new 75-inch Samsung, tho hopefully 8k if prices have gone down enough by then, but if not, it's probably just a high-end 4K TV. I love your videos, tho been watching them for Years!
Wow, thanks for this update!! So I really can't afford a high end receiver to connect my devices to utilize all the 2.1 features. Sounds like I'll have to choose which devices to connect to the limited 2.1 ports😤🤦🏽♂️ I feel I'll be paying more for a TV that offers less! 🤬 Not cool MediaTek
I’ve owned the LG signature G7, Wallpaper 8, Wallpaper 9 and now Gallery 2 and these all had full bandwidth hdmi 2.1 with feature Rick support. Yes they are all the top end of LG’s offering but it’s surprising in 5 years other manufacturers still lag even big weights like Sony in this department.
PC monitors have been lying about the HDMI 2.1 spec for a while now. It's why channels like Hardware Unboxed check a full range of features in every monitor review to see if they are REALLY a HDMI 2.1 monitor.
This is outrageous because it's not enough for my setup, which I think is the best for gaming. PC with HDMI 2.1, PS5 for exclusives, soundbar with eARC. It's 2023 and you can't have it the necessities in premium TV - 4x HDMI full bandwidth ports (only LG and Samsung), QD-OLED (only Samsung and Sony) and Dolby Vision (only LG and Sony). If only Sony would give us 4x HDMI ports (buying chips from LG or Samsung), QD OLED 77 panel from Samsung and Dolby Vision support...
Just bought the LG C2 in October 2022, tossed it up with the Sony A80K and decided with the LG instead mainly because of the availability of 4 HDMI 2.1 ports on the LG. It’s baffling that we talk about the “Sony premium” and yet they only have 2 HDMI 2.1 ports. I won’t be surprised that if the Sony TVs have 4 HDMI 2.1 ports, their sales might actually tick up.
I have 4 HDMI 2.1 out of 4 on my TV AND 144hz 4k (while not official, it works) thank you very much
Hi,
With this HDMI 2.1 I have noticed when watching through Apple TV ( latest version) going through my Yamaha AV3080 in to the Samsung QN95A 75” it loses signal when you finish a program and looking for something else to watch.
Have you ever come across this before or why it does it?
I get around this by changing from Apple TV to another port on Yamaha and then back to Apple TV again to get it back on.
Back in the days of scart and composite you had nothing like this and I feel that HDMI is not doing what the manufacturer is saying what it can do since say 1.4 and the band width is getting higher surely you be better off with something like an Ethernet port and not a HDMI port?
I love your videos and would be interested on your thoughts and wait to hear from you.
I've used video switching HDMI receivers since the early 2000s. Never needed to use more than a single input on my living room display. Number of ports doesn't seem like an issue at all to me. But I'm with you on the meaningless HDMI spec. They went the same stupid 'screw the consumer' path as USB. At this point, saying 'HDMI 2.1' is just as meaningless as 'USB 4'.
I always wondered why there are only 4 HDMI inputs on TVs. Is it so expensive to offer more, especially on high priced gears? I use all 4 inputs and I am not even a gamer.
I was going to possibly get Sony, going from my 77” C1 OLED, the 2.1 port limitation is beyond a deal breaker, G3 it is.
Why have all this updated tech if it’s all restricted by the bandwidth of your ports anyways, its complete ridiculousness.
😡
The saving grace of this situation is the consistently clear labeling of physical ports on TVs as “4K60” or “4K120”. You can look at the product photos and see what you’re getting, no need to worry about HDMI specs.
I just purchased a Sony A80K and am very happy with it the picture quality is awesome at 4k and beyond when using my PS-5 which is the reason i decided on it. I too am not happy with the games that are played but at least I have my 2 HDMI 2.1
Hey hello , can I ask you what content you want to consume at 144hz 4k ? I don't count pc gaming 😃😛
The HDMI forum's decision to make HDMI 2.1 meaningless is really at the core of this. We all knew something like this was was going to happen.
Yep. A standard without threshold requirements isn't a standard.
HDMI realised that having all these qualifiers to make it 2.1, was really bad for their name. Since so many companies had issues with it and could do some, but not all features. Now we just have it as 4K = HDMI 2.1.
HDMI also need to tell us when they're contract ends with all TVs. Many would love to have DisplayPort but we'll never see it since HDMI has a secret contract with all TV brands. Nobody is giving away info when this contract ends though.
Having worked in the repair of consumer electronics since the 70's all of these false specifications
and the hype that goes with them means nothing until you have a final production sample on the
test bench where you can confirm exactly what the true working specs are, not those on a web site
or handed out at shows like CES.
Manufacturers rely far too much on promised specs of new panels or chips that never make it to the
finished product.
Media Tek want to sell as many of those SOC's as possible, so they publish a spec that will make it seem
like just what we wanted, then when the truth comes out we all feel cheated.
This is why i only buy new products after i have tested them, and never rely on the false promises
that are handed around as "this is far better than last years models" the truth is there if you look for it
before buying.
I've already filled two with the two major console manufacturers. what's the advantage of connecting your soundbar/receiver via hdmi 2.1? having trouble with audio sync?
To be honest, I wouldn't mind a TV with 2 x HDMI ports IF eARC/ARC were on a separate HDMI port. It wouldn't be so restrictive. As I feel 2.1 is mainly for gamers. So at you can have a PS5 and Xbox plugged in and can still use a sound bar or what not. Or is that not doable? Because 4 is great, but I don't think most users would need 4 x HDMI 2.1. But at least free up both 2.1 ports for input devices while having another HDMI port for output audio.
While we're at it, is there any reason to not have gigabit LAN ports on televisions or any network equipment at this point?
Why does anyone need all four? Seriously. Caleb, you said it, a majority of consumers don’t need it. People are still struggling to get a next-Gen console, let alone two. Anyone with a high-end PC isn’t attaching it to a TV either, they’re going monitor. Don’t need 48Gb/s for blu-ray because films are done in 24fps. Someone please explain why this is a big deal. You don’t need four. Two is more than enough.
It’s like supersizing your meal and then just having the rest of the drink and fries you didn’t finish just sit there.