This video leaves out a HUGE important point that the HDMI forum and manufacturers don't really want to you to talk about so that consumers will be confused and buy the wrong hardware: HDMI 2.2 introduces an OPTIONAL new bandwidth of 96 Gbps. Actually you will start seeing HDMI 2.2 GPU and displays coming out as soon as the HDMI 2.2 spec is ratified during 2025. But they will all have HDMI48C (HDMI 2.2 48 Gbps Compression-Supported) or HDMI48U (HDMI 2.2 48 Gbps Uncompressed-Only) ports and not be able to operate at 96 Gbps. Why can they call these HDMI ports HDMI 2.2 ports? Because the new 96 Gbps bandwidth is OPTIONAL. Sadly using supporting DSC compression is also OPTIONAL in the HDMI spec be sure to check for that specifically when choosing a device as well. The minimum required bandwidth for HDMI 2.2 will likely be the same as HDMI 2.1, 4.95 Gbps. Any bandwidth above that is optional. This is for example how Sony is able to say the PlayStation 5 supports HDMI 2.1, when really it has a HDMI 32C port and doesn't support the full 48 Gbps optional bandwidth that is in the HDMI spec. Sadly, DisplayPort 2.1 is only a little better. With DisplayPort 2.1, DSC compression support is REQUIRED but any bandwidth above 6.48 Gbps is OPTIONAL. That's right, a display manufacturers can say their monitor supports DisplayPort 2.1 and they don't even have to support UHBR bandwidth speeds. VESA and HDMI Forum, please make your naming schemes more friendly. In the mean time, we all should use labels like HDMI48C and DP80 so that people actually know what these devices are capable of at first glance.
iirc They were/are taking the monies from manufacturers so they could make this change from previously strict defined standards, they ain't gonna change it back. Manufacturers want consumers to be confused so that its easier for manufacturers to sell their stuff especially older/cheaper stuff.
Still weird to me that the standard is so behind that it holds back the displays we have. I'd really expect it to be more treated like PCIE where the standard is well above what anyone realistically uses giving room for improvement. Like supporting 4K 240hz should have been announced back when 4K displays really started being a thing, not when 4K 240hz displays are already on the market with DSC.
It’s almost like sending extremely high bandwidth signals down meter-plus cables is more difficult than sending them across centimeter-long traces in a PCB!
@stillmoms yeah I don't really buy that as making the cables hasn't really gotten that much easier with time. They don't really use the most state of the art nodes for their on board chips and we've been able to send 100Gbps over fiber optic for over a decade and coaxial has been doing the same with analog signals. How do you think they supply entire neighborhoods with cable and internet?
@@Skylancer727 The cheapest short range 100GbE transceivers are like $100 each. Are you prepared to pay $200 + the price of a fiber optic cable to connect your display to your computer? People in these comment sections already complain when non-fiber Thunderbolt cables are $80 for 2m of length. Economies of scale helps, but the fact is this stuff is difficult and therefore expensive.
@stillmoms well guess what? We need it for our displays. They don't need only one model per generation. My issue was that the standard is far behind where it should be. I will gladly pay more to actually use my monitor at its best.
It will be a while till we see HDMI96C and HDMI96U ports in devices. Sadly we will see HDMI 2.2 this year, but these devices won't support 96 Gbps and it will cause much confusion.
As a consumer, I found these kind of versioning very confusing.. why 2.1b? why 2.2? why not just 3 or not 5? if you're jumping from 4k to 8k, make it a big major version upgrade.. if you're jumping from 8k to 12k, make it another major version upgrade..
While HDMI is proprietary and requires licensing I'm trying to use DP. This wierd situation with Radeon 7000/HDMI in Linux is ridiculous. Intel bypassed in 1st generation GPUs with dedicated converter dp-hdmi, but it's a strange situation overall. P.S. For those who using old hardware with new monitors DP is cool, I have mini PC with 8th gen Intel CPU, which has DP - so I can use for example 165Hz full hd or 60hz 4k. With it's HDMI 1.4b limitation is 4k 30hz. Even more - with thin client based on old Atom Z8350 I can use 144Hz with DP in full hd.
ridiculous yes but not strange, HDMI requires a license and will not license an open source implementation so AMD that choose to implement HDMI in software instead of in hardware cannot use this in their open source version as per the license requirements from HDMI. nVidia solves this by having the HDMI as part of the closed source firmware and Intel with their dedicated hardware on the card.
@@axescar yep, which is also why I also use DP with my AMD card. Luckily a dp->hdmi connector can be found for quite cheap (just make sure that it supports higher than 1080p).
You don't need to wait 5 years. There are already devices on the market that support DisplayPort 2.1b. And once the HDMI 2.2 spec is ratified, 1000s of devices will automatically support HDMI 2.2 as well. I am not sure that every device will have HDMI 2.2 and DP 2.1 support even in five years. To this day there are still so many DP 1.4 devices coming to market at DP 2.0 was released in 2019.
@@saji8k not the same thing, here is saji8k's comment about it: This video leaves out a HUGE important point that the HDMI forum and manufacturers don't really want to you to talk about so that consumers will be confused and buy the wrong hardware: HDMI 2.2 introduces an OPTIONAL new bandwidth of 96 Gbps. Actually you will start seeing HDMI 2.2 GPU and displays coming out as soon as the HDMI 2.2 spec is ratified during 2025. But they will all have HDMI48C (HDMI 2.2 48 Gbps Compression-Supported) or HDMI48U (HDMI 2.2 48 Gbps Uncompressed-Only) ports and not be able to operate at 96 Gbps. Why can they call these HDMI ports HDMI 2.2 ports? Because the new 96 Gbps bandwidth is OPTIONAL. Sadly using supporting DSC compression is also OPTIONAL in the HDMI spec be sure to check for that specifically when choosing a device as well. The minimum required bandwidth for HDMI 2.2 will likely be the same as HDMI 2.1, 4.95 Gbps. Any bandwidth above that is optional. This is for example how Sony is able to say the PlayStation 5 supports HDMI 2.1, when really it has a HDMI 32C port and doesn't support the full 48 Gbps optional bandwidth that is in the HDMI spec. Sadly, DisplayPort 2.1 is only a little better. With DisplayPort 2.1, DSC compression support is REQUIRED but any bandwidth above 6.48 Gbps is OPTIONAL. That's right, a display manufacturers can say their monitor supports DisplayPort 2.1 and they don't even have to support UHBR bandwidth speeds.
@@saji8k That's more due to the fake branding behind these names. DP 2.1 doesn't actually require the full bandwidth meaning it can be about the same bandwidth as 1.4 yet still carry the new name. HDMI 2.2 is the same with the standard only requiring 48Gbps which is the maximum for 2.1 already. That means tons of products will come out this year with fake branding for the ports.
HDMI 2.1 was announced in November 2017 and then the LG C9 had it in early 2019 because they build their own in-house chips and don‘t rely on MediaTek. Since TVs for gaming are the only real use case for such bandwidth and don‘t use DP that‘s where I‘m expecting to see 2.2 first. Either the 2026 or 2027 lineup of LG TVs (hopefully with 240 Hz) and the 2027 lineup of Nvidia GPUs. I‘ll wait with any upgrades until then.
I would be interested in a well-presented full blown video about HDMI and DP versions, specifications, and what is optional within each specification, and how to check for these things giving that manufacturers are trying to obscure these specs while HDMI and DP are empowering them to do so, things are getting out of hand
I agree with you, but it probably gets down to catering for the lowest common denominator. Optical cables tend to be fragile but, more importantly, susceptible to dust interference. The fragility can be addressed by using thicker cable sheaths. I think the challenge would be designing a connector spec that is robust enough to deal with 8 year olds and many plug/unplug cycles.
@@sorwis I think the question they're asking is, why aren't there optical interfaces. Copper to optical cables often add latency, and certainly add cost and complexity.
@user-ui4fn6fj3p I wouldn't make it that big of a deal. I don't think the engineer making the technology is the same person that writes a graphic design.
Whomever designs the hdmi and DP specs are anti consumer. These specs need to be clearly defined on bandwidth and features instead of having so many optional things it's BS.
@2:20 you do a b-roll over a table with a couple of optical cables. Is the HDMI forum going to release a standard for AOC so that we consumers can FINALLY get certified optical cables at various lengths? Or do they plan to still require manufacturers to individually certify every cable at every length? For example, it looks like they had a cable from Fiber Command, which is NOT certified because it works with cables that are not at fixed lengths. Meaning Fiber Command can't be "certified". Or I'm completely wrong and they can be but haven't bothered paying the cartel for a pretty sticker on their boxes.
is DP 2.1b UHBR 20 only? given the DP80LL name I really hope it is, they gotta make it less confusing for consumers if they want us to buy high quality cables
8:26 i bought an active 2.1 cable way before the market had even a dp2.1 cabale monitor , while i could not test it but the hdmi cable at 100m from the same company worked fine for transmitting 1080p 60hz signal from my pc so i would expect it to work for the 2.1 standard going forward i think theres no room for passive cables only active cause copper cant handle that much bandwidth at demanding lengths the classification was pointless to me but it serves as a purpose for buyers to buy with confident
I just want to know why these standards are being released so slowly. We know 96Gbps is possible, just like 48, 24, 10, 5, and 1 before it. Copper isn't evolving. What are we waiting for?
feels like the manufactures are just being cheap and just not giving us a solid high bandwidth solution. Like needing to implement DSC just because it can't deliver.
Not sure if I understand it right. Will a DP2.1 Monitor (for example FO32U2P) work with a DP2.1b Cable or do monitors need a specific DP2.1b Port? If the “old” 4K 240Hz Monitors are not Compatible, this would be really sad.
You need new monitor with dp2.1b connectors… At least Nvidias 50 series GPUs are first to support DP2.1b so previous AMD gpus can only support dp2.1! So there is difference!
Can't we call USB, HDMI and DiplayPort all corresponding to their respective speeds? Like, whatever... If the speed is 80 Gb/s in a DP cable, then it can't really be anything but 2.1 If an HDMI cable is capable of 96 Gb/s, then it most likely is 2.2 I just need the speeds labelled out with especially USB, since the naming is even more of a clusterfuck than the _possibly_ misleading names if HDMI and DP 2.1 If anything, such a simple thing is made confusing for the consumers, presumably, -though I may be wrong in thinking that and the people in charge of naming are playing 4D chess to which I wasn't invited to- on purpose which is just frustrating...
Are you planning on making a review/test on the Samsung Odyssey G80SD? I know it's not one of the newest monitors out there. But I think it would be really interesting to see, just because the Odyssey monitors were always quite good in comparison to other monitors with a similar panel.
Being that HDMI will overall be more valuable to people, I'm hoping the new DP ports will at least match HDMI 2.1/2.2 with Audio codecs and compatibility.
I love this. We have hdmi 2.1 and do 2.1 or 2.0 out for years yet also none are standard on anything except consoles. Current monitor selection and even gpus don't offer either as standard. A lot of monitors still using dp1.4.....hdmi 2.0........so yeah I'm not excited for this
If GPU manufacturers in their different focus areas streaming or gaming or whatever is the category they are focusing on, if they get involved in the cable circle stuff would kinda be in sync.
That mouse cursor being rendered on the left border of the video during the sponsor segment was weirding me out, especially since mine is set to the same size :D
A bit off the topic but will there be a review of PGO32UFS, cuz im considering buying a dual-mode monitor, and this one seems the cheapest at $899, 4K 240, 1080p 480hz, WOLED. If someone is aware of it and have it, can u provide a bit more information, is it worth?
ur doing it wrong. Since the 30 series if the monitor supported it hdmi was better. Now with the 50 series getting display port 2.1 finally display port is better and when gpus finally get hdmi 2.2 it'll be better
1440p is more than enough to enjoy gaming to the fullest. We're going crazy with 4k and high refresh rates. You don't have to spend so much money for a simple hobby.
Most likely not. The Nvidia 50 series is the fist gpu to support dp2.1b… that means that old AMD gpus with DP2.1 can not support dp2.1b… So monitors that has ”only” dp2.1 do not support dp2.1b…
I connected my 40 series GPU with lg tv using 2m/6.5f Zeskit maya hdmi 2.1 cables and have no issues, chroma 4 4 4, 4k 120hz HDR, vrr, gsync. If you are on pc, xbox, PlayStation and need something longer than 2 meters you can buy real fiber hdmi 2.1 cables, very expensive $50 - $100 usd.
It'll be nice having the ability to run 4K @240hz with HDR 10 and Gsync/Freesync Assuming monitor manufacturers actually bother and don't take even longer than usual to get it out
apart from the slow WINDOWS tab out of games, which is a nvidia issue, and can be mitigated with borderless fullscreen on nvidia, can anybody tell a difference with DSC on or off, i sure cant, do a hardware unboxed, DSC on vs OFF video
Would Displayport 2.1b, being simply an active cable, work on existing Displayport 2.1a equipment? Even just having a monitor on an arm alone I find necessitates going from a 2 meter cable to a 3 meter, minimum.
Tim did say that Nvidia 50 series is the first gpu to support 2.1b… so for example old AMD dp2.1 gpus does not support it. So most likely you also need dp2.1 monitor…
@@mercurio822 5090 and 5080 will have DP 2.1b with UHBR20 "The new Asus RTX 50 series cards all support DisplayPort 2.1 with the maximum UHBR20 80Gbps bandwidth."
Is it closed source like 2.1? That is all I care about. If it is then no HDMI 2.2 on Linux for AMD then because the driver is open source just like 2.1
What is the max resolution decode of the 5000 series gpus? 8k60? or 12k60? I don't care about concurrent, just the max decode. (I want to know how far I can push VR movie/doco creation)
We are still struggling with 4K frame rates even with top end cards.... how many years is it going to take before this becomes the bottleneck just like 8K displays?
MediaTek still can’t manufacture chip that can have 4 hdmi 2.1 ports lol. Sony, Panasonic and the rest of Chinese TV brands have only 2 x hdmi 2.1 ports. Only lg and Samsung have 4 ports because they use custom chip. When mediTek will announce their new chip that supports 4 x hdmi 2.1, lg and Samsung will be on hdmi 2.2😂
So an 800G OSFP DAC cable with 2m length costs ~220€ what are the HDMI Forum smoking that they can't even get close to that yet? Also big middle finger at the HDMI Forum for refusing to allow open source drivers to use even a closed-source HDMI 2.1 blob.
You can have HDMI 2.2 - 98 Gbps if your cable doesn't exceed 6 inches. 🙄 Just switch all of them to optical. I guess copper is going to the moon anyway, so optical will end up cheaper and isn't susceptible to interference like copper HDMI cables. In my setup, HDMI 2.1 is a flicker-fest beyond 6' with copper cables.
2029, when hardware is powerful enough to run 4k games natively without feeling sluggish or choppy, will be happy to get their hands on these standards.
Oh great. MORE cables that are indistinguishable from one another. Will anyone actually follow "standards" or indicate which is what? I suspect simply more opaque marketing "nuance". Thanks...
So why are we still getting monitors with hdmi 2.0 and dp 1.4b….trickle down technology is dead in my view. While research advances and cutting edge tech is developed the gap between real world products just gets larger and larger
This video leaves out a HUGE important point that the HDMI forum and manufacturers don't really want to you to talk about so that consumers will be confused and buy the wrong hardware: HDMI 2.2 introduces an OPTIONAL new bandwidth of 96 Gbps.
Actually you will start seeing HDMI 2.2 GPU and displays coming out as soon as the HDMI 2.2 spec is ratified during 2025. But they will all have HDMI48C (HDMI 2.2 48 Gbps Compression-Supported) or HDMI48U (HDMI 2.2 48 Gbps Uncompressed-Only) ports and not be able to operate at 96 Gbps. Why can they call these HDMI ports HDMI 2.2 ports? Because the new 96 Gbps bandwidth is OPTIONAL. Sadly using supporting DSC compression is also OPTIONAL in the HDMI spec be sure to check for that specifically when choosing a device as well. The minimum required bandwidth for HDMI 2.2 will likely be the same as HDMI 2.1, 4.95 Gbps. Any bandwidth above that is optional. This is for example how Sony is able to say the PlayStation 5 supports HDMI 2.1, when really it has a HDMI 32C port and doesn't support the full 48 Gbps optional bandwidth that is in the HDMI spec.
Sadly, DisplayPort 2.1 is only a little better. With DisplayPort 2.1, DSC compression support is REQUIRED but any bandwidth above 6.48 Gbps is OPTIONAL. That's right, a display manufacturers can say their monitor supports DisplayPort 2.1 and they don't even have to support UHBR bandwidth speeds.
VESA and HDMI Forum, please make your naming schemes more friendly. In the mean time, we all should use labels like HDMI48C and DP80 so that people actually know what these devices are capable of at first glance.
Damn bro maybe you should start your own tech channel with all that knowledge
Damn bro cooked🔥🙏
*Bingo*
Mostly one will encounter marketing "nuances".
We no longer are allowed to use words like 'lies" and "misrepresentation".
Thanks for the info
iirc They were/are taking the monies from manufacturers so they could make this change from previously strict defined standards, they ain't gonna change it back.
Manufacturers want consumers to be confused so that its easier for manufacturers to sell their stuff especially older/cheaper stuff.
Spotted "16K at 60hz" in the fine print. That's insane.
I remember when HDMI 2.1 was announced. The Xbox One X was supposed to have it. That lesson was, don't hold your breath.
Still weird to me that the standard is so behind that it holds back the displays we have. I'd really expect it to be more treated like PCIE where the standard is well above what anyone realistically uses giving room for improvement. Like supporting 4K 240hz should have been announced back when 4K displays really started being a thing, not when 4K 240hz displays are already on the market with DSC.
Yeah, all the ultrawide with 200% compression. Its insane.
It’s almost like sending extremely high bandwidth signals down meter-plus cables is more difficult than sending them across centimeter-long traces in a PCB!
@stillmoms yeah I don't really buy that as making the cables hasn't really gotten that much easier with time. They don't really use the most state of the art nodes for their on board chips and we've been able to send 100Gbps over fiber optic for over a decade and coaxial has been doing the same with analog signals. How do you think they supply entire neighborhoods with cable and internet?
@@Skylancer727 The cheapest short range 100GbE transceivers are like $100 each. Are you prepared to pay $200 + the price of a fiber optic cable to connect your display to your computer? People in these comment sections already complain when non-fiber Thunderbolt cables are $80 for 2m of length. Economies of scale helps, but the fact is this stuff is difficult and therefore expensive.
@stillmoms well guess what? We need it for our displays. They don't need only one model per generation. My issue was that the standard is far behind where it should be. I will gladly pay more to actually use my monitor at its best.
NICE! cant wait to buy 7070ti + 2.2 cable and 8k monitor that has 2.2 in 2030
I cant wait to buy a 4060 + $2 cable and a 1080p monitor that con do 60hz in 2030...
And play Cyberpunk 2077 in 20 fps (native) with a path tracing on.
@@johnswan4551 can you not afford a 4060?
HDMI 2.2 is announced
Me: yay 4k 240 without compression
Companies: enjoy 8k 120 (with compression)
My eyes: what changed?
Thanks for the breakdown, Tim. It'll be a while til we get to see HDMI 2.2 in action, but it's something to look forward to
It will be a while till we see HDMI96C and HDMI96U ports in devices. Sadly we will see HDMI 2.2 this year, but these devices won't support 96 Gbps and it will cause much confusion.
Thanks for making this video Tim!
Did you do a handshake @ HDMI booth?
Hahaha
As a consumer, I found these kind of versioning very confusing.. why 2.1b? why 2.2? why not just 3 or not 5? if you're jumping from 4k to 8k, make it a big major version upgrade.. if you're jumping from 8k to 12k, make it another major version upgrade..
You'd think with so many numbers available they'd be more generous.
@ yeah, they'll have 64k or 128k video resolutions with 768Gbps bandwidth and still version number will be like 2.2be
Maybe becuse the same socket
@@nartcazat this point we should be using QSFP 400 Gigabits Direct Attach Cables.
I'm tired
While HDMI is proprietary and requires licensing I'm trying to use DP. This wierd situation with Radeon 7000/HDMI in Linux is ridiculous. Intel bypassed in 1st generation GPUs with dedicated converter dp-hdmi, but it's a strange situation overall.
P.S. For those who using old hardware with new monitors DP is cool, I have mini PC with 8th gen Intel CPU, which has DP - so I can use for example 165Hz full hd or 60hz 4k. With it's HDMI 1.4b limitation is 4k 30hz. Even more - with thin client based on old Atom Z8350 I can use 144Hz with DP in full hd.
ridiculous yes but not strange, HDMI requires a license and will not license an open source implementation so AMD that choose to implement HDMI in software instead of in hardware cannot use this in their open source version as per the license requirements from HDMI. nVidia solves this by having the HDMI as part of the closed source firmware and Intel with their dedicated hardware on the card.
@Henrik_Holst with that approach from HDMI I'd rather use DP if it's possible. Its a pity, that there is a small amount of DP implementation in TVs.
@@axescar yep, which is also why I also use DP with my AMD card. Luckily a dp->hdmi connector can be found for quite cheap (just make sure that it supports higher than 1080p).
RemindMe! 5 Years “HDMI 2.2 and DP 2.1b is a standard and in every new models”
You don't need to wait 5 years. There are already devices on the market that support DisplayPort 2.1b. And once the HDMI 2.2 spec is ratified, 1000s of devices will automatically support HDMI 2.2 as well. I am not sure that every device will have HDMI 2.2 and DP 2.1 support even in five years. To this day there are still so many DP 1.4 devices coming to market at DP 2.0 was released in 2019.
@@saji8k not the same thing, here is saji8k's comment about it:
This video leaves out a HUGE important point that the HDMI forum and manufacturers don't really want to you to talk about so that consumers will be confused and buy the wrong hardware: HDMI 2.2 introduces an OPTIONAL new bandwidth of 96 Gbps.
Actually you will start seeing HDMI 2.2 GPU and displays coming out as soon as the HDMI 2.2 spec is ratified during 2025. But they will all have HDMI48C (HDMI 2.2 48 Gbps Compression-Supported) or HDMI48U (HDMI 2.2 48 Gbps Uncompressed-Only) ports and not be able to operate at 96 Gbps. Why can they call these HDMI ports HDMI 2.2 ports? Because the new 96 Gbps bandwidth is OPTIONAL. Sadly using supporting DSC compression is also OPTIONAL in the HDMI spec be sure to check for that specifically when choosing a device as well. The minimum required bandwidth for HDMI 2.2 will likely be the same as HDMI 2.1, 4.95 Gbps. Any bandwidth above that is optional. This is for example how Sony is able to say the PlayStation 5 supports HDMI 2.1, when really it has a HDMI 32C port and doesn't support the full 48 Gbps optional bandwidth that is in the HDMI spec.
Sadly, DisplayPort 2.1 is only a little better. With DisplayPort 2.1, DSC compression support is REQUIRED but any bandwidth above 6.48 Gbps is OPTIONAL. That's right, a display manufacturers can say their monitor supports DisplayPort 2.1 and they don't even have to support UHBR bandwidth speeds.
@@saji8k That's more due to the fake branding behind these names. DP 2.1 doesn't actually require the full bandwidth meaning it can be about the same bandwidth as 1.4 yet still carry the new name. HDMI 2.2 is the same with the standard only requiring 48Gbps which is the maximum for 2.1 already. That means tons of products will come out this year with fake branding for the ports.
Reddit
Better make it 10.
So glad you got to look at everything at CES. Such valuable info and insights you have esp with the FSr, you just know what to look for
HDMI 2.1 was announced in November 2017 and then the LG C9 had it in early 2019 because they build their own in-house chips and don‘t rely on MediaTek. Since TVs for gaming are the only real use case for such bandwidth and don‘t use DP that‘s where I‘m expecting to see 2.2 first. Either the 2026 or 2027 lineup of LG TVs (hopefully with 240 Hz) and the 2027 lineup of Nvidia GPUs. I‘ll wait with any upgrades until then.
I would be interested in a well-presented full blown video about HDMI and DP versions, specifications, and what is optional within each specification, and how to check for these things giving that manufacturers are trying to obscure these specs while HDMI and DP are empowering them to do so, things are getting out of hand
Why is fibre optic not a standard display tech?
still have to be delicate with the cables, and even more delicate with the connectors.
clearly haven't hit the limits of copper either
I agree with you, but it probably gets down to catering for the lowest common denominator. Optical cables tend to be fragile but, more importantly, susceptible to dust interference. The fragility can be addressed by using thicker cable sheaths. I think the challenge would be designing a connector spec that is robust enough to deal with 8 year olds and many plug/unplug cycles.
Because it is hella expensive, optical transievers go for $500-2000 a piece and you need at least two.
There are plenty of optical HDMI-cables on the market for longer distances. They are cheaper than active ones.
@@sorwis I think the question they're asking is, why aren't there optical interfaces. Copper to optical cables often add latency, and certainly add cost and complexity.
U putting in hard work on all this new tech for us & us real ones appreciate u 🔥🔥🔥🔥🔥
The lil jingle after all these years, gets me in the feels.
Good overview. Longer DP cables sounds nice. HDMI 2.2..well, is still proprietary.
7:10 "at its fastest" What an eyesore
beat me to it
Can't believe these guys are the ones designing our video standards.
@user-ui4fn6fj3p I wouldn't make it that big of a deal. I don't think the engineer making the technology is the same person that writes a graphic design.
And here I am with my keen eye for grammar, but unemployed. While some peabrain is making a fat paycheck writing stuff like that.
Whomever designs the hdmi and DP specs are anti consumer. These specs need to be clearly defined on bandwidth and features instead of having so many optional things it's BS.
@2:20 you do a b-roll over a table with a couple of optical cables. Is the HDMI forum going to release a standard for AOC so that we consumers can FINALLY get certified optical cables at various lengths? Or do they plan to still require manufacturers to individually certify every cable at every length?
For example, it looks like they had a cable from Fiber Command, which is NOT certified because it works with cables that are not at fixed lengths. Meaning Fiber Command can't be "certified". Or I'm completely wrong and they can be but haven't bothered paying the cartel for a pretty sticker on their boxes.
Take a shot every time Tim says 2.1
is DP 2.1b UHBR 20 only? given the DP80LL name I really hope it is, they gotta make it less confusing for consumers if they want us to buy high quality cables
HDMI forum can walk off a cliff.
Take a shot every time Tim says "so, yeah..."
Take a shot every time Tim is Australian
Which monitor to get for under 1k for 70% game development work and video editing, and 30% games/movies/etc
Currently thinking a 32 inch QD Oled
8:26 i bought an active 2.1 cable way before the market had even a dp2.1 cabale monitor , while i could not test it but the hdmi cable at 100m from the same company worked fine for transmitting 1080p 60hz signal from my pc so i would expect it to work for the 2.1 standard
going forward i think theres no room for passive cables only active cause copper cant handle that much bandwidth at demanding lengths
the classification was pointless to me but it serves as a purpose for buyers to buy with confident
I just want to know why these standards are being released so slowly. We know 96Gbps is possible, just like 48, 24, 10, 5, and 1 before it. Copper isn't evolving. What are we waiting for?
Can't wait to see large format 4K 240Hz/ 8K OLED TVs in 2028.
I am not buying a new big screen TV until they offer at least 480hz.
i literally tried to look this up earlier, yay for an easier answer
feels like the manufactures are just being cheap and just not giving us a solid high bandwidth solution. Like needing to implement DSC just because it can't deliver.
I can't wait till they make it even more complicated yayyyy!
Is the 'OLED' on his top supposed to simulate burn-in? If so, do I get a cookie for noticing?
There are many of us that run our audio and video through hifi separates or receivers. I also use 5 and 7m length hmdi cables.
I do not get the DisplayPort naming system, but at least DisplayPort is significantly better than HDMI in almost every way.
Not sure if I understand it right.
Will a DP2.1 Monitor (for example FO32U2P) work with a DP2.1b Cable or do monitors need a specific DP2.1b Port?
If the “old” 4K 240Hz Monitors are not Compatible, this would be really sad.
Just the cable is new. 2.1 monitors are still compatible.
the revision just specifies active cables that would work in a DP 2.1a port
You need new monitor with dp2.1b connectors…
At least Nvidias 50 series GPUs are first to support DP2.1b so previous AMD gpus can only support dp2.1!
So there is difference!
At least it's not USB 3.2 Gen 2×2 yet, right? Right...?
Can't we call USB, HDMI and DiplayPort all corresponding to their respective speeds?
Like, whatever... If the speed is 80 Gb/s in a DP cable, then it can't really be anything but 2.1
If an HDMI cable is capable of 96 Gb/s, then it most likely is 2.2
I just need the speeds labelled out with especially USB, since the naming is even more of a clusterfuck than the _possibly_ misleading names if HDMI and DP 2.1
If anything, such a simple thing is made confusing for the consumers, presumably, -though I may be wrong in thinking that and the people in charge of naming are playing 4D chess to which I wasn't invited to- on purpose which is just frustrating...
Exciting stuff!
Tim, did you see the acer Predator X323QX 31.5" 5k 144hz ips display?
Are you planning on making a review/test on the Samsung Odyssey G80SD? I know it's not one of the newest monitors out there. But I think it would be really interesting to see, just because the Odyssey monitors were always quite good in comparison to other monitors with a similar panel.
Being that HDMI will overall be more valuable to people, I'm hoping the new DP ports will at least match HDMI 2.1/2.2 with Audio codecs and compatibility.
Does HDMI support active cables? If not, how did they solve the cable length problem?
Oh look, Tim has found the Handy Dandy Movie Input booth 😜
awesome video
I love this. We have hdmi 2.1 and do 2.1 or 2.0 out for years yet also none are standard on anything except consoles. Current monitor selection and even gpus don't offer either as standard. A lot of monitors still using dp1.4.....hdmi 2.0........so yeah I'm not excited for this
If GPU manufacturers in their different focus areas streaming or gaming or whatever is the category they are focusing on, if they get involved in the cable circle stuff would kinda be in sync.
DP 2.0 80 gigabit/s bandwith is maxed out with 5120x2160 250hz 10 bit color. if you dont want to use DSC.
VESA give us more bandwith PLEASE🙏🏻🙏🏻🙏🏻😢😢😢.
That mouse cursor being rendered on the left border of the video during the sponsor segment was weirding me out, especially since mine is set to the same size :D
A bit off the topic but will there be a review of PGO32UFS, cuz im considering buying a dual-mode monitor, and this one seems the cheapest at $899, 4K 240, 1080p 480hz, WOLED. If someone is aware of it and have it, can u provide a bit more information, is it worth?
So are we back to using HDMI cables? I have been using DP for all my PC displays
ur doing it wrong. Since the 30 series if the monitor supported it hdmi was better. Now with the 50 series getting display port 2.1 finally display port is better and when gpus finally get hdmi 2.2 it'll be better
@@Frozoken I just checked with HDMI and my Asus OLED is capping at 120hz, but with HDMI i can use 12bit if i drop to 60hz, i got 3090
@pivorsc then ur monitor doesn't support 2.1
I mean Thunderbolt 5 can theoretically support 120gbps…
1440p is more than enough to enjoy gaming to the fullest. We're going crazy with 4k and high refresh rates. You don't have to spend so much money for a simple hobby.
a monitor with DP 2.1a input can benefit of the DP 2.1b?
Most likely not. The Nvidia 50 series is the fist gpu to support dp2.1b… that means that old AMD gpus with DP2.1 can not support dp2.1b…
So monitors that has ”only” dp2.1 do not support dp2.1b…
It won't be a standard until Nvidia and amd supports them
New MST hubs when? 1.4 hubs need to not be the only option at this point
I connected my 40 series GPU with lg tv using 2m/6.5f Zeskit maya hdmi 2.1 cables and have no issues, chroma 4 4 4, 4k 120hz HDR, vrr, gsync. If you are on pc, xbox, PlayStation and need something longer than 2 meters you can buy real fiber hdmi 2.1 cables, very expensive $50 - $100 usd.
It'll be nice having the ability to run 4K @240hz with HDR 10 and Gsync/Freesync
Assuming monitor manufacturers actually bother and don't take even longer than usual to get it out
apart from the slow WINDOWS tab out of games, which is a nvidia issue, and can be mitigated with borderless fullscreen on nvidia, can anybody tell a difference with DSC on or off, i sure cant, do a hardware unboxed, DSC on vs OFF video
A problem on the Nvidia side, DSR / DLDSR does not work with DSC.
Would Displayport 2.1b, being simply an active cable, work on existing Displayport 2.1a equipment?
Even just having a monitor on an arm alone I find necessitates going from a 2 meter cable to a 3 meter, minimum.
Very possibly, they are just adding signal enhancing to the spec, same protocol
Tim did say that Nvidia 50 series is the first gpu to support 2.1b… so for example old AMD dp2.1 gpus does not support it. So most likely you also need dp2.1 monitor…
Can a longer 2.1b cable go to a DP 1.4 monitor?
Of course
When monitors using dp2.1b?
Soo wee need now a 5090 !
5090 wont even have the new standard. And nvidia compressed everything anyway soo no point
@@mercurio822 5090 and 5080 will have DP 2.1b with UHBR20 "The new Asus RTX 50 series cards all support DisplayPort 2.1 with the maximum UHBR20 80Gbps bandwidth."
@mercurio822 5000 Series Have full Dp2.1 i thin "" no one"" need more than Full Dp2.1 80Gbps
@@ramonzaions7522 i do. I want to have a 500hz 4k oled in 5 years.
Any news on HDMI people saying "just buy a windows machine" to AMD +Linux people?
Thunderbolt 5
Is it closed source like 2.1? That is all I care about. If it is then no HDMI 2.2 on Linux for AMD then because the driver is open source just like 2.1
Yes, of course
I wish they would release/sell TV's with DP 2.1b
Not gonna hapen. They want to sell us expensive small monitors…
I never seen HDMI consortium do anything good to society
I hope USB-C soon reaches the necessary speed to be and standard in multimedia interferences.
Did I say how much I hate the HDMI consortium?
Why did Tom say you were banned from CES? Was he trolling you? So confused.
What is the max resolution decode of the 5000 series gpus? 8k60? or 12k60? I don't care about concurrent, just the max decode. (I want to know how far I can push VR movie/doco creation)
It will depend on the codec but I don't think anyone's published the info online yet anyways
We are still struggling with 4K frame rates even with top end cards.... how many years is it going to take before this becomes the bottleneck just like 8K displays?
That's going to be a thick cable.
And cable also shorter. I have problem with hdmi 2.1 10m use optical cable but much of the cable fail after time. And with hdmi 2.2 ???
Same deal
MediaTek still can’t manufacture chip that can have 4 hdmi 2.1 ports lol. Sony, Panasonic and the rest of Chinese TV brands have only 2 x hdmi 2.1 ports. Only lg and Samsung have 4 ports because they use custom chip. When mediTek will announce their new chip that supports 4 x hdmi 2.1, lg and Samsung will be on hdmi 2.2😂
armored core spotted!
Post a video of best 4k or above monitors to buy in 2025 for content creation
Good video
So an 800G OSFP DAC cable with 2m length costs ~220€ what are the HDMI Forum smoking that they can't even get close to that yet?
Also big middle finger at the HDMI Forum for refusing to allow open source drivers to use even a closed-source HDMI 2.1 blob.
What is that mess @ 2:57? Shame on whoever did that for a presentation PC at CES.
Having a Display Port cable that can handle 4K 240Hz without DSC and be longer than 1 meter is a win, as far as I'm concerned. Thumbs up.
Just name it hdmi 3 . Come on
Thank you.
Cant we just use fiber optics ?
Is it true that monitors with microLED technology will be created? Supposedly even better than OLED.
Meanwhile you can buy a 700€+ 2025 laptop with HDMI 1.4...
Cheap laptop… cheap connectors!
@haukikannel 740€ 2800x1800 120hz oled screen laptop cheap... right
You should decrease the food bandwidth, you got all swollened up.
Should have been a new connector to avoid the SHITTON of fake labeled cables
After releasing all these new devices only to be replaced in 11 months with new ones with new ports🤣
Videos need to be in HDR.....
My 7900gre has DP 2.1..
Great, just bought an AVR and HDMI is soon obsolete again
> HDMI GAMING at it is fastest
Shocking #apostrophecatastrophe in the background there
thanks
HDMI still compresses content though. DP does not
You can have HDMI 2.2 - 98 Gbps if your cable doesn't exceed 6 inches. 🙄 Just switch all of them to optical. I guess copper is going to the moon anyway, so optical will end up cheaper and isn't susceptible to interference like copper HDMI cables. In my setup, HDMI 2.1 is a flicker-fest beyond 6' with copper cables.
I just upgraded to 48gbit HDMI lol
2029, when hardware is powerful enough to run 4k games natively without feeling sluggish or choppy, will be happy to get their hands on these standards.
Wow, looking at the title, this is exactly what i want to know about
Oh great. MORE cables that are indistinguishable from one another. Will anyone actually follow "standards" or indicate which is what? I suspect simply more opaque marketing "nuance".
Thanks...
Lilerally unwatchable with that german AI audio track. Cannot be disabled by the user (Android - App). Please stop enabling it for videos.
So why are we still getting monitors with hdmi 2.0 and dp 1.4b….trickle down technology is dead in my view. While research advances and cutting edge tech is developed the gap between real world products just gets larger and larger