I do not really have a body for this. I was not aware that this is a thing and still feel like this is bs, but maybe there is an actual explanation for HDMI Forum’s decision that I am missing.
maybe there is an actual explanation for HDMI Forum’s decision that I am missing.
HDMI has never been an open standard (to the best of my understanding anyway). You’ve always needed to be an adopter or a member of HDMI forum to get the latest (or future) specs. So it’s not like they’ve just rejected a new idea. The rejection is fully consistent with their entire history of keeping the latest versions on lockdown.
Standards organizations like HDMI Forum look like a monolith from the outside (like “they should explain their thinking here”) but really they are loosely coupled amalgamations of hundreds of companies, all of whom are working hard to make sure that (a) their patents are (and remain) essential, and that (b) nothing mandatory in a new version of the standard threatens their business. Think of it more like the UN General Assembly than a unified group of participants. Their likely isn’t a unified thinking other than that many Forum members are also participants in the patent licensing pool, so giving away something for which they collect royalties is just not a normal thought. Like… they’re not gonna give something away without getting something in return.
I was a member of HDMI Forum for a brief while. Standards bodies like tihs are a bit of a weird world where motivations are often quite opaque.
You want companies to stop supporting and using your shitty standard? Because that is how you get customers ntonstop using your standard and by extension, your companies
let’s make usb and displayport open-source drivers. seriously!
Can we just do display port then?
displayport is starting to appear on some higher end tvs
Sounds good to me
Be a shame if it leaked on the internet.
Time to kill HDMI with USB 4/TB bring those cost way down.
Normalize USB-c with screws and we have a deal.
Display Port would be better suited to do all of those things.
Usb-c supports DisplayPort! Long live open standards!
are USB-C ports that do display using displayport standards to do it anyway?
Be a damn shame if someone leaked the driver.
Or even better, drop HDMI support in favor of displayport
I really hope we’ll see TVs with DisplayPort one day.
I think I’d like DisplayPort over a USB-C connector. It seems like this might be an easier sell too, since the general non-techy populace is already used to everything going to USB-C (thanks EU). Maybe one day we can actually just use the same cable for everything. I realize that not all USB-C cables are equal, but maybe if TVs used USB-C, we’d see more cables supporting power, data, and video.
Mildly spicy take: USB is an unrecoverable disaster and we need an entirely unrelated team to invent something entirely new to replace it because we’re never getting this sleeping bag back in the little bag it shipped in.
USB 1.1 was cool in 1996; it replaced PS/2, RS-232, Centronics parallel, several proprietary connectors, several use cases for SCSI, ADB, Apple’s DIN serial ports, and probably some stuff I’m missing. There was an A plug and a B plug, main problem was both weren’t very obvious which way up you were supposed to plug them. Speed was low but firewire existed for high speed connections.
USB 2.0 was cooler in 2000. The plugs and sockets were identical, the cable was similar but with better shielding, it was as fast or faster than FireWire 400. They did start introducing more plugs, like Mini-B and Micro-B, mainly for portable devices. There were also Mini-A and Micro-A, I’ve never personally seen them. That pretty much finished off external SCSI. Higher speed FireWire was still there if you needed faster than USB but USB 2.0 did basically everything. To indicate USB 2.0 devices and ports, they made the tongues black in contrast with USB 1.1’s white tongues. Didn’t really matter in practice; by the time people had devices that needed the speed, USB 2.0 ports were all machines had.
USB 3.0 took too long to arrive in 2008. The additional speed was sorely needed by then, FireWire was mostly an Apple thing, PCs had but often didn’t use it, so PCs mostly didn’t have anything faster than 480Mbit/s until Obama was sworn in. USB 3.0 is best thought of as a separate tech bolted on top of USB 2.0, they added 5 more wires, a ground wire and two pair of high speed data lines for 5Gbit/s full duplex. The original four wires are also in the cable for power and 480Mbit/s half-duplex. They managed to make the A plug and socket entirely forwards and backwards compatible, the 3B sockets are compatible with 2B plugs (same with micro) but 3B plugs are not compatible with 2B sockets (again, same with micro). Which means we’ve just added two more kinds of cable for people to keep track of! So a typical consumer now likely has a printer with a USB A-B cable, some bluetooth headset or mp3 player they’re still using that has a mini-B plug, an Android smart phone with a micro-B plug, an iPod Touch with a Lightning plug because Apple are special widdle boys and girls with special widdle needs, and now an external hard drive with a 3A to micro-3B plug, which just looking at it is obviously a hack job.
Computer manufacturers didn’t help. It’s still common for PCs to have 2.0 ports on them for low speed peripherals like mice, keyboards, printers, other sundry HIDs, to leave 3.0 ports open for high speed devices. To differentiate these to users, 3.0 ports are supposed to be blue. In my experience, about half of them are black. I own a Dell laptop made in 2014 with 1 2.0 and 2 3.0 ports, all are black. I own two Fractal Design cases, all of their front USB ports are black. Only ports on my Asrock motherboards are blue. I’ve had that laptop for nearly 12 years now, I STILL have to examine the pinout to tell which one is the USB 2.0 port. My Fractal cases aren’t that bad because they have no front 2.0, but I built a PC for my uncle that does have front 2.0 and 3.0 ports, and they’re all black.
USB 3.1 showed up in 2013, alongside the USB-C connector, and the train came entirely off the rails. USB 3.1 offers even higher 10Gbit/s duplex throughput, maybe on the same cable as 3.0. If the port supports it. How do you tell a 3.1 port from a 3.0 port? They’ll silk screen on a logo in -8 point font that’ll scratch off in a month, it is otherwise physically identical. Some motherboard manufacturers break with the standard in a good way and color 3.1 capable ports a slightly teal-ish blue. USB A-B cables can carry a USB 3.1 10Gbit/s signal. But, they also introduced the USB-C connector, which is its own thing.
USB-C was supposed to be the answer to our prayers. It’s almost as small as a Micro-2B connector, it’s reversible like a Lightning port, it can carry a LOT of power for fast charging and even charging laptops, and it’s got not one, but two sets of tx/rx pins, so it can carry high speed USB data in full duplex AND a 4k60hz DisplayPort signal AND good old fashioned 480Mbit/s USB2.0 half-duplex for peripherals. In one wire. That was the dream, anyway.
Android smart phones moved over to USB-C, a lot of laptops went mostly or entirely USB-C, PCs added one or two…and that’s where we are to this day. Keyboards, mice, wireless dongles, HIDs, still all use USB-A plugs, there doesn’t seem to have been any move at all to migrate. Laptops are now permanently in dongle hell as bespoke ports like HDMI are disappearing, yet monitors and especially televisions are slow to adopt DP over USB-C.
Also, about half of the USB-C cables on the market are 4-wire USB 2.0 cables. There are no USB-C data cables, just D+ and D- plus power. They’re phone charging cables; they’re sufficient for plugging a phone into a wall wart or car charger but they often don’t carry laptop amounts of power and they don’t carry high speed data or video.
USB 3.2 turned up in 2017, added the ability to do two simultaneous 3.1 10Gbit/s connections in the same cable, a boon for external SSDs, retroactively renamed 3.0 and 3.1 to 3.2 Gen 1 and 3.2 Gen 2, with 3.2 being 3.2 Gen 2x2, changed to different case logos to match, pissed in the fireplace and started jabbering about Thunderbolt. Thunderbolt was an Intel thing to put PCIe lanes out mini DisplayPort cables, usually for the purposes of connecting external GPUs to laptops but also for general purpose high speed data transfer. Well, around this time they decided to transition to USB-C connectors for Thunderbolt.
Problem: They use a lighting bolt logo to denote a Thunderbolt port. Lightning bolt, or angled squiggle lines, have been used to mean “high speed”, “Power delivery”, “Apple Lightning”, and now “Thunderbolt.”
“Power delivery” sometimes but not always denoted by a yellow or orange tongue means that port delivers power even with the device turned off…or something. And has nothing to do with the fact that USB-C cables now have chips in them to negotiate with power bricks and devices for how much power can be delivered, and nobody marks the cables as such, so you just have to know what your cables can do. They’re nearly impossible to shop for, and if you want to set up a personal system of “my low-speed cables are black, my high speed cables are white, my high power cables are red” fuck you, your Samsung will come with a white 2.0 cable and nobody makes a high power red cable.
USB4 is coming out now, it’s eaten Thunderbolt to gain its power, it’ll be able to do even higher speed links if you get yet another physically indistinguishable cable, and if you hold it upside down it’ll pressure wash your car, but only Gigabyte Aorus motherboards support that feature as of yet.
The “fistful of different cables to keep track of” is only getting worse as we head into the USB4 era and it needs to be kicked in the head and replaced entirely.
The renaming while still selling it with older packaging for years has been angering me since it happend.
Honestly it would not be so much of a problem if things where actually labeled appropriately with all the actual specs and support features on the package but its more profitable to keep you guessing (and going for the higher priced one just in case)
They do the same thing with Bluetooth audio transmission usb, their “high quality audio” and “ps5 compatible” but does not tell me wether it supports aptx or not?
Also the whole “buy a product clearly pictured with usb A type connector, receive a usb C type connector variant, if lucky, with an added adapter.
I mentioned this in another thread but “DP Alt” (DP over USB-C) is not a default feature of the USB spec and is an optional extension that needs to be added via additional hardware and supported by the device. At that point you’re basically adding in DP with just a different port.
To that end, it’s still the same thing that TV manufacturers just aren’t adding in DP support regardless of connector.
Isn’t usb-c able to carry Thunderbolt, which subsumed DisplayPort at some point? I thought Thunderbolt and DisplayPort were thus merged into whatever the usb standard was at the time.
Display port over USB-C is totally a thing. With things like USB-PD USB seem to be getting dangerously close to becoming the standard for everything. The cables are a wreck though and are way too hard for a layperson to tell apart.
I’m a very technical person and I can’t tell them apart.
Is there a symbol?
It’s pretty simple and straightforward, all you have to so is buy the cable and a professional cable tester to see what specs it’s actually in compliance with
Don’t worry, I’m sure when USB 4 releases, they’ll retroactively change the names of USB 3.2 Gen 1 and USB 3.2 Gen 2 to “USB 4.3 Gen 0.01” and “USB 4.3 Gen 0.02” respectively. Then USB 4 will actually be named “USB 4.4 Gen 5” just because.
And none of the cables will be labeled, nor will they simultaneously support high power delivery and full data speed. We’ll need to wait for “USB 4.4 Gen 4” for that, which is when the old standard will get renamed to “USB 4.4 Gen 3.5” instead.
“USB4” (not USB 4.0) released in 2019 and “USB4 Version 2.0” (not USB 4.1) released in 2022.
These days a ~10€ gadget can tell you about the electricity going through a USB connection and what the cable is capable of. I don’t like the idea of basically requiring this to get that knowledge, but considering the limited space on the USB-C plugs I’m not sure anything is likely to improve about their labeling.
That’s good enough for me, what they called and wee where can I get one?
Nope! That’s part of the fun sadly. At least if you’re technical you’ll know that not all type-c cables are the same.
My monitor (tv) supports usb c and I like it! The flexibility was nice during my single battle station move
Digital signage
Unaffordable to a consumer.
Why even buy 'em, tho? They’re basically shitty monitors with spyware for brains.
Well if you want anything over 35 inches a TV is your only option. They don’t really make monitors bigger than that.
Because we don’t all live in dorm rooms sitting at desks watching TV. Some of us need something besides a 32-in.
I have enough back problems to remember a time when a 32 inch television WAS a big-screen. My family had a 35 inch Sony Trinitron that weighted as much as a motorcycle. You do not NEED a 50+ inch screen.
I’m 43. I remember those days as well. It was shit.
I’m 38, and I remember the last gasps of CRTs in the early 2000s more fondly than the colicky 10-year toddlerhood of digital flat panels that followed.
I mean, you also had pretty low quality TV and home media, anything beyond 32 inches wasn’t doing much good for you.
well its not the only option, its the only consumer ended option.
the corporate option is large format display/digital signage screens
If you actually give a fuck about image quality beyond size and brightness, digital signage also isn’t really an option. You won’t find many commercial oled displays, for example.
Best option for home entertainment, imo, is still a consumer TV, that you just never connect to the internet and use a set top box with, instead.Assuming they don’t start using meshnet to spy on you anyways.
Up The (soldering) Irons!
Don’t give them any ideas
Too late, TVs have apparently been known to connect to open WiFi networks if they find any.
That’s actually already a thing potentially, TV and even appliance makers have been fielding using meshnet tech to spy on users regardless of if they’re connected to the web or not for a while now.
of course, I’m not suggesting anyone should use them. im just saying they exist. the companies that make the good screens are all part of the HDMI forum, which defeats the reason why display port wont be offered at these screen sizes effectively.
basically no comoany is going to deny using HDMI unlrss a new upcoming screen company either develops propietary displayport tech, or the VESA foundation spins up monitor production.
unironically the only company i can remotely see doing said action is apple.
It’s a ‘why not both’ thing.
My experience is that digital signage displays are still HDMI-only.
many are hdmi only, but there are several that have display port as well. I see a lot since I work in lease return/e-waste recycling.
Are there a lot of 65" PC monitors?
Alienware once sold a 55" 120Hz OLED monitor.
Edit: It originally cost $4000
about the same specs as my TV, but 10" less and 4x its price
It looks to be $1200 on Amazon. Decent for OLED.
I got my 42 inch lgc2 for $990 a few years ago.
Technically those menu boards at restaurants.
Just don’t connect it to the internet.
That’s why HDMI needs to die and display port needs to take over. The TV industry is too big for that to happen of course. They make a shit ton of money off of HDMI
AMD should remove the HDMI port from all of their GPUs as a nice F.U. to the HDMI forum. They shouldn’t be paying the licensing fees if they are not allowed to make full use of the hardware.
That’d be suicide.
There would be uproar, but like the audio jack on phones people would come around. All it would take is one big enough company to pull it off, and the rest would follow.
Apple could remove the audio jack from iPhones because 1. They’re Apple. They could remove the eyes from their customers and 9/10ths of them would stay loyal. and 2. Eliminating the headphone jack mostly locked people out of $20 or less earbuds that might have come free with a previous phone anyway. People grumbled, and carried on using the Bluetooth headphones a lot of them already owned.
AMD doesn’t have the following that Apple does; they’re the objectively worse but more affordable alternative to Nvidia. Eliminating the HDMI port would lock themselves out of the HTPC market entirely; anyone who wanted to connect a PC to a TV would find their products impossible to use, not without experience ruining adapter dongles. We’re talking about making machines that cost hundreds or thousands of dollars incompatible.
Just bought a new phone that has an audio jack. Some of us refuse to “come around”. They can fit a stylus and an audio jack in this thing. Why did they remove the audio jack again? Not enough room? Bullshit
The point isn’t whether it’s needed or not. It’s not about space or features. The point is that a major player made a design decision and bucked the system. And while there may still be some phones with audio jacks, the majority of mainstream phones don’t. That major player is still successful, and other companies followed suit.
Can we agree this is what should happen to HDMI. No?
tbh I looked at audio jacks in internals, and they do usually have double the footprint on a pcb than what you see outside of it, at least on low end consumer devices:

That’s not to say that they couldn’t put anything more compact in a highend device like a smart phone.
Okay but I have a usbc slot, speakers, stylus, and an audio jack all on the bottom of my new phone. It’s bullshit that they needed the room as evidenced by this 2025 phone.
It can also use an sdcard. Greedy fucking corporations just wanting you to repurchase stuff you already have.
For now, but DP and specially DP over USB-C is becoming gradually more popular for computer hardware, someone paying 400 euros for a GPU doesn’t mind paying 10 bucks extra on an adapter if they have an HDMI monitor. But most monitors nowadays come with DP anyway.
The problem is not with monitors, but rather TVs, which are not using DP (almost?) at all
No, because electronics aren’t alive
🤦♂️
HDMI adapter memory dimm card! That’s $45.99 all day long.
but maybe there is an actual explanation for HDMI Forum’s decision that I am missing.
Licensing money.
Maybe we can get it when they come out with 2.2 or whatever, then
Which could be paid by Valve in this case, especially since no one is expecting the Steam Box thing to be cheap.
If the license holder isn’t willing to accept the money, it doesn’t matter if Valve is willing to pay it.
Everyone has a price. Maybe a superyacht or two from Gabe’s fleet?
It’s not about liquid money. It’s about “preventing piracy” by blocking anything that could allow people to use certain features via FOSS systems.
I was thinking more along the lines of, "let people pirate. Here’s a megayacht’.
The license holder is attaching additional terms and conditions that are incompatible with publicly disclosing the driver source code.
It still boggles my mind things can be licensed/copyrighted without being forced to disclose source code. The lack of transparency we’re okay with in society is absolutely unsustainable.
This wouldn’t work to scale. If Valve paid to license the spec for the Linux kernel, it would have to pay for every person who downloaded the driver, which is far more than the amount of people who buy the Steam Cube.
Unless of course you’re suggesting that the kernel driver for the new spec become closed source.
OK. Fine. Then it’s going to be reverse engineered and everyone will use it anyways and they’ll get nothing.
It’s not, people will just convert DP to HDMI and call it a day
If it ever gets open sourced, anybody will just use it without paying.
Yes, this isn’t new but it’s resurfacing thanks to the Steam Machine. Basically (off my memory), part of your title is accurate: AMD did create a FOSS driver with HDMI 2.1 which does not violate HDMI forum requirements, but the HDMI forum still vetoed it. I don’t know if it would necessarily “disclose the specification” as the first part of your title suggests, but I dig into the details enough to say for certain.
Basically a dick move by HDMI. Maybe Valve can push their weight on this, we’ll see.
I quoted the link. I do not have more insights.
I wish we just fuck HDMI group and switch to open standard display port but we are not control of TV manufactures cause they are who crested HDMI group
Let’s start messing up these fuckers…
I’ve got a feeling this is specifically related to DRM in the HDMI spec that prevents video capture of encrypted content. Maybe I’m remembering something vaguely from about a decade ago about HDMI content encryption that is no longer relevant, but my hazy memory is that this was a core element of the HDMI spec that media corps wanted to prevent digital copying. Not that it really means anything at this point, the seas are full of high quality rips regardless, but maybe there is some dubious legal value in preventing an open source driver?
Who makes up the HDMI Forum?
The council.



























