But still, I like it. :-)
For some reason the bandwidth is too low?
So for 60Hz there is interop. but for 75Hz you need a thick DVI cable.
And assuming they do, they all have HDMI anyway.
As for everything else most people would connect to their TV, they all have HDMI and precisely none have DisplayPort.
Given that, supporting DisplayPort is an unnecessary expenditure on bill of materials and labor for TV manufacturers.
Just throw one and if the ecosystem grows (like it did on PCs) then you keep replacing HDMI ports with DP ports.
And outside of gaming consoles it is becoming increasingly more rare to connect anything to the TV outside of the luxury segment (as even build in sound often is better then any external sound up to a price region where the lower luxury segment starts, so buying a slightly better TV without an external sound system is often better then a cheaper system and external sound).
And many of the "killer features" of HDMI (like network over HDMI) are semi-dead.
And DP is royalty free, HDMI isn't. So gaming consoles probably would love going DP only.
So as far as I can tell the only reason physical HDMI interfaces are still everywhere is network effect of them always having been everywhere.
I.e. if there is a huge disruption HDMI might go from everywhere to dying off.
And USB-C is perfectly suited for such disruption (long term).
I mean everything from server centers to high speed interconnects of consumer PCs is currently moving to various forms of "PCIe internally". USB4 and NVMe just being two examples. So it might just be a matter of time until it reaches the console/TV space and then USB-C with USB4+ or Thunderbolt would be the way to go.
Thing is, DP → HDMI adapters all suck when you’re using them to send anything but a basic 1080p picture. They nearly all fail or struggle with uncompressed 4K. I tried several different cables and adapters and despite marketing claims, they all had trouble. The best was one that was externally powered via USB, but even it exhibited quirks.
I no longer have the Rift hooked up to that machine which freed its HDMI port up, but I too wish TVs and receivers had even just one DisplayPort.
My guess is that it's not a technical or even a user experience issue. It's probably a money issue with the deals tv manufacturers make with the media industry.
i.e. Netflix won't allow their app on a device that can circumvent HDCP.
DVI connectors offered analog VGA for years. This meant that graphics card vendors could put one port on their card that did both, huzzah, and a passive adapter got you VGA out of DVI.
DVI is ahead of DisplayPort by 8 years. The DMCA is passed and HDCP becomes a Thing. Many card vendors do put DisplayPort on their cards, since it's the "Professional" standard for video, but that isn't until 2008 or so. DisplayPort would not be widely adopted until 2012-ish.
Fast forward, VGA dies. DisplayPort and DVI-Dual link are there. DVI Dual is forwards-ish compatible with upcoming HDMI displays for TVs, as pushed by the MPEG-LA and DVD makers. In 2009, less than 5% of devices shipped had DisplayPort on them. DisplayPort at this time also cannot handle the two most popular color spaces in use: sRGB and Adobe RGB 1998.
Part of the issue was a perception thing: Displayport was widely adopted by Apple early on, and consumer understanding of Mini DisplayPort was that it was an Apple standard, rather than an open standard, and this further pushed the port out of the limelight.
This poses an interesting question, maybe some of the hobby-lawyers on HN like to chime in and post heir theories :o)
Let's assume they print that Logo on their box, call it HDMI in their Windows drivers, but don't do so in their Linux drivers, while it's still a spec-compliant implementation. Would that pose a potential legal problem, and if so why?
If it's the fact that they have access to the official HDMI 2.1 spec, implemented that, but call it something else, which I could imagine they forbid in some contract, would things change if some random hacker with too much time on their hands reversed the protocol by sniffing it, implementing it for the AMD driver (again without calling it HDMI)?
Too bad the HDMI forum doesn't feature an email address on their home page, I'd have loved to tell them what I think of them.
So a different term would be needed (like "WLAN" or "BT" is used by non-members)
It's not just that, it's about playing nice so they can have access to the next version of the spec if one comes along.
>Video Out port: DVI-D signal in 640х480 px, 60 Hz. The port supports a well-known video standard that we can't name due to copyright limitations The first letter is H, and the last one is I.
https://blog.flipper.net/introducing-video-game-module-power...
Of course, they don't do HDMI 2.1 or anything advanced like that, but I guess the reason for the name not appearing anywhere is the same as you're discussing here.
Not nice. Neither worms, nor HDCP forum.
Aren't we able to create and use open standards?
DisplayPort is not proprietary; it is a VESA standard.
> Lightning
If you owned any iPhone between 2012 and the present, you had to use Lightning, without choice. And for some inexplicable reason, Americans love iPhones.
In theory, sure. In practice you'll have to construct a financially sustainable organization that is able to motivate all interested parties to chip in and is also able to certify the implementation and at the same time also doesn't fall victim to internal corruption (e.g. high C-level compensation making it unsustainable). I think there are few-to-no precedents for that in the open source space in general, and even less when it comes to standards body organizations for maintaining a standard at that level of complexity.
In most domains proprietary specifications form the backbone of everything. A lot of governments refer to ISO standards, which by default are not open access.
Here's a list of just the best known ones [0]. There are literally hundreds of thousands of open standards for everything from communications to mechanical engineering, to packaging to chemical formulas....
They make the world go round.
No piece of technology you use today, especially the Internet would function without open standards and standards bodies.
For some reason bits of the digital tech industry, in particular media and entertainments, have a parochial disconnection with the rest of reality and forget that they stand on the shoulders of giants and operate with the assent of everyone else in the world giving them the standards space within which to work.
[0] https://en.wikipedia.org/wiki/List_of_technical_standard_org...
The moment you want to control what people watch and how they watch it, you lose any hope of having an open standard.
Translation by obsolete sandwich-fed LLM:
In practice, zealously litigious organisations will assemble corporate lawyers in a room to compute profits and to define access constraints for consumers.
Standards documents being behind a paywall is not at all the same thing as something being proprietary or needing to be licensed. ISO charges for standards documents to pay their administrative costs, you can implement those standards without paying any extra money. And if you happen to have an alternative way of implementing the standard without reading its document, that is fine too. If you implement JPEG XL by studying its open source reference implementation, that is A-OK.
in the mean time keep paying taxes, rent, subscription, and utility bills.
i'm not even sure it's capitalism that needs to get modified? maybe it's something about how private/public property works that is cleary off the mark and needs updates?
I'd argue that in the internet specifically, open sourced implementations of the protocols are the backbone of everything not closed proprietary specs; aren't most internet specifications open?
You can, but you need monitor/graphics chip companies to use it. These are mostly the members/creators of the HDMI standard, they are also probably the best able to create a standard that others will use:
Once you get a libre OS you can dump the content of the BUS or fake whatever HDMI hardware out there to get pristine audio and video frames. Also, the current Hollywood movies are very subpar compared to what we had in the 90's, so who cares.
My SO has an Amazon Prime account and yet they want to show adverts in middle of a media she already paid to be displayed without ads in theory. So, you are paying them twice. Thus, I don't consider Bittorrent piracy when you legally paid for a service but the streamers can break out the rules anytime.
The average power user won't be able to run SkyShowTime on Linux. The idea is locking everyone on Windows, OS X or Linux with Secure Boot and verifiable boot chain if you want to watch movies or TV shows.
Their first sentence is "Any Linux user trying to send the highest-resolution images to a display at the fastest frame rate is out of luck for the foreseeable future, at least when it comes to an HDMI connection" but that's plainly not true. Hardware with closed source drivers, such as the standard nvidia ones do support those because they don't have this legal limitation. Then they even end it with that possibility of closed source AMD and didn't bother even asking themselves if anyone else has done this.
https://hdmiforum.org/about/hdmi-forum-board-directors/
I see people from Apple Panasonic Sony Nvidia Samsung etc.
Hardware companies. Maybe you have to buy your way in the club.
Almost anti-competitive that the HDMI 2.1 spec people won't allow an open implementation.
That they don't even allow open implementation should have been a red flag to all of us that HDMI 2.1 has not been subjected to sufficient review.
Have any of you sufficiently reviewed an actual implementation of this spec? Only with black-box testing because it's closed source?
https://www.amazon.com/AmazonBasics-Aluminum-USB-C-DisplayPo...
They also have an adapter: https://www.amazon.com/AmazonBasics-Aluminium-DisplayPort-Ad...
I don't understand. Fuck whatever committee said whatever crap. Open source is open source. Just make the damn driver and give the suits 2 middle fingers.
Also, "HDMI" itself is a trademark with usage only allowed to its Members (like "Wi-Fi"), so even if a non-Member would do an open-source HDMI-implementation against the will of the HDMI-Forum, he would likely not be allowed to call it "HDMI" (like "WLAN" is used by companies without Wi-Fi Alliance Membership)
WiFi is pretty much there already
This is the HDMI Group that's being Idiots in this case because the Content Mafia is afraid someone might use this to steal content. Anyway, back to torrenting stuff...
A cheap DP-HDMI dongle makes all this go away. As long as VESA doesn't behave the same way, anyway.
I normally run it on 120Hz over DP, but it will work fine over HDMI 1.1 at 60Hz. My (5+ years old) TV runs at 1080p/120Hz just fine too.
Even DVI had HDCP though, so it too was flawed :-( https://en.wikipedia.org/wiki/High-bandwidth_Digital_Content...