[HN Gopher] When HDMI 2.1 Isn't HDMI 2.1 - The Confusing World o...
       ___________________________________________________________________
        
       When HDMI 2.1 Isn't HDMI 2.1 - The Confusing World of the Standard
        
       Author : cbg0
       Score  : 129 points
       Date   : 2021-12-13 18:57 UTC (4 hours ago)
        
 (HTM) web link (tftcentral.co.uk)
 (TXT) w3m dump (tftcentral.co.uk)
        
       | theandrewbailey wrote:
       | It sounds like the marketing people who kept renaming USB 3.x Gen
       | FU got hired to mess up HDMI.
        
       | PedroBatista wrote:
       | I'm a "tech guy" and I'll be in the market soon for a screen.
       | 
       | Just the thought I'll have to learn about this while HDMI
       | disaster in order to not get burned gives me anxiety.
       | 
       | Also, never quite liked HDMI when it came out, but from what I'm
       | reading they really outdone themselves during these years.
        
       | nvarsj wrote:
       | This article doesn't mention the most infuriating aspect of HDMI
       | - it's not an open standard! It's a closed, proprietary standard
       | that requires licensing fees and prevents any open source drivers
       | from existing. This is why the Linux AMD open source drivers
       | don't support HDMI 2.1 - so you can't display 120hz@4K on Linux
       | with AMD.
        
       | mey wrote:
       | Quick summary, HDMI 2.0 no longer "exists", and HDMI 2.1 only
       | features are now optional according to the certifying body.
       | Manufactures are supposed to indicate which features they
       | support.
       | 
       | Whelp, I guess we should just stick to Display Port 1.4
        
         | jayflux wrote:
         | I see they went to the USB school of standardisation
        
           | Roritharr wrote:
           | I really wonder what is up with that. These standards become
           | increasingly frustratingly complex even for people who deal
           | with them daily.
        
             | colechristensen wrote:
             | It's just design by committee and long-standing efforts for
             | backwards compatibility. Also the people writing the
             | standards are far too familiar with them and thus a bit
             | lost when it comes to making practical decisions.
             | 
             | Whenever you make changes there will be compromises and
             | someone will have reason to be unhappy.
        
             | jjoonathan wrote:
             | Cable manufacturers probably realized that the secret to
             | profits lay in resisting commoditization, beat a path to
             | the table, and made it happen.
        
             | zokier wrote:
             | I don't think its particularly odd that the specifications
             | are supersets of old versions; indeed that feels pretty
             | common in the standards world. IETF specs are maybe the odd
             | ones out where you typically have to read like ten
             | different RFCs to get good picture of some standard.
        
           | mavhc wrote:
           | USB 2 Full Speed = USB 1 speed
           | 
           | This has been going on for 20 years
        
           | errcorrectcode wrote:
           | LOL. Why don't you understand superduperspeed USB 3.4.6
           | 2x2x6? It's so eeeasy.
           | 
           | And, coming soon to an Amazon app near you:
           | 
           |  _(Pack of 3) USB4 Thunderbolt 4, 98 ft / 30m, 240W charging,
           | gold plated! $19.99! Free Shipping!_
           | 
           | It's not like anyone is checking products are what they say
           | they are.
        
             | spicybright wrote:
             | You can't legally call your food a hamburger if it's 90%
             | meat glue and 10% cow.
             | 
             | Can we not do the same with cables?
             | 
             | Whoever is responsible for setting standards must be
             | getting some good kickbacks from all this...
        
               | colejohnson66 wrote:
               | The people writing the standards are also the ones
               | implementing it. That's the kickback. That's why every
               | USB 3.whatever device suddenly became USB4 ones.
        
         | IshKebab wrote:
         | Seems kind of reasonable. DisplayPort is exactly the same -
         | just because your display supported DisplayPort 1.4 doesn't
         | mean it is required to support 10 bit colour, VRR, etc.
        
           | tjoff wrote:
           | Well that makes sense. Your 1080p display shouldn't be forced
           | to accept an 8k signal just because the interface supports
           | it.
        
           | thaumasiotes wrote:
           | I replaced my old Dell XPS laptop with a newer Dell laptop.
           | The old one had been happily driving an external monitor
           | through a thunderbolt cable (USB-C on the laptop side;
           | DisplayPort on the monitor side.)
           | 
           | The new laptop still has a thunderbolt port proudly
           | advertised in the specs. But I'm not allowed to use it. It
           | won't send video data out that way. And when I called tech
           | support, the most they would offer me for my newly-purchased
           | laptop was "Don't use the thunderbolt port. Use the HDMI
           | port."
        
       | kup0 wrote:
       | Why does it feel like it is inevitable that
       | standardization/licensing organizations in tech will always
       | eventually turn into a user-hostile mess?
       | 
       | USB, HDMI, what can we screw up next?
       | 
       | Is it incompetence? Malice? I'd really like to see an in-depth
       | investigation of this phenomenon
        
         | rob_c wrote:
         | I'm leaving towards malice (through not caring so much for
         | users) caused by big tech using this arena as a battleground.
         | 
         | I wish all cables were equal too, but c'est la vie
        
         | errcorrectcode wrote:
         | Greed, of course.
         | 
         | If it becomes too big of a problem, each cable and device will
         | be required to have a challenge-response Obscure Brand Inc.
         | proprietary U9 chip burned with a valid secret key and serial
         | number at the factory that must return a valid response for the
         | link to be enabled.
        
         | ASalazarMX wrote:
         | HDMI was never a pro-user protocol, it was made to encumber a
         | digital display signal with DRM.
        
           | ChuckNorris89 wrote:
           | This. HDMI was cooked as a proprietary connector with DRM by
           | the big manufacturers in the DVD/Blu-Ray, TV, home-
           | entertainment business and the big movie studios to enforce
           | stronger content protections to protect their IP, at wich it
           | miserably failed, as I can still torrent every Hollywood
           | blockbuster and every Netflix series.
           | 
           | IIRC, every manufacturer must pay a fee to the HDMI
           | consortium for every device with HDMI they sell.
           | 
           | DisplayPort, by contrast is a more open standard only
           | requiring a flat fee for the standard documentation and
           | membership instead of a fee per unit sold IIRC.
        
             | babypuncher wrote:
             | DisplayPort and DVI both support HDCP. This wasn't the
             | purpose behind HDMI, though support for it was no doubt a
             | requirement for adoption. It was designed to be a single
             | cable for carrying video and audio between playback
             | devices, receivers, and displays.
             | 
             | For this purpose, it succeeded and did a much better job at
             | it than alternatives. HDMI still makes far more sense for
             | use in a home theater environment than DisplayPort thanks
             | to features like ARC.
        
               | Talanes wrote:
               | HDMI is great for a home theatre set up where there's an
               | obvious central unit, but the ecosystem has gotten worse
               | if your speakers don't take in HDMI, at least at the very
               | cheap end of the spectrum I buy on.
               | 
               | My current TV will only put out an inferior "headphone"
               | mix over the 3.5mm connection, and the SPDIF connection
               | is co-axial on the tv, but optical on the speaker. Having
               | to run a powered converter box just to get audio from my
               | tv to a speaker feels like such a step backwards.
        
               | spicybright wrote:
               | Is there a big difference because 3.5 and something
               | digital?
               | 
               | I know 3.5 is worse on a technically, but I've never been
               | able to actually notice the difference.
        
               | tjohns wrote:
               | I think the better question is why SDI video connections
               | aren't available on any consumer devices.
               | 
               | While HDMI is nice for consumers because it carries
               | audio/data, SDI cables are cheap (just a single coax
               | cable!) and easy to route (T-splitters are a thing!).
               | 
               | SDI does not support HDCP, however.
        
               | MomoXenosaga wrote:
               | On that note are there TVs with displayport?
               | 
               | I'm using my LG TV as monitor for a PC and forced to use
               | HDMI.
        
               | dwaite wrote:
               | IIRC, most panels interface via DisplayPort internally
               | these days.
        
               | beebeepka wrote:
               | Gigabyte has been selling a version of LG CX48 slightly
               | changed to be a monitor. It has HDMI and DP.
               | 
               | Model name is AORUS FO48U.
        
           | toast0 wrote:
           | HDCP can run on DVI or DisplayPort too. HDMI is a smaller,
           | lower pin count connector than DVI, however.
        
             | mjevans wrote:
             | HDMI's initial version is electrically and pin-compatible
             | (passive adapter only) with DVI-D single link; assuming the
             | DVI port supports HDCP.
             | 
             | The parent post is correct in that the mandatory HDCP was a
             | major feature (for the involved cabal of organizations).
        
               | teh_klev wrote:
               | > The parent post is correct in that the mandatory HDCP
               | was a major feature
               | 
               | This is wrong. HDCP isn't mandatory to implement HDMI,
               | they are two separate technologies. I'm not defending
               | HDCP or DRM encumbered content but I wish folks would get
               | their facts straight.
        
             | [deleted]
        
           | teh_klev wrote:
           | Not quite true. The "DRM" mechanism you're most likely
           | referring to is HDCP which was designed separately by Intel
           | to provide copy protection over multiple device types
           | including DVI (the first implementation), DisplayPort and of
           | course HDMI.
           | 
           | It's not the HDMI interface that enforces copy protection
           | it's the software, firmware and additional hardware on the
           | devices that do this. You can use HDMI perfectly fine without
           | the additional DRM crap.
        
         | noneeeed wrote:
         | I almost always err on the "never attribute to malice that
         | which can be adequately explained by incompetance". Howerver,
         | the "standards" bodies ability to repeatedly make a complete
         | pigs ear of every single interconnect system makes me assume
         | the opposite.
        
         | amelius wrote:
         | Don't forget MPEG.
        
       | jon-wood wrote:
       | Has anyone ever seen a device that actually uses Ethernet over
       | HDMI? The thought of being able to plug a single network cable
       | into the back of your display and then anything plugged into that
       | has a wired connection is lovely, but as far as I can tell
       | absolutely nothing actually supports it, despite the ever growing
       | set of internet connected devices sitting underneath people's
       | TVs.
        
         | daveevad wrote:
         | I went down this rabbit hole the other night and found a German
         | Blu-ray receiver T+A K8[0] from 2012 that supports the HDMI
         | Ethernet Channel. I have not found, however, the other piece of
         | equipment that I can only suspect _may be_ be some sort of HDMI
         | IP injector.
         | 
         | [0](https://www.homecinemachoice.com/content/ta-k8-blu-ray-
         | recei...)
         | 
         | > Ethernet switch: distribution of an Ethernet uplink
         | connection to BluRayplayer, streaming client, TV monitor and up
         | to 3 source devices (via HEC),up to 2 more external devices via
         | LAN cable (e.g. playing console
         | 
         | from the manual
        
         | Uehreka wrote:
         | I tried to use this once in a theatre to connect a camera
         | watching the stage to a greenroom backstage. It worked
         | sometimes, but was super unreliable. Latency was often several
         | hundred milliseconds, and sometimes the image would just
         | straight up disappear. It may be that we had bad
         | HDMI<->Ethernet devices, but that's the thing: It's not a
         | "works or doesn't" kind of thing, it's a "varies with the
         | quality of all the devices in the chain" kind of thing.
        
         | tjohns wrote:
         | Ethernet Over HDMI is used by newer AV receivers to support
         | eARC (extended audio return channel). The older ARC spec would
         | work with any HDMI cable, but bandwidth limitations only
         | allowed compressed 5.1 surround sound. eARC uses the higher
         | bandwidth from Ethernet Over HDMI, allowing uncompressed 7.1
         | surround and Dolby Atomos streams.
         | 
         | (If you're not familiar with ARC/eARC, this lets the TV send
         | audio from its native inputs back to the AV receiver over an
         | HDMI cable. Without ARC, you need to plug _everything_ directly
         | into the AV receiver.)
        
           | josteink wrote:
           | eARC is neat in theory, but my experience with it has been
           | that's too unreliable and unstable to actually use in
           | practice.
           | 
           | I even bought new cables to make sure there wouldn't be
           | issues, but eARC audio regularly falls out in ways other
           | sources (including regular ARC) doesn't. And when it fails
           | there's literally zero tools for diagnosing it either.
           | 
           | Maybe around the time of eARC2 we'll have something working
           | as well as Bluetooth does today. (Yes, that's me being
           | snarky)
        
             | ApolIllo wrote:
             | That's unfortunate. I've been hoping to simplify my HT
             | setup and eARC was something I wanted to target in an
             | upgrade
        
           | simplyaccont wrote:
           | actually, if i understand correctly, earc doesn't use HEC. it
           | just re-purposes hec wiring for something useful
        
         | WorldMaker wrote:
         | My understanding is that Ethernet over HDMI is still used by
         | consumer devices, just no longer for the original dream of
         | switching wired internet given the modern ubiquity of WiFi.
         | More recent standards such as ARC [Audio Relay Channel; used
         | for a number of surround sound setups] and CEC [Consumer
         | Electronics Control; used for passing remote/controller data
         | between devices] both piggy back on the Ethernet pins, and I
         | believe they entirely interfere with using the Ethernet pins as
         | Ethernet (though maybe only in the available bandwidth/speed?).
        
       | wyager wrote:
       | 2021 HDMI is a disaster. I'm using a Sony flagship TV, a PC, and
       | a popular Sony 7.1 receiver.
       | 
       | I had to update my graphics card to get 4k120 444 10bit and eARC.
       | 
       | Only eARC is totally broken - audio often doesn't work at all
       | without restarting PC/TV/receiver a few times. And then once it
       | "works" it will randomly cut out for 5-10 seconds at a time.
       | 
       | HDR on windows is also totally broken. It was a nightmare to get
       | something that correctly rendered full 10bit HDR video (I ended
       | up having to use MPC-HC with madvr and a ton of tweaking). You
       | also have to _turn off_ windows HDR support and use D3D exclusive
       | mode. After updating my TV to get DRR, the audio for this setup
       | stopped working.
       | 
       | Linux also has zero HDR support. Didn't have luck getting 5.1 or
       | 7.1 working either.
       | 
       | MacOS can at least handle HDR on Apple displays - not sure if it
       | works on normal displays. Haven't tried surround sound.
        
         | MayeulC wrote:
         | Not to be offensive, but -- first world problems: where did you
         | find a new graphics carts, for starters?
         | 
         | Now, a bit more on a serious tone: this is all bleeding edge.
         | And combining multiple recent development together is a recipe
         | for corner cases and untested combinations.
         | 
         | That said, did you try Variable Refresh Rate with that? Bur
         | reduction technologies (backlight strobing) are also
         | interesting, but thankfully they require little software
         | interaction (for now).
        
         | plus wrote:
         | Were you able to get 4k120Hz 444 working on Linux? What GPU do
         | you have? I can only do 4k60 444 or 4k120 420 on my LG C1
         | connected to my Radeon RX 6900xt.
        
           | wyager wrote:
           | I don't remember, but I had to sidegrade from a 2070Ti to a
           | 3060Ti to get HDMI2.1 to get 4k120 444.
        
       | theshrike79 wrote:
       | Linus Tech Tips used a $expensive dedicated device to test a ton
       | of HDMI cables. Most of them were shit:
       | https://youtu.be/XFbJD6RE4EY
       | 
       | And what was most interesting is that price and quality didn't
       | always correlate at all.
        
       | errcorrectcode wrote:
       | The fundamental problem is a lack of supply chain integrity.
       | Customers can buy a million cables or laptop batteries directly
       | from (country that shall not be named), but they have no idea if
       | they're getting fakes or not.
       | 
       | The fix isn't "authorized" suppliers only, but requiring a
       | reputable someone in the supply chain to maintain evidence of
       | continually testing products advertising trademarked standards
       | for compliance. If it's too much work, then boohoo, sad day for
       | them, they don't get to be traded or sold in country X.
       | 
       | In all honesty, flooding a market with cheap, substandard
       | products claiming standards they don't comply with is dumping.
       | 
       | https://en.wikipedia.org/wiki/Dumping_(pricing_policy)
        
       | anonymousiam wrote:
       | I remember when USB2 came out and similar mischief ensued. All
       | the hardware manufacturers got together and pushed the standards
       | body to re-brand USB 1.1 hardware as USB 2.0 (full-speed vs.
       | high-speed). It allowed hardware retailers to empty their
       | shelves, while consumers thought they were getting the latest
       | technology.
       | 
       | https://arstechnica.com/uncategorized/2003/10/2927-2/
        
         | sixothree wrote:
         | Same thing exists for USB3. Every time a new version is
         | released, all cables and products suddenly support that
         | revision. They just don't have any new features.
         | 
         | Not to mention that I've _never_ had a cable identify what it
         | is capable of. Thus USB is a shitshow of crapiness.
        
           | michaelbuckbee wrote:
           | Is there some device that can/could do this. I did a cursory
           | look through Amazon and there's a lot of "signal testers", is
           | that sufficient?
        
             | driscoll42 wrote:
             | There was a guy from Google going around reviewing all the
             | cables testing them:
             | https://arstechnica.com/gadgets/2015/11/google-engineer-
             | leav... https://en.wikipedia.org/wiki/Benson_Leung
             | https://usbccompliant.com/
        
               | jmiserez wrote:
               | Apparently he uses [1] a "Advanced Cable Tester v2" from
               | Totalphase for his tests, starting at 15000$. Probably
               | depends on what you need to test.
               | 
               | [1] https://www.reddit.com/r/UsbCHardware/comments/ny4y6z
               | /commen...
        
             | [deleted]
        
           | thescriptkiddie wrote:
           | I recently upgraded the NVME SSD in my machine. The
           | motherboard only has a single NVME compatable M.2 port, so I
           | bought a USB 3.1 enclosure [0] to put the old drive in while
           | I cloned it to the new drive. The enclosure has a USB type-C
           | connector so I also had to use a USB 3.1 A-to-C adapter [1]
           | to connect it to my motherboard's [2] USB 3.1 type-A port.
           | Anyway something somewhere went wrong and it took over 5
           | hours to copy 750 GB instead of the expected 10 minutes.
           | Absolute shitshow.
           | 
           | [0] https://www.newegg.com/rosewill-
           | rhub-20001-enclosure/p/N82E1...
           | 
           | [1] https://www.amazon.com/dp/B07L92KPBB
           | 
           | [2] https://www.asus.com/us/Motherboards/Z97AUSB_31/
        
             | anonymousiam wrote:
             | I recently had nearly the opposite experience. I was
             | upgrading a Linux server to a new motherboard with a NVMe
             | SSD from and old one with a SATA3 SSD attached. To see how
             | things would go, I imaged the old SATA3 SSD onto a
             | USB3/NVMe adapter
             | (https://www.amazon.com/gp/product/B07Z8Y85GL) and tried
             | booting the new system from USB. It actually came up
             | working, so next I figured I would need to remove the NVMe
             | SSD from the USB3 adapter and install it in the motherboard
             | slot, boot the system from a different USB drive, and then
             | once again image the NVMe SSD from the old SATA3 drive. (I
             | had read that the USB3/NVMe adapter accessed the SSD in a
             | way that was incompatible with the motherboard.) So I
             | installed the NVMe SSD in the new motherboard and powered
             | it up just for giggles. To my great surprise, it booted
             | normally and everything was fine! (Oh, and my SSD access
             | speeds went from 500MB/s on the old system to 2GB/s on the
             | new one.)
        
               | MayeulC wrote:
               | Why wouldn't it work? Bulk storage is bulk storage. As
               | for booting from that... Linux has all the required
               | drivers in the kernel, at worst (booting from radically
               | different hardware) select a fallback initramfs with more
               | drivers. If you did a bit-by-bit copy of your drive,
               | partitions should have come out unmodified at the other
               | end, including GUIDs and the EFI label on the EFI
               | partition (if using EFI), or the bootloader in the MBR if
               | using that.
               | 
               | Parent is talking about speed. There are different things
               | in M.2 ports (as this is the form factor): SATA and NVMe,
               | PCIe AHCI[1]. There was probably a slight incompatibility
               | and a fallback to some other mode there.
               | 
               | [1] https://en.wikipedia.org/wiki/M.2#Storage_interfaces
        
               | doubled112 wrote:
               | I've definitely had problems with external storage on
               | Linux machines
               | 
               | > Why wouldn't it work? Bulk storage is bulk storage
               | 
               | You would think so, but anything using UAS is a complete
               | mess and you can't be sure it'll work. I can only assume
               | devices implemented the convenient parts of the spec and
               | fail randomly.
               | 
               | Happened often enough the kernel flag for the USB quirk
               | to disable UAS was stickied on the Raspberry Pi forums
               | when USB boot was new.
               | 
               | https://forums.raspberrypi.com/viewtopic.php?t=245931
        
             | zokier wrote:
             | From the review section of that enclosure:
             | 
             | > Cons: - Included USB-C to USB-C cable is only capable of
             | USB 2.0 speeds (40 MB/s as measured with crystaldiskmark)
             | 
             | yeah, that would explain it.
        
         | Frenchgeek wrote:
         | I still had an USB 1.0 motherboard laying around not too long
         | ago...
        
       | b3lvedere wrote:
       | When i had a very old Samsung tv, my Nvidia Geforce videocard
       | produced a nice image to the tv and Dolby AC3 sound to my even
       | older surround set via a nice hdmi to optical out converter in
       | between.
       | 
       | Now i have a not-so-old Philips tv and suddenly i can't get dolby
       | ac3 sound anymore. Why? Because the GeForce communicates with the
       | tv and the tv responds it only has stereo. The surround set has
       | no hdmi input or output so it cannot communicate with GeForce.
       | 
       | I have tried everything from hacking drivers to changing EDID
       | with some weird devices. Nothing works. Stereo only. Very
       | frustrating.
       | 
       | I was recommended to replace my surround set or my tv. Both
       | pretty expensive solutions for some weird hdmi communication
       | bug/standard.
       | 
       | So i bought a $20 usb sound device to get Dolby AC3 sound to my
       | suround set. All because i replaced my old tv which couldn't
       | communicate with the GeForce about its speaker setup.
        
       | astraloverflow wrote:
       | As the saying goes, a camel is a horse designed by a committee
        
       | MomoXenosaga wrote:
       | Is it really confusing? The people who need the specs of HDMI2.1
       | (like gamers) will do their research.
        
       | floatingatoll wrote:
       | This is a stellar example of how catering to everyone results in
       | the destruction of a brand. "HDMI 2.1" will be with us for years
       | and it's essentially irrelevant now, and they aren't willing to
       | admit they were wrong, so their only hope is to release an HDMI
       | 2.2 that is just "all of the optional features of HDMI 2.1 are
       | now required", which will cause howls of outrage and confusion.
       | I'm guessing they are too captive to manufacturers to have the
       | courage to do that. Oh well.
        
         | IshKebab wrote:
         | It wasn't an oversight to make the features optional. They're
         | deliberately optional so device manufacturers aren't forced
         | into a ridiculous all or nothing situation.
        
           | rocqua wrote:
           | AND manufacturers not wanting to be stuck with old branding
           | on devices.
           | 
           | the standard coukd have made some things optional. But they
           | made everything optional.
        
           | floatingatoll wrote:
           | I think they would have been better off forcing manufacturers
           | into that situation, and that the feature list has grown so
           | large that it's no longer a sensible specification for
           | purchasing decisions, which will erode consumer trust.
        
           | nomel wrote:
           | I see it as a problem with the abstractions being on the
           | wrong layer.
        
             | IshKebab wrote:
             | It's kind of going in that direction with HDMI and
             | DisplayPort over USB 3. But.. not exactly because they're
             | not actually encoding video data into USB packets. It's too
             | difficult to have abstractions like that when you're
             | dealing with such insanely high data rates.
        
       | Strom wrote:
       | As with any tech, you can't trust marketing if you plan on
       | pushing it to the limit. You need to either test it yourself, or
       | in some rare cases there are reviewers who have already tested
       | it. Most "reviews" for tech are extremely superficial though and
       | certainly won't be testing HDMI inner workings.
       | 
       | For HDMI 2.1, there are a bunch of monitors being sold under that
       | banner that don't have the full 48 Gbps bandwidth. For example
       | the Gigabyte M28U is limited to half of that at 24 Gbps. [1]
       | Gigabyte certainly doesn't want you to know this. On their
       | specificaion page they just list it as HDMI 2.1. [2]
       | 
       | Similar nonsense was going on during the transition from HDMI 1.4
       | to 2.0. I really wanted HDMI 2.0 output on my laptop and held off
       | on a purchase until that was possible. I finally bought an ASUS
       | Zephyrus GX501 laptop. It has a Nvidia GTX 1080, which does
       | support HDMI 2.0 output. The marketing for this laptop also seems
       | to suggest that they're utillizing this potential, with claims
       | like _" You can also run a large ROG gaming monitor with NVIDIA
       | G-SYNC(tm) via DisplayPort(tm) over Type-C(tm) USB or use HDMI
       | 2.0 to connect 4K TVs at 60Hz."_ [3] The specification page
       | mentions the HDMI 2.0 port. [4] However in reality I found out
       | that this HDMI 2.0 port is limited to HDMI 1.4 bandwidth. It
       | supports HDMI 2.0 features like HDR, but not the bandwidth. 4K @
       | 60Hz is possible only with 4:2:0 chroma subsampling and you're
       | limited to 8 bits, so no HDR.
       | 
       | I'm not the only one who found this out either. There are plenty
       | of others on the ASUS forums. [5] Hard to say whether this was
       | intentional misleading by ASUS marketing, or whether engineering
       | messed up, or whether the feature ended up being cut due to time
       | constraints. In any case, they still haven't adjusted the old
       | marketing pages for this laptop that never shipped with HDMI 2.0
       | bandwidth.
       | 
       | Reviewers don't tend to check things like this either. For
       | example The Verge reviwed this laptop [6] and wrote: _" Asus has
       | a nice array of ports on the Zephyrus -- four traditional USB 3.0
       | ports, a Thunderbolt 3 USB Type-C port, HDMI, and a headphone
       | jack."_ They're just regurgitating the marketing material.
       | There's no depth to it, the claims aren't verified. So people end
       | up buying the product and then get confused why it isn't working.
       | 
       | --
       | 
       | [1]
       | https://www.rtings.com/monitor/discussions/q-D1CBeE2EiGMYgn/...
       | 
       | [2] https://www.gigabyte.com/Monitor/M28U/sp#sp
       | 
       | [3] https://rog.asus.com/laptops/rog-zephyrus/rog-zephyrus-
       | gx501...
       | 
       | [4] https://rog.asus.com/laptops/rog-zephyrus/rog-zephyrus-
       | gx501...
       | 
       | [5]
       | https://rog.asus.com/forum/showthread.php?96916-GX501-Zephyr...
       | 
       | [6] https://www.theverge.com/2017/8/25/16201656/asus-rog-
       | zephyru...
        
         | 0cVlTeIATBs wrote:
         | I have had similar concerns with reviewers skipping USB
         | compatibility on laptops. After I left comments on relevant
         | reviews with a quick rundown of the problem they started
         | including it.
         | 
         | I might get at it again because I'd want my next laptop to
         | drive my 4k 120Hz display over HDMI.
        
       | alin23 wrote:
       | When the new MacBook Pro came out this year, everyone was puzzled
       | as to why the newly included HDMI port was only 2.0.
       | 
       | Well it turns out they lied. It's 2.1 after all! \s
       | 
       | Jokes aside, it's actually only 2.0 because internally they
       | transmit a DisplayPort video signal and convert it to HDMI using
       | an MCDP2900 chip[0], which is the same chip usually seen inside
       | USB-C hubs.
       | 
       | So the new MacBook basically got rid of the HDMI dongle by
       | integrating it inside the laptop.
       | 
       | This also breaks DDC/CI on that port and now I get a ton of
       | support emails for Lunar (https://lunar.fyi) because people can't
       | control their monitor brightness/volume and think that the app is
       | broken.
       | 
       | [0] https://www.kinet-ic.com/mcdp2900/
        
         | eatYourFood wrote:
         | How can I control screen brightness from my m1 max? Willing to
         | pay for a solution.
        
           | alin23 wrote:
           | Lunar can do that: https://lunar.fyi
           | 
           | It's free for manual brightness adjustments.
           | 
           | Just make sure to use one of the Thunderbolt ports of the
           | MacBook.
           | 
           | And if it still doesn't work, check if there's any monitor
           | setting that could block DDC by going through this FAQ:
           | https://lunar.fyi/faq#brightness-not-changing
        
             | MrSourz wrote:
             | Neat. I've been quite fond of Monitor Control:
             | https://github.com/MonitorControl/MonitorControl
        
               | ApolIllo wrote:
               | How does Lunar and Monitor Control get around the Tbolt
               | -> HDMI dongle baked into the logic board problem?
        
               | gsich wrote:
               | What problem? Is DDC not available?
        
               | pseudalopex wrote:
               | They said use one of the Thunderbolt ports.
        
               | alin23 wrote:
               | There is no way to get DDC working on that port. It's
               | probably locked in the chip firmware.
               | 
               | All we can do is to provide software dimming using Gamma
               | table alteration, or tell people to use the Thunderbolt
               | ports instead.
        
       | smithza wrote:
       | LTT did some manual testing of HDMI cables [0] in hopes of
       | answering the last question of this article, "how do consumers
       | know if a cable supports v2.1 features?"
       | 
       | Does anyone know of other tests or more comprehensive data sets?
       | 
       | [0] https://linustechtips.com/topic/1387053-i-spent-a-
       | thousand-d...
        
       ___________________________________________________________________
       (page generated 2021-12-13 23:00 UTC)