Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Television IT Technology

Don't Buy a Monitor or TV Just for HDMI 2.1 -- Read the Fine Print or You Might Get Fooled (theverge.com) 91

An anonymous reader shares a report: Four years running, we've been jazzed by the potential of HDMI 2.1 -- the relatively new video connector standard that can provide variable refresh rates (VRR), automatic low latency connections (ALLM), and of course, a giant pipe with 48Gbps of bandwidth (and fixed rate signaling) to deliver up to 10K resolution and up to a 120Hz refresh rate depending on your cable and compression. But today, I'm learning that not only are all of those features technically optional, but that the HDMI standards body owner actually encourages TV and monitor manufacturers that have none of those things -- zip, zilch, zero -- to effectively lie and call them "HDMI 2.1" anyhow. That's the word from TFTCentral, which confronted the HDMI Licensing Administrator with the news that Xiaomi was selling an "HDMI 2.1" monitor that supported no HDMI 2.1 features, and was told this was a perfectly reasonable state of affairs. It's infuriating.

It means countless people, some of whom we've encouraged in our reviews to seek out HDMI 2.1 products, may get fooled into fake futureproofing if they don't look at the fine print to see whether features like ALLM, VRR, or even high refresh rates are possible. Worse, they'll get fooled for no particularly good reason: there was a perfectly good version of HDMI without those features called HDMI 2.0, but the HDMI Licensing Administrator decided to kill off that brand when it introduced the new one. Very little of this is actually news, I'm seeing -- we technically should have known that HDMI 2.1's marquee features would be optional for a while now, and here at The Verge we've seen many a TV ship without full support. In one story about shopping for the best gaming TV for PS5 and Xbox Series X, we characterized it as "early growing pains."

This discussion has been archived. No new comments can be posted.

Don't Buy a Monitor or TV Just for HDMI 2.1 -- Read the Fine Print or You Might Get Fooled

Comments Filter:
  • Viola, HDMI 2.1! Slap the sticker on that baby, it'll sell like hotcakes.
    • interesting.
      who is the HDMI Licensing Administrator.
      why does this sack of shit advocate buyer beware.
      instead of seller beware
      is the job of the HDMI Licensing Administrator a third world glory hole

    • by Immerman ( 2627577 ) on Tuesday December 14, 2021 @01:04PM (#62079509)

      Exactly - I don't see the problem. HDMI is an interconnect standard - it describes the signaling frequency, path tolerances, and encoding codecs needed to be compliant, NOT which features are actually supported by the TV.

      Consider:
      HDMI 1.0 supports resolutions up to (and slightly beyond) 1080p - that doesn't mean all HDMI compatible TVs are 1080p, and for many years vanishingly few were.
      HDMI 1.4 added support for an ethernet channel and stereoscopic 3D - features that most TVs still don't actually support, but a 1.4 TV should at least be able to ignore them successfully, while an older TV may get confused by the non-compliant signal.
      HDMI 2.0 added support for 4k@60Hz in 2013, and 2.0a added HDR in 2015, but the overwhelming majority of HDMI 2 compatible TVs still aren't 4k or HDR.

      Basically, if you care about TV features, look at the list of features of the TV. HDMI only specifies features of the communication channel.

      • I was about to post something against the HDMI Licensing Administrator, but your post makes so much sense that you've changed my mind.

        So here is my new post:
        Fuck you all, TV manufacturers!

    • Yeah, it will even work with your Windows Vista Ready PC.
    • Viola, HDMI 2.1! Slap the sticker on that baby, it'll sell like hotcakes.

      Then add another sticker that says "3D Compatible", and one that says "Certified for Windows Vista" and consumers will know they've got a guaranteed future-proof system they'll never have to worry about upgrading.

  • by Kokuyo ( 549451 ) on Tuesday December 14, 2021 @10:59AM (#62079027) Journal

    We avoided eye contact when they pissed on us with USB (and oh so many other things where we should have had the balls to make a stand).

    This is what we deserve for being weak.

    • by AmiMoJo ( 196126 )

      It goes back way before USB. SCSI had various iterations and to be "SCSI 2" or "SCSI 3" a device only had to support the protocol, even if it didn't actually support any of the new features.

      • Ah yes, LVD and HVD to make it more confusing.

      • USB was a huge improvement over SCSI. Trying to select the correct adapter and device was painful. Unfortunately, it seems we are moving in the opposite direction with USB 3.
    • So are you saying the slower Serial and Parallel ports were better, if you have multiple devices make sure your Com ports and interrupts don't conflict? Perhaps you don't remember the many different SCSI standards, which you never seemed to have the right driver for.
      Being that todays LCD screens use a matrix display, perhaps you think taking the 1s and 0s that is in video Ram, convert them to an analog signal, to be sent on a VGA cable, then back to the display which takes that obvious analog signal and ba

      • Re: (Score:3, Insightful)

        by Kokuyo ( 549451 )

        I feel like nothing you've said relates in any way, shape or form to what I have said...

        • Perhaps you were not clear on what you meant, perhaps you should be more descriptive in what you mean and less emotional guttural reaction. It read like, you hate the introduction of USB (over 20 years ago), and you are not over the fact that it had became the dominate way of connecting equipment to PCs.
          So before USB, we had Serial, Parallel, SCSI and VGA as the primary ports for connecting external equipment. Which in general sucked compared to what USB had became.

          • by kbg ( 241421 )

            I am not sure things are better now:
            USB Type A, USB Type B, USB Type C, USB Mini B, USB Micro B, USB 3.0 Type A, USB 3.0 Type B, USB 3.0 Micro B
            DVI, HDMI Type A, Mini-HDMI, Micro-HDMI, HDMI Type E, DisplayPort, Mini DisplayPort

            Things maybe faster, but it's still a complete mess and grows everyday because connectors are changed every other year so that cable companies can sell more cables.

            • by Kokuyo ( 549451 )

              These things are plug types... That's not the same thing as what bandwidth the cable actually supports.

              • Strike 3, you're out.
                Plug type? Oh, like USB1-3x which have vastly different bandwidth don't all fit the default USB plug on every PC and charger I bought?
                ----
                Never mind, you got sucked into a silly argument and then I followed, and shouldn't have.
                Forget the strike, just a foul tip, you're still in the game.
                I need coffee.

        • I feel like nothing you've said relates in any way, shape or form to what I have said.

          Slashdot forums are a pissing contest wrt how hard you had it as a kid using old computer tech. If you don't understand a response then the other person has pissed further than you, and your id probably has too many digits, which means you never would have won anyway.

    • We avoided eye contact when they pissed on us with USB

      We were pissed on? More like had champagne poured over us. USB had its warts in the process but it was technically far better than everything it replaced and has led to an amazing world of interoperability that we simply couldn't imagine back in the 90s. So a device may not be able to charge full speed. A port may not be able to connect a display. A standard may be renamed, but ultimately fuck it, I'll happily take that tradeoff for all the benefit USB *has* brought.

      No one likes getting wet but at least cha

      • No, it could charge Full Speed. But Full Speed meant slow speed, not full speed. Full speed at the time was High Speed, not to be confused with later SuperSpeed, which is faster than both Full Speed or High Speed. But SuperSpeed can be faster than SuperSpeed too, and SuperSpeed+ is faster again, unless it isn't, because SuperSpeed can be faster than SuperSpeed+.

        They definitely stuffed up the nomenclature of USB, even if you ended up in a much better place than you started out before USB.

        • They definitely stuffed up the nomenclature of USB

          Only if you read marketing words that no one actually uses. I can't even remember EVER buying a SuperSpeed USB device or a High Speed, or a Full Speed. I remember buying USB 1.1, USB 2.0. USB 3.0, and even to this day I probably can't tell you which marketing buzzword corresponds to which one without looking it up because, and this is the most important part: No one ever used that nomenclature. The only thing they really fucked up was the whole USB 3.1 Gen 1/2 renaming thing, but even then for the overwhelm

      • by jwhyche ( 6192 ) on Wednesday December 15, 2021 @12:45AM (#62081717) Homepage

        Here, Here. Even in its worse incarnation USB was far superior to what we had. A different cable for each device, a different port type, using different IRQ. A fucking nightmare.

    • We avoided eye contact when they pissed on us with USB (and oh so many other things where we should have had the balls to make a stand).

      This is what we deserve for being weak.

      Mod parent up! Even on more pro things like OpenCL, where even for OpenCL 3.0, the baseline is OCL 1.2, and then EVER SINGLE FEATURE OF OCL 2.x and OCL 3 IS OPTIONAL FFS!

      Yes, we deserve this

      PS: Caps intended

  • by Crashmarik ( 635988 ) on Tuesday December 14, 2021 @11:04AM (#62079037)

    I can look at my parts bins and see at least 20 types of audio connector, about 8 types of video connector and God only knows how many power and data connectors. (25 pin D connector anyone?)

    Being able to accept a connection and use the signal is in itself a feature and there are people that will want it.

    • by AvitarX ( 172628 ) <me@brandywinehund r e d .org> on Tuesday December 14, 2021 @11:09AM (#62079051) Journal

      Because if they made those features required to label it 2.1, it would make shopping very easy.

      Care about what's in 2.1, buy 2.1, don't care, buy 2.0.

      I understand why they did it though, or are we to expect anything labeled 2.1 to be 10k?

    • by Viol8 ( 599362 ) on Tuesday December 14, 2021 @11:44AM (#62079183) Homepage

      So deliberately changing the connector for no good reason and hence potentially obsoleting old cables (and possibly connected hardware) and forcing people to buy new ones is a "feature" now is it? Right you are.

      • So deliberately changing the connector for no good reason and hence potentially obsoleting old cables (and possibly connected hardware) and forcing people to buy new ones is a "feature" now is it? Right you are.

        If you are talking about HDMI, the connector for 2.1 is the same as the connector of 1.0

        So if you areconnecting an old computer/console/DVD player with 1.0 to your new TV with 2.1, the 1.0 cables will work fine.

        If you were talking about something else, please clarify.

    • I can look at my parts bins and see at least 20 types of audio connector, about 8 types of video connector and God only knows how many power and data connectors. (25 pin D connector anyone?)

      All of this has nothing to do with revving up a standard with the promise of new features and delivering something old. Imagine if USB2.0 made the "High Speed" part optional, and that your two USB2.0 devices completely unbeknownst to you were only operating at USB1.1 speed.

      The point here is that the *only* features of a new standard are optional meaning that it effectively is no different to the previous one.

      • I can look at my parts bins and see at least 20 types of audio connector, about 8 types of video connector and God only knows how many power and data connectors. (25 pin D connector anyone?)

        I have to agree with you one one part. I have a plastic organizer box with dozens of adapters. You never know what you might need. One of the differences though, is that most of those patch cables are still useable, with the exception of some really old stuff. That gets kept in an oddity drawer, never to go offsite.

        My various incarnations of USB cables include some unused now obsolete cables that will probably never be used. But maybe, so they don't get discarded. In addition, one of the new additions to

        • by Megane ( 129182 )

          I had the need for a RS232C 25 pin to the normal RS232

          But the 25-pin version IS the normal one. IBM forced the 9-pin version into existence all on its own. Also, I have a couple of 25-pin RS232 ribbon-cables with flipped sections and extra connectors to not need gender changers or null modems. And if you think that's bad, back in the late '00s I was making some very weird stacks of connectors going from SCSI to Firewire 800 so that I could read some ancient hard drives.

          • I had the need for a RS232C 25 pin to the normal RS232

            But the 25-pin version IS the normal one. IBM forced the 9-pin version into existence all on its own. Also, I have a couple of 25-pin RS232 ribbon-cables with flipped sections and extra connectors to not need gender changers or null modems. And if you think that's bad, back in the late '00s I was making some very weird stacks of connectors going from SCSI to Firewire 800 so that I could read some ancient hard drives.

            That one was a case that could just about be excused. the 25 pin was just kinda big for things like laptops. But it does show up on occasion.

            This is a complete aside - when I read your post, I thought of the quote from Narnia, the lion talking to the witch... you the lion, me the witch.

            "Do not cite the deep magic to me witch, I was there when it was written."

            Made my day. Now the SO is wondering why I am laughing so hard!

      • Imagine if USB2.0 made the "High Speed" part optional,

        It did.

        • It did not. All USB 2.0 host controllers and hubs must support high speed.

          It's only optional in devices which makes perfect sense since your mouse doesn't need to transfer 480Mbps.

      • by Chaset ( 552418 )

        High Speed in USB 2.0 compliant devices IS optional. The USB IF could have possibly made manufacturers follow better guidelines in labeling, but there is nothing wrong with a USB spec version 2.0 compliant devices that don't support High Speed. Just like there were USB 1.1, and 1.2 devices that were Low speed only.

        The numbers are just versions of the Spec. There were changes made in the spec other than additions of the higher speed modes. (for better interoperability with higher speed devices, clarificat

        • You're over simplifying using the word "device". Of course a device can transfer slowly otherwise it would be a colossal waste of a limited reserveable bandwidth.

          All USB 2.0 Host controllers and Hubs MUST support High Speed. Hubs and hosts which don't needed to be advertised as USB 1.1. A USB 2.0 device could optionally connect at any speed. So if your computer or hub said USB 2.0 on it, you could be certain that it would support a High Speed camera or other high speed device connected to it.

          Not even remote

    • by jd ( 1658 )

      Because all HDMI connectors are interchangeable, what makes something a particular version of HDMI is the protocol. If the HDMI 2.1 protocol isn't supported, then it's not really HDMI 2.1. It may be faster than the basic HDMI 2.0, so call it HTML 2.0+. People love + signs. (C++ even has two of them to make it doubly exciting.)

      To label an HDMI 2.0 product as HDMI 2.1 because 2.0 is being discontinued is, well, not great but I might be able to accept that as an excuse from a vendor. You can't sell something t

      • by bws111 ( 1216812 ) on Tuesday December 14, 2021 @02:13PM (#62079719)

        None of those three cases apply here. What applies here is that the 'reporter' is a moron. While he claims that the standards body 'encourages' you to claim 2.1 even if you don't support any features, what they actually say is that you CAN NOT use any version number by itself, you MUST clearly list the features of the version that your product supports. See https://hdmi.org/spec/hdmi2_1 [hdmi.org], under "Can I use HDMI 2.1 in my marketing". "You can only use version numbers when clearly associating the version number with a feature or function as defined in that version of the HDMI Specification. You cannot use version numbers by themselves to define your product or component capabilities or the functionality of the HDMI interface."

  • They can't just use HDMI 2.1 by itself or without any of the HDMI 2.1 features.

    From https://hdmi.org/spec/hdmi2_1 [hdmi.org]

    Can I use "HDMI 2.1" in my marketing

    You can only use version numbers when clearly associating the version number with a feature or function as defined in that version of the HDMI Specification. You cannot use version numbers by themselves to define your product or component capabilities or the functionality of the HDMI interface.

    And please note that NO use of version numbers is allowed in the labeling, packaging, or promotion of any cable product.

    But I agree it's still an ugly mess they've made. The A/B resolution notation is totally inadequate and lets manufacturers pull a fast one like Yamaha and their 24Gbps ports.

    • by MTEK ( 2826397 )

      They can't just use HDMI 2.1 by itself or without any of the HDMI 2.1 features.

      It's confusing. This is what TFT Central determined:

      We contacted HDMI.org who are the “HDMI Licensing Administrator” to ask some questions about this new standard, seek clarification on several questions we had and discuss the Xiaomi display we mentioned above. Here is what we were told:

      HDMI 2.0 no longer exists, and devices should not claim compliance to v2.0 as it is not referenced any more. The features of HDMI 2.0 are now a sub-set of 2.1. All the new capabilities and features associated with HDMI 2.1 are optional (this includes FRL, the higher bandwidths, VRR, ALLM and everything else). If a device claims compliance to 2.1 then they need to also state which features the device supports so there is no confusion.

      So according to what they have told us, this means that in theory all devices with an HDMI 2.x connection should now be labelled as HDMI 2.1, even though at the end of the day they may only offer the capabilities of the older HDMI 2.0 generation. With HDMI 2.0 certification now discontinued, these are basically 2.0 devices hiding under the replacement banner name of 2.1. They don’t have to offer any new capabilities whatsoever, but they are still supposed to be labelled as 2.1 it seems.

      • by grimr ( 88927 )

        How is TFT Central going from HDMI.org's (they're called the HDMI Forum now) reply:

        "If a device claims compliance to 2.1 then they need to also state which features the device supports so there is no confusion."

        to this:

        "They don’t have to offer any new capabilities whatsoever, but they are still supposed to be labelled as 2.1 it seems."

        They can't just label a device or port "HDMI 2.1". All the reputable manufactures don't. Check out the new AVRs. All the new ports are labeled "8K".

        • by suutar ( 1860506 )

          If they are basically 2.0, then a label of "HDMI 2.1" is in fact listing which new features they support - none.
          You note that reputable manufacturers do more, but we're talking about what a skeezy manufacturer can get away with that's technically meeting the letter of the rules.

          • by grimr ( 88927 )

            Yes, the shady manufacturers will just use "HDMI 2.1" because they don't care. "HDMI 2.1" by itself should just be replaced by "AVOID" in your mind.

            • by suutar ( 1860506 )

              why? If HDMI 2.0 is what I want, then that's what I'm getting, no?

              • by grimr ( 88927 )

                No. Nothing stops them from making an HDMI 1.4 device that can be listed as supporting "HDMI 2.1 eARC".

                Even if it was HDMI 2.0 port hardware, there's nothing that guarantees it would support 4K or HDR or Dolby Vision. Depending on the HDMI version really doesn't tell you much.

  • This is not really a big deal, since video transmission is already very mature.

    Of course, 4K @120 fps will be better than 4K @ 60 fps, but only marginally better; for most people, it will be unnoticeable. The change from 1080p to 4K was a slight update (at a typical viewing distance it is often unnoticeable, how many people watch a 78" at 5 feet? [carltonbale.com]), the change from 720p to 1080p more noticeable, and the previous from SD (576i) to 720p was a huge improvement. Even further, some people directly recommend to av [slashdot.org]

    • The 1080p (Standard resolution), which is the same PPI as the long time 1024x768 display resolution. has been good enough for decades now. As moving to 4k only offers us a marginal improvement, at a cost of needing exponentially more powerful equipment.

      Getting a 4k 8k+ display is like a Dick Measuring competition of incels. Even if you win, you don't get anything.

      • I don't agree with that at all. I am never, ever going to go below 4k@60hz for a monitor again. I think it's crummy that something like a current Dell Precision laptop with a powerful GPU is even available with a 1080p display.
      • The 1080p (Standard resolution), which is the same PPI as the long time 1024x768 display resolution. has been good enough for decades now.

        i understand you're discussing video, but text rendering at 1080p and 4k is night and day. and that's just text.

        if you dont notice a big IQ improvement above 1080p, your eyes are at fault.
        i dont mean that sarcastically; i dont mean that as a put-down; i say it plainly.

    • This is not really a big deal, since video transmission is already very mature.

      Of course, 4K @120 fps will be better than 4K @ 60 fps, but only marginally better; for most people, it will be unnoticeable. The change from 1080p to 4K was a slight update (at a typical viewing distance it is often unnoticeable, how many people watch a 78" at 5 feet? [carltonbale.com]),

      Except, that the industry cunningly witheld HDR, making it mostly only available in 4K.

      Try to find HDR content in 1080p and see what happens. (not much, if anything at all)

      That means that 4K will be required, not because of the 4K itself, but because of the HDR thing.

      • by nagora ( 177841 )

        This is not really a big deal, since video transmission is already very mature.

        Of course, 4K @120 fps will be better than 4K @ 60 fps, but only marginally better; for most people, it will be unnoticeable. The change from 1080p to 4K was a slight update (at a typical viewing distance it is often unnoticeable, how many people watch a 78" at 5 feet? [carltonbale.com]),

        Except, that the industry cunningly witheld HDR, making it mostly only available in 4K.

        Try to find HDR content in 1080p and see what happens. (not much, if anything at all)

        That means that 4K will be required, not because of the 4K itself, but because of the HDR thing.

        Don't forget: they redefined 2K to call it 4K as well. 8K is the same - it's actually 4K. Magically overnight the resolution doubled because some industry body decided to say it had.

        These people are thieves and liars being supported by the sort of deluded fools that are buying shitty plastic records and dragging a diamond across the surface while calling the resulting squeals "hi-fi".

        • These people are thieves and liars being supported by the sort of deluded fools that are buying shitty plastic records and dragging a diamond across the surface while calling the resulting squeals "hi-fi".

          I've always viewed listening to LPs similar to something like Japanese Tea Ceremony: it's the deliberate action/ceremony your buying into, not the quality of the output.

  • are not technical specification labels. The "are" Advertising and Marketing buzzword word spam. So! they are meaningless!
  • by nagora ( 177841 ) on Tuesday December 14, 2021 @11:23AM (#62079107)

    Industry bodies are just cartels by a different name. Standards like HDMI, USB, and all the rest should have definitions which are determined by consumer groups. ISO and DIN and the rest are just a joke when it comes to this sort of shit because they're stuffed with industry representatives focused on marketing (and this applies to all those ANSI language documents as well, you know) and being able to sell what they have without additional effort, preferably while telling their customers that the standard is a great breakthrough.

    confronted the HDMI Licensing Administrator with the news that Xiaomi was selling an "HDMI 2.1" monitor that supported no HDMI 2.1 features, and was told this was a perfectly reasonable state of affairs

    Physical violence is always an option worth looking into in these cases.

    • Standards like HDMI, USB, and all the rest should have definitions which are determined by consumer groups.

      Your solution to the problem still involves the causes of the problems. The issues with HDMI and USB are not due to consumer vs corporate interests, they are 100% the result of design by committee. Consumer groups are just as lucky to produce the same garbage when attempting to cram everything and the kitchen sink into a standard.

      • by nagora ( 177841 )

        Standards like HDMI, USB, and all the rest should have definitions which are determined by consumer groups.

        Your solution to the problem still involves the causes of the problems. The issues with HDMI and USB are not due to consumer vs corporate interests, they are 100% the result of design by committee. Consumer groups are just as lucky to produce the same garbage when attempting to cram everything and the kitchen sink into a standard.

        I agree, but it's definitely the lesser of two evils to not have the judge also be the accused.

        • Not sure I fully agree with that either. Consumer groups are rarely a good judge of future technology trends. The result may actually hamper adoption of new technologies and development of standards.

          Remember we laughed at USB for the irrelevant pointless thing it was. Who needed that shit. Don't those stupid engineers realise all our devices plug into the PC just fine already!

          In the field of technology the customer is not always right and doesn't always have their own best interests at heart. Remember the f

          • by nagora ( 177841 )

            Not sure I fully agree with that either. Consumer groups are rarely a good judge of future technology trends

            Well, let's not get carried away here. We're talking about labelling something with "X+" when it is in fact just "X", not issuing edicts about allowable research. We're talking about marketing claims which can be objectively shown to be misleading.

            Also, when I'm referring to consumers I mean consumers of the standard in question, not just focus groups off the street. For programming languages, the consumers are programmers. For USB consumers are more like the standard definition of the word, but for the MS

  • by Thelasko ( 1196535 ) on Tuesday December 14, 2021 @11:28AM (#62079131) Journal
    With all of the different flavors of HDMI, DVI, Display Port, and USB 3, good luck connecting your devices to monitors. My box of adapters is growing rapidly.

    I know they all have benefits, but I kinda miss the old VGA adapter.
    • by Viol8 ( 599362 )

      Agree about VGA. One connector I don't miss however is SCART. Whoever designed that oversized wart connector with a thick cable that came out sideways instead of perpendicular deserves to spend an eternity in hell trying to sort out cabling runs behind a complex analogue video setup.

      • Agree about VGA. One connector I don't miss however is SCART. Whoever designed that oversized wart connector with a thick cable that came out sideways instead of perpendicular deserves to spend an eternity in hell trying to sort out cabling runs behind a complex analogue video setup.

        Wrong. Scart was a product of its era. TVs were not flat, coming sideways helped you push the TV towards the wall as much as possible.

        Is oversized because, at the price points for house TVs, and the technology available at that era, using a smaller connector with better tolerances woud have risen the price significantly.

        • by Viol8 ( 599362 )

          "Wrong. Scart was a product of its era."

          So was every other cable around them and none of them came out sideways.

          "TVs were not flat, coming sideways helped you push the TV towards the wall as much as possible."

          You've contradicted yourself. CRTs back ends stuck out way further than any cable could. Its only with flatscreen TVs you'd care about that sort of thing yet HDMI comes out straight.

          "using a smaller connector with better tolerances woud have risen the price significantly."

          Yet they managed it with (S)VG

    • With all of the different flavors of HDMI, DVI, Display Port, and USB 3, good luck connecting your devices to monitors. My box of adapters is growing rapidly.

      I know they all have benefits, but I kinda miss the old VGA adapter.

      If you miss the old VGA adapter, you did not live through the hacilon days of VGA vs SuperVGA on DOS and Win3.x

      Also, you did not witnessed the VESABios 1.3 mess of non-compliant cards...

      And remember that, in those days, imputing the wrong resolution + Refresh rate combo could PHYSICALLY DAMAGE your monitor...

      Ah, those were the days...

      Or the std parallel vs EPP vs ECP vs Laplink Parallel ports

      Or the 100mb 2pair cat 5 eth vs 100mb 4pair cat 3 eth...

      Or the great HDD rating change from MBiBytes to MBytes

      These s

      • by Megane ( 129182 )
        And I remember when XFree86 insisted on probing and divination with incense to determine custom mode lines, apparently with a list of only the most popular monitors, instead of letting you just pick from the VESA standard SVGA refresh rates. I needed my display to work at all much more than I needed an extra 24 pixels of screen width. The only thing worse than too many standards is throwing them all out the window and then guessing.
  • A lot of us thought it was weird that Apple didn't put HDMI 2.1 in their new laptops, but it sounds like maybe they could technically claimed that they had and supported literally none of the features. I wonder if they just looked at the state of affairs and didn't want to participate? Usually they don't pass up an opportunity for good marketing, but the scrutiny here would have been a lot of bad press.

    But more to the point, why even have standards that are optional like this? The ONLY thing that they are guaranteed to do is cause confusion. Even if most of the people in the market are good actors, even one will upend the whole cart.

    • by AmiMoJo ( 196126 )

      It's because each version of the standard incorporates a bunch of different things. Some of them won't apply, like 8k makes no sense on a 4k TV. That TV might support variable refresh rate though, which is not supported on HDMI 2.0.

      USB has the same problem. USB 2.0 brought support for a lot of new devices, but a lot of them only used the old USB 1.1 speed signalling. You don't really need your audio mix controller to support 480mbps, that would just add cost for no benefit to something that updates 100 time

      • While that's all perfectly fair, I think the certification process is lacking, then.

        So having an HDMI 2.1 certified cable needs to mean that it can support the whole feature set. But you shouldn't be able to put HDMI 2.1 on your TV if it doesn't support the full spec, even the stuff that makes no sense. They should have some sub-category system so that they can say it meets the HDMI HD Monitor spec or something, which REQUIRES an HDMI 2.1 cable, but otherwise confirms a fixed list of features, and no end-po

        • by bws111 ( 1216812 )

          What you are claiming they 'should' do is exactly what they are doing. In spite of what this idiot reporter says, you can NOT just put 'HDMI 2.1" on your TV. If you are going to use the version number, then you MUST state exactly which features (using standardized names and acronyms) of the 2.1 spec your device supports.

          • Yeah, but I think seeing '2.1' on a device is too misleading a signal. You shouldn't get to put 'HDMI 2.1' anywhere on the device at all unless it meets the full specification. Any sub-specification should have its own name that doesn't have '2.1' in it.

            The reality is that there are those of us that will check, and a bunch of people that will get ripped off, and just because I don't think I'd fall into that trap doesn't mean I think other people should. It seems designed to be confusing—and maybe it i

          • by suutar ( 1860506 )

            So... how do you represent "2.0 with no new features" other than "2.1"? (Since 2.0 is no longer a thing)

      • by Chaset ( 552418 )

        I think this is part of the problem:
        >, but a lot of them only used the old USB 1.1 speed signalling.

        There are names for the speeds separate from the spec version. USB 1.1 defined "Low speed" and "Full speed" devices. Not all devices were required to support "Full speed" for the reason you state. There's no point for some devices. "Full speed" is defined in the 2.0 spec as well, so it's a bit misleading to say 'USB 1.1 speed". The USB IF defined those "High Speed" and "Super speed" logos to clarify t

  • by Jaegs ( 645749 ) on Tuesday December 14, 2021 @12:08PM (#62079257) Homepage Journal

    LTT invested in an HDMI tester and determined that a shockingly high number of HDMI cables—some from reputable vendors—were not compliant:

    https://linustechtips.com/topi... [linustechtips.com]

    They provided their test results in a convenient spreadsheet as well.

    Long story, short: the failure rate of Monoprice cables is concerning; also, stick to cables 2m or less—anything 3m and beyond tends to be more unreliable.

  • This is my age talking. My first pc had a black and white CRT monitor with a resolution of 640x480. But I bet a lot of people can't tell the difference between 4k @60Hz and 10k @ what was it? 120Hz. Not sure why HDMI makes their standard an empty box, but am pretty sure that the people who need the real HDMI 2.1 stuff know what to look for. But yes... I probably had the same conversation with my dad about finally buying a color CRT monitor. Very hard to land a plane on a black aircraft carrier deck on a bla
  • "the HDMI standards body owner actually encourages TV and monitor manufacturers that have none of those things -- zip, zilch, zero -- to effectively lie and call them "HDMI 2.1" anyhow"

    • Because the licensing body also says "You can only use version numbers when clearly associating the version number with a feature or function as defined in that version of the HDMI Specification. You cannot use version numbers by themselves to define your product or component capabilities or the functionality of the HDMI interface."

      But if you leave that little detail out it is much easier to stir up faux outrage.

  • This was one of the more unnecessary pieces of advice I’ve gotten in a while.

  • by Anonymous Coward

    Xiaomi was selling an "HDMI 2.1" monitor that supported no HDMI 2.1 features

    If we can't trust a first-class brand such as Xiaomi, the world just doesn't make sense to me any more.

  • So just like they did with HD Ready, 1080p vs 1080i TVs and media centres.

  • USB-C and USB 3.0+ are the same story. I bought a USB-C smart phone assuming it could do all the fancy things USB-C and USB 3.0 advertise. Nope; they're optional. The display port (MHL), sound, networking, docking station, laptop charging, etc... *all optional*. You can say your product is USB 3.2, and it can only run at at the minimum low-power and base USB 3.0 speed, and it's still perfectly legal branding.

    USB 3.0 supports 5 volt 150 mA power by default. Everything else is optional:
    - 900mA for "hi

  • If I plug it in, and it works, I'm happy. I don't really care if it meets some technical standard for HDMI 2.1. Sure, there's the ethical issue of truth in advertising. These days, when you order something on Amazon, you're lucky if the brand name displayed isn't a total fraud, much less compliance with a specific HDMI standard.

  • This doesn't really matter as there is almost zero content. It's hard enough trying to get 4K live TV.
  • If you are the kind of person who looks for specific, exotic, expensive features in the hardware that he buys, then probably you also look at spec sheets, to see if those features are actually supported.
    How could a single user-friendly label summarize the sea of optional features that today's hardware can support? Or if you think that those features shouldn't be optional, do you expect a technical standard to force manufacturers to equalize their offer to the level of their most expensive products? This do
  • HDMI 2.1 is *not* a new connector standard. It uses the same connector and cabling that has existed for many a year. It is a software standard, not a hardware standard.

  • "Worse, they'll get fooled for no particularly good reason:"

    Wrong.

    Profit. Profit is the reason.

    "Remind me to give you a copy of the rules, you never know when they'll come in handy." [youtu.be] - Quark Solves the Problem of War With Economics (Star Trek: Deep Space Nine : Emissary, Part 1)

  • Mobile slots might be your best bet. https://gamblerschoice.net/fre... [gamblerschoice.net] Thanks to mobile casinos and download casino apps, you can load up a slot in seconds and give it your best spin.

"Money is the root of all money." -- the moving finger

Working...