Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Television Displays Technology

Ask Slashdot: Why Don't HDR TVs Have sRGB Or AdobeRGB Ratings? 143

dryriver writes: As anyone who buys professional computer monitors knows, the dynamic range of the display device you are looking at can be expressed quite usefully in terms of percentage sRGB coverage and percentage AdobeRGB coverage. The higher the percentage for each, the better and wider the dynamic range of the screen panel you are getting. People who work with professional video and photographs typically aim for a display that has 100 percent sRGB coverage and at least 70 to 80 percent AdobeRGB coverage. Laptop review site Notebookcheck for example uses professional optical testing equipment to check whether the advertised sRGB and AdobeRGB percentages and brightness in nits for any laptop display panel hold up in real life.

This being the case, why do quote-unquote "High Dynamic Range" capable TVs -- which seem to be mostly 10 bits per channel to begin with -- not have an sRGB or AdobeRGB rating quoted anywhere in their technical specs? Why don't professional TV reviewers use optical testing equipment that's readily available to measure the real world dynamic range of HDR or non-HDR TVs objectively, in hard numbers? Why do they simply say "the blacks on this TV were deep and pleasing, and the lighter tones were..." when this can be expressed better and more objectively in measured numbers or percentages? Do they think consumers are too unsophisticated to understand a simple number like "this OLED TV achieves a fairly average 66 percent AdobeRGB coverage?"
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Why Don't HDR TVs Have sRGB Or AdobeRGB Ratings?

Comments Filter:
  • Because... (Score:5, Informative)

    by msauve ( 701917 ) on Tuesday December 11, 2018 @09:07PM (#57789886)
    "why do not have an sRGB or AdobeRGB rating ... Why don't professional TV reviewers use optical testing equipment..."

    Because video is ultimately encoded as YCrCb, wide gamut is compared against Rec. 2020, and you're not looking at the right review sites [rtings.com]
    • Re: Because... (Score:3, Insightful)

      by Anonymous Coward

      This.
      Also most people do not know, or care to know, the intricacies of yet another rating system.. let alone a Proprietary one.

      • It's also because no reviewer worth their salt would ever, ever write an objective review based on actual measurements when they can churn out endless subjective reviews. The entire high-end audio industry is built on reviewers arbitrary opinions, devoid of any actual measurements or testing. That's reserved for third-party sites and blogs that point out how ridiculous most high-end audio reviews and claims are.
    • by Kjella ( 173770 )

      Because video is ultimately encoded as YCrCb, wide gamut is compared against Rec. 2020, and you're not looking at the right review sites

      Or against DCI P3, which is very similar [astramael.com] to AdobeRGB and much more relevant for video since it's in the digital cinema standard. Full coverage of rec. 2020 is pretty much mission impossible so the important part is what bits do you miss. Hopefully not much within DCI P3 or Pointer's gamut (which is approximately all the colors natural objects give off, like not neon signs etc.) while the photo standards aren't very important unless you use the TV as a monitor.

      • by msauve ( 701917 )
        I mentioned 2020 because it's the newer, wider gamut and it includes the full (well, 99.99%) P3 gamut. Most people don't have digital cinema projectors or a source for the associated content. 2020 is for consumer UHDTV, is what UHD content will be mastered for, and so is most applicable to this discussion.

        The OP is already confusing the needs of content creators with content delivery, I didn't see a need to throw a bunch more at them, like dynamic contrast extension (Dolby Vision, HDR10+) or pulldown.
        • by Kjella ( 173770 )

          2020 is for consumer UHDTV, is what UHD content will be mastered for, and so is most applicable to this discussion.

          Standard-wise that's true, but it requires them to do a separate color grade for UHD compared to the cinema release. I was under the impression that many would reuse that, it'd be in a Rec. 2020 format but the actual color would be within DCI P3 as that's a strict subset and pretty wide already. Though my info on that may be outdated, I've not done any actual tests.

          • by Malc ( 1751 )

            Given that there are no real BT.2020/BT.2100 screens around, I imagine most people will colour grade on something that supports DCI-P3 D65. HDR10 metadata (SMPTE 2086 "Mastering Display Colour Volume") can convey the colour space of the reference screen used for grading, and presumably something capable of rendering true BT.2020 colours can take this in to account. Does this mean they have to colour grade once or twice for cinema and TV?

        • by Malc ( 1751 )

          The story is really about HDR TVs, so perhaps it's worth being a bit more precise. BT.2020 is different and refers colour gamut, and it can be used with SDR and HDR. We should really be talking about BT.2100 for HDR, which is BT.2020 colour space with PQ (basis of HDR10, Dolby Vision, etc) or HLG transfer characteristics.

          DCI-P3 D65 is effectively a subset of the BT.2020 colour space, and screen manufacturers are getting pretty close to 100% coverage. I suppose at some point they will try to achieve compl

    • Well, a biggest factor is that these televisions are sold to consumers. Consumers don't know about this stuff in general. This is not just a factor in HDR televisions, but in every product marketed tot he general public. Even the fine print of consumer products will leave out very vital details much of the time - ie, I was trying to find a USB 3.0 thumb drive last year but so many of those on the shelf just said "USB" or "high speed".

      Besides, if you care about such stuff you need to actually see the prod

  • by Anonymous Coward

    Videos use different color gamuts. HD uses Rec. 709, while UHD uses Rec. 2020. The default for movie projection is DCI-P3. Professional monitors will list the coverage of those color spaces just like sRGB and AdobeRGB

  • TV's, not monitors (Score:5, Informative)

    by guruevi ( 827432 ) on Tuesday December 11, 2018 @09:08PM (#57789894)

    TV's actually have quite a different color space and are also a lot brighter than the professional monitors which would make the settings for the AdobeRGB or sRGB ratings kind of moot since nobody uses a TV at 35-40% of the brightness (to get to the 160 cd/sqm).

    You really don't want an "HDR TV" as a monitor and vice versa, hence the ratings are pointless.

    • >You really don't want an "HDR TV" as a monitor and vice versa, hence the ratings are pointless.

      Why not?

      I ask as someone who has been fairly happy using a 1080pTV as a monitor for 10+ years, and am thinking it's about time to upgrade to a 4k TV.

      • by JBMcB ( 73720 )

        Why not?

        Because, oversimplified:

        1. A professional monitor is tuned assuming you are going to be sitting a couple of feet away looking at graphics, and
        2. A TV is tuned assuming you are sitting on the other side of the room watching movies

        Mostly, there are issues with brightness, contrast, white balance, motion processing and latency.

        • Would you care to elaborate? I have done a fair bit of research on the subject, and the only areas that seriously concern me are:

          brightness - TVs are generally far too bright to use as a monitor, and apparently LED backlit sets mostly use PWM for "dimming", with the resultant flickering potentially causing eye strain and other problems.
          and
          response time - it seems that most decent VA TV panels have a response time roughly double that of VA monitors, so you'll get some ghosting in high-speeed games. There's

          • I should clarify, when I say "the only areas that seriously concern me are...", I mean the only areas that have aroused serious concern, not that those are the only areas that concern (interest) me.

          • Would you care to elaborate?

            A professional monitor is usually adjusted to a value you'd consider being too dark, because the adjustment is trying to take it to a level you would see from a physical print on paper instead of a glowing monitor...

            TV's are also often shipped to really saturate colors heavily, possibly you can tone that down but not in a consistent way sometimes, so again if you did a print (or even looked at it on another monitor) the colors could be quite different.

            Personally I do use a "real"

            • Shipped, yes - it's widely known among enthusiasts that the default settings on most TVs are designed for brightly lit showroom floors so that the demo units look good, and the first thing you should do upon getting one home is to switch to a more standard mode. And then adjust brightness, contrast, white balance, and other calibration settings if you actually care, which I think you absolutely should, for use as a monitor.

              The question is, once you get the brightness and colors calibrated for use as a moni

            • It's, frankly, bizarre reading this stuff.

              Check out the "setup" menu on your TV - you can adjust the brightness, contrast, etc.

              I know - space age shit going down right there. But it's really true.

              • Even after you adjust those, for plain old TV sets it's not assured the colors will be in balance, as I said... and no way to apply an ICC correction profile generally. Turning down saturation is no guarantee either.

                Bizarre indeed reading responses from people who don't understand color management and think turning down the brightness and contrast is in any way "calibration".

                • I've been using UHD TVs as my main desktop for a couple of years now, so I have a little experience in this field. The colors aren't perfect but they're so far on this side of "good enough" that it doesn't matter. If I need color perfection I can just use the laptop's screen.

                  Michael

    • by Trailer Trash ( 60756 ) on Tuesday December 11, 2018 @09:47PM (#57790082) Homepage

      You really don't want an "HDR TV" as a monitor and vice versa, hence the ratings are pointless.

      Uh, yeah, I do. I've been using 4K TVs for a couple of years now as my main desktop monitor and I couldn't be happier.

      Right now I have a 48" Vizio and a 49" LG. Neither cost more than $500, both do 4:4:4 just fine, and use a standard HD connection to my laptop. The laptop's a few years old so I only get 30Hz, but I can handle that.

      I have 10 terminals and a couple of web browsers up. So I can program, multiple files at a time, plus consoles, database command line, dev web server log, documentation, and web output up all without anything being occluded.

      I started programming on a 256x192 screen with 28x24 characters. It's awesome to have this much real estate and has raised my productivity noticeably.

      • by msauve ( 701917 )
        "I started programming on a 256x192 screen with 28x24 characters."

        You were lucky. I had 6x 7-segment LEDs (KIM-1). But, that's not really right, because before that the "display" was larger, a 1x80 punch card.
        • by Bruce Perens ( 3872 ) <bruce@perens.com> on Wednesday December 12, 2018 @01:30AM (#57790786) Homepage Journal
          Someone I know took a 4-year degree in computer science without ever touching a terminal. Holey cards, line printers, and batch processing all of the way. Imagine all of that time and having no concept of interactive software.
          • by MrMr ( 219533 )
            My highschool offered a class on programming, where you would mark your programs on cards with a soft pencil and had them sent by mail to the computer center for processing. Output was sent back a week later, in the next class.
            That must have been a 1 millbaud connection.
            • Luxury! I had to program my computer using steel and flint, trying to aim the spark at the correct set of rocks to set each bit!

            • I mentioned the 256x192 screen because I knew it would set off a small avalanche of oneupmanship. I have not been let down :) Actually, I appreciate both stories. It's great to see how far we've come.

          • Someone I know took a 4-year degree in computer science without ever touching a terminal. Holey cards, line printers, and batch processing all of the way. Imagine all of that time and having no concept of interactive software.

            My eldest brother completed a CS degree at SFU around 1977. I often saw him carrying around one of his assignments in the form of piles of punch cards held together with rubber bands. His experience would not have been completely without any concept of interactive software, however, since they did have teletypewriters. I recall playing a game of tic-tac-toe on one. Seemed quite amazing at the time.

        • 256*192 pixels is 32*24 characters. If you are going to falsely brag, at least do your research. Signed: A real Sinclair ZX Spectrum owner.
      • 80x24 white and pale green screens... stacks of them. Dixon Ticonderoga #2 keyboard/mouse combination, electric sharpener, Staedtler Mars backspace, desk brush for backspace and bagel crumbs.

        Next stop: do it yourself keypunch room, or submit to keypunch pool, wait overnight.

        This included embedded systems work on an 8080.

        My 4K TV has too much glare to be a good monitor, and it blocks my view out the window. My Happy Hacking keyboards each cost as much as the TV. Decent HD monitors are under $140.

      • One other thing, if I have wide database output or whatever, at standard font size I can get up to 635 columns and/or 134 rows on this screen. I also edit documents in full-page mode.

    • since nobody uses a TV at 35-40% of the brightness

      Color looks terrible at full brightness on most LED panels. Before I even start to calibrate color, I turn brightness way down. No more than 50% brightness on my current TV, but I don't know the exact number.

    • Okay, I got a lot of great answers as to things to watch out for when using a TV as a monitor, but is there anything specific to HDR that's a problem?

      Assume I'm only looking at 4K TVs that use standard monitor-style RGB/BGR/etc subpixel layouts, has a good latency and response time, and accurate color reproduction after calibration.

      Is there anything specific to HDR that's a potential problem?

      The only thing I can think of is brightness, as it seems most HDR sets have a much greater maximum brightness than SD

  • by Anonymous Coward on Tuesday December 11, 2018 @09:09PM (#57789910)

    For all intensive purposes that's an eggcorn I hadn't seen before, but it's a doggy dog world out there and I have zero taller ants for this sort of thing which some people are known to have a feel day with...

    • This deserves a Funny mod, not Troll, and if I had mod points I've give it to you.

    • Re:Quote-on-quote (Score:5, Informative)

      by Bite The Pillow ( 3087109 ) on Tuesday December 11, 2018 @11:48PM (#57790496)

      dryriver, for the record, for a single word you want to include in "quotes" some people say "quote-unquote" meaning "begin quote then end immediately".

      It is best in a reading medium to use actual quotation marks, and elsewhere use it not at all, or sparingly if you hate people and want them to hate you equally.

      I'm thinking now of a movie with Katharine Hepburn and Mae West which likely does not exist, and should very much.

    • by Mal-2 ( 675116 )

      But what we all want to know is, did you still get your French benefits?

    • I'd give a nominal egg for a TV like that...

  • rtings.com?

  • by Octorian ( 14086 ) on Tuesday December 11, 2018 @09:11PM (#57789918) Homepage

    Achem... Stop confusing terminology... AdobeRGB and sRGB coverage are the "color gamut" spec. While its valid to ask why these aren't promoted in TV specs, "dynamic range" is a completely different spec item.

    • by Kjella ( 173770 )

      While its valid to ask why these aren't promoted in TV specs, "dynamic range" is a completely different spec item.

      That's not really true, when you combine color space and brightness you get color volume [rtings.com] because most TVs can display fewer colors when it's really bright. Like it can show extremely bright white sunlight but vibrant reds and greens and blues were much harder. It's one of the things they've made good progress on in recent years, they've now "filled out" the box bounded by the gamut and dynamic range.

    • No it's not. HDR TVs we're not talking about performance, but rather marketing. The reality is 10bit HDR TVs have a defined colour gamut: Rec: 2020 and when purchasing a HDR TV you should definitely review it's ability to cover Rec2020 properly.

  • by Anonymous Coward

    “...quote-on-quote...”

    Really? Does no-one here have an understanding of basic grammar or spelling, or are these articles perhaps written by some kind of primitive AI?

    Here in the normal, English-speaking parts of the world, the expression is “quote-unquote”.

    • “...quote-on-quote...”

      Really? Does no-one here have an understanding of basic grammar or spelling, or are these articles perhaps written by some kind of primitive AI?

      Here in the normal, English-speaking parts of the world, the expression is “quote-unquote”.

      And, it's only used in spoken language, not written language. When writing you just used the damn quotes.

  • by mattyj ( 18900 ) on Tuesday December 11, 2018 @09:32PM (#57790014)

    Broadly speaking, TV's are for consumers, monitors for creators. We consumers just wanna know if it looks good and numbers won't necessarily tell us that. There's a high quotient of subjectivity there.

    It's like when you take your car to be smogged, you get that printout with all kinds of numbers on it. Do you care? No, you just zoom into the pass/fail part. The DMV and the guy doing the test might care, but the numbers are irrelevant to your purposes.

    • by Mal-2 ( 675116 )

      I do care, and I look at the numbers, because if any one of them is getting close to the fail line, then chances are it's going to fail the next time around. I'd rather have two years to deal with it than have to do it in a hurry when it does eventually fail a test.

    • Broadly speaking, TV's are for consumers, monitors for creators.

      And broadly speaking the distinction is irrelevant and there're plenty of consumers who actually care about their TV's ability to accurately reproduce the rec2020 colour space as required by 10bit HDR content.

  • Check out HDTVTest on youtube ( https://www.youtube.com/user/h... [youtube.com] ) or http://www.rtings.com/ [rtings.com]
  • No one cares (Score:4, Insightful)

    by Anonymous Coward on Tuesday December 11, 2018 @09:39PM (#57790044)

    Because 99.95% of consumers don't give a shit.

    • No, because 99.99% of consumers don't know specifications to save themselves. Among these consumers is the person who posted the question given that he's asking about sRGB and AdobeRGB, neither of which are relevant to the Rec2020 colour space that is specified for HDR content.

      Consumers care about specs, it's why marketing departments advertise a bagillion to 1 contrast ratios, push for 8K pixels while lying about which dimension they are talking about, and each trying to out do the other in how "colourful"

  • Also... (Score:4, Insightful)

    by mr.dreadful ( 758768 ) on Tuesday December 11, 2018 @10:10PM (#57790196)
    Lots of good answers already, but the simplest is hardly anyone outside certain industry's have any idea of color space. Try explaining how monitor color and printer color work is like trying to explain physics to a toddler. (generally speaking) It shouldn't be that hard, but somehow. most people don't get the concept until their image comes back with pink rather then red.
    • You're not creating content on a TV but rather consuming defined content. The only question is how accurately do you reproduce the Rec2020 colourspace. The exception being those people who use TVs as monitors, but then they clearly don't care about quality anyway.

  • A brand makes perfect HDR displays. That totally meet all standards but for $5000~$10000 when sold to the consumer.
    Factory tested and consumer grade quality.
    Do they get to set the letters "H" "D" and "R" as a standard so everyone has to buy from them?
    Want to pay $10000 for an approved "TV" display from one company again?

    Another less advanced nation "reverse engineers" anything emerging in TV display tech.
    With a low cost of workers, regulations and less product quality control they can get 4K and 60
  • don't have a clue what you just said.

    Me? My whine is "why don't they give me 4 HDMI ports? I've got a cable box, game box, Roku, and DVD player. And I doubt I'm in the minority here. Yet you only offer 2 HDMI ports. What's This Feature?"
    • by Mal-2 ( 675116 )

      That's what these are for [amazon.com].

      • That's what these are for.

        I agree with parent. There is no excuse for TV vendors to skimp out on HDMI ports. The product you linked to does not even support 4k properly.. no 60hz which is kind of a big deal for 4k going forward.

        • by Mal-2 ( 675116 )

          I agree that this should be standard functionality. I was just pointing out that there's a whole class of products devoted to solving this problem.

    • HDMI Ports (Score:5, Insightful)

      by crow ( 16139 ) on Wednesday December 12, 2018 @12:31AM (#57790644) Homepage Journal

      Yup. Most people will never really appreciate how well the TV does on color as long as it's good enough. But many of us will notice that it doesn't have enough connectivity.

      My big gripe is smart TVs that have a nice interface for selecting inputs, but don't have enough so you end up needing to use a switch, so you're back to a separate interface for selecting inputs. You can never have enough inputs, but they could be a lot less stingy. If you figure a cable box, a disc player, two gaming systems, and a streaming device, that's five inputs.

      Perhaps there's only one game console, but eventually you'll get a new one, and then you may want to play some of the old games. Perhaps the TV has a good streaming solution built-in. Still, you don't want to run out, and adding HDMI ports should be dirt cheap.

    • Your speakers are crap and most audio receivers have plenty of HDMI ports. Your TV only needs the one port to connect to it. Get yourself a nice sound system.

  • for the HDTV I use as a monitor.

    One was not to be found or even hinted at. Yet I found a picture setting: Apply picture settings - All sources.
    That will do the same; I think, HDTV is broke awaiting warranty replacement (that setting is greyed out).

    I have Samsung Monitor that will send a profile to my Samsung Printer via USB, at least that's the plan.
    Talking warranty replacement I ask about that setting, nobody has a clue - Google insist it will work.

    • Did you actually change any color settings on the TV before trying to apply those to all sources? You calibrate the TV color manually after taking it out of one of the presets and using "advanced" settings to tweak color balance while using reference graphics on your connected source. Once you get that, you can apply those tweaked settings to your other inputs (all sources).

      Don't rely on a preset color profile on a monitor either. There are factory variances to correct for.

      • Did you actually change any color settings on the TV before trying to apply those to all sources?

        No, and the reason for the warranty return. Most of the picture options are greyed out, no matter the mode.

        Had a picture I wanted to give someone, tweaked it on Monitor and printed it. The print out was too dark to view.
        That was after I got it back from warranty repair, monitor's going back for the second time. Samsung has been great about it.
        I have bunches of professional Monitor test pictures, they are the best I can do without special equipment.

        • If every single device you own is "defective," have you started to think that maybe you just don't know how to do color calibration properly? Furthermore, a CMYK color space can't accurately be represented on an RGB screen and vice versa, so it will never match the screen exactly.

          Most screens come out of the box with their brightness set way too high. I would start there, and it would certainly help explain your dark picture.

  • Quote what-now (Score:5, Interesting)

    by martinX ( 672498 ) on Tuesday December 11, 2018 @11:13PM (#57790402)

    Firstly, quote-on-quote is just wrong (points for hyphenation, though). It's said 'quote-unquote' and is the spoken way of indicating that the phrase that follows would probably have quotes around it in written form, to indicate the phrase "so-called". So many people use air quotes in conversation that it's probably no surprise the author is unfamiliar with the correct spelling. This leads me to the next obvious thing which is that the piece is written. The quotes are actually used in the text around the part that is 'quote-unquoted', so there is no need for that phrase at all.

    Was this dictated to Siri or something?

  • by Anonymous Coward

    First off "dynamic range" is used totally incorrectly here. That's only useable for luma, brightness.

    The actual word to use here is color space. A color space is defined by it's rgb values, or maximum red, green, blue values it can produce. Any values in between those the primary colors can be recreated by combining those colors.

    Because Adobe RGB and SRGB are within the P3 color space, if the panel is able to recreate the P3 gamut it can recreate the others. How well it does so is another matter but usually

  • by Miamicanes ( 730264 ) on Wednesday December 12, 2018 @12:38AM (#57790670)

    Part of the problem has to do with the "white" LEDs used for backlighting the LCD -- the exact nature of their "white" varies slightly from batch to batch. Expensive TVs (usually) make a point of using "white" LEDs that are from the "best" batches/bins (consistency from LED to LED, color purity, etc). Cheap TVs use "white" LEDs from the lower batches/bins. The cheapest Black Friday TVs use whatever LEDs were left over after making the "main" manufacturing run.

    So... you might have a TV made by someone like Samsung with a model number like MHD4kQ62 that gets made with the best LEDs... then a model with similar (published) specs that uses the cheaper LEDs & has a model number like MHD4kQ62SXB that sells for $100 less at Best Buy, and an additional model that once again has similar published specs, but uses the cheapest/leftover LEDs, has a model number like MHD4kQ62SVW and sells for $27 less than the Best Buy version at Walmart (possibly with fewer HDMI inputs, just to further spite buyers and shave another 25 cents from the manufacturing cost).

    The point is, they don't talk about THOSE performance specs, because they don't WANT to talk about those performance specs. By not talking about them, they can let Walmart have a model that looks almost the same on paper, even if it's egregiously inferior if you saw it side by side with the most expensive variant.

    Another area where they often cut corners: the timing circuit that allows a TV to natively deal with 50hz and 60hz, instead of being locked to 50hz OR 60hz native. It only saves a few cents because it's mostly just a few passive components omitted from the mainboard, but they do it anyway (especially with US models) because they know that 98% of US buyers won't notice the difference anyway.

    • Part of the problem has to do with the "white" LEDs used for backlighting the LCD -- the exact nature of their "white" varies slightly from batch to batch.

      The problem is use of white LEDs in the first place.

      • It's entirely possible to get high-quality white light from LEDs that's objectively superior to blackbody light... it's just that by the time you GET it, it requires almost as much energy, and throws off nearly the same amount of total heat as halogen (except, 100% of that heat is concentrated in tiny areas that need heatsinks to avoid melting themselves). You basically combine near-UV light (with purple, not blue) glow with the right phosphors, then fortify it with superbright near-infrared LEDs to support

        • Interesting. I was wondering why they should bother with nice whites in the backlight, when it gets filtered by the RGB anyway -- why not use specific R/G/B LEDs instead? But it seems we can't make them specific enough.
          • I believe one future idea being explored is kind of a combination of OLED & LCD -- using broader-spectrum OLED that's less susceptible to burn-in, but more optimized towards proper red, green, and blue, then refined by passing it through a mask of red/green/blue filters.

            The big question is whether mass-market buyers can be convinced to spend more for increased long-term color fidelity. If it becomes seen as a 'must-have' feature, it might add something like $100 to a mid-priced TV. If it were a niche fe

  • by WaffleMonster ( 969671 ) on Wednesday December 12, 2018 @01:02AM (#57790726)

    That's the first place I'd look for an explanation. When you actually run a display in HDR mode it drives backlighting to the max and sucks power like crazy. They have to trade flux for colors at least partially to work around the atrocious starting spectrum of backlighting. The only way to do that without eroding contrast is cranking up the volume.

    Personally I would much prefer color space not become a selling point unless the metric used explicitly considers power consumption. HDR isn't worth it.

  • by morethanapapercert ( 749527 ) on Wednesday December 12, 2018 @03:00AM (#57790936) Homepage
    1) manufacturers don't want the consumer looking too closely at tech specs. Technically, most of the HDR offerings are pretty much the same and what can be seen with an optical device might not make a noticeable difference to an average consumer. But have a tech review site say that Brand X came in below average for AdobeRGB would be a mortal wound to sales.

    2) I don't know what it's like with HDR sets, but for other panels I have the impression that there is only a very small handful of actual fabs making the raw panels. That would mean that that the panels themselves are largely identical, so trying to compete on specs is a mugs game. Finished panels, whether it be TVs, monitors or digital signage get sold on brand recognition and marketing schmooze.

    3) Consumers, on average, are far more sensitive to price per diagonal inch of screen than they are tech specs, warranty, privacy concerns etc. So that's where vendors focus their efforts.

    4) I'm willing to bet that in the professional space, you *can* get proper tech specs for TV's and not just monitors. I don't remember the vendor in question but I recall looking at published specs per panel from a video wall company that was touting HDR upgrades for the TV studio market and they definitely quoted $RGB specs.

    5) When all is said and done, unless you have audiophile level addiction to video equipment, it doesn't really matter now does it? The pros want and need calibrated color gamuts because they need to match printer colors, logos need to use correct corporate branded color, need to work on hidef movies where any shortfall in the color gamut of the work flow WILL show up in the final analogue film and so on. For your average user, what are they going to compare the image their living room set displays to? About the only place I can think of it mattering to a residential consumer is either multi-monitor displays (where as long as all panels match each other, you're usually good to go), or video walls for the wealthy and that brings us back to the professional panel vendors. A videophile might want to bask in the knowledge that they spent an extra X grand to make sure they get full sRGB gamut (I don't know of any TV that comes close to 100% of AdobeRGB) but that's a real niche market and one that doesn't seem to attract the same level of "price no object as long as there is pseudo science invoked" silliness of the audiophile segment. Sitting in a living room watching movies, can an average viewer discern even a 5% difference in color gamut if they don't have both panels playing the same thing at the same time?

  • When people watch stupid crap like "Celebrity Apprentice" and other "reality" shows, "dancing with the stars", and "america's got talent" (no, it doesn't), "fox and friends", "the bachelor", etc., why would you need color range and accuracy?

    I think orange is dead nuts in the middle of the range of any crappy TV...

  • Sigh. (Score:5, Insightful)

    by ledow ( 319597 ) on Wednesday December 12, 2018 @08:35AM (#57791600) Homepage

    "Do they think consumers are too unsophisticated to understand a simple number like "this OLED TV achieves a fairly average 66 percent AdobeRGB coverage?""

    Er.... yes?

    I had someone ask me what HD was, I've had people watching an SD channel on an HDTV and not realising that it's not "automatically" HD (the HD version of the same channel was higher up in the list), I've had people not understand that the remote control has to point at the TV (increasingly common again as a youth accustomed to Bluetooth are thrown back into an IR world).

    Hell, I had one 18-year-old at work bring me their aerial and say "I found this in my room, I dunno what it is do you want it back?" and only correlated this with his lack of TV reception when it was pointed out that it was in fact still necessary to receive digital TV over the air (but they didn't care because they streamed everything).

    I couldn't give a shit about sRGB etc. and I'd have to read up on what they even meant without this summary. You can be damn sure that most consumers don't even know what HDR is (they'll think it's something to do with HD!), let alone care about an arbitrary standard to do with it.

    "Can I see it from an angle?" will be a question asked a million times more than anything to do with colour calibrations, nits, or anything else.

    • I remember when 16:9 format TVs first came out. People watched 4:3 video stretched to the full screen width for years and never though twice about it, even though they could just push a button on the remote control to display it as 4:3. They probably thought it was HD, too.

      I used to see ads for TV antennas that were specifically for color TV (usually on the same page as antennas that were cheaper and not indicated for color TV), as if the antenna had anything to do with the quality of the color your TV di

  • Do they think consumers are too unsophisticated to understand a simple number like "this OLED TV achieves a fairly average 66 percent AdobeRGB coverage"

    The think that enough that they are unwilling to cut into their profit margin to perform the testing and waste the ink to put that on the boxes, yes. The vast majority of people walk into $store and buy the largest TV that looks better than the rest on the wall of TVs. Manufacturers spend their money wisely, which means they put all of their effort into the

  • You don't really know if it matches manufacturer specs unless you test it anyways. Some reviewers do test. Look at https://www.soundandvision.com/content/vizio-pq65-f1-lcd-ultra-hdtv-review-test-bench as an example. The print magazine Home Theater which merged with that used to post the color graphs with measured vs. expected too.

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...