Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Movies Television

'Men In Black' Director Barry Sonnenfeld Calls 8K, Netflix HDR 'Stupid' (cepro.com) 279

CIStud writes Barry Sonnenfeld, director of the "Men in Black" series, "Get Shorty" and most recently Netflix's "Series of Unfortunate Events", says 8K is "only good for sports" and High Dynamic Range (HDR) is "stupid" and "a waste."

Sonnenfeld, speaking with actor Patrick Warburton at the CEDIA Expo last week in Denver, called for a "filmmaker mode" on all TVs that can turn off unwanted HDR. He says Netflix's insistence everything be shot in HDR altered the cinematography on "Series of Unfortunate Events" to his disliking.

Sonnenfeld said Netflix and other streaming services feel HDR makes them appear "next level" from a technology perspective, according to the article, then conceded that "HDR is the future... but it shouldn't be. It's great for watching sports, like hockey, but nothing else... "

He also said today's cinematographers are actually using older lenses and filters on digital cameras to make them look like they weren't shot with a 4K or 8K camera. "The problem with 8K and even 4K is that all it is doing is bringing us closer to a video game aesthetic. It just looks more and more 'not real.' I can't watch any Marvel movies because none of the visual effects look real."

And both Sonnenfeld and Patrick Warburton believe that subscribers to streaming services should be able to watch first-run movies at home the same day the films are released in theaters.
This discussion has been archived. No new comments can be posted.

'Men In Black' Director Barry Sonnenfeld Calls 8K, Netflix HDR 'Stupid'

Comments Filter:
  • by saloomy ( 2817221 ) on Monday September 16, 2019 @02:39AM (#59198200)

    On an Apple TV it's in the Audio and Video settings. Is he talking about something else? Or is HDR turn off an Apple TV only thing?

    • by WaffleMonster ( 969671 ) on Monday September 16, 2019 @05:28AM (#59198420)

      On an Apple TV it's in the Audio and Video settings. Is he talking about something else? Or is HDR turn off an Apple TV only thing?

      You can't turn off HDR. If content is in HDR and you want to watch it you have no choice but to have HDR and have it enabled.

      There are alternatives such as watching content in the wrong color space or tone mapping but none of those options produce good results.

      • As a creator he can make the HDR 8K look like anything lower though.

      • What about using an HDMI 1.4 cable?

      • Tone mapping can actually produce excellent results.

        Check out madVR which is a windows directshow renderer for what is possible with dynamic HDR to SDR ton-mapping.

        madVR can dynamically tone-map HDR to SDR based on a scene-by-scene or frame-by-frame (based on the actual nits measurements of those frames or scenes, so it's not even reliant on HDR metadata) to fit within the dynamic range of a given SDR display.

        https://landmatlutur.tk/photo/... [landmatlutur.tk]
        https://4.bp.blogspot.com/-UbA... [blogspot.com]
        https://3.bp.blogspot.com/-MFe... [blogspot.com]

  • by lister king of smeg ( 2481612 ) on Monday September 16, 2019 @02:56AM (#59198214)

    In 1928 Joseph Schenck, President of United Artists and later chairman of the 20th Century Fox Film Said that "talking doesn't belong in pictures." He also said that sound effect might be useful in some situation but know one wanted to listen to large amount of talking in films.

    • by DNS-and-BIND ( 461968 ) on Monday September 16, 2019 @03:19AM (#59198256) Homepage

      "We still feel that color is hard on the eyes for so long a picture."

      -- New York Times review of "Gone With the Wind", 1939

    • by BBF_BBF ( 812493 ) on Monday September 16, 2019 @04:51AM (#59198368)
      Yeah, just like how 3D is mainstream now. It *definitely* was not a fad. ;-)
      • I feel 3D failed not because of technology, but rather timer or more to the point saturation. They didn't come out so much later than 1080p sets, and didn't offer that much to make it compelling to upgrade if you already had a 1080p set.

        I upgraded from a 720p Plasma set to a 1080p 3D passive one, and enjoyed what content was available. Had I already had a 1080p set, I never would have got one. The TV industry treats their product as being disposable like smartphones, and that simply isn't the case. Most hol

        • You're probably right about 3D not incentivising people to buy a new TV. My opinion is that 3D was not really given a chance to succeed. So much was written and spoken about all the "problems" with 3D from both directors and the audience that people went in expecting headaches and motion sickness. And directors refused to shoot movies in 3D properly, relying on crappy post-production extrusion to generate a weak 3D separation. Forget about 3D actually improving storytelling, very, very few 3D movies even us

        • Re: (Score:3, Interesting)

          by BorgDrone ( 64343 )

          I feel 3D failed not because of technology, but rather timer or more to the point saturation.

          3D failed because it was a flawed technology. The concept of 3D TV is great, but the implementation sucks. The glasses are an issue, it makes it annoying to casually watch TV. The main problem, however, is that 3D images are pre-focussed. You're not actually looking at a 3D image but at 2 2D images, the true focus distance never changes. Your eyes can't focus on the parts of the image you want to look at, the only part that is in focus is the part that the director wanted you to focus on. This makes it rea

    • I'm pretty sure the functional difference between talking and not talking is way bigger than the difference between 4K and 8K.

      Hell, most people told Blu-ray to go fuck itself when DVDs were just fine (at half the price).

  • HDR is necessary (Score:2, Informative)

    by stikves ( 127823 )

    HDR is necessary, even if not essential to modern displays.

    We used to get by 256 levels of intensity. However as soon as you go to an HD display (1280x720) it is no longer sufficient to fill the screen with a simple gradient. That is why we went back to dithering (or the natural "film grain effect" which does a similar thing).

    When we switched to even higher resolutions, we needed more and more bits to properly distinguish between pixels. So came 10-bit HDR (1024 levels of intensity). And then came HDR+ and

    • by bickerdyke ( 670000 ) on Monday September 16, 2019 @03:06AM (#59198222)

      My current display is for coding. It does not have HDR or high refresh rates. And I can see the dithering on the desktop wallpaper. It looks smooth from a distance, but when observed close by it is easy to see per-pixel effects.

      Most likely, that image came as jpg?

    • Eight bits was fine as long as you showed a limited dynamic range, but to encode really deep blacks and bright whites we needed 10 bits of luma or else thereâ(TM)d be even less values in the middle. That we moved to 10 bit color was more about the move to Rec. 2020, that way you got more colors and better color fidelity at the same time.

    • Re: (Score:2, Informative)

      by Namarrgon ( 105036 )

      10-bit HDR (1024 levels of intensity)

      "10-bit" and "HDR" are two different things, though related.

      10 bits gives you 1024 levels between the black and white points, reducing banding as you say - but HDR actually moves the black and white points themselves. Whites in particular can be much brighter, from a (typical) 100-200 nits right up to 1000 nits and beyond. Of course, with a much bigger range of brightness, you'll need even more levels in between or you'll get banding again, so 12 bits or more may be needed, but this is separate to HDR and t

      • Actually thats not even close to true either.

        The black level and brightness of your display is the only thing that determines how dark black is, and how bright white is. HDR has NOTHING to do with that, although HDR sets tend to crank the brightness a lot.

        What HDR actually does is use a different (and less linear) gamma curve, allowing enough brightness resolution in the midranges, while stretching the top and bottom out to cover a wider range.

        This gives you some idea, however is still wrong because it drea

      • Whites in particular can be much brighter, from a (typical) 100-200 nits right up to 1000 nits and beyond.

        What difference does it make? So people now get to be blinded IRL by imaginary suns and nuclear explosions on TV. For a few seconds out of hours of film where there even exists such range? Is it worth insane power consumption? Can anyone even tell the difference even if they are paying attention?

        Queue a mob of people having failed to run a single properly calibrated A+B test telling everyone how great HDR is.

        I tend to agree with TFA that 8K resolutions are overkill for all but the hugest screens, but HDR offers clear picture improvements regardless of resolution or screen size.

        I find this whole thing as funny as shit.

        The people selling hardware are pushing resolution and

        • by AmiMoJo ( 196126 )

          This always happens with new tech.

          7.1 sound often needs to be re-balanced and compressed by home audio equipment so it doesn't go from inaudibly quiet to blowing your ear drums out all the time.

          3D was the same, stuff flying at you all the time and headache-inducing focus problems. It didn't have to be that way, a few films like Avatar used to sensibly and were just about watchable.

          HDR can be good when not abused. Unfortunately, it's often abused and needs and off switch or limiter.

    • by AmiMoJo ( 196126 )

      You might have a 6 bit panel. On a decent 8 bit panel the gradients should be invisible.

      HDR and 10 bit is only needed on extremely high contrast displays. Currently the only displays capable of that much contrast are either high end OLED or LED with multiple backlight zones. Neither is very suitable for anything other than media consumption.

    • Banding effects are barely visible on a good quality monitor, but if you can see dithering, that means your display is a piece of crap using 6-bit color conversion.

      • by tepples ( 727027 )

        Then one purpose of HDR is to break TN's stranglehold on the low end and encourage manufacturers to produce affordable displays that are not "a piece of crap using 6-bit color conversion."

    • HDR is necessary, even if not essential to modern displays.

      Nope its worthless. Very few TVs even so called "HDR" ones are even able to provide full coverage of BT.709 let alone BT.2020.

      We used to get by 256 levels of intensity. However as soon as you go to an HD display (1280x720) it is no longer sufficient to fill the screen with a simple gradient. That is why we went back to dithering (or the natural "film grain effect" which does a similar thing).

      When we switched to even higher resolutions, we needed more and more bits to properly distinguish between pixels. So came 10-bit HDR (1024 levels of intensity). And then came HDR+ and others with "dynamic" levels. (Still 10 bits, but the scales themselves can move).

      Nope, bit depth is independent of HDR. You can have 10-bit SDR content or 12-bit SDR or 14-bit... or whatever. Anything more than 10-bit is worthless.

      My current display is for coding. It does not have HDR or high refresh rates. And I can see the dithering on the desktop wallpaper. It looks smooth from a distance, but when observed close by it is easy to see per-pixel effects.

      Nope, HDR is separate from banding and can be addressed in SDR displays simply by adding more bits.

    • My current display is for coding. It does not have HDR or high refresh rates. And I can see the dithering on the desktop wallpaper.

      If you're seeing dithering on your wallpaper then it has nothing to do with HDR. Chances are either you have a poor wallpaper, your display drivers are acting up, or you have a cheap TN film panel which only can display 6bits in the first place and thus require dithering.

      JPEGs are 8bit. Display inputs for non-HDR panels are 8bit. You should see no dithering. If you have a HDR display or a wide gamut display then you *may* see banding in smooth gradients if you look closely enough, but in general dithering h

  • 4K is kinda stupid too except for two things that come with it:
    1. HDR
    2. Extra bandwidth wihen streaming

    The resolution could be nice if movies were actually made in 4K but they often are not, but it is a much smaller effect than both HDR and the extra bandwidth.

    • Comment removed based on user account deletion
      • by Pyramid ( 57001 )

        You should be aware that they play high bandwidth content specifically designed to provide the "wow factor" and the sets are in color modes specifically designed to look good in the show-room. Most day to day content will never look that good in home.

  • He is not wrong (Score:5, Insightful)

    by gweihir ( 88907 ) on Monday September 16, 2019 @03:26AM (#59198260)

    The only reason these technologies exist is because the display industry tries to resist things becoming a commodity and hence having far lower profits as people will not buy the "next great thing", but only will replace broken equipment. Personally, I am on full-HD and I see absolutely no reason to buy anything more "advanced".

    • by Strider- ( 39683 )

      Bingo. I'm still using the 42" LCD TV I bought back in 2008. It still works fine.

      • by Kokuyo ( 549451 )

        FullHD plasma user her. Perhaps it's the degradation of the plasma screen, perhaps it was always like that and I never noticed but the picture looks very grainy to me lately.

        I really want a beautiful picture so I'm looking at OLED and HDR (which means as high NIT as possible.

        I could live with the FullHD resolution just fine, it's just they don't build high nit OLED displays in that resolution.

      • by thegarbz ( 1787294 ) on Monday September 16, 2019 @06:43AM (#59198530)

        I still ride a horse to work. It still works fine. I also post on slashdot by sending telegrams to one of those technological freaks who use that electricity black magic thing. As such expect it to take a while for me to reply.

        • I still ride a horse to work. It still works fine. I also post on slashdot by sending telegrams to one of those technological freaks who use that electricity black magic thing. As such expect it to take a while for me to reply.

          If you feel the director is wrong about his perspective on x, y or z then explain the benefit of the change and its importance. Avoid meaningless unfalsifiable statements.

        • by gweihir ( 88907 )

          You probably drink water? Or breath air? You should stop these things, they have been done for millions of years!

        • by eepok ( 545733 )

          Except that it doesn't. Almost no one in America has a job that facilitates a horse-riding commute. The bicycle and then automobile were easier to maintain than the horse. And telegrams are much more onerous to send today. The telephone and then email covered that need and pretty much everyone benefited from the change. It was worth the expense.

          The 42" LCD TV (likely with full HD resolution) is probably still more than sufficient for most people. If it weren't for all the marketing behind 4k+, there wouldn'

    • The only reason these technologies exist is because displays don't look like real life yet

      There FTFY. If someone wants to make a profit of technological improvement then more power to them. This case here is nothing more than film makers rejecting colour as some novelty simply because they don't understand how to adapt to an ever improving medium.

      What is being said now is the same angry rants that were used when traditional film makers couldn't figure out how to setup their lighting for colour films now that they no longer had to separate background and foreground through the use of rim lighting

      • by Pyramid ( 57001 )

        So what? The push to make displays and content look more like real life is stupid. Why? "Real life" is what we deal with 95% of the time; very often, it sucks.

        The best directors understand this; they create movies that make no attempt to accurately mimic reality. They know hyper-realistic media sucks you right out of the moment, destroys suspension of disbelief.

        Yes, technology should improve to give us *something better*, but "more realistic" should not be the primary (or only) goal.

    • I think people are already there, have you seen how cheap TVs are compared to what they used to be? I have two TVs for a household of four and it is plenty. One is a 1080P Sony from... 2008? The other is a more recent Samsung 4K TV. I can tell you that the latter is sharper with antenna broadcast HD sports games. Other than that, yeah you're source limited so DVDs look the same, (although due to some upscaling or similar DVD-menus look terrible on the 4K if they tried to make lots of animations), Blu ra
    • In your mind maybe... But in reality screen technology is like most technology, there's always room for improvement and companies compete by trying to provide a better product at a better price.

      As for becoming commodity, anything outside of 8k is already commodity in display technology. Sure, 1080p displays are pretty damn cheap, but 4k ones are well within what can be considered affordable and so are the low-end HDR displays. Hell, most of what's sold is in the "commodity" section anyway so for most con
    • by AmiMoJo ( 196126 )

      One nice thing about 4k is that it makes YouTube look a lot better. Even if the video was only recorded in 1080p it can be scaled up to 1440p or 4k before upload, which seems to activate YouTube's high bit rate mode.

      Even if your display is only 1080p you can make YouTube play the 4k stream (assuming your system is powerful enough) and enjoy the better quality.

  • by Misagon ( 1135 ) on Monday September 16, 2019 @03:29AM (#59198264)

    "High Dynamic Range" (HDR) is today used mostly as a buzzword, that implies one or more of:
    * A wider colour-space
    * A high dynamic range between low and high brightness. (contrast)
    * Larger number of bits per pixel
    ... than some norm to compare to.

    Both chemical and digital cinema already have wider colour space, higher contrast and higher quality (eq. bitrate) overall than what has typically been used for digital broadcasts, BluRay and streaming.
    Digital HDR in the home is first about catching up to cinema quality, and then some over-provisioning in standards for possible future hardware with even wider colour spaces.

    But that doesn't mean that the image has to look different when shot and encoded in a HDR format: That is entirely an artistic decision!
    All big movie productions employ "colour timing" anyway to get a consistent light, contrast and colour tone between individual shots.

    What I would guess that Sonnenfeld is really compaining about is that Netflix would have forced him to over-saturate the colour in his movie as a way to market the "HDR" feature.

    • by dwywit ( 1109409 )

      Yup - that Blu-Ray of {insert film name here} - a Blu-Ray disc has a maximum storage of 25GB or 50GB for dual-layer.

      A Digital Cinema Package hard drive for the same film could be double that, or more. I've seen DCPs over 180GB.

      They throw a *lot* of bits away when compressing to fit on to a Blu-Ray.

  • by Pinky's Brain ( 1158667 ) on Monday September 16, 2019 @03:32AM (#59198270)

    Most of us simply don't have enough experience with gun shot wounds, explosions etc in reality to have a frame of reference. It just has to look plausible and correspond to the unrealistic visual vocabulary we have build up watching previous movies.

    But those vocabularies can change over time ... young people won't care and neither will more flexible old people, but it's strange and frightening to people like mr. Warburton. Has nothing to do with what's real though.

  • So I know my TV is not the most current or most expensive, but here are my opinions anyway

    As far as HDR goes, what a pain.
    Some 4k blurays look okay. Some look over saturated. Some look dull. I don't know whether my TV cannot handle some uses of the new format, or it is just hard to make an encoding that looks good across lots of devices. The standard should really have been backwards compatible, and devices have the ability to turn it off (my TV doesn't seem to).
    The netflix app on my TV has the same problem

  • Sigh. (Score:5, Insightful)

    by ledow ( 319597 ) on Monday September 16, 2019 @04:13AM (#59198322) Homepage

    If I can't see/hear it, I won't pay extra for it.

    And if you keep changing movies with the expectation that I can see/hear it, to the detriment of what happens when I can't (for example, pissing about with audio so that it sounds okay on an 14.2 Dolby-whatever setup meaning I can't hear the damn actors on a standard stereo setup, adding in HDR stuff that makes it look poorer on a standard screen, etc.), then I won't even pay for it.

    How's this for an idea:

    - Shoot movie
    - Put it online
    - Provide a drop-down box of audio and video formats with various prices (Obviously, just provide a preset box that says "DVD quality", "Ultra quality" or whatever for the people who don't understand it)

    Then you'll see how much the audience actually values those things, how much they just want to watch the movie, what they want to actually watch the movie on, whether it's even worth the effort to HDR/Dolby the hell out of the movie at all for the people willing to pay.

    If I strain to hear or see a movie, I'm going to start buying less of them. I have good vision and hearing. If I can't see/hear it on a bog-standard TV, then I'm really not interested in the techy reason for that... you failed at cinematography.

    To be honest - this started being the case about 5-10 years ago. There are a number of movies I won't watch because the action / music is louder than the speech, which is actually critical to the plot. There are also a number of movies so dark that you can't see what's going on unless your TV is really good at blacks and you're in an entirely dark room.

    If I have to squint or strain to see what the hell's going on, when that's the primary purpose of making your movie to convey those images/sound, what makes you think you did a good job?

    • If I can't see/hear it, I won't pay extra for it.

      Cool don't. I can and I do.

      But then why can't you? Is it because film producers don't understand how to actually make a good looking movie?

      Then you'll see how much the audience actually values those things

      The value equation changes with equipment. What I "value" now is entirely different from what I "value" when I buy a new TV.

      There are a number of movies I won't watch because the action / music is louder than the speech, which is actually critical to the plot.

      This one is actually your own fault. The standards for audio from both Dolby and THX required the end user hardware to apply dynamic range compression when wanted, not for the studios to ship a sub-standard product. To get Dolby or THX certification

    • by Algan ( 20532 )

      - Shoot movie
      - Put it online
      - Provide a drop-down box of audio and video formats with various prices

      That's what Vudu does. Options for rental/purchase in SD/HD/UHD at various price points.

  • Read the transcript (Score:5, Informative)

    by Jeremy Erwin ( 2054 ) on Monday September 16, 2019 @04:29AM (#59198342) Journal

    There's a transcript here [hdguru.com]

    Many of his objectionss to 8K and HDR are technical-- they aren't rooted in some nebulous vision of the past.

    for insance:

    What 4K is doing, and 8K will be even worse, is totally preventing costume designers from using certain clothing. Because of moire, you can no longer have certain stripes. You can’t have checks. You can’t have hounds tooth. Before we put any costume on any actor we test fabric now, which we never had to do. But so much of it is moireing now because of 4K, and 8K is going to make those stripes even wider before you can use them.

    or

    The first season looked extraordinary because we finished it on [Standard Dynamic Range]. Netflix, or course, for marketing reasons, wants to say everything is HDR, and we started to finish the second and third seasons in HDR and I said to Netflix, `This is horrible. You’ve got to come and see this.’

    And then the night before they were I going to come, I realized that I was going to lose the battle and the difference wasn’t as much as I thought. Truthfully, if I didn’t control it. All of the new televisions were going to have HDR anyway and they were going to expand that image to HDR without my control.

    But here’s the thing. What HDR tried to do was to say, `I see Barry’s images and poor Barry, there’s no contrast. I am going to help him. I am going to expand the range.’

    I didn’t want high dynamic range. I wanted it flat and moody. But it does it anyway. It took these light bulbs that had exposure to them and brightened them. If you tried to darken those light bulbs they would just solarize. Lemony Snicket’s white collars would just glow like it was under ultra violet light, and so we would add mattes just for his white collars.

    so-- if you want to argue with Sonnenfeld, argue with Sonnenfeld's words, not with what you believe a caricature of Sonnenfeld might have said.

    • None of that is inevitable, is just an encoding. His problem is failing to control the encoding, that's either bad software, bad application or much more likely management interference. Taking away capability is not a good solution to any of them.

    • by gotan ( 60103 ) on Monday September 16, 2019 @06:20AM (#59198496) Homepage

      His main gripe seems to be, that he can't control how a movie he makes will look when and where it is shown. E.g. the TV-set will process the pictures to alter the dynamic range, and the audience isn't even aware of that, or what it should look like.

      It's like a photographer making an image with a nice blurry background to highlight the object in focus seeing his images displayed by some software that removes all that blurriness and make it all crispy sharp.

      • It is exactly like that, which is the problem.

      • by grimr ( 88927 )

        The reason TVs do this is because of limitations in the HDR10 format. With HDR10 it can only set the min and max brightness and gamut parameters at the beginning for the whole movie. The dynamic contrast and tone mapping on the TV try to compensate for this. On my TV I have turned that dynamic contrast and tone mapping options off. It makes some scenes a little dark but it's more accurate to what the director intended.

        Now Dolby Vision can specify the same metadata at any time, even on a frame by frame b

    • It's still a nebulous vision of the past regardless of how technical it may sound.
      The complaints about costumes really are no different from when we adopted colour for the first time.
      Moire always exists in digital recordings, the only thing that 8K allows is a finer pattern ... oh and giving you the scope to fix moire in post without messing up the scene.

      These are very much the complaints of a director who has no idea how to work with the medium, much like directors who tried colour film for the first time

    • by AmiMoJo ( 196126 )

      This seems to be full of mistakes that make me wonder if he knows what he is talking about at all.

      Moire patterns are when the frequency of the pattern is close to the Nyquist frequency of the signal. So you get them on everything from standard definition to 8K, just with different patterns. Maybe there are some materials with patterns that interfere at 8k, but there are many more that interfere at 1080p and SD which can now be used.

      Also any half decent down-sampling algorithm will take that 8k video and mak

    • by Stele ( 9443 )

      for insance:

      What 4K is doing, and 8K will be even worse, is totally preventing costume designers from using certain clothing. Because of moire, you can no longer have certain stripes. You can’t have checks. You can’t have hounds tooth. Before we put any costume on any actor we test fabric now, which we never had to do. But so much of it is moireing now because of 4K, and 8K is going to make those stripes even wider before you can use them.

      At least we can look forward to a lot more early 1900s prison films!

  • by aepervius ( 535155 ) on Monday September 16, 2019 @04:34AM (#59198346)
    Due to the way the eye is evolved, you are not going to have a look at 8K anyway and see a difference really with 4K , unless you freeze the frame and come inches away from the tv/monitor. You have about 4.5 millions cone cells looking at color stuff (the rest 90 millions rod cells and mostly responsible for colorless low light vision and peripherial vision) , and 30 millions pixels in 8K, and then you really see precisely in the focus center. There are compelling argument for 4K. 8K is stupid and a marketing gimmick.

    They would be better off with a progressive resolution offering "8k" in the center of the tv and "HD" at the periphery...
    • by religionofpeas ( 4511805 ) on Monday September 16, 2019 @05:04AM (#59198384)

      You overlook the fact that most of these cone cells are in the center of the retina, allowing very high resolution in a small area that you're looking at. Because the eye moves around and looks at different details on the screen, this means that the entire screen needs to be made in that high resolution.

      Similarly, the frame rate needs to be high enough to capture moving objects without motion blur, in case the viewer tracks that particular object with their eyes.

      • by AmiMoJo ( 196126 )

        8k also reduces aliasing due to the digital sampling of the image.

        Basically 8k is a true "retina" display, significantly better than the human eye to the point where with enough contrast, colour accuracy and high quality source material it's like looking out a window.

        • The eye does not work like a bit map. The eye itself does a lot of processing. What you "see" is not a bmp like picture but a high speed series of commands to construct the picture mentally with direction, color fill, etc. Due to pigment recovery times, the eye cannot maintain a bit-detailed image of an object moving with any real speed. Because of the "post production" enhancements the eye/brain does, and the influence of the mind (conscious and unconscious) on the "image", it's not at all necessary to
      • needs to be high enough to capture moving objects without motion blur

        Why? Your eye doesn't see a moving object in crisp detail, not even when you're tracking.

  • Resolutions like 8K and 16K are indeed mostly bullshit, I agree with him. It's the next "3D" - a way for monitor manufacturers to sell you a new TV when your existing one is just fine, thanks. They're desperate to make every TV obsolete within 5 years, and you shouldn't buy into that.
    When he's referring to HDR he's using a term meaningful to most consumer aficionados, but I prefer to think of it as the more generic higher gamut. HDR just happens to be one of the pit-stops down that road, but I disagree h

  • by Mal-2 ( 675116 ) on Monday September 16, 2019 @07:57AM (#59198662) Homepage Journal

    If you put some piece of media out there, you want to have some control over how it gets rendered on the far end. You don't want "smart" displays deciding they know better -- if you want a scene to be tinted a bit orange, you do not want the white balance being reset by the display because that orange tint means something. If you desaturate the whole image and soften it up to make it feel more dreamy, you don't want the display popping the color and sharpening the edges. If you know how to speak the language, you resent translators trying to interpose themselves, especially when you're intentionally subverting the norms they're trying to establish.

    • by AmiMoJo ( 196126 )

      He's arguing against giving the director control over the image. That's what Netflix HDR does, it allows the creator to more accurately specify how the image should look based on an agreed standard and then have the TV attempt to reproduce it as closely as it can.

      Cinema has had this for a while, stuff like THX for video. The projector is supposed to be calibrated to the THX standard, and some home TVs (notably from Panasonic) supported it.

      For some reason he seems to dislike it though. I suspect he doesn't r

  • Same could be said for making new films look old. I donâ(TM)t appreciate grain being added to digital movies either. Itâ(TM)s fucking maddening.

  • To elaborate on "sports", it annoys the hell out of me when I'm watching the local news and everything behind the anchor desk is out of focus. That's not the HD I bought into. This has to be a major debate about the "art" of film-making because I'm sure my complaint is seen as dirty, unwashed ignorance by anyone with a first-year photography or film school class. But really, isn't HD sold as a "window" into the scene? What sort of "window" is it if 3/4s of the scene is out of focus? So I really wonder if k

    • No, they won't. What they'll watch is the portion of the image that is in focus, the main point of the scene. Just like their eyes work in real life. ALL film techniques, now and later, are expressionist in nature.
  • hitting your download cap in 1 day is stupid & 8K can do that.

  • I'm pretty sure that like all things "tech", we'll see things moving towards an 8K video standard, just because it can be done affordably.

    But I agree with Barry on this one. It's too soon to push this, other than services like Netflix just wanting to appear cutting edge. I know not everyone agrees with me, but I haven't even spent the money to upgrade any of our televisions at home to 4K. 1080p is simply good enough for our purposes. I mean, first of all? We're really not sports enthusiasts so the advan

  • the future of where my industry is going! Stop it! Stop it now! Before I have to improve my own techniques, please stop the thing that might make me have to do things differently!

    In fact, let's go back to puppets and minstrels... life was better when all the entertainment was live, in person, and only for a few brief moments per street corner. Everything looks so flat and like you totally aren't really there on those big flat movie screens... it's a shame we've gone so far!

  • Agree, 8K is stupid except for commercial operations, but you'd have to be blind not to be able to see why HDR is better, verging on necessary. Hint: any image of the sky, check out the banding on your low dynamic range monitor.

The Tao is like a glob pattern: used but never used up. It is like the extern void: filled with infinite possibilities.

Working...