'Men In Black' Director Barry Sonnenfeld Calls 8K, Netflix HDR 'Stupid' (cepro.com) 279
CIStud writes
Barry Sonnenfeld, director of the "Men in Black" series, "Get Shorty" and most recently Netflix's "Series of Unfortunate Events", says 8K is "only good for sports" and High Dynamic Range (HDR) is "stupid" and "a waste."
Sonnenfeld, speaking with actor Patrick Warburton at the CEDIA Expo last week in Denver, called for a "filmmaker mode" on all TVs that can turn off unwanted HDR. He says Netflix's insistence everything be shot in HDR altered the cinematography on "Series of Unfortunate Events" to his disliking.
Sonnenfeld said Netflix and other streaming services feel HDR makes them appear "next level" from a technology perspective, according to the article, then conceded that "HDR is the future... but it shouldn't be. It's great for watching sports, like hockey, but nothing else... "
He also said today's cinematographers are actually using older lenses and filters on digital cameras to make them look like they weren't shot with a 4K or 8K camera. "The problem with 8K and even 4K is that all it is doing is bringing us closer to a video game aesthetic. It just looks more and more 'not real.' I can't watch any Marvel movies because none of the visual effects look real."
And both Sonnenfeld and Patrick Warburton believe that subscribers to streaming services should be able to watch first-run movies at home the same day the films are released in theaters.
Sonnenfeld, speaking with actor Patrick Warburton at the CEDIA Expo last week in Denver, called for a "filmmaker mode" on all TVs that can turn off unwanted HDR. He says Netflix's insistence everything be shot in HDR altered the cinematography on "Series of Unfortunate Events" to his disliking.
Sonnenfeld said Netflix and other streaming services feel HDR makes them appear "next level" from a technology perspective, according to the article, then conceded that "HDR is the future... but it shouldn't be. It's great for watching sports, like hockey, but nothing else... "
He also said today's cinematographers are actually using older lenses and filters on digital cameras to make them look like they weren't shot with a 4K or 8K camera. "The problem with 8K and even 4K is that all it is doing is bringing us closer to a video game aesthetic. It just looks more and more 'not real.' I can't watch any Marvel movies because none of the visual effects look real."
And both Sonnenfeld and Patrick Warburton believe that subscribers to streaming services should be able to watch first-run movies at home the same day the films are released in theaters.
You can turn off HDR (Score:3)
On an Apple TV it's in the Audio and Video settings. Is he talking about something else? Or is HDR turn off an Apple TV only thing?
Re:You can turn off HDR (Score:4, Insightful)
On an Apple TV it's in the Audio and Video settings. Is he talking about something else? Or is HDR turn off an Apple TV only thing?
You can't turn off HDR. If content is in HDR and you want to watch it you have no choice but to have HDR and have it enabled.
There are alternatives such as watching content in the wrong color space or tone mapping but none of those options produce good results.
Re: You can turn off HDR (Score:2)
As a creator he can make the HDR 8K look like anything lower though.
Re: (Score:2)
What about using an HDMI 1.4 cable?
Re: (Score:3)
Tone mapping can actually produce excellent results.
Check out madVR which is a windows directshow renderer for what is possible with dynamic HDR to SDR ton-mapping.
madVR can dynamically tone-map HDR to SDR based on a scene-by-scene or frame-by-frame (based on the actual nits measurements of those frames or scenes, so it's not even reliant on HDR metadata) to fit within the dynamic range of a given SDR display.
https://landmatlutur.tk/photo/... [landmatlutur.tk]
https://4.bp.blogspot.com/-UbA... [blogspot.com]
https://3.bp.blogspot.com/-MFe... [blogspot.com]
Re: (Score:3)
Check out madVR which is a windows directshow renderer for what is possible with dynamic HDR to SDR ton-mapping.
madVR can dynamically tone-map HDR to SDR based on a scene-by-scene or frame-by-frame (based on the actual nits measurements of those frames or scenes, so it's not even reliant on HDR metadata) to fit within the dynamic range of a given SDR display.
This is hilarious. I invite everyone to carefully examine each image.
A studio publishes BD SDR and 4k HDR versions of exact same content.
Left is what studio intended SDR version to be.
Right was result of conversion of HDR to SDR.
Notice the two don't match not at all not even close. If the studio intended for it to look like the image on the right ... they could have done that as evidenced by the fact such an image is possible to display on everyone's SDR monitor viewing it.
There are a lot of people who crave over sharp, over saturation and brightness... they think it looks better... there are all kinds of dynamic saturation maximizing features built into modern TVs designed to tweak displays so everything is always as vivid as can be gotten away with.. there are even auto SDR to HDR conversion features completely unmoored from reality..... ok whatever floats your boat.. I don't subscribe to that. I don't want that.
Until I see HDR to SDR converter in consumer product that always does a good job and by good job I mean faithful studio quality conversion my remarks stand. Simple tone mapping is insufficient and everything else is a series of hard tradeoffs like trying to draw a 2D map of the earth.
No, they couldn't.
You clearly don't understand how video standards and formats works so I will enlighten you.
SDR is a standard and cannot be mixed to more than 100 nits peak.
HDR is a standard with a minimum, of 1000 nits, but there is also 4000 and 10000 nits standards as well.
But modern SDR displays such as TVs and computer monitors sold in the last 5 years or so generally can do around 300 peak nits.
With madVR you enter in the peak nits of your actual display and it tone-maps the 1000+ nit HDR down to app
Talking in film is fad too. (Score:5, Informative)
In 1928 Joseph Schenck, President of United Artists and later chairman of the 20th Century Fox Film Said that "talking doesn't belong in pictures." He also said that sound effect might be useful in some situation but know one wanted to listen to large amount of talking in films.
Re:Talking in film is fad too. (Score:5, Interesting)
"We still feel that color is hard on the eyes for so long a picture."
-- New York Times review of "Gone With the Wind", 1939
Re:Talking in film is fad too. (Score:5, Funny)
Re:Talking in film is fad too. (Score:5, Funny)
Re: (Score:2)
I feel 3D failed not because of technology, but rather timer or more to the point saturation. They didn't come out so much later than 1080p sets, and didn't offer that much to make it compelling to upgrade if you already had a 1080p set.
I upgraded from a 720p Plasma set to a 1080p 3D passive one, and enjoyed what content was available. Had I already had a 1080p set, I never would have got one. The TV industry treats their product as being disposable like smartphones, and that simply isn't the case. Most hol
Re: (Score:2)
You're probably right about 3D not incentivising people to buy a new TV. My opinion is that 3D was not really given a chance to succeed. So much was written and spoken about all the "problems" with 3D from both directors and the audience that people went in expecting headaches and motion sickness. And directors refused to shoot movies in 3D properly, relying on crappy post-production extrusion to generate a weak 3D separation. Forget about 3D actually improving storytelling, very, very few 3D movies even us
Re: (Score:2)
That's why I went with passive; the set came with four pair, plus two Game pair (made split screen games look full screen for both players, pretty neat!) Extra pairs were dirt cheap as well. They were comfortable enough to wear over normal eyewear, too.
Re: (Score:3, Interesting)
I feel 3D failed not because of technology, but rather timer or more to the point saturation.
3D failed because it was a flawed technology. The concept of 3D TV is great, but the implementation sucks. The glasses are an issue, it makes it annoying to casually watch TV. The main problem, however, is that 3D images are pre-focussed. You're not actually looking at a 3D image but at 2 2D images, the true focus distance never changes. Your eyes can't focus on the parts of the image you want to look at, the only part that is in focus is the part that the director wanted you to focus on. This makes it rea
Every analogy needs hyperbole (Score:2)
I'm pretty sure the functional difference between talking and not talking is way bigger than the difference between 4K and 8K.
Hell, most people told Blu-ray to go fuck itself when DVDs were just fine (at half the price).
Re: Talking in film is fad too. (Score:2)
Explosions are mundane. CGI swarms are the new explosions...for the past 20 years. Time for a new go-to effect.
HDR is necessary (Score:2, Informative)
HDR is necessary, even if not essential to modern displays.
We used to get by 256 levels of intensity. However as soon as you go to an HD display (1280x720) it is no longer sufficient to fill the screen with a simple gradient. That is why we went back to dithering (or the natural "film grain effect" which does a similar thing).
When we switched to even higher resolutions, we needed more and more bits to properly distinguish between pixels. So came 10-bit HDR (1024 levels of intensity). And then came HDR+ and
Re:HDR is necessary (Score:5, Insightful)
My current display is for coding. It does not have HDR or high refresh rates. And I can see the dithering on the desktop wallpaper. It looks smooth from a distance, but when observed close by it is easy to see per-pixel effects.
Most likely, that image came as jpg?
Re: HDR is necessary (Score:2)
Eight bits was fine as long as you showed a limited dynamic range, but to encode really deep blacks and bright whites we needed 10 bits of luma or else thereâ(TM)d be even less values in the middle. That we moved to 10 bit color was more about the move to Rec. 2020, that way you got more colors and better color fidelity at the same time.
Re: HDR is necessary (Score:4, Informative)
The only reason 10 bit exist is because radiologists need it. Many panels are actually only 6 bit internally and that is were ordinary people can see the effects you describe.
Re: (Score:3)
That is rather unlikely. You probably see some other effect.
Re: (Score:2, Informative)
10-bit HDR (1024 levels of intensity)
"10-bit" and "HDR" are two different things, though related.
10 bits gives you 1024 levels between the black and white points, reducing banding as you say - but HDR actually moves the black and white points themselves. Whites in particular can be much brighter, from a (typical) 100-200 nits right up to 1000 nits and beyond. Of course, with a much bigger range of brightness, you'll need even more levels in between or you'll get banding again, so 12 bits or more may be needed, but this is separate to HDR and t
Ummmm... no. (Score:2)
Actually thats not even close to true either.
The black level and brightness of your display is the only thing that determines how dark black is, and how bright white is. HDR has NOTHING to do with that, although HDR sets tend to crank the brightness a lot.
What HDR actually does is use a different (and less linear) gamma curve, allowing enough brightness resolution in the midranges, while stretching the top and bottom out to cover a wider range.
This gives you some idea, however is still wrong because it drea
Re: (Score:3)
Whites in particular can be much brighter, from a (typical) 100-200 nits right up to 1000 nits and beyond.
What difference does it make? So people now get to be blinded IRL by imaginary suns and nuclear explosions on TV. For a few seconds out of hours of film where there even exists such range? Is it worth insane power consumption? Can anyone even tell the difference even if they are paying attention?
Queue a mob of people having failed to run a single properly calibrated A+B test telling everyone how great HDR is.
I tend to agree with TFA that 8K resolutions are overkill for all but the hugest screens, but HDR offers clear picture improvements regardless of resolution or screen size.
I find this whole thing as funny as shit.
The people selling hardware are pushing resolution and
Re: (Score:2)
This always happens with new tech.
7.1 sound often needs to be re-balanced and compressed by home audio equipment so it doesn't go from inaudibly quiet to blowing your ear drums out all the time.
3D was the same, stuff flying at you all the time and headache-inducing focus problems. It didn't have to be that way, a few films like Avatar used to sensibly and were just about watchable.
HDR can be good when not abused. Unfortunately, it's often abused and needs and off switch or limiter.
Re: (Score:3)
You might have a 6 bit panel. On a decent 8 bit panel the gradients should be invisible.
HDR and 10 bit is only needed on extremely high contrast displays. Currently the only displays capable of that much contrast are either high end OLED or LED with multiple backlight zones. Neither is very suitable for anything other than media consumption.
Re: (Score:2)
GP was talking about his desktop computer display that they were using for development work.
Re: (Score:2)
Banding effects are barely visible on a good quality monitor, but if you can see dithering, that means your display is a piece of crap using 6-bit color conversion.
Re: (Score:2)
Then one purpose of HDR is to break TN's stranglehold on the low end and encourage manufacturers to produce affordable displays that are not "a piece of crap using 6-bit color conversion."
Re: (Score:2)
HDR is necessary, even if not essential to modern displays.
Nope its worthless. Very few TVs even so called "HDR" ones are even able to provide full coverage of BT.709 let alone BT.2020.
We used to get by 256 levels of intensity. However as soon as you go to an HD display (1280x720) it is no longer sufficient to fill the screen with a simple gradient. That is why we went back to dithering (or the natural "film grain effect" which does a similar thing).
When we switched to even higher resolutions, we needed more and more bits to properly distinguish between pixels. So came 10-bit HDR (1024 levels of intensity). And then came HDR+ and others with "dynamic" levels. (Still 10 bits, but the scales themselves can move).
Nope, bit depth is independent of HDR. You can have 10-bit SDR content or 12-bit SDR or 14-bit... or whatever. Anything more than 10-bit is worthless.
My current display is for coding. It does not have HDR or high refresh rates. And I can see the dithering on the desktop wallpaper. It looks smooth from a distance, but when observed close by it is easy to see per-pixel effects.
Nope, HDR is separate from banding and can be addressed in SDR displays simply by adding more bits.
Re: (Score:2)
My current display is for coding. It does not have HDR or high refresh rates. And I can see the dithering on the desktop wallpaper.
If you're seeing dithering on your wallpaper then it has nothing to do with HDR. Chances are either you have a poor wallpaper, your display drivers are acting up, or you have a cheap TN film panel which only can display 6bits in the first place and thus require dithering.
JPEGs are 8bit. Display inputs for non-HDR panels are 8bit. You should see no dithering. If you have a HDR display or a wide gamut display then you *may* see banding in smooth gradients if you look closely enough, but in general dithering h
8K is stupid (Score:2)
4K is kinda stupid too except for two things that come with it:
1. HDR
2. Extra bandwidth wihen streaming
The resolution could be nice if movies were actually made in 4K but they often are not, but it is a much smaller effect than both HDR and the extra bandwidth.
Re: (Score:2)
Re: (Score:2)
You should be aware that they play high bandwidth content specifically designed to provide the "wow factor" and the sets are in color modes specifically designed to look good in the show-room. Most day to day content will never look that good in home.
He is not wrong (Score:5, Insightful)
The only reason these technologies exist is because the display industry tries to resist things becoming a commodity and hence having far lower profits as people will not buy the "next great thing", but only will replace broken equipment. Personally, I am on full-HD and I see absolutely no reason to buy anything more "advanced".
Re: (Score:3)
Bingo. I'm still using the 42" LCD TV I bought back in 2008. It still works fine.
Re: (Score:2)
FullHD plasma user her. Perhaps it's the degradation of the plasma screen, perhaps it was always like that and I never noticed but the picture looks very grainy to me lately.
I really want a beautiful picture so I'm looking at OLED and HDR (which means as high NIT as possible.
I could live with the FullHD resolution just fine, it's just they don't build high nit OLED displays in that resolution.
Re:He is not wrong (Score:5, Funny)
I still ride a horse to work. It still works fine. I also post on slashdot by sending telegrams to one of those technological freaks who use that electricity black magic thing. As such expect it to take a while for me to reply.
Re: (Score:2)
I still ride a horse to work. It still works fine. I also post on slashdot by sending telegrams to one of those technological freaks who use that electricity black magic thing. As such expect it to take a while for me to reply.
If you feel the director is wrong about his perspective on x, y or z then explain the benefit of the change and its importance. Avoid meaningless unfalsifiable statements.
Re: (Score:3)
You probably drink water? Or breath air? You should stop these things, they have been done for millions of years!
Re: (Score:3)
Except that it doesn't. Almost no one in America has a job that facilitates a horse-riding commute. The bicycle and then automobile were easier to maintain than the horse. And telegrams are much more onerous to send today. The telephone and then email covered that need and pretty much everyone benefited from the change. It was worth the expense.
The 42" LCD TV (likely with full HD resolution) is probably still more than sufficient for most people. If it weren't for all the marketing behind 4k+, there wouldn'
Re: (Score:2)
The only reason these technologies exist is because displays don't look like real life yet
There FTFY. If someone wants to make a profit of technological improvement then more power to them. This case here is nothing more than film makers rejecting colour as some novelty simply because they don't understand how to adapt to an ever improving medium.
What is being said now is the same angry rants that were used when traditional film makers couldn't figure out how to setup their lighting for colour films now that they no longer had to separate background and foreground through the use of rim lighting
Re: (Score:2)
So what? The push to make displays and content look more like real life is stupid. Why? "Real life" is what we deal with 95% of the time; very often, it sucks.
The best directors understand this; they create movies that make no attempt to accurately mimic reality. They know hyper-realistic media sucks you right out of the moment, destroys suspension of disbelief.
Yes, technology should improve to give us *something better*, but "more realistic" should not be the primary (or only) goal.
Re: (Score:2)
Re: (Score:2)
As for becoming commodity, anything outside of 8k is already commodity in display technology. Sure, 1080p displays are pretty damn cheap, but 4k ones are well within what can be considered affordable and so are the low-end HDR displays. Hell, most of what's sold is in the "commodity" section anyway so for most con
Re: (Score:2)
One nice thing about 4k is that it makes YouTube look a lot better. Even if the video was only recorded in 1080p it can be scaled up to 1440p or 4k before upload, which seems to activate YouTube's high bit rate mode.
Even if your display is only 1080p you can make YouTube play the 4k stream (assuming your system is powerful enough) and enjoy the better quality.
What does he mean by "HDR"? (Score:5, Insightful)
"High Dynamic Range" (HDR) is today used mostly as a buzzword, that implies one or more of:
... than some norm to compare to.
* A wider colour-space
* A high dynamic range between low and high brightness. (contrast)
* Larger number of bits per pixel
Both chemical and digital cinema already have wider colour space, higher contrast and higher quality (eq. bitrate) overall than what has typically been used for digital broadcasts, BluRay and streaming.
Digital HDR in the home is first about catching up to cinema quality, and then some over-provisioning in standards for possible future hardware with even wider colour spaces.
But that doesn't mean that the image has to look different when shot and encoded in a HDR format: That is entirely an artistic decision!
All big movie productions employ "colour timing" anyway to get a consistent light, contrast and colour tone between individual shots.
What I would guess that Sonnenfeld is really compaining about is that Netflix would have forced him to over-saturate the colour in his movie as a way to market the "HDR" feature.
Re: (Score:2)
Yup - that Blu-Ray of {insert film name here} - a Blu-Ray disc has a maximum storage of 25GB or 50GB for dual-layer.
A Digital Cinema Package hard drive for the same film could be double that, or more. I've seen DCPs over 180GB.
They throw a *lot* of bits away when compressing to fit on to a Blu-Ray.
Re: (Score:2)
You're asking the wrong questions. HDR is a response to the compromises necessary to fit high-quality sources onto lesser-quality distribution platforms. If we could all afford DLP or laser projectors, and digital cinema systems, we wouldn't need HDR at all.
An analogy might be the original Dolby audio systems - boost the high-end during production, then damp it during reproduction. It was about reducing an artifact caused by limits of the technology at the time, e.g. tape hiss.
Even practical effects don't look real (Score:3)
Most of us simply don't have enough experience with gun shot wounds, explosions etc in reality to have a frame of reference. It just has to look plausible and correspond to the unrealistic visual vocabulary we have build up watching previous movies.
But those vocabularies can change over time ... young people won't care and neither will more flexible old people, but it's strange and frightening to people like mr. Warburton. Has nothing to do with what's real though.
Re: (Score:3)
My mid range 55" LED TV from 2018 (Score:2)
So I know my TV is not the most current or most expensive, but here are my opinions anyway
As far as HDR goes, what a pain.
Some 4k blurays look okay. Some look over saturated. Some look dull. I don't know whether my TV cannot handle some uses of the new format, or it is just hard to make an encoding that looks good across lots of devices. The standard should really have been backwards compatible, and devices have the ability to turn it off (my TV doesn't seem to).
The netflix app on my TV has the same problem
Sigh. (Score:5, Insightful)
If I can't see/hear it, I won't pay extra for it.
And if you keep changing movies with the expectation that I can see/hear it, to the detriment of what happens when I can't (for example, pissing about with audio so that it sounds okay on an 14.2 Dolby-whatever setup meaning I can't hear the damn actors on a standard stereo setup, adding in HDR stuff that makes it look poorer on a standard screen, etc.), then I won't even pay for it.
How's this for an idea:
- Shoot movie
- Put it online
- Provide a drop-down box of audio and video formats with various prices (Obviously, just provide a preset box that says "DVD quality", "Ultra quality" or whatever for the people who don't understand it)
Then you'll see how much the audience actually values those things, how much they just want to watch the movie, what they want to actually watch the movie on, whether it's even worth the effort to HDR/Dolby the hell out of the movie at all for the people willing to pay.
If I strain to hear or see a movie, I'm going to start buying less of them. I have good vision and hearing. If I can't see/hear it on a bog-standard TV, then I'm really not interested in the techy reason for that... you failed at cinematography.
To be honest - this started being the case about 5-10 years ago. There are a number of movies I won't watch because the action / music is louder than the speech, which is actually critical to the plot. There are also a number of movies so dark that you can't see what's going on unless your TV is really good at blacks and you're in an entirely dark room.
If I have to squint or strain to see what the hell's going on, when that's the primary purpose of making your movie to convey those images/sound, what makes you think you did a good job?
Re: (Score:2)
If I can't see/hear it, I won't pay extra for it.
Cool don't. I can and I do.
But then why can't you? Is it because film producers don't understand how to actually make a good looking movie?
Then you'll see how much the audience actually values those things
The value equation changes with equipment. What I "value" now is entirely different from what I "value" when I buy a new TV.
There are a number of movies I won't watch because the action / music is louder than the speech, which is actually critical to the plot.
This one is actually your own fault. The standards for audio from both Dolby and THX required the end user hardware to apply dynamic range compression when wanted, not for the studios to ship a sub-standard product. To get Dolby or THX certification
Re: (Score:2)
- Shoot movie
- Put it online
- Provide a drop-down box of audio and video formats with various prices
That's what Vudu does. Options for rental/purchase in SD/HD/UHD at various price points.
Read the transcript (Score:5, Informative)
There's a transcript here [hdguru.com]
Many of his objectionss to 8K and HDR are technical-- they aren't rooted in some nebulous vision of the past.
for insance:
What 4K is doing, and 8K will be even worse, is totally preventing costume designers from using certain clothing. Because of moire, you can no longer have certain stripes. You can’t have checks. You can’t have hounds tooth. Before we put any costume on any actor we test fabric now, which we never had to do. But so much of it is moireing now because of 4K, and 8K is going to make those stripes even wider before you can use them.
or
The first season looked extraordinary because we finished it on [Standard Dynamic Range]. Netflix, or course, for marketing reasons, wants to say everything is HDR, and we started to finish the second and third seasons in HDR and I said to Netflix, `This is horrible. You’ve got to come and see this.’
And then the night before they were I going to come, I realized that I was going to lose the battle and the difference wasn’t as much as I thought. Truthfully, if I didn’t control it. All of the new televisions were going to have HDR anyway and they were going to expand that image to HDR without my control.
But here’s the thing. What HDR tried to do was to say, `I see Barry’s images and poor Barry, there’s no contrast. I am going to help him. I am going to expand the range.’
I didn’t want high dynamic range. I wanted it flat and moody. But it does it anyway. It took these light bulbs that had exposure to them and brightened them. If you tried to darken those light bulbs they would just solarize. Lemony Snicket’s white collars would just glow like it was under ultra violet light, and so we would add mattes just for his white collars.
so-- if you want to argue with Sonnenfeld, argue with Sonnenfeld's words, not with what you believe a caricature of Sonnenfeld might have said.
Re: Read the transcript (Score:2)
None of that is inevitable, is just an encoding. His problem is failing to control the encoding, that's either bad software, bad application or much more likely management interference. Taking away capability is not a good solution to any of them.
Re:Read the transcript (Score:5, Insightful)
His main gripe seems to be, that he can't control how a movie he makes will look when and where it is shown. E.g. the TV-set will process the pictures to alter the dynamic range, and the audience isn't even aware of that, or what it should look like.
It's like a photographer making an image with a nice blurry background to highlight the object in focus seeing his images displayed by some software that removes all that blurriness and make it all crispy sharp.
Re: (Score:2)
It is exactly like that, which is the problem.
Re: (Score:3)
The reason TVs do this is because of limitations in the HDR10 format. With HDR10 it can only set the min and max brightness and gamut parameters at the beginning for the whole movie. The dynamic contrast and tone mapping on the TV try to compensate for this. On my TV I have turned that dynamic contrast and tone mapping options off. It makes some scenes a little dark but it's more accurate to what the director intended.
Now Dolby Vision can specify the same metadata at any time, even on a frame by frame b
Re: (Score:2)
It's still a nebulous vision of the past regardless of how technical it may sound. ... oh and giving you the scope to fix moire in post without messing up the scene.
The complaints about costumes really are no different from when we adopted colour for the first time.
Moire always exists in digital recordings, the only thing that 8K allows is a finer pattern
These are very much the complaints of a director who has no idea how to work with the medium, much like directors who tried colour film for the first time
Re: (Score:2)
This seems to be full of mistakes that make me wonder if he knows what he is talking about at all.
Moire patterns are when the frequency of the pattern is close to the Nyquist frequency of the signal. So you get them on everything from standard definition to 8K, just with different patterns. Maybe there are some materials with patterns that interfere at 8k, but there are many more that interfere at 1080p and SD which can now be used.
Also any half decent down-sampling algorithm will take that 8k video and mak
Re: (Score:2)
for insance:
What 4K is doing, and 8K will be even worse, is totally preventing costume designers from using certain clothing. Because of moire, you can no longer have certain stripes. You can’t have checks. You can’t have hounds tooth. Before we put any costume on any actor we test fabric now, which we never had to do. But so much of it is moireing now because of 4K, and 8K is going to make those stripes even wider before you can use them.
At least we can look forward to a lot more early 1900s prison films!
Your eye are not that good (Score:3)
They would be better off with a progressive resolution offering "8k" in the center of the tv and "HD" at the periphery...
Re:Your eye are not that good (Score:4, Insightful)
You overlook the fact that most of these cone cells are in the center of the retina, allowing very high resolution in a small area that you're looking at. Because the eye moves around and looks at different details on the screen, this means that the entire screen needs to be made in that high resolution.
Similarly, the frame rate needs to be high enough to capture moving objects without motion blur, in case the viewer tracks that particular object with their eyes.
Re: (Score:2)
8k also reduces aliasing due to the digital sampling of the image.
Basically 8k is a true "retina" display, significantly better than the human eye to the point where with enough contrast, colour accuracy and high quality source material it's like looking out a window.
Re: (Score:3)
Re: (Score:2)
Why? Your eye doesn't see a moving object in crisp detail, not even when you're tracking.
Resolution is marketing BS, but not HDR per se (Score:2)
Resolutions like 8K and 16K are indeed mostly bullshit, I agree with him. It's the next "3D" - a way for monitor manufacturers to sell you a new TV when your existing one is just fine, thanks. They're desperate to make every TV obsolete within 5 years, and you shouldn't buy into that.
When he's referring to HDR he's using a term meaningful to most consumer aficionados, but I prefer to think of it as the more generic higher gamut. HDR just happens to be one of the pit-stops down that road, but I disagree h
I kinda get what he means. (Score:3)
If you put some piece of media out there, you want to have some control over how it gets rendered on the far end. You don't want "smart" displays deciding they know better -- if you want a scene to be tinted a bit orange, you do not want the white balance being reset by the display because that orange tint means something. If you desaturate the whole image and soften it up to make it feel more dreamy, you don't want the display popping the color and sharpening the edges. If you know how to speak the language, you resent translators trying to interpose themselves, especially when you're intentionally subverting the norms they're trying to establish.
Re: (Score:2)
He's arguing against giving the director control over the image. That's what Netflix HDR does, it allows the creator to more accurately specify how the image should look based on an agreed standard and then have the TV attempt to reproduce it as closely as it can.
Cinema has had this for a while, stuff like THX for video. The projector is supposed to be calibrated to the THX standard, and some home TVs (notably from Panasonic) supported it.
For some reason he seems to dislike it though. I suspect he doesn't r
Canâ(TM)t turn off stupid old lenses and filt (Score:2)
Same could be said for making new films look old. I donâ(TM)t appreciate grain being added to digital movies either. Itâ(TM)s fucking maddening.
Artsiness (Score:2)
To elaborate on "sports", it annoys the hell out of me when I'm watching the local news and everything behind the anchor desk is out of focus. That's not the HD I bought into. This has to be a major debate about the "art" of film-making because I'm sure my complaint is seen as dirty, unwashed ignorance by anyone with a first-year photography or film school class. But really, isn't HD sold as a "window" into the scene? What sort of "window" is it if 3/4s of the scene is out of focus? So I really wonder if k
Re: (Score:2)
hitting your download cap in 1 day is stupid & (Score:2)
hitting your download cap in 1 day is stupid & 8K can do that.
It's really just tech "because we can" .... (Score:2)
I'm pretty sure that like all things "tech", we'll see things moving towards an 8K video standard, just because it can be done affordably.
But I agree with Barry on this one. It's too soon to push this, other than services like Netflix just wanting to appear cutting edge. I know not everyone agrees with me, but I haven't even spent the money to upgrade any of our televisions at home to 4K. 1080p is simply good enough for our purposes. I mean, first of all? We're really not sports enthusiasts so the advan
I don't like... (Score:2)
the future of where my industry is going! Stop it! Stop it now! Before I have to improve my own techniques, please stop the thing that might make me have to do things differently!
In fact, let's go back to puppets and minstrels... life was better when all the entertainment was live, in person, and only for a few brief moments per street corner. Everything looks so flat and like you totally aren't really there on those big flat movie screens... it's a shame we've gone so far!
You'd have to be blind (Score:2)
Agree, 8K is stupid except for commercial operations, but you'd have to be blind not to be able to see why HDR is better, verging on necessary. Hint: any image of the sky, check out the banding on your low dynamic range monitor.
Re: (Score:2, Insightful)
This is the same tired old argument that the frame rate luddites have been using forever, and is simply nonsense.
QFT. Also, the visual effects in Marvel movies don’t look real because they aren’t. Personally I can’t watch most of the Marvel movies because of the rubbish writing...
Re:Well he's half right... (Score:5, Interesting)
You can see why the latter doesn't happen - you'd kill cinemas overnight.
Maybe that's not a bad thing, but when they are one of your biggest "customers" as a movie producer, it's a very silly thing to do.
To be honest, even in release-weeks, I've been in cinemas outside of working hours and they are pretty much dead over here. I don't get how they make money any more, even with the over-priced concessions.
Hell, my daughter and I once paid GBP5 each, got a private box, a meal brought to it, and watched a movie that was released that week. I honestly don't know how they paid the hourly wages of the staff that served us for how many people there were in the cinema at the time.
Re: (Score:3)
He's right about 8K though - nobody wanting 8k can see it. The whole point of it is to create huge displays so you have better resolution in a park or auditorium. You're not going to ever see the benefit of 8k at home, ever. Your eyes just aren't good enough,
(and those who say they can see a difference, is that on your solid gold AV cables by any chance?)
HDR... I guess that does bring some benefits to modern displays that can show greater colour gamuts. Maybe he meant that the way directors suddenly think "
Re: (Score:2)
8k over 4k, not really. Though I suspect you might if sitting fairly close to an 80 or 100 inch TV, but it will be a long time before one of those is affordable. 8k would however go a long way towards resolving the screen door effect that exists even at 4k on VR headsets.
1080p to 4k, absolutely you can see it. Especially in video games or when watching anything that has sharp straight lines (usually computer generated imagery).
Re: (Score:2)
I'm not going to lie - going to 720p over 480p is a nice step up on quality, but personally I can barely notice any further increases in resolution. I'm fine with technology progressing and if 8K is the standard the next time I upgrade my TV then I have no problems upgrading, but I'd be perfectly happy with watching video in 720p for the rest of my days.
Re: (Score:3)
Technically I'd go for an 8k Monitor. I spotted an 8K TV at Best Buy a few weeks back. At 82" it's about double the size of my 43" 4K home monitor which is perfect for having lots of screen space for coding. With the extra monitors I have, I'm running about 6k overall. So I keep picturing my 43" in portrait mode with a second one next to it and I can see it. Price is still up there and no DisplayPort on the Samsung I saw but maybe soon...
[John]
Re: (Score:2)
Re: (Score:2)
This is the same tired old argument that the frame rate luddites have been using forever, and is simply nonsense
When radio switched from coconut shells to recorded horse trotting, people complained the horse sounds didn't sound like real horses anymore.
Re:How to make a movie look great (Score:4, Insightful)
Test the movie on lots of different 4K, 5K, 8K "displays" sold in the USA.
Do you think movies are made for the USA market only? Of course mostly the same display devices are sold everywhere else as well, especially now that the NTSC/PAL issue is mostly a historic thing, but still I don't see the point of adding "in the USA"...
Re:How to make a movie look great (Score:4, Interesting)
In fact 8k is only currently available in any meaningful sense in Japan AFAIK. They have been doing test broadcasts for a while now on satellite and over the internet. The plan is to have it all ready for the Olympics next year.
Re:How to make a movie look great (Score:4, Interesting)
"1. Find script, actors, location, the best sound experts, electricians, carpenters, costume designers as needed.
2. Make a really great movie.
3. Work out what the color should look like.
4. Set the really light and dark parts as wanted.
5. Select a modern way to send the movie to the display that totally keeps the correct color information.
6. The display shows the movie as expected within the limits of its hardware."
Feature films are generally made for cinema release, then subsequently a DVD/Blu-Ray/Streaming release. Consequently, they're optimised initially for a DLP projector*, and commercial films distributed by hard drive or internet are rendered in XYZ colour space to work on those projectors.
Post-cinema releases are a compromise on the initial vision of the production crew - direction, art design, lighting, etc. There's a lot you can do with the master copy to make it work on various end-user displays, but it will always be a compromise. Shooting a film to look its best in XYZ means there'll be some sacrifices in RGB/others.
There's a cueing monitor in the projection room where I work part-time. The difference between the "big screen" and the monitor is very obvious. Granted, it's just a cheap LCD rather than a reference monitor, but it still show the difference between the two technologies.
* The new laser technology is going to look even better:
https://www.nec-display-solutions.com/p/dc/en/projection-room/cinema-projection.xhtml [nec-displa...utions.com]
Re:Laser projection (Score:2)
My son and I watched Avengers: Endgame in IMAX 2D with laser projection. The stuff that was actually filmed looked great. The camera operators and lighting people really knew what they were doing. However, all the special effects looked cheesy and fake. Even the few seconds of fake fireworks near the end looked horribly fake. Why couldn't they have used real fireworks? On the massive Melbourne IMAX screen with ultra-crisp laser projection, that uncanny valley fakeness is shoved in your face harder tha
Re: Laser projection (Score:2)
Imagine how realistic games would look nowadays if they had stopped at 720p and all the graphics artists didn't have to spend years working on 4k textures.
Re: (Score:2)
????
The main problem with games is not the display resolution.
It is low texture resolution.
To stop all artifacting and maybe look like film the textures need to be in a much much much higher resolution that the display.
However 4K textures at 720p would still not be enough to make it silky.
Re: (Score:2)
"They worked out VHS, dvd, blu ray, HD, 4K...
No, they didn't. It was compromise all the way down. VHS lost most of the "bits", whether from film or digital masters, and it looked crap. It also lost a lot of bits when the edges of widescreen movies were chopped off to fit a 4:3 aspect ratio (or they were subjected to pan-and-scan, with a similar loss of screen real estate). And then it lost most of the colour information because you just can't fit that sort of data into the type of MPEG frame that lived on
Re: (Score:2)
1. Find script,
Maybe the problem with movies isn't the technology.
Re: (Score:2)
A lot of sound engineers and producers swear by mixing on ordinary speakers, even in mono. (A) that's what a lot of punters have and (B) if it sounds good on them it often sounds fantastic on better ones.
It doesn't work if you only have crappy speakers, you need both types in the studio and a switch to flip between them.
I get the impression that pictures don't work that way...?
Yes it does, eg. you can switch to black and white to get a better idea of the lighting.
Re: (Score:3)
You master on studio monitors, but you also check it in mono from time to time (this will reveal inverted phase on an input channel REALLY fast), and hopefully check it with headphones, a decent home theater setup, a car, and overpromised/underdelivered cheap hardware. You're not seeking to have it sound the same on all of them, that's just not happening (or there would be no point in the better hardware). You're checking that it doesn't go hilariously wrong in one of those other environments, while you sti
Re: (Score:3)
You can't make a a 4k into an 8k.
Why not? People have been releasing 2k content as 4k on BD for years and getting away with it. Hell people have been releasing VHS quality crap on DVD.. and DVD quality on BD for decades.
And I think HDR looks tons better.
Nope, HDR does not look "tons better". You are confused by lack of a comparable standard or display calibration issues.
(again, you can't add it later)
Nor can you take it away.
Re: (Score:2)
Why not? People have been releasing 2k content as 4k on BD for years and getting away with it.
Usually when they do 4k versions of movies and aren't just trying to get out out there as fast as possible or maximize volume they actually either go back and scan the physical film used in the theatrical release in 4k resulting in genuinely higher quality material due to the superiority of 35mm film over 1080p. In the case of fully digital productions they go the higher quality digital masters from which the original 1080p releases are simply downsampled from.
Even in TV productions the internally used m
Re: (Score:2)
The "higher quality digital masters" are almost always 2K - because that's the output resolution of the preferred digital camera used in most productions. It's also the preferred master for 35mm film to digital master transfer. Special effects that are added later are in 720p - because that's the default for rendering them for every major Hollywood blockbuster due to the time it takes to render them. Later, they're up-scaled by a very advanced method, but it's still nowhere near 4K when completed.
Even the
Re: (Score:2)
A good rule of thumb when choosing any display is if you can see the pixels when viewing at a "comfortable" distance then you should consider a smaller display
When the Game Boy came out in 1989, its display was tiny, but I could still see its display's pixels at a typical viewing distance. Pixel density in the leading handheld video game system with buttons of each generation improved very little from then to 2015 when the New Nintendo 3DS XL came out. Most of the pixel count improvement (from 160x144 to 400x240) went to a physically larger screen (from 2.6" to 4.88"), not to more density.