


Why The Hobbit's 48fps Is a Good Thing 599
An anonymous reader writes "Last year, when we discussed news that The Hobbit would be filmed at 48 frames per second, instead of the standard 24, many were skeptical that format would take hold. Now that the film has been released, an article at Slate concedes that it's a bit awkward and takes a while to get used to, but ends up being to the benefit of the film and the entire industry as well. 'The 48 fps version of The Hobbit is weird, that's true. It's distracting as hell, yes yes yes. Yet it's also something that you've never seen before, and is, in its way, amazing. Taken all together, and without the prejudice of film-buffery, Jackson's experiment is not a flop. It's a strange, unsettling success. ... It does not mark the imposition from on high of a newer, better standard — one frame rate to rule them all (and in the darkness bind them). It's more like a shift away from standards altogether. With the digital projection systems now in place, filmmakers can choose the frame rate that makes most sense for them, from one project to the next.'"
Re:Why? (Score:0, Interesting)
I agree with you, I think hobbit looks like ass, WAY to much detail and no motion blur or warm moments for your eyes to pause on.
we had the technology to go as many FPS as we wanted to for the last 50 years, there is a reason we use 24. jackson is a retard.
60fps with motion blur may provide a solution (Score:5, Interesting)
HOWEVER..., critics argue that the Hobbit feels less 'dream-like' and 'too real'. Even though I disagree with them to an extent, I recently played a game called Nitronic Rush [nitronic-rush.com] (fast free Wipeout clone, with tron-esque graphics, great fun btw). I set it to 60fps, but the graphics are 'enhanced' by motion blur, which 60fps normally doesn't 'need'. We're talking at least a couple of frames worth, and maybe up to 5 frames worth of artificial motion blur. However, I find this actually gets the best of both worlds. You get the smoother motion so that your eyes don't ache, and any fast panning looks convincing. But you also get the cinematic 'blurry' look that 24fps films provide (24fps film techniques employ motion blur naturally, or at least something similar to motion blur).
I think 60fps with this kind of motion blur may have a big future for it.
Awkward... (Score:5, Interesting)
Actually, I was thinking more along the lines of the Hogfather (specifically from the movie).
While I enjoyed this first Hobbit movie, I found the Radagast scenes awkward (like an old family photo with too-large glasses and sisters with poofy bangs). Radagast and his bunny sled seemed too much like something right out of Discworld, which would be delightful except that combining Discworld and Middle Earth yields a very large impedance mismatch.
Re:60fps with motion blur may provide a solution (Score:5, Interesting)
Re:Why? (Score:4, Interesting)
As soon as technology allowed me, I produced and projected video animations in 60fps to make pans more fluid. Would I be producing a movie today, I would try to shoot in 60fps for the same reason, much more fluid motion on big screens.
If Jackson had chosen 60fps for The Hobbit, it would have been a much better choice, at least as far as home video is concerned.
With a choice of 48fps as the source, we are going to get stuck with a much lower quality home video release, because there is no current format that allows at least 48fps and 1920x1080 resolution. So, to convert to 24fps, either the original footage will have to be filmed at 24fps, or else some sort of digital interpolation will have to be done. Neither will give the same quality that we have come to expect from current media, as instead of 24 frames per second where scenes with little motion have very sharp frames, pretty much every frame will show some sort of motion.
Re:Why? (Score:5, Interesting)
> Most people can't see a difference in rates above 30fps and pretty much nobody can distinguish fps over 60 fps.
Bullshit. You most certainly CAN see a difference, particularly when there's high-resolution, high-contrast detail with fast movement across the screen. In fact, high-framerate video has its own "uncanny valley" problem (above a certain framerate, generally in the neighborhood of ~300fps, hyperfluid 2-dimensional video becomes disorienting and vertigo-inducing, because your brain can't reconcile the seemingly-lifelike motion with its lack of depth).
I don't buy that. (Score:5, Interesting)
Lets consider two scenarios here.
In the first case, the camera is not panning, but just filming the scenario as it is, and projector playing it back at the filmed rate. Thus viewing the projection is the same experience as looking at the scene in real life, to within the fidelity of the playback. Notably, there is no depth (or a poor simulation of depth with forced focus), but apart from that higher fidelity should be more realistic. The viewer's eyes will be jumping around the big screen and blinking just like normal so there is absolutely no reason to try to "simulate" that; you have the real-life effect already occurring. Same with motion blur; the eye will supply the same amount of blur that it does in real life, so there is no reason to simulate it, beyond compensating for too *low* of a frame rate, which requires a longer integration time to avoid appearing choppy.
And yet it is exactly this sort of scene that was causing people to deride 48fps as being "soap opera like". They talked about how watching the Hobbits slowly walk down the hill towards them looked epic in 24fps, and looked like a documentary in 48fps. It destroyed the suspension of disbelief for them, and made them think they were looking at actors not Hobbits. That has nothing to faking limitations of human vision. It is completely psychological; whether that psychological effect is inherent in the medium or the result of prior conditioning is debatable, though.
The second scenario is where the camera is panning, and thus forcing visual motion on the user even though they didn't initiate it. This is identical to being smoothly flown around a scene, and how "realistic" it is will depend on whether that would actually happen in real life. In situations where it is realistic my argument above would apply; the eye will be looking around the moving scene just like it would be when looking out a train window.
On the other hand, in situations where panning is being used to simulate human motion, I would argue that 48fps could allow the filmmaker to have more realistic view changes if they want them. Low rate 24fps forces the director to have slow gradual pans less they create a choppy or blurry mess as a result in the limitations of the rate. However, as you pointed out, the eye doesn't work that way. It jumps around, taking time to settle and focus each time. If you tried to do that at 24fps the viewer would get lost, unable to follow the transitions. In large part this is because in real life they are controlling the transitions so they know in advance where the view is changing to, but to a lesser extend this is due to the limitations of the frame rate. Faster frame rates will allow for more abrupt translations that are still possible to follow.
Comment removed (Score:5, Interesting)
Re:Why? (Score:5, Interesting)
Most people can't see a difference in rates above 30fps and pretty much nobody can distinguish fps over 60 fps. There are plenty of people (especially gamers) who think they can but they are imagining it.
You really don't have a clue, but you've bought into some techno-babble explanation and have convinced yourself that you do. It's sad, really.
There's a point at which a flickering light source stops being perceived as flickering, and starts being perceived as continuously lit. That threshold is somewhere south of 60 fps for the vast majority of people, true, but that isn't the same as not being able to perceive more than 60 fps (much less 30!).
The reason film (@24 fps) and TV (@30 fps) look smooth, where video games (@30 fps, or even 60 fps+) don't is because the human eye is fooled by (or possibly trained to be fooled by) motion blur. When a camera takes a picture, it doesn't actually capture a moment in time, it captures a span of time. The more something moves during that span, the more motion blur exists. This is due to the shutter system which is required to keep the film from being exposed when it is out of position within the camera. (Note: Motion blur is an effect that can be seen with the naked eye, even with no camera in the mix, but the speeds involved for that are *much* faster than required to see them on film.) Video games (short of the ultra-high end games coupled with extremely powerful graphics cards) don't produce motion blur. Instead, they work to produce more than 30 fps, which is the *minimum* required to feel 'smooth' in the absence of motion blur, frame rates faster than 30 fps (up to at least 75 fps for most people) have been shown to be distinguishable as noticeably smoother in experiments.
Re:No, 24 is more realistic. Really. (Score:2, Interesting)
I am not sure that I can accept your argument: For example, let's say that we change the definition of visual "reality" (the world as you perceive it) and consider what you see when you look about as projected analog image of essentially INFINITE framerate (the gap between frames being, essentially the distance between photons as they strike the retina).
Your brain nevertheless perceives this just as it was designed to do.
Your discussion wrt to looking about and blinking, while interesting, nevertheless remains true whether what I'm looking-at is a video (24/48/72 fps notwithstanding) or if I'm just looking out my window (again, effectively infinite fps). In either case, the rods and cones of the retina still perceive the information in the same way. Since none of the discussed framerates exceeds that of "reality", I cannot understand your premise of "overloading" the eye/retina/brain with information?
What is different, though, is the way the brain interprets the information it gets. It's not that it's LESS realistic, it's that it's less movie-like as your brain has been trained to view it. -- You (and I) have watched 24fps video for our entire lives, so our brains EXPECT 24fps video when we watch movies, and thus consider it's limitations and deficiencies as part of the normal, "I'm-watching-a-movie" experience. But now, at 48fps, when the brain goes into "I'm-watching-a-movie" mode, the higher frame-rate video is perceived as being "wrong" against the background of your visual cortex's movie-viewing history and training. This "wrongness" is certainly jarring to your perception, and I think THAT is the reason people are having such a hard time with it. Your brain doesn't like discordant experiences, and it has evolved to make them stand-out and be memorable and though people may not be entirely, consciously aware of *why* the movie seemed "wrong" to them, they still leave the theatre with that perception (and thus relate that sense when asked about the experience).
I cannot help but wonder if people had the exact same reaction in the 20's and 30's as 24fps starting coming into the mainstream and the older movies with their herky-jerky, C.Chaplin-esque movements faded into the background. I somewhat expect that, in 20-30 years, people born today will look back at 24fps with the same sense of disdain that would eschew a modern-era movie if it was shot in 14fps...
-AC
An example of better being worse... (Score:5, Interesting)
I've been doing computer animation for 35 years, as long as it has existed. Back in the early 80's, I worked on some early 60 field-per-second animation; and I was a convert to high-frame-rate footage since then. (The opening to the PBS show NOVA was perhaps the first 60fps animation ever done.) When we started doing broadcast graphics (show openings, things like that) for TV, we naturally did them at 60fps, and that looked right as it worked with the rest of video. Finally, though, we moved into advertising, and TV advertising was (and still is) typically 24fps. And it bothered me!
But then, something changed my mind completely. We were doing an ad for Snacky, a Japanese snack food company. There was the required silly animated spokespuppet, and we modeled it and made it perform. Part of doing animation is doing the lip-sync, and the company gave us the dialogue in English to animate to. We did this, although it didn't seem right -- expecting them to give us the Japanese soundtrack eventually.
But no, it got to a couple of days before delivery, and the character was still speaking English, and we asked the customer when he came to review the work. "This is only going to be shown in Japan, right?" "Oh, yes, yes!", "And you're going to dub it into Japanese, right?" "Of course! Yes!" "But the lip sync is to an English sound track, the lips are not going to match the dialogue!" "YES! JUST LIKE ALL GOOD ANIMATION!"
Because in that day, lip-sync that was correct in Japanese meant it was low-quality domestic animation; where if the lip-sync didn't match it was high-quality American animation. Nobody can tell me that wrong lip-sync is in any way superior -- except that there were 150 million people in Japan who would see it that way instinctively and immediately.
So, I became a happy convert to 24 fps animation. I applaud Peter Jackson for his incredibly audacious experiment, and I hope he succeeds, but he has to fight the near-instinctive reaction from a lot of people who see 48 fps as video.
I think that part of the problem with The Hobbit at 48fps is that the screens are so terribly dark that you just can't appreciate the high frame rate. Your eye integrates dark scenes over a long period of time, and at 48 fps with the very very dark 3D screens, I believe that your eye smears the frames together. On Transformers III, I removed all the motion blur from the very dark scenes, because even at 24 fps they got smeary.
Re:Why? (Score:5, Interesting)
Have you seen the movie yet? Reserve judgment until you do. My wife went into the viewing not really comprehending what "HFR" meant. About 30 seconds into the movie she leaned over and whispered, "Is the entire movie going to be like this???" and later, "It looks like a video game."
I was pretty well mentally prepared for the frame rate difference, so I was able to enjoy it as a spectacle if nothing else. But it added nothing of value to the movie itself, and speaking honestly the movie did lose something in the transition. It ceased to feel like a movie. It felt like an extremely high definition live broadcast.
I would like someone to explain to me, where is the inherent benefit with 48 FPS? Sure it's nice for directors who would love to shoot faster-moving pans, but how exactly does it make things nicer for the viewer? Do people complain of headaches when they watch movies? Seriously, where is the improvement?
tubes... (Score:2, Interesting)
Thing about tubes. I generally agree, but there is a warm thing abou tubes that *is* better. Digital sampling vs analog cricuitry is a aurally distingusiable feature. In digital sampling there is no trending, no inertia, to the samples. Tubes provide a continuous representation of the analog waveform where digital apratus (transistors, or god forbid, digitial medial 8-) provide snapshot sampling. The harmonics of each are distinct since the tubes will represent the intersticial times skipped by a digital media.
That said... have I rushed out and bought a tube set? No. Do I care about the difference? Not really. Do I think this is the same as the vinal question? Sort of. Do I care enough? No.
One thing that gets lost to most people is the belief that what they don't preceive is perhaps still perceptiable to others.
I think most "audiophiles" have been duped. Monster cable selling "gold plated HDMI cables to remove digital distorion" is complete and utter bulshit foisted on a fatuous public. On the other hand, I can and do hear a difference in continuously variable analog signals compared to digital signals in many settings. My ex was way more sensitive in the audio range. I do see the difference between motion blur and high frame rate and he cannot. (I have better eyes, he had better ears.).
Distinctions that you personally don't perceive are not _necessarily_ imperceptable to others. People vary.
How much that variance matters compared to a technology is a completely subjective question.
But yes, while I agree that most of the things are completely in people's heads, there are differences.
Don't be too dismissive. There is _some_ baby in that bathwater.