Why The Hobbit's 48fps Is a Good Thing 599
An anonymous reader writes "Last year, when we discussed news that The Hobbit would be filmed at 48 frames per second, instead of the standard 24, many were skeptical that format would take hold. Now that the film has been released, an article at Slate concedes that it's a bit awkward and takes a while to get used to, but ends up being to the benefit of the film and the entire industry as well. 'The 48 fps version of The Hobbit is weird, that's true. It's distracting as hell, yes yes yes. Yet it's also something that you've never seen before, and is, in its way, amazing. Taken all together, and without the prejudice of film-buffery, Jackson's experiment is not a flop. It's a strange, unsettling success. ... It does not mark the imposition from on high of a newer, better standard — one frame rate to rule them all (and in the darkness bind them). It's more like a shift away from standards altogether. With the digital projection systems now in place, filmmakers can choose the frame rate that makes most sense for them, from one project to the next.'"
Why? (Score:2, Insightful)
sure its outdated, but so is 48 fps.
broadcast TV has been 50 for years, with more recent forays with high def into 120hz (no idea of the actual frame rate with digital, but I could image its up there)
why are the doing this now? and why only 48fps?
Re: (Score:2, Informative)
why are the doing this now? and why only 48fps?
Follow the slashdot link in the summary, it was discussed extensively there, no need to derail yet another thread with it.
Re: (Score:3)
Re:Why? (Score:5, Insightful)
I was just wondering if anyone else would mention ShowScan, amid all the claims of "first time such a high frame rate film has been produced... blah blah blah..." when the claim really should be "finally, something almost as good as what was available 40+ years ago."
24fps has always bothered me whenever an object or person moves across the screen quickly. Even the small increase to 30fps is a significant improvement to my eyes. 72fps seems like a good goal, though I probably won't complain about 48.
I think those in the "24fps is magic" camp have a lot in common with the "vinyl is better" and "tubes are better" bunch. They either like their content distorted by their medium of choice or just like the idea of using archaic technology. There's certainly nothing wrong with either of those things, but the old ways are not "better" for everyone else.
Re:Why? (Score:5, Insightful)
24 fps from a high-speed shutter camera (usually digital these days), can be disturbing. 24 fps with low-speed shutter (older analog cameras), where there is motion blur is ok; motion blur approaches what we see with a naked eye.
24 fps from a video game, which is a sequence of stills, typically without motion blur as it requires more CPU time, is awful.
Assuming Jackson used digital cameras, 48 fps should be an improvement.
Re: (Score:3)
What camera man in his right mind would shoot faster than 1/24s shutter speed and then display it at 24 FPS? This only happens when taking a live broadcast at a high frame rate and insta-converting to 24 FPS like they do (or did) with some awards shows. It would never happen in a movie.
Re:Why? (Score:5, Interesting)
Have you seen the movie yet? Reserve judgment until you do. My wife went into the viewing not really comprehending what "HFR" meant. About 30 seconds into the movie she leaned over and whispered, "Is the entire movie going to be like this???" and later, "It looks like a video game."
I was pretty well mentally prepared for the frame rate difference, so I was able to enjoy it as a spectacle if nothing else. But it added nothing of value to the movie itself, and speaking honestly the movie did lose something in the transition. It ceased to feel like a movie. It felt like an extremely high definition live broadcast.
I would like someone to explain to me, where is the inherent benefit with 48 FPS? Sure it's nice for directors who would love to shoot faster-moving pans, but how exactly does it make things nicer for the viewer? Do people complain of headaches when they watch movies? Seriously, where is the improvement?
Re: (Score:3)
Personally, I didn't even notice.
Re: (Score:3)
Hmmm... that could end up being a good thing eventually.
Re: (Score:2)
why do you think it last sooooo long .... ;-p
Re:Why? (Score:5, Informative)
Re:Why? (Score:4, Informative)
I guess that depends on how you define it. On interlaced broadcast (like all old TV), you get a half-frame every 50 seconds, where half-frame means either the even or the odd lines, alternating. However in true interlaced broadcast (i.e. where the material was already recorded in that format, not transformed into it as when putting a movie to TV) it's not that you get the even and odd lines of the same image, but each half-frame is recoded on its own time. So say you've got 50 half-frames per second, then you'll get e.g. the even lines of the image at 0ms, then the odd lines of the image at 20ms, then the even lines of the image at 40ms, then the odd lines of the image at 60ms, and so on. Only with converted stuff, the even and odd lines will be from the same image.
You can see that quite nicely when capturing a true interlaced-recorded TV program on tjhe computer, where two half-frames are combined into a frame. If there's fast movement in the scene, you'll get striped frames because your "frame" is actually the combination of two images at different times, with the even and odd lines image separated by 20 ms (50Hz) or 16.7 ms (60Hz). Given that those images are recorded at different times, I'd say it makes sense to consider them different frames which are recoded at half the vertical resolution with a displacement of one line every second frame.
Re:Why? (Score:4, Insightful)
Interlacing was a wonderful thing in the analog days. TV would have looked (literally) half as good without it. But those days are passed: It is time to let interlacing die. It just gets in the way now and complicates things needlessly.
Re: (Score:3)
But the reason for using fields is so that the effective frame rate perceived is 60 frames per second, just at half the vertical resolution. Each field contains a different image than the field before it. So we're used to relatively fast frame rates. If people find 48 FPS distracting, the reason is probably either a psychological "uncanny valley" thing—because it feels almost like TV but not quite—or perhaps it just happens to be a magic speed that makes people uncomfortable for some reason.
Re: (Score:3)
I'm not aware of broadcasts in 50 FPS. AFAIK, they're being evaluated, but basically material is broadcast at 25 or 30 fps, depending on the standard used. These conform to the old PAL/NTSC/SECAM framerates. Interlaced formats, however, can be 50 or 60, but that's because each frame is essentially split into two frames of alternating horizontal lines, "fields".
720p60 is a common broadcast format and a few European broadcasters do 720p50 (presumably to ease upscaling 25 FPS SD content). It seems 1080i50 is more popular over there though (annoyingly, I despise interlacing and would much rather have seen 1080p30 and 1080p25 become the broadcast standards rather than their crappy interlaced counterparts).
Re: (Score:3)
Re:Why? (Score:4, Interesting)
As soon as technology allowed me, I produced and projected video animations in 60fps to make pans more fluid. Would I be producing a movie today, I would try to shoot in 60fps for the same reason, much more fluid motion on big screens.
If Jackson had chosen 60fps for The Hobbit, it would have been a much better choice, at least as far as home video is concerned.
With a choice of 48fps as the source, we are going to get stuck with a much lower quality home video release, because there is no current format that allows at least 48fps and 1920x1080 resolution. So, to convert to 24fps, either the original footage will have to be filmed at 24fps, or else some sort of digital interpolation will have to be done. Neither will give the same quality that we have come to expect from current media, as instead of 24 frames per second where scenes with little motion have very sharp frames, pretty much every frame will show some sort of motion.
Re: (Score:3)
The framerate (1/24 of a second) determines how much judder you get. The shutter speed determines the motion blur.
The Hobbit does not use interpolation. It is shot at 48fps. Interpolation is used by 120Hz TV's to guess the intermediate frames and is a worthless technology in my mind. I only want to see the original framerate, whatever that may be, and of course motion blur is removed on the interpolated TV's, against the director's wishes.
It doesn't do any analysis. The sensor is always on. There are
Re: (Score:3)
It's also much more effective to down-convert from 60 FPM to 60i for broadcast than to up-convert from 24 or even 48 FPM to 30i
Re: (Score:3)
Clarification: That was 50i for PAL and 60i for NSTC. Because of the interlace the FPS was actually 25fps and 30fps respectively.
Re:Why? (Score:4, Funny)
A lot fo the magic of film was 24fps.
Oh yes, like wagon wheels going backwards. I also pine for the days of scratches, dust spots and pubic hairs on the big screen. And nothing but nothing beats the exhilaration of watching the celluloid melt because the projector stalled.
This is about RMS. (Score:5, Funny)
Re:This is about RMS. (Score:5, Funny)
The movie has hairy, disgusting trolls.
I think it's clear you went to the midnight showing...
Re:This is about RMS. (Score:5, Funny)
So does Slashdot, but we try not to discriminate.
Re:Why? (Score:4, Funny)
I bought $50 monster cables from Best Buy. You mean they weren't worth all that?
Re:Why? (Score:4, Insightful)
More like an attempt to make movies impossible to watch at home in the same form, therefore making you buy theater tickets.
But seriously, most people can see the difference in rates above 30. See the complaints about the "Soap Opera Look" of 120Hz TV's.
Panning and dolly shots look terrible at 24fps if they go too fast. So. much. judder.
Even if the human eye can't distinguish > 60fps (it definitely can), the human retina is not v-synced with the television/screen. So you still need more temporal resolution than the eye can handle for it to appear smooth.
Re: (Score:3)
Personally, I'd like to see the end of the concept of framerate altogether - that changes to the display image are bundled into timestamped packets which can be of any infintessimal interval. It'd impose a slight bandwidth overhead but be pretty insigifnicant if more than a few pixels were contained in each bundle.
You'd be detaching the uptake mechanism from the playback mechanism. Each device does the best it can with the hardware it has on-hand. The simpler the task it's presented with, likewise, the s
Re: (Score:3)
It's no more a problem with digital than film. They can both accept the same shutter speeds and aperture settings and even the same lenses. How can digital exhibit anything different?
Stuttering happens on pans that are too fast. It's because there aren't enough frames. But sometimes we want sweeping panoramic shots at a decent speed. In order to deliver on that, you MUST have a higher frame rate.
Re:Why? (Score:5, Interesting)
> Most people can't see a difference in rates above 30fps and pretty much nobody can distinguish fps over 60 fps.
Bullshit. You most certainly CAN see a difference, particularly when there's high-resolution, high-contrast detail with fast movement across the screen. In fact, high-framerate video has its own "uncanny valley" problem (above a certain framerate, generally in the neighborhood of ~300fps, hyperfluid 2-dimensional video becomes disorienting and vertigo-inducing, because your brain can't reconcile the seemingly-lifelike motion with its lack of depth).
Comment removed (Score:5, Interesting)
Re: (Score:3)
I don't know if you'd describe it as a flop, exactly, but it certainly lacked the massive uptake of DVDs/CDs over VHS/Cassettes. In my experience, most people didn't rush out and buy a Bluray player; they got one the next time they were going to upgrade anyway - with their console, or built into their TV, or occasionally replacing their standalone DVD player. I still know many people (including myself) who just use DVDs.
The high-res transition was very much an iterative update. People had too much invested
Re: (Score:3)
The reason why DVDs took over from VHS tapes (which hardly was an overnight thing either) had more to do with creature comforts of not having to rewind the tape, deal with tape stretching/media fall out, and other physical problems of the VHS medium. That the DVD discs took up much less space was also a huge bonus and none of those advantages applied to Bluray as a format. For the most part Bluray is simply DVD on steroids and seen as just that.
On the technical side, there are some decided advantages of t
Re: (Score:3)
you don't have to rewind dvds?
http://www.dvdrewinder.com/
Re: (Score:3)
Most people think the stereo that came with their car sounds good enough, because that's all they've had.
Most people think the earbuds that came with their MP3 player are good enough, because they've never tried anything different.
Most people think Bud Light is OK, because that's what everyone else seems to be drinking.
FFS: Most people ran their CRT monitors at 640x480 @ 60Hz, before Windows started defaulting to 800x600 (still at 60Hz), and they were OK with that because it was all they knew. And I know,
Re:Why? (Score:5, Interesting)
Most people can't see a difference in rates above 30fps and pretty much nobody can distinguish fps over 60 fps. There are plenty of people (especially gamers) who think they can but they are imagining it.
You really don't have a clue, but you've bought into some techno-babble explanation and have convinced yourself that you do. It's sad, really.
There's a point at which a flickering light source stops being perceived as flickering, and starts being perceived as continuously lit. That threshold is somewhere south of 60 fps for the vast majority of people, true, but that isn't the same as not being able to perceive more than 60 fps (much less 30!).
The reason film (@24 fps) and TV (@30 fps) look smooth, where video games (@30 fps, or even 60 fps+) don't is because the human eye is fooled by (or possibly trained to be fooled by) motion blur. When a camera takes a picture, it doesn't actually capture a moment in time, it captures a span of time. The more something moves during that span, the more motion blur exists. This is due to the shutter system which is required to keep the film from being exposed when it is out of position within the camera. (Note: Motion blur is an effect that can be seen with the naked eye, even with no camera in the mix, but the speeds involved for that are *much* faster than required to see them on film.) Video games (short of the ultra-high end games coupled with extremely powerful graphics cards) don't produce motion blur. Instead, they work to produce more than 30 fps, which is the *minimum* required to feel 'smooth' in the absence of motion blur, frame rates faster than 30 fps (up to at least 75 fps for most people) have been shown to be distinguishable as noticeably smoother in experiments.
Re:Why? (Score:5, Insightful)
there is a reason we use 24.
Like because it is cheaper and easier to make movies that way? If something looks fake when there is not enough blur, it is because movie makers have not bothered figuring out how to make their scenes look more realistic at a more realistic frame rate. It has nothing to do with the video technology. It is like complaining that color TV looks too realistic and they should stay with black and white. These days people only watch black and white TV whenever they are feeling nostalgic. I applaud Jackson for trail blazing the path to higher frame rates in movies.
Re: (Score:3)
I don't buy that. (Score:5, Interesting)
Lets consider two scenarios here.
In the first case, the camera is not panning, but just filming the scenario as it is, and projector playing it back at the filmed rate. Thus viewing the projection is the same experience as looking at the scene in real life, to within the fidelity of the playback. Notably, there is no depth (or a poor simulation of depth with forced focus), but apart from that higher fidelity should be more realistic. The viewer's eyes will be jumping around the big screen and blinking just like normal so there is absolutely no reason to try to "simulate" that; you have the real-life effect already occurring. Same with motion blur; the eye will supply the same amount of blur that it does in real life, so there is no reason to simulate it, beyond compensating for too *low* of a frame rate, which requires a longer integration time to avoid appearing choppy.
And yet it is exactly this sort of scene that was causing people to deride 48fps as being "soap opera like". They talked about how watching the Hobbits slowly walk down the hill towards them looked epic in 24fps, and looked like a documentary in 48fps. It destroyed the suspension of disbelief for them, and made them think they were looking at actors not Hobbits. That has nothing to faking limitations of human vision. It is completely psychological; whether that psychological effect is inherent in the medium or the result of prior conditioning is debatable, though.
The second scenario is where the camera is panning, and thus forcing visual motion on the user even though they didn't initiate it. This is identical to being smoothly flown around a scene, and how "realistic" it is will depend on whether that would actually happen in real life. In situations where it is realistic my argument above would apply; the eye will be looking around the moving scene just like it would be when looking out a train window.
On the other hand, in situations where panning is being used to simulate human motion, I would argue that 48fps could allow the filmmaker to have more realistic view changes if they want them. Low rate 24fps forces the director to have slow gradual pans less they create a choppy or blurry mess as a result in the limitations of the rate. However, as you pointed out, the eye doesn't work that way. It jumps around, taking time to settle and focus each time. If you tried to do that at 24fps the viewer would get lost, unable to follow the transitions. In large part this is because in real life they are controlling the transitions so they know in advance where the view is changing to, but to a lesser extend this is due to the limitations of the frame rate. Faster frame rates will allow for more abrupt translations that are still possible to follow.
Re: (Score:3)
Whilst I do generally like higher frame rates, the above is actually the trouble... It looks _too_ realistic. For a high fantasy movie like The Hobbit sometimes putting a little 24Hz vaseline on the lens helps let your brain fill in the gaps with fantasy. That 48Hz the project fills in the gaps with reality.
It sounds reasonable, but it isn't. There are less frames, but the filling in your brain does is more or less equal to what just presenting it with more frames does. It doesn't synthesize detail like it does with low spatial resolution imagery.
Consider the following: suppose you were given a low resolution drawing, you would not be able to draw the equivalent of the high resolution version of it. If you were given two subsequent (drawn) frames, you _would_ be able to fairly accurately draw the frame in betw
Re: (Score:3)
I agree with you, I think hobbit looks like ass, WAY to much detail and no motion blur or warm moments for your eyes to pause on.
I agree with you. I think real life looks like ass. WAY too much detail, and no motion blur. Though our eyes provide "motion blur" anyway on our practically infinite FPS world. Get a grip.
Re:Why? (Score:5, Informative)
The reason you think 720p looks better is because of frame rate. That's why ESPN and Fox Sports both use 720p for broadcast. In the US-ATSC system, 1080i is interlaced at 59.94 fields per second, or 29.97 frames per second. 720p is progressive scan at 59.94 FRAMES per second.
There is also a lesser quality version of 720p at 29.97, but broadcast 720p is 59.94 FRAMES per second. That's why it is better for fast-action sports, and looks much better than 1080i.
720p-60 (as it is called) uses the same amount of broadcast bandwidth as 1080i-30.
YIAABE (Yes, I am a broadcast engineer) Posting annon since I am too lazy to log in.
By the way, if your cable or satellite provider is giving you ESPN in 1080, they are downgrading the original format, but that bigger number impresses the idiots who don't know any better.
Re: (Score:3)
The reasons are simpler than that. At the time, VHS was still king. LCD TVs were expensive, small, and fickle novelties. CRT tele
Re:Why? (Score:5, Informative)
In the US, analog TV used to be broadcast in NTSC format, 320 x 240 pixels
Completely incorrect. Analog TV had no pixels at all, it had scan lines, and there were 525 of them. You only have pixels in digital media, not analog.
Where? (Score:2)
Where can I see the Hobbit in 48FPS?
Re:Where? (Score:5, Informative)
Never heard of google? http://www.48fpsmovies.com/48-fps-theater-list/ [48fpsmovies.com]
Re:Where? (Score:4, Funny)
Sure I have. [lmgtfy.com]
Re: (Score:2, Informative)
Where can I see the Hobbit in 48FPS?
"Yes."
What makes it... (Score:2, Insightful)
distracting? Since the film seems to be getting panned a lot, does this maybe have something to do with it?
Re:What makes it... (Score:4, Informative)
If you have a modern medium to high end HDTV, turn on frame interpolation [wikipedia.org] processing (by whatever silly trademarked name your TV has for it) and watch an HD movie (especially one with sweeping pans and action, etc). It's hard to quantify exactly why it's distracting, but is sometimes described as a "soap opera" effect.
It bugs me too, but it really is hard to objectively say why. I'd like to think it's about a subconscious feeling of "expansiveness" and uncertainty (since your brain has to interpolate instead of the TV, and maybe your brain interpolating engages you with the content differently, etc) that you want with a more "epic" movie experience.
But there is also a strong argument that it's mostly your brain adjusting to something it has not experienced in this setting, and you will get used to it if exposed enough. Sort of like getting a new pair of glasses with a different shape/refractive index...
Rather than shooting with more FPS (Score:5, Insightful)
Re: (Score:2)
Is this the part where I have to pretend that the whole Tom Bombadil segment wasn't the most poorly-written part in the books?
Re: (Score:3, Insightful)
Re:Rather than shooting with more FPS (Score:4, Insightful)
150 pages of Frodo walking with almost nothing else happening in mordor didn't excite you?
Re:Rather than shooting with more FPS (Score:5, Funny)
Re: (Score:2)
Fog On The Barrowdowns is one of the most harrowing parts of the book, and the first major signal that this wasn't merely The Hobbit part 2.
Re: (Score:2)
Re:Rather than shooting with more FPS (Score:5, Funny)
But it yielded Tim Benzedrine [wikipedia.org]. That alone was worth it.
Along those lines, a quote in TFA made me wonder exactly which book Peter Jackson was basing the plot on?
An interminable sequence in Bilbo’s hutch culminates in a dorky, dwarven drinking song, performed alongside animated plates and spoons.
Bored of the Rings?
We Boggies are a hairy folk,
Who like to eat until we choke.
Loving all like friend and brother,
We hardly ever eat eat other.
Gorging out from morn till noon,
But don't forget your plate and spoon.
Now, I would pay good money to see that film.
Re: (Score:3)
At the same time I love the movies (and just got the extended BR edition), I'm sad that they weren't as 'faithful' as they could have been to the books, and worse is that it was such a huge endeavor that it's not likely to be tried again for at least a very long time, if ever. I more than understand cutting out a part like Tom Bombadil (and changing and cutting various other things) for the sake of brevity, but that's not what they did - they cut it for the sake of stuff that wasn't in the books at all.
Bac
Re: (Score:3)
Actually, now that I think of it - I wonder if this will be offered on things like Netflix or AppleTV in the full resolution/refresh rate? Or if it will be crippled to avoid making the Blu-ray version look bad?
Re:Rather than shooting with more FPS (Score:5, Insightful)
And like many people that have commented it seems, I found that the Tom Bombadil thing was horrendous in the book, and cheered a little inside when it was skipped in the movie.
I honestly can't even slightly understand why some people have such a hardon for that part of the book. It was terrible. TERRIBLE!
One of the reasons people like the Tom Bombadil section is because of the character development.
Remember, the book was about little, ordinary people that can do great things, even while big, great people are doing great things all around them. The book was not about little people outshining big people, nor was it about great people overshadowing the efforts of little people. On complaint about the movie was that it was more about Aaragon and Legolas with Gimli being the comic relief than it was about the Hobbits.
As for the character development, the Tom Bombadil was one of the first things that said, "This is not a simple trip across the forest. This is a dangerous journey and you better be ready." In the book the RingWraith drove them into the dark forest, and they almost got killed because they did not take the journey serious enough. When they got to Bree, they tried to fall back into the easy ways of the shire, only to be almost killed again by RingWraiths because they weren't paying attention. Only this time, they "found" a guide to help them in their character development. By the time they dealt with WeatherTop and finally made it to Rivendell, they were ready to start the journey to Mordor.
The Scouring of the Shire, another section left out by the movie, was the final step that the Hobbits had to take to realize that they were no longer children or ordinary people, but had become great people with large responsibilities. They no longer needed to rely on their guides or other races to take care of their own troubles. Their accomplishments did not belittle the other races, but finally became equals with them. And as equals, they were expected to take care of their own troubles. With great power comes great responibility. (The words are from Spider Man, but the theme is ancient.)
i always liked the soap opera look (Score:2)
for decades i could never figure out why they could do it on TV soap operas and some sit coms but not on movies that cost a lot more money to make
i'll take a blu ray of an older movie over the grainy theater crap quality any day
Re: (Score:2, Insightful)
Blu-Ray of old movies are still 24P, they don't magically add more frames that don't exist, and interpolation generally looks like ass.
Some people want to see ass (Score:2)
Blu-Ray of old movies are still 24P, they don't magically add more frames that don't exist, and interpolation generally looks like ass.
But some people want to see ass, especially when voiced by Eddie Murphy [wikipedia.org]. If DreamWorks Animation wanted to rerender Shrek at twice the frame rate, could they?
As a lesson learned, actually. (Score:2)
Re:As a lesson learned, actually. (Score:4, Informative)
old Dr. Who at best
My desire to see The Hobbit just multiplied ten-fold.
Re: (Score:3, Insightful)
I really don't get why people are so attached to 24fps. Can you imagine this with computer games?
Re: (Score:3)
Re:As a lesson learned, actually. (Score:4)
I really don't get why people are so attached to 24fps. Can you imagine this with computer games?
Because 24fps in a movie has no relation whatsoever to 24fps to computer games. In a movie, 24fps is shot with cameras and you get motion blur (just as you would if you take photos at a film speed of 1/24th of a second). Your brain is an amazing thing, and happily interpolates the motion blur to give a concept of smoothness. What I'd like to know is whether 48fps looks "soap opera" simply because we've conditioned ourselves to equating high fps video with the crap shows that always used it on TV, or whether there really is something magical about 24fps. I can't really see any inherrent reason why 48fps should look bad per se, even if it probably doesn't add anything much.
I do know, however, that there is no way I want to go anywhere near The Hobbit. Forget the whole 24 vs 48fps thing -- Jackson sold out big time in making three stodgy films out of one tiny, light-hearted children's book, presumably for no other reason than to rake in the extra cash. He ought to be ashamed of himself.
Re: (Score:3)
how does it "look" different (Score:2)
not having seen the movie or old enough to remember it if it happened in the golden age of movies..
how does it look different in the theater at 48fps vs normal 24fps movies?
Re: (Score:3)
uncanny valley (Score:3)
uncanny valley - in short, not quite real gets a worse response than being obviously fake
http://en.wikipedia.org/wiki/Uncanny_valley [wikipedia.org]
Video games have been doing this for years (Score:4, Informative)
When playing a game, I can easily tell if it's running at 30 fps or 60 fps, and I *much* prefer the higher framerate, for obvious reasons. It'll definitely take a bit of getting used to when it comes to moves, but it is no doubt a good thing.
Re: (Score:3, Insightful)
Re: (Score:3)
Everyone tells me I'm crazy when I say that windowed games and videos play really sluggish in Windows 7 compared to XP. The reason is because the new window manager uses 3D hardware to do compositing, and for some reason, updates seems to be locked in at 30 FPS (my guess is 50% of the monitor refresh rate). XP updated at full blast and provided much, much smoother games and video. I've noticed a rather huge loss in overall performance on my new Win7 machine, despite it being massively more powerful than
60fps with motion blur may provide a solution (Score:5, Interesting)
HOWEVER..., critics argue that the Hobbit feels less 'dream-like' and 'too real'. Even though I disagree with them to an extent, I recently played a game called Nitronic Rush [nitronic-rush.com] (fast free Wipeout clone, with tron-esque graphics, great fun btw). I set it to 60fps, but the graphics are 'enhanced' by motion blur, which 60fps normally doesn't 'need'. We're talking at least a couple of frames worth, and maybe up to 5 frames worth of artificial motion blur. However, I find this actually gets the best of both worlds. You get the smoother motion so that your eyes don't ache, and any fast panning looks convincing. But you also get the cinematic 'blurry' look that 24fps films provide (24fps film techniques employ motion blur naturally, or at least something similar to motion blur).
I think 60fps with this kind of motion blur may have a big future for it.
Re:60fps with motion blur may provide a solution (Score:5, Interesting)
Re: (Score:3)
Re:60fps with motion blur may provide a solution (Score:4, Informative)
It only matters very little whether you capture rapid motion with 30fps or 1000fps - the motion still occurs at its natural speed, it's the amount of motion blur per frame that changes. The eye sends a continuous stream of signals to the brain and the brain "sees". Most people have difficulty registering details about an object that moves faster than 36 degrees per second. So for roughly 180 degrees field of view, anything that crosses your sight in under 5 seconds is blurred. That's not much frame rate, right there. I only give this example to demonstrate that high frame-rate is not that important for action.
When you shoot film @ 24 fps, the photographic shutter does not stay open the whole 1/24 sec. time, because that will be too much motion blur and also too much exposure at, say, F2.8 for the film. Normally film is shot at shutter speeds about 1/50 sec. This means that half of the 1/24 sec. motion is NOT CAPTURED AT ALL. Film creates a stroboscopic effect, and when played back through a projector that displays 1/50 sec. worth of action for 1/24 sec., it looks eerie, artsy.
For the soap opera look, cheap TV shows are shot with cheap video cameras which do not have light shutters. Shutter is open for the duration of the frame - 1/60 for interlaced NTSC TV. The whole action is captured with motion blur similar to film (film at 1/60 sec. shutter). The playback is absolutely realistic, cheaply realistic.
So, there you have it: TV @ PAL/50i and film with "normal" shutter speed 1/48 sec. @ 24fps have similar amount of motion blur. Film has a stroboscopic effect, TV does not.
Obligatory xkcd... (Score:4, Informative)
Tired of Luddites calling higher FPS "soap opera" (Score:4, Insightful)
Most of the time you can't even tell the difference between frame rates, except when it emerges as artifacts at 24 fps.
24 fps movies are purposefully shot with more motion blur to hide the jerkiness. But nothing really gets around it when panning.
So 24fps primarily equals artifacts: Blurring, jerky motion, and juddering pans.
How nonsensical is it, and how resistant change do you have to be, to worship these artifacts. They are no more beneficial than ticks/pops were on Vinyl. There is certain nostalgia value to listening to something with ticks/pops sometimes, but it isn't something we put everywhere because we can't do without it.
So these resistant to change, Luddites in love with quite irritating artifacts have taken to calling superior motion video with less blur, less judder and less jerking: "The Soap Opera Effect".
Do a freeze frame on a soap opera and good movie. You can still tell which is which when frozen. Soaps look like crap, because they have crap production values. Poor sets, poor lighting, poor cameras, shot without any flair.
Shoot 48fps (or 60 fps or 120 fps for that matter) with great sets, great lighting, great cameras and great flair and it will be amazing and have nothing in common with soap operas.
Re:Tired of Luddites calling higher FPS "soap oper (Score:4, Insightful)
To be fair there is more to it than just "24fps has unwanted artefacts". Most people will probably remember when they first saw Saving Private Ryan because the film stock and shutter speed used gave it a very realistic, un-blurry and gritty look. One of the biggest reasons it has taken so long for digital cinema cameras to become popular is that the early ones were unable to replicate the effect of using particular well known film stocks and camera techniques.
Star Wars is another good example to look at. Part of the charm of the original movies is that they were a bit rough around the edges. The film had a fair bit of grain that made the Star Wars universe look a bit grubby and used, rather than sleek and clean like Star Trek. The later trilogy was crisp and clean, and ended up looking more like generic sci-fi than the Star Wars we loved.
48fps is still in its infancy and it will take some time for cinematographers and directors to figure out how to get the effect they want from it. In the end the result will be better than 24fps, but that doesn't automatically mean that the early examples will be particularly good. 3D was the same; everything looked terrible until Avatar finally figured out how to use it and still look like a movie and not give you a brain aneurysm.
Re:Tired of Luddites calling higher FPS "soap oper (Score:5, Informative)
Hi there. Technical director here. Just need to step in a clarify the relationship between frame rate and motion blur. I'm seeing a lot of posts that are calling for higher frame rates with more motion blur, as if they are two completely independent things. They're actually closely linked. Let me explain:
Motion blur is the effect of a moving object in the frame while the shutter is open. In photography, the time the shutter is open is called the shutter speed, and is used along with iso and aperture to control the overall exposure. If you know anything about photography, this is pretty basic stuff.
In the film world, the equivalent of shutter speed is what's known as shutter angle. This is because the shutter for film camera is a spinning disk, of which a portion lets light through and a portion blocks it as it spins. The portion, measured in degrees, that lets the light in is the shutter angle. Typically, the shutter angle used in film is 180 degrees, meaning during half that 1/24 of a second frame rate, the film is being exposed. In photographic shutter speed terms, that would be the same as 1/48. Again, not too complicated.
Here's the catch though: because your film stock is rolling by at 24 frames per second, each frame can only be exposed for 1/24 of a second or less. If you use a smaller shutter angle, or faster frame rate, you get less motion blur. What this means is there's no practical (the film industry definition of practical) way of getting more motion blur than your frame rate and shutter angle allows. The faster you go, the crisper the action will be.
So at this point you're probably wondering who cares about the amount of motion blur in a movie? The answer is: the audience. The industry has shot film at 24fps with a 180 degree shutter angle for so long that's what everyone is used to. The last thing you want is to distract your audience away from enjoying the movie because there's know there's something different about the picture quality but they can't figure out what.
Finally, I'd like to point out that this choice of frame rate, like many other subjective decisions that are made during a movie production, are made at the director's discretion. Peter Jackson is going out on a limb by shooting a movie at this frame rate, and doubtless he has his reasons for doing so (mostly due to it being shot in 3d as I recall) but it's still his call. The industry talk I hear views it as an experiment, and everyone's curious as to how it will work (or won't). If audiences do get used to it and like it, expect to see more movies shot like this, and in enough time it will be the new standard.
Re: (Score:3)
Informative post but you forgot one _tiny_ detail about The Hobbit.
They are shooting everything DIGITAL.
--
Classmates.com are a bunch of scammers.
Re:Tired of Luddites calling higher FPS "soap oper (Score:4, Informative)
Wow, sign me up. (Score:3)
> The 48 fps version of The Hobbit is weird, that's true. It's distracting as hell, yes yes yes.
That said, like 3D, you do get a choice, so no harm, no foul. We will be seeing the film in 2D, 24 FPS, because 3D gives my wife migraines and because of reports in New Zealand of motion sickness - like symptoms amongst viewers there.
Parenthetically, I predict that of the people who love 48 FPS will contain a high percentage of people who can play first person shooters for hours without motion sickness, and conversely, the people who don't like it will be those who can't. Although I don't think anyone is collecting this metric, sadly.
I will see the film at the faster frame rate (and in 3D because I believe that's your only choice at 48 FPS) but I want to see it "normal" first.
I don't feel qualified to judge the technology not having viewed it yet, but the most interesting criticisms that have come out of advanced showings so far is that the sets look more like sets, which disrupts one's ability to suspend disbelief, and that the depth of field tends to be very deep, with everything in focus, which makes things look weird (because the human eye doesn't see that way). Speed Racer did the same thing, intentionally. (Speaking of which, it appears that I'm the only one who liked Speed Racer.)
An example of better being worse... (Score:5, Interesting)
I've been doing computer animation for 35 years, as long as it has existed. Back in the early 80's, I worked on some early 60 field-per-second animation; and I was a convert to high-frame-rate footage since then. (The opening to the PBS show NOVA was perhaps the first 60fps animation ever done.) When we started doing broadcast graphics (show openings, things like that) for TV, we naturally did them at 60fps, and that looked right as it worked with the rest of video. Finally, though, we moved into advertising, and TV advertising was (and still is) typically 24fps. And it bothered me!
But then, something changed my mind completely. We were doing an ad for Snacky, a Japanese snack food company. There was the required silly animated spokespuppet, and we modeled it and made it perform. Part of doing animation is doing the lip-sync, and the company gave us the dialogue in English to animate to. We did this, although it didn't seem right -- expecting them to give us the Japanese soundtrack eventually.
But no, it got to a couple of days before delivery, and the character was still speaking English, and we asked the customer when he came to review the work. "This is only going to be shown in Japan, right?" "Oh, yes, yes!", "And you're going to dub it into Japanese, right?" "Of course! Yes!" "But the lip sync is to an English sound track, the lips are not going to match the dialogue!" "YES! JUST LIKE ALL GOOD ANIMATION!"
Because in that day, lip-sync that was correct in Japanese meant it was low-quality domestic animation; where if the lip-sync didn't match it was high-quality American animation. Nobody can tell me that wrong lip-sync is in any way superior -- except that there were 150 million people in Japan who would see it that way instinctively and immediately.
So, I became a happy convert to 24 fps animation. I applaud Peter Jackson for his incredibly audacious experiment, and I hope he succeeds, but he has to fight the near-instinctive reaction from a lot of people who see 48 fps as video.
I think that part of the problem with The Hobbit at 48fps is that the screens are so terribly dark that you just can't appreciate the high frame rate. Your eye integrates dark scenes over a long period of time, and at 48 fps with the very very dark 3D screens, I believe that your eye smears the frames together. On Transformers III, I removed all the motion blur from the very dark scenes, because even at 24 fps they got smeary.
Went and saw it at 48fps (Score:3)
And two things I have to say:
1) If you get the least bit motion sick, don't go see it at the high frame frate in 3D. Normally I don't, even when seeing IMAX/OMNIMAX, but this film I did.
2) The 48 frames per second and 3d makes certain parts of the film like watching a live stage production. The problem then becomes with the post production. There were a lot of scenes when you could tell the background was composited and with so much CGI some of it was like going back and watching CGI from 15 years ago.
That was one of things I liked about the LOTR movies and especially by the third movie, the CGI had gotten so good that it was largely seamless. You didn't notice it, it was just part of the story. In this one I noticed it and often found myself cringing.
Edison advocated 46 fps a century ago (Score:3)
Re: (Score:2, Informative)
Because 1080p24 is the resounding standard for high def. 50fps is incompatible with that.
Re: (Score:3)
NTSC television to the bitter end was 29.97. Many hidef television broadcasts are actually at 23.976 fps, and most feature films shot on digital equipment are at this rate as well, because it converts to 29.97 with fewer artifacts.w
Re: (Score:3)
Sometimes we do, but that's definitely a visible artifact. You can't just delete every other frame, you have to add blur and interpolation to get the same level of motion blur the 2x picture had -- the cameras can't shoot overlapping frames.
Another factor with shooting "fast" is that it halves your available light, so if you have an ISO 800-equivalent gain factor at 24 fps, it becomes ISO 400 at 48; so then either your f/stop (and thus depth of field) has to give, your shutter angle (and thus motion blur)
Re: (Score:3)
A lot of speculation or vague answers to your post, but the real reason is simple - many theaters can't display it at 48 fps, and this way they can easily make 24 fps prints.
Awkward... (Score:5, Interesting)
Actually, I was thinking more along the lines of the Hogfather (specifically from the movie).
While I enjoyed this first Hobbit movie, I found the Radagast scenes awkward (like an old family photo with too-large glasses and sisters with poofy bangs). Radagast and his bunny sled seemed too much like something right out of Discworld, which would be delightful except that combining Discworld and Middle Earth yields a very large impedance mismatch.
Re: (Score:2)
60 fps would have been a good thing. 48 is just as dumb as 24. You'll still have to do a pulldown on most consumer displays with 48 fps. If you're reading this Hollywood, update the DCI spec. to support 60 fps!
48 is either dumber or smarter than 24, but not just as dumb, depending on whose idea it was. If it was the display makers' idea, then it's goddamned genius because all the videophiles are going to buy new displays all over again. My TV's panel has a native film mode so it doesn't have to do anything wacky to display a film-mode signal. But it doesn't have a 48hz mode... Not that I'll be buying another TV. Problem is, even if a firmware update would let my TV display that content, it's not going to be an al
Blurs the imperfections (Score:3)
Re:48 fps for everything! (Score:5, Insightful)
We really need to move beyond 24fps though. Take any single frame of that scene and just try to make out what is in the house. Is that a lamp? Or a table? Or wait, maybe it's a vase with a funny flower coming out of it. You can't tell. It's a blurry mess. All you can tell is that is was a sweep of the inside of a house. No detail. [...]
That of course assumes that viewing all the detail is important. In many cases "viewing all the detail" is not what you want. It can be distracting from the message that the writer and director is trying to convey. At times the blur in the background can help support the in focus stuff in the foreground and the elements that are actually important to the story.
Re: (Score:3)
Re: (Score:3)
Yup, that's my problem. I'd love to see 48fps, but I've seen maybe a half dozen 3D films, which is about 5 more than my lifetime quota. No need to subject myself to that again.
I think that 48fps (or higher) *would* be a good thing for the motion picture industry, but 3D is not. If I have to have one to get the other, I'm going to have to pass.
To be fair, some people like 3D. That's fine for them, but glasses over glasses gives me a headache. No thanks.
High frame rates solve a problem (for me) in the ci
Re: (Score:3)
Yes. Most people can see the difference between 240 fps and speeds below that. Some people can perceive differences up to 360 fps. Note that these values are way above the frequencies that we can detect flicker -- around 75 fps.