Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Lord of the Rings Movies Entertainment

Why The Hobbit's 48fps Is a Good Thing 599

An anonymous reader writes "Last year, when we discussed news that The Hobbit would be filmed at 48 frames per second, instead of the standard 24, many were skeptical that format would take hold. Now that the film has been released, an article at Slate concedes that it's a bit awkward and takes a while to get used to, but ends up being to the benefit of the film and the entire industry as well. 'The 48 fps version of The Hobbit is weird, that's true. It's distracting as hell, yes yes yes. Yet it's also something that you've never seen before, and is, in its way, amazing. Taken all together, and without the prejudice of film-buffery, Jackson's experiment is not a flop. It's a strange, unsettling success. ... It does not mark the imposition from on high of a newer, better standard — one frame rate to rule them all (and in the darkness bind them). It's more like a shift away from standards altogether. With the digital projection systems now in place, filmmakers can choose the frame rate that makes most sense for them, from one project to the next.'"
This discussion has been archived. No new comments can be posted.

Why The Hobbit's 48fps Is a Good Thing

Comments Filter:
  • Re:Why? (Score:2, Informative)

    by Anonymous Coward on Friday December 14, 2012 @01:26PM (#42288513)

    why are the doing this now? and why only 48fps?

    Follow the slashdot link in the summary, it was discussed extensively there, no need to derail yet another thread with it.

  • Re:Where? (Score:5, Informative)

    by ArchieBunker ( 132337 ) on Friday December 14, 2012 @01:29PM (#42288559)

    Never heard of google? http://www.48fpsmovies.com/48-fps-theater-list/ [48fpsmovies.com]

  • Re:Why not 50Hz? (Score:2, Informative)

    by Anonymous Coward on Friday December 14, 2012 @01:29PM (#42288561)

    Because 1080p24 is the resounding standard for high def. 50fps is incompatible with that.

  • Re:Why? (Score:5, Informative)

    by fastest fascist ( 1086001 ) on Friday December 14, 2012 @01:32PM (#42288597)
    I'm not aware of broadcasts in 50 FPS. AFAIK, they're being evaluated, but basically material is broadcast at 25 or 30 fps, depending on the standard used. These conform to the old PAL/NTSC/SECAM framerates. Interlaced formats, however, can be 50 or 60, but that's because each frame is essentially split into two frames of alternating horizontal lines, "fields".
  • by wwalker ( 159341 ) on Friday December 14, 2012 @01:34PM (#42288635) Journal

    When playing a game, I can easily tell if it's running at 30 fps or 60 fps, and I *much* prefer the higher framerate, for obvious reasons. It'll definitely take a bit of getting used to when it comes to moves, but it is no doubt a good thing.

  • Re:Where? (Score:2, Informative)

    by girlintraining ( 1395911 ) on Friday December 14, 2012 @01:40PM (#42288721)

    Where can I see the Hobbit in 48FPS?

    "Yes."

  • Obligatory xkcd... (Score:4, Informative)

    by Dr. Manhattan ( 29720 ) <(moc.liamg) (ta) (171rorecros)> on Friday December 14, 2012 @01:43PM (#42288765) Homepage
    https://www.xkcd.com/732/ [xkcd.com]. Especially the alt-text.
  • Re:Why? (Score:1, Informative)

    by Anonymous Coward on Friday December 14, 2012 @02:00PM (#42288977)

    Screen refresh rates are something totally different from field rates or frame rates.

    In the US, analog TV used to be broadcast in NTSC format, 320 x 240 pixels, 29.97 frames per second (progressive rate). Analog TVs have long supported 480 lines, though. In interlaced rate, the frames are drawn on screen in something called "fields". First, the odd field is drawn (all the odd lines) and then the even field (all even lines). Each field persists for two refresh cycles, alternating in turns. This makes it look like a 480-line picture.

    Even with digital, the frame rate is still mostly 29.97fps, with some 24fps made directly from digitized film. All the "i-modes" (480i, 720i and 1080i) are interlaced in the same way as it was with analog, which makes 1080i look worse than 720p (IMO, at least). Nothing changed that much.

    So, although TVs can refresh the screen 120 times per second, DVDs, BDs and digital broadcast TV still uses the same frame/field rates as ever. And I don't think it will change in the future.

  • Re:Why? (Score:4, Informative)

    by Anonymous Coward on Friday December 14, 2012 @02:09PM (#42289073)

    I guess that depends on how you define it. On interlaced broadcast (like all old TV), you get a half-frame every 50 seconds, where half-frame means either the even or the odd lines, alternating. However in true interlaced broadcast (i.e. where the material was already recorded in that format, not transformed into it as when putting a movie to TV) it's not that you get the even and odd lines of the same image, but each half-frame is recoded on its own time. So say you've got 50 half-frames per second, then you'll get e.g. the even lines of the image at 0ms, then the odd lines of the image at 20ms, then the even lines of the image at 40ms, then the odd lines of the image at 60ms, and so on. Only with converted stuff, the even and odd lines will be from the same image.

    You can see that quite nicely when capturing a true interlaced-recorded TV program on tjhe computer, where two half-frames are combined into a frame. If there's fast movement in the scene, you'll get striped frames because your "frame" is actually the combination of two images at different times, with the even and odd lines image separated by 20 ms (50Hz) or 16.7 ms (60Hz). Given that those images are recorded at different times, I'd say it makes sense to consider them different frames which are recoded at half the vertical resolution with a displacement of one line every second frame.

  • Re:Why? (Score:5, Informative)

    by Anonymous Coward on Friday December 14, 2012 @02:29PM (#42289295)

    The reason you think 720p looks better is because of frame rate. That's why ESPN and Fox Sports both use 720p for broadcast. In the US-ATSC system, 1080i is interlaced at 59.94 fields per second, or 29.97 frames per second. 720p is progressive scan at 59.94 FRAMES per second.

    There is also a lesser quality version of 720p at 29.97, but broadcast 720p is 59.94 FRAMES per second. That's why it is better for fast-action sports, and looks much better than 1080i.

    720p-60 (as it is called) uses the same amount of broadcast bandwidth as 1080i-30.

    YIAABE (Yes, I am a broadcast engineer) Posting annon since I am too lazy to log in.

    By the way, if your cable or satellite provider is giving you ESPN in 1080, they are downgrading the original format, but that bigger number impresses the idiots who don't know any better.

  • by Zordak ( 123132 ) on Friday December 14, 2012 @02:48PM (#42289481) Homepage Journal

    old Dr. Who at best

    My desire to see The Hobbit just multiplied ten-fold.

  • Re:What makes it... (Score:4, Informative)

    by Dahamma ( 304068 ) on Friday December 14, 2012 @03:06PM (#42289855)

    If you have a modern medium to high end HDTV, turn on frame interpolation [wikipedia.org] processing (by whatever silly trademarked name your TV has for it) and watch an HD movie (especially one with sweeping pans and action, etc). It's hard to quantify exactly why it's distracting, but is sometimes described as a "soap opera" effect.

    It bugs me too, but it really is hard to objectively say why. I'd like to think it's about a subconscious feeling of "expansiveness" and uncertainty (since your brain has to interpolate instead of the TV, and maybe your brain interpolating engages you with the content differently, etc) that you want with a more "epic" movie experience.

    But there is also a strong argument that it's mostly your brain adjusting to something it has not experienced in this setting, and you will get used to it if exposed enough. Sort of like getting a new pair of glasses with a different shape/refractive index...

  • by vmxeo ( 173325 ) on Friday December 14, 2012 @03:09PM (#42289925) Homepage Journal

    Hi there. Technical director here. Just need to step in a clarify the relationship between frame rate and motion blur. I'm seeing a lot of posts that are calling for higher frame rates with more motion blur, as if they are two completely independent things. They're actually closely linked. Let me explain:

    Motion blur is the effect of a moving object in the frame while the shutter is open. In photography, the time the shutter is open is called the shutter speed, and is used along with iso and aperture to control the overall exposure. If you know anything about photography, this is pretty basic stuff.

    In the film world, the equivalent of shutter speed is what's known as shutter angle. This is because the shutter for film camera is a spinning disk, of which a portion lets light through and a portion blocks it as it spins. The portion, measured in degrees, that lets the light in is the shutter angle. Typically, the shutter angle used in film is 180 degrees, meaning during half that 1/24 of a second frame rate, the film is being exposed. In photographic shutter speed terms, that would be the same as 1/48. Again, not too complicated.

    Here's the catch though: because your film stock is rolling by at 24 frames per second, each frame can only be exposed for 1/24 of a second or less. If you use a smaller shutter angle, or faster frame rate, you get less motion blur. What this means is there's no practical (the film industry definition of practical) way of getting more motion blur than your frame rate and shutter angle allows. The faster you go, the crisper the action will be.

    So at this point you're probably wondering who cares about the amount of motion blur in a movie? The answer is: the audience. The industry has shot film at 24fps with a 180 degree shutter angle for so long that's what everyone is used to. The last thing you want is to distract your audience away from enjoying the movie because there's know there's something different about the picture quality but they can't figure out what.

    Finally, I'd like to point out that this choice of frame rate, like many other subjective decisions that are made during a movie production, are made at the director's discretion. Peter Jackson is going out on a limb by shooting a movie at this frame rate, and doubtless he has his reasons for doing so (mostly due to it being shot in 3d as I recall) but it's still his call. The industry talk I hear views it as an experiment, and everyone's curious as to how it will work (or won't). If audiences do get used to it and like it, expect to see more movies shot like this, and in enough time it will be the new standard.

  • Re:Why? (Score:5, Informative)

    by mcgrew ( 92797 ) * on Friday December 14, 2012 @03:52PM (#42291069) Homepage Journal

    In the US, analog TV used to be broadcast in NTSC format, 320 x 240 pixels

    Completely incorrect. Analog TV had no pixels at all, it had scan lines, and there were 525 of them. You only have pixels in digital media, not analog.

  • by DreadPiratePizz ( 803402 ) on Friday December 14, 2012 @05:19PM (#42293439)
    That's irrelevant. The CMOS sensor on the RED works the same way as a frame of film - it is activated and exposed to light in exactly the same way, for a fraction of a second depending on the desired shutter speed. The only difference is that the shutter is electronic (turning the sensor on and off) vs mechanical, and even new digital cameras like the Sony F65 have mechanical shutters exactly like film cameras.
  • by stanjo74 ( 922718 ) on Friday December 14, 2012 @09:12PM (#42297423)
    You got the science wrong. This has very little to do with FPS and all to do with the stroboscopic effect film camera shutters introduce. Bear with me here for moment while I explain.

    It only matters very little whether you capture rapid motion with 30fps or 1000fps - the motion still occurs at its natural speed, it's the amount of motion blur per frame that changes. The eye sends a continuous stream of signals to the brain and the brain "sees". Most people have difficulty registering details about an object that moves faster than 36 degrees per second. So for roughly 180 degrees field of view, anything that crosses your sight in under 5 seconds is blurred. That's not much frame rate, right there. I only give this example to demonstrate that high frame-rate is not that important for action.

    When you shoot film @ 24 fps, the photographic shutter does not stay open the whole 1/24 sec. time, because that will be too much motion blur and also too much exposure at, say, F2.8 for the film. Normally film is shot at shutter speeds about 1/50 sec. This means that half of the 1/24 sec. motion is NOT CAPTURED AT ALL. Film creates a stroboscopic effect, and when played back through a projector that displays 1/50 sec. worth of action for 1/24 sec., it looks eerie, artsy.

    For the soap opera look, cheap TV shows are shot with cheap video cameras which do not have light shutters. Shutter is open for the duration of the frame - 1/60 for interlaced NTSC TV. The whole action is captured with motion blur similar to film (film at 1/60 sec. shutter). The playback is absolutely realistic, cheaply realistic.

    So, there you have it: TV @ PAL/50i and film with "normal" shutter speed 1/48 sec. @ 24fps have similar amount of motion blur. Film has a stroboscopic effect, TV does not.

With your bare hands?!?

Working...