Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
Lord of the Rings Movies Entertainment

Why The Hobbit's 48fps Is a Good Thing 599

Posted by Soulskill
from the in-places-deep-where-dark-things-sleep dept.
An anonymous reader writes "Last year, when we discussed news that The Hobbit would be filmed at 48 frames per second, instead of the standard 24, many were skeptical that format would take hold. Now that the film has been released, an article at Slate concedes that it's a bit awkward and takes a while to get used to, but ends up being to the benefit of the film and the entire industry as well. 'The 48 fps version of The Hobbit is weird, that's true. It's distracting as hell, yes yes yes. Yet it's also something that you've never seen before, and is, in its way, amazing. Taken all together, and without the prejudice of film-buffery, Jackson's experiment is not a flop. It's a strange, unsettling success. ... It does not mark the imposition from on high of a newer, better standard — one frame rate to rule them all (and in the darkness bind them). It's more like a shift away from standards altogether. With the digital projection systems now in place, filmmakers can choose the frame rate that makes most sense for them, from one project to the next.'"
This discussion has been archived. No new comments can be posted.

Why The Hobbit's 48fps Is a Good Thing

Comments Filter:
  • Why? (Score:2, Insightful)

    by davydagger (2566757)
    A lot fo the magic of film was 24fps.

    sure its outdated, but so is 48 fps.

    broadcast TV has been 50 for years, with more recent forays with high def into 120hz (no idea of the actual frame rate with digital, but I could image its up there)

    why are the doing this now? and why only 48fps?
    • Re: (Score:2, Informative)

      by Anonymous Coward

      why are the doing this now? and why only 48fps?

      Follow the slashdot link in the summary, it was discussed extensively there, no need to derail yet another thread with it.

    • why do you think it last sooooo long .... ;-p

    • Re:Why? (Score:5, Informative)

      by fastest fascist (1086001) on Friday December 14, 2012 @12:32PM (#42288597)
      I'm not aware of broadcasts in 50 FPS. AFAIK, they're being evaluated, but basically material is broadcast at 25 or 30 fps, depending on the standard used. These conform to the old PAL/NTSC/SECAM framerates. Interlaced formats, however, can be 50 or 60, but that's because each frame is essentially split into two frames of alternating horizontal lines, "fields".
      • Re:Why? (Score:4, Informative)

        by Anonymous Coward on Friday December 14, 2012 @01:09PM (#42289073)

        I guess that depends on how you define it. On interlaced broadcast (like all old TV), you get a half-frame every 50 seconds, where half-frame means either the even or the odd lines, alternating. However in true interlaced broadcast (i.e. where the material was already recorded in that format, not transformed into it as when putting a movie to TV) it's not that you get the even and odd lines of the same image, but each half-frame is recoded on its own time. So say you've got 50 half-frames per second, then you'll get e.g. the even lines of the image at 0ms, then the odd lines of the image at 20ms, then the even lines of the image at 40ms, then the odd lines of the image at 60ms, and so on. Only with converted stuff, the even and odd lines will be from the same image.

        You can see that quite nicely when capturing a true interlaced-recorded TV program on tjhe computer, where two half-frames are combined into a frame. If there's fast movement in the scene, you'll get striped frames because your "frame" is actually the combination of two images at different times, with the even and odd lines image separated by 20 ms (50Hz) or 16.7 ms (60Hz). Given that those images are recorded at different times, I'd say it makes sense to consider them different frames which are recoded at half the vertical resolution with a displacement of one line every second frame.

      • by dgatwood (11270)

        But the reason for using fields is so that the effective frame rate perceived is 60 frames per second, just at half the vertical resolution. Each field contains a different image than the field before it. So we're used to relatively fast frame rates. If people find 48 FPS distracting, the reason is probably either a psychological "uncanny valley" thing—because it feels almost like TV but not quite—or perhaps it just happens to be a magic speed that makes people uncomfortable for some reason.

      • I'm not aware of broadcasts in 50 FPS. AFAIK, they're being evaluated, but basically material is broadcast at 25 or 30 fps, depending on the standard used. These conform to the old PAL/NTSC/SECAM framerates. Interlaced formats, however, can be 50 or 60, but that's because each frame is essentially split into two frames of alternating horizontal lines, "fields".

        720p60 is a common broadcast format and a few European broadcasters do 720p50 (presumably to ease upscaling 25 FPS SD content). It seems 1080i50 is more popular over there though (annoyingly, I despise interlacing and would much rather have seen 1080p30 and 1080p25 become the broadcast standards rather than their crappy interlaced counterparts).

    • by gagol (583737)
      I designed and produced many corporate events in my life. Most of then involving video animations on screens as large as 60 foot wide. As soon as technology allowed me, I produced and projected video animations in 60fps to make pans more fluid. Would I be producing a movie today, I would try to shoot in 60fps for the same reason, much more fluid motion on big screens.
      • Re:Why? (Score:4, Interesting)

        by nabsltd (1313397) on Friday December 14, 2012 @01:52PM (#42289529)

        As soon as technology allowed me, I produced and projected video animations in 60fps to make pans more fluid. Would I be producing a movie today, I would try to shoot in 60fps for the same reason, much more fluid motion on big screens.

        If Jackson had chosen 60fps for The Hobbit, it would have been a much better choice, at least as far as home video is concerned.

        With a choice of 48fps as the source, we are going to get stuck with a much lower quality home video release, because there is no current format that allows at least 48fps and 1920x1080 resolution. So, to convert to 24fps, either the original footage will have to be filmed at 24fps, or else some sort of digital interpolation will have to be done. Neither will give the same quality that we have come to expect from current media, as instead of 24 frames per second where scenes with little motion have very sharp frames, pretty much every frame will show some sort of motion.

      • It's also much more effective to down-convert from 60 FPM to 60i for broadcast than to up-convert from 24 or even 48 FPM to 30i

    • broadcast TV has been 50 for years

      Clarification: That was 50i for PAL and 60i for NSTC. Because of the interlace the FPS was actually 25fps and 30fps respectively.

    • Re:Why? (Score:4, Funny)

      by Tough Love (215404) on Friday December 14, 2012 @04:45PM (#42294101)

      A lot fo the magic of film was 24fps.

      Oh yes, like wagon wheels going backwards. I also pine for the days of scratches, dust spots and pubic hairs on the big screen. And nothing but nothing beats the exhilaration of watching the celluloid melt because the projector stalled.

  • by Hatta (162192)

    Where can I see the Hobbit in 48FPS?

  • What makes it... (Score:2, Insightful)

    by Anonymous Coward

    distracting? Since the film seems to be getting panned a lot, does this maybe have something to do with it?

    • Re:What makes it... (Score:4, Informative)

      by Dahamma (304068) on Friday December 14, 2012 @02:06PM (#42289855)

      If you have a modern medium to high end HDTV, turn on frame interpolation [wikipedia.org] processing (by whatever silly trademarked name your TV has for it) and watch an HD movie (especially one with sweeping pans and action, etc). It's hard to quantify exactly why it's distracting, but is sometimes described as a "soap opera" effect.

      It bugs me too, but it really is hard to objectively say why. I'd like to think it's about a subconscious feeling of "expansiveness" and uncertainty (since your brain has to interpolate instead of the TV, and maybe your brain interpolating engages you with the content differently, etc) that you want with a more "epic" movie experience.

      But there is also a strong argument that it's mostly your brain adjusting to something it has not experienced in this setting, and you will get used to it if exposed enough. Sort of like getting a new pair of glasses with a different shape/refractive index...

  • by Andy Prough (2730467) on Friday December 14, 2012 @12:25PM (#42288485)
    maybe Jackson should just try actually shooting the whole story this time. Hey Merry - where'd you get that cool magic blade that killed the Witch King? "Errr.... well err ummm. See there were these barrows, but we had to cut that from the story, but - hey, Liv Tyler is hot, right??"
    • by Anonymous Coward

      Is this the part where I have to pretend that the whole Tom Bombadil segment wasn't the most poorly-written part in the books?

    • by gfxguy (98788)

      At the same time I love the movies (and just got the extended BR edition), I'm sad that they weren't as 'faithful' as they could have been to the books, and worse is that it was such a huge endeavor that it's not likely to be tried again for at least a very long time, if ever. I more than understand cutting out a part like Tom Bombadil (and changing and cutting various other things) for the sake of brevity, but that's not what they did - they cut it for the sake of stuff that wasn't in the books at all.

      Bac

  • for decades i could never figure out why they could do it on TV soap operas and some sit coms but not on movies that cost a lot more money to make

    i'll take a blu ray of an older movie over the grainy theater crap quality any day

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Blu-Ray of old movies are still 24P, they don't magically add more frames that don't exist, and interpolation generally looks like ass.

  • From the reviews I've seen so far, no one seems to enjoy the 48fps. Even mainstream reviewers have referred to it as 1970s video smooth, "old Dr. Who at best." (Paraphrasing from the CNN review this morning). Maybe The Hobbit is the sacrificial movie which needed to be made and receive this kind of backlash, in order to never have such an awful-looking "feature" used in film again.
    • by Zordak (123132) on Friday December 14, 2012 @01:48PM (#42289481) Homepage Journal

      old Dr. Who at best

      My desire to see The Hobbit just multiplied ten-fold.

  • not having seen the movie or old enough to remember it if it happened in the golden age of movies..
    how does it look different in the theater at 48fps vs normal 24fps movies?

    • by tooyoung (853621)
      It is difficult to describe without seeing it. You become much more conscious that you are looking at actors wearing costumes standing on a set. I've had a similar sense with some movies in blu-ray, although this is different. Now, 48fps is really cool for scenes with perspective motion, as you feel tricked into thinking you are part of the scene.
  • by wwalker (159341) on Friday December 14, 2012 @12:34PM (#42288635) Journal

    When playing a game, I can easily tell if it's running at 30 fps or 60 fps, and I *much* prefer the higher framerate, for obvious reasons. It'll definitely take a bit of getting used to when it comes to moves, but it is no doubt a good thing.

    • Re: (Score:3, Insightful)

      by BergZ (1680594)
      Agreed. The kvetching over the transition from 24 fps to 48 fps reminds me of the transition from incandescent bulbs to compact fluorescent or the transition from records to CDs. It strikes me as nostalgia for a (mostly) inferior product.
    • by Waccoon (1186667)

      Everyone tells me I'm crazy when I say that windowed games and videos play really sluggish in Windows 7 compared to XP. The reason is because the new window manager uses 3D hardware to do compositing, and for some reason, updates seems to be locked in at 30 FPS (my guess is 50% of the monitor refresh rate). XP updated at full blast and provided much, much smoother games and video. I've noticed a rather huge loss in overall performance on my new Win7 machine, despite it being massively more powerful than

  • by Twinbee (767046) on Friday December 14, 2012 @12:38PM (#42288691) Homepage
    I'm all for video and motion being at 48fps, and maybe even 100fps+ for super smoothness which will also help cure motion blur (without the use of black flickery interspersed sub-frames). Heck why stop there, 240 or 300fps will help for compatibility, and allow us even smoother motion.

    HOWEVER..., critics argue that the Hobbit feels less 'dream-like' and 'too real'. Even though I disagree with them to an extent, I recently played a game called Nitronic Rush [nitronic-rush.com] (fast free Wipeout clone, with tron-esque graphics, great fun btw). I set it to 60fps, but the graphics are 'enhanced' by motion blur, which 60fps normally doesn't 'need'. We're talking at least a couple of frames worth, and maybe up to 5 frames worth of artificial motion blur. However, I find this actually gets the best of both worlds. You get the smoother motion so that your eyes don't ache, and any fast panning looks convincing. But you also get the cinematic 'blurry' look that 24fps films provide (24fps film techniques employ motion blur naturally, or at least something similar to motion blur).

    I think 60fps with this kind of motion blur may have a big future for it.
    • by avandesande (143899) on Friday December 14, 2012 @12:56PM (#42288923) Journal
      I wonder if there is something more to this than just people being used to old technology. Perhaps there is something like a visual 'uncanny valley'?
    • by Tumbleweed (3706)
      The ridiculously high-bandwidth 'ultimate' solution would be to record the original in 120fps, which would be able to downgrade evenly to 24 or 30fps, depending on the format you want to output to.
    • by stanjo74 (922718) on Friday December 14, 2012 @08:12PM (#42297423)
      You got the science wrong. This has very little to do with FPS and all to do with the stroboscopic effect film camera shutters introduce. Bear with me here for moment while I explain.

      It only matters very little whether you capture rapid motion with 30fps or 1000fps - the motion still occurs at its natural speed, it's the amount of motion blur per frame that changes. The eye sends a continuous stream of signals to the brain and the brain "sees". Most people have difficulty registering details about an object that moves faster than 36 degrees per second. So for roughly 180 degrees field of view, anything that crosses your sight in under 5 seconds is blurred. That's not much frame rate, right there. I only give this example to demonstrate that high frame-rate is not that important for action.

      When you shoot film @ 24 fps, the photographic shutter does not stay open the whole 1/24 sec. time, because that will be too much motion blur and also too much exposure at, say, F2.8 for the film. Normally film is shot at shutter speeds about 1/50 sec. This means that half of the 1/24 sec. motion is NOT CAPTURED AT ALL. Film creates a stroboscopic effect, and when played back through a projector that displays 1/50 sec. worth of action for 1/24 sec., it looks eerie, artsy.

      For the soap opera look, cheap TV shows are shot with cheap video cameras which do not have light shutters. Shutter is open for the duration of the frame - 1/60 for interlaced NTSC TV. The whole action is captured with motion blur similar to film (film at 1/60 sec. shutter). The playback is absolutely realistic, cheaply realistic.

      So, there you have it: TV @ PAL/50i and film with "normal" shutter speed 1/48 sec. @ 24fps have similar amount of motion blur. Film has a stroboscopic effect, TV does not.

  • Obligatory xkcd... (Score:4, Informative)

    by Dr. Manhattan (29720) <sorceror171@gmai ... om minus painter> on Friday December 14, 2012 @12:43PM (#42288765) Homepage
    https://www.xkcd.com/732/ [xkcd.com]. Especially the alt-text.
  • by guidryp (702488) on Friday December 14, 2012 @01:01PM (#42288995)

    Most of the time you can't even tell the difference between frame rates, except when it emerges as artifacts at 24 fps.

    24 fps movies are purposefully shot with more motion blur to hide the jerkiness. But nothing really gets around it when panning.

    So 24fps primarily equals artifacts: Blurring, jerky motion, and juddering pans.

    How nonsensical is it, and how resistant change do you have to be, to worship these artifacts. They are no more beneficial than ticks/pops were on Vinyl. There is certain nostalgia value to listening to something with ticks/pops sometimes, but it isn't something we put everywhere because we can't do without it.

    So these resistant to change, Luddites in love with quite irritating artifacts have taken to calling superior motion video with less blur, less judder and less jerking: "The Soap Opera Effect".

    Do a freeze frame on a soap opera and good movie. You can still tell which is which when frozen. Soaps look like crap, because they have crap production values. Poor sets, poor lighting, poor cameras, shot without any flair.

    Shoot 48fps (or 60 fps or 120 fps for that matter) with great sets, great lighting, great cameras and great flair and it will be amazing and have nothing in common with soap operas.

    • To be fair there is more to it than just "24fps has unwanted artefacts". Most people will probably remember when they first saw Saving Private Ryan because the film stock and shutter speed used gave it a very realistic, un-blurry and gritty look. One of the biggest reasons it has taken so long for digital cinema cameras to become popular is that the early ones were unable to replicate the effect of using particular well known film stocks and camera techniques.

      Star Wars is another good example to look at. Part of the charm of the original movies is that they were a bit rough around the edges. The film had a fair bit of grain that made the Star Wars universe look a bit grubby and used, rather than sleek and clean like Star Trek. The later trilogy was crisp and clean, and ended up looking more like generic sci-fi than the Star Wars we loved.

      48fps is still in its infancy and it will take some time for cinematographers and directors to figure out how to get the effect they want from it. In the end the result will be better than 24fps, but that doesn't automatically mean that the early examples will be particularly good. 3D was the same; everything looked terrible until Avatar finally figured out how to use it and still look like a movie and not give you a brain aneurysm.

    • by vmxeo (173325) on Friday December 14, 2012 @02:09PM (#42289925) Homepage Journal

      Hi there. Technical director here. Just need to step in a clarify the relationship between frame rate and motion blur. I'm seeing a lot of posts that are calling for higher frame rates with more motion blur, as if they are two completely independent things. They're actually closely linked. Let me explain:

      Motion blur is the effect of a moving object in the frame while the shutter is open. In photography, the time the shutter is open is called the shutter speed, and is used along with iso and aperture to control the overall exposure. If you know anything about photography, this is pretty basic stuff.

      In the film world, the equivalent of shutter speed is what's known as shutter angle. This is because the shutter for film camera is a spinning disk, of which a portion lets light through and a portion blocks it as it spins. The portion, measured in degrees, that lets the light in is the shutter angle. Typically, the shutter angle used in film is 180 degrees, meaning during half that 1/24 of a second frame rate, the film is being exposed. In photographic shutter speed terms, that would be the same as 1/48. Again, not too complicated.

      Here's the catch though: because your film stock is rolling by at 24 frames per second, each frame can only be exposed for 1/24 of a second or less. If you use a smaller shutter angle, or faster frame rate, you get less motion blur. What this means is there's no practical (the film industry definition of practical) way of getting more motion blur than your frame rate and shutter angle allows. The faster you go, the crisper the action will be.

      So at this point you're probably wondering who cares about the amount of motion blur in a movie? The answer is: the audience. The industry has shot film at 24fps with a 180 degree shutter angle for so long that's what everyone is used to. The last thing you want is to distract your audience away from enjoying the movie because there's know there's something different about the picture quality but they can't figure out what.

      Finally, I'd like to point out that this choice of frame rate, like many other subjective decisions that are made during a movie production, are made at the director's discretion. Peter Jackson is going out on a limb by shooting a movie at this frame rate, and doubtless he has his reasons for doing so (mostly due to it being shot in 3d as I recall) but it's still his call. The industry talk I hear views it as an experiment, and everyone's curious as to how it will work (or won't). If audiences do get used to it and like it, expect to see more movies shot like this, and in enough time it will be the new standard.

      • Informative post but you forgot one _tiny_ detail about The Hobbit.

        They are shooting everything DIGITAL.

        --
        Classmates.com are a bunch of scammers.

  • by roc97007 (608802) on Friday December 14, 2012 @01:47PM (#42289457) Journal

    > The 48 fps version of The Hobbit is weird, that's true. It's distracting as hell, yes yes yes.

    ...because Lord knows, that's what I always look for in a movie. A presentation that is weird and distracting as hell.

    That said, like 3D, you do get a choice, so no harm, no foul. We will be seeing the film in 2D, 24 FPS, because 3D gives my wife migraines and because of reports in New Zealand of motion sickness - like symptoms amongst viewers there.

    Parenthetically, I predict that of the people who love 48 FPS will contain a high percentage of people who can play first person shooters for hours without motion sickness, and conversely, the people who don't like it will be those who can't. Although I don't think anyone is collecting this metric, sadly.

    I will see the film at the faster frame rate (and in 3D because I believe that's your only choice at 48 FPS) but I want to see it "normal" first.

    I don't feel qualified to judge the technology not having viewed it yet, but the most interesting criticisms that have come out of advanced showings so far is that the sets look more like sets, which disrupts one's ability to suspend disbelief, and that the depth of field tends to be very deep, with everything in focus, which makes things look weird (because the human eye doesn't see that way). Speed Racer did the same thing, intentionally. (Speaking of which, it appears that I'm the only one who liked Speed Racer.)

    ...which brings me to my point. This doesn't make the film any less artistic. In actual fact, Jackson's use of the technology is an artistic choice. It may not be a choice that everyone likes, and it may disrupt what we've come to believe are common artistic choices (directing audience attention through depth of field, motion blur to indicate movement, softer focus for effect) but that doesn't make it any less artistic. Now, whether it's a *commercial* success, that remains to be seen. I strongly suspect that the 48 FPS showings will be crowded because it's a new thing. Whether people will flock to the next film in that forum remains to be seen.

  • by Thagg (9904) <thadbeier@gmail.com> on Friday December 14, 2012 @03:50PM (#42292741) Journal

    I've been doing computer animation for 35 years, as long as it has existed. Back in the early 80's, I worked on some early 60 field-per-second animation; and I was a convert to high-frame-rate footage since then. (The opening to the PBS show NOVA was perhaps the first 60fps animation ever done.) When we started doing broadcast graphics (show openings, things like that) for TV, we naturally did them at 60fps, and that looked right as it worked with the rest of video. Finally, though, we moved into advertising, and TV advertising was (and still is) typically 24fps. And it bothered me!

    But then, something changed my mind completely. We were doing an ad for Snacky, a Japanese snack food company. There was the required silly animated spokespuppet, and we modeled it and made it perform. Part of doing animation is doing the lip-sync, and the company gave us the dialogue in English to animate to. We did this, although it didn't seem right -- expecting them to give us the Japanese soundtrack eventually.

    But no, it got to a couple of days before delivery, and the character was still speaking English, and we asked the customer when he came to review the work. "This is only going to be shown in Japan, right?" "Oh, yes, yes!", "And you're going to dub it into Japanese, right?" "Of course! Yes!" "But the lip sync is to an English sound track, the lips are not going to match the dialogue!" "YES! JUST LIKE ALL GOOD ANIMATION!"

    Because in that day, lip-sync that was correct in Japanese meant it was low-quality domestic animation; where if the lip-sync didn't match it was high-quality American animation. Nobody can tell me that wrong lip-sync is in any way superior -- except that there were 150 million people in Japan who would see it that way instinctively and immediately.

    So, I became a happy convert to 24 fps animation. I applaud Peter Jackson for his incredibly audacious experiment, and I hope he succeeds, but he has to fight the near-instinctive reaction from a lot of people who see 48 fps as video.

    I think that part of the problem with The Hobbit at 48fps is that the screens are so terribly dark that you just can't appreciate the high frame rate. Your eye integrates dark scenes over a long period of time, and at 48 fps with the very very dark 3D screens, I believe that your eye smears the frames together. On Transformers III, I removed all the motion blur from the very dark scenes, because even at 24 fps they got smeary.

  • by ducomputergeek (595742) on Friday December 14, 2012 @04:30PM (#42293739)

    And two things I have to say:

    1) If you get the least bit motion sick, don't go see it at the high frame frate in 3D. Normally I don't, even when seeing IMAX/OMNIMAX, but this film I did.

    2) The 48 frames per second and 3d makes certain parts of the film like watching a live stage production. The problem then becomes with the post production. There were a lot of scenes when you could tell the background was composited and with so much CGI some of it was like going back and watching CGI from 15 years ago.

    That was one of things I liked about the LOTR movies and especially by the third movie, the CGI had gotten so good that it was largely seamless. You didn't notice it, it was just part of the story. In this one I noticed it and often found myself cringing.

  • by peter303 (12292) on Friday December 14, 2012 @05:01PM (#42294457)
    At that time film was very expensive. Producers then preferred a minimal frame rate to save cost. Some Nickeldoleans were 10 fps. The guy in Hugo used 16 fps. Since Edison was one of the inventors of motion pictures, he may have wanted to sell more film stock.

We will have solar energy as soon as the utility companies solve one technical problem -- how to run a sunbeam through a meter.

Working...