Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Lord of the Rings Movies Entertainment

Hobbit Film Underwhelms At 48 Frames Per Second 607

bonch writes "Warner Bros. aired ten minutes of footage from The Hobbit at CinemaCon, and reactions have been mixed. The problem? Peter Jackson is filming the movie at 48 frames per second, twice the industry standard 24 frames per second, lending the film a '70s era BBC-video look.' However, if the negative response from film bloggers and theater owners is any indication, the way most people will see the movie is in standard 24fps."
This discussion has been archived. No new comments can be posted.

Hobbit Film Underwhelms At 48 Frames Per Second

Comments Filter:
  • Re:Can You SHow Me (Score:3, Informative)

    by Anonymous Coward on Friday April 27, 2012 @06:07PM (#39827183)

    It looks like a soap opera.

  • by Surt ( 22457 ) on Friday April 27, 2012 @06:10PM (#39827231) Homepage Journal

    I don't have links handy but they aren't terribly hard to find. Most of the population (more than 90%) can tell the difference between 24 and 48. Most (over 50%) can tell the difference on any 10fps jump (i.e. 60fps to 70 fps) up to 80 fps IIRC. Beyond that it starts to dwindle, but there's still a substantial chunk (20ish%) that can tell a 10fps difference at 120fps. By 240fps you reach the point where basically no one can tell the difference between that and anything faster, no matter how much faster (e.g. 240 vs 480 fps benefits basically no one).

  • by cpu6502 ( 1960974 ) on Friday April 27, 2012 @06:11PM (#39827245)

    Ooops you posted twice. :-) BBC video is 25 frames per second..... so I don't understand the comparision.

    And HDTV is upto 60 frames per second; aren't people used to seeing a rapid frame rate by now? I guess people are just weird.

  • by Fahrvergnuugen ( 700293 ) on Friday April 27, 2012 @06:19PM (#39827343) Homepage

    Because the shutter is fixed, the exposure time of each frame is directly related to the frame rate. Lower frame rate = longer exposure = more motion blur in the frame. Shorter frame rate = shorter exposure = less motion blur in each frame. You need more light to shoot at a higher frame rate to keep the same aperture setting.

    So, if they do project this at 24 frames per second (by throwing away half the frames in post), the frames will not have the necessary motion blur and it will actually look worse because half the frames are missing. This could also probably be fixed in post, but that would be a pretty big hack for such a large production.

  • by Sycraft-fu ( 314770 ) on Friday April 27, 2012 @06:33PM (#39827493)

    The more fast motion/pans you have, the more noticeable framerate is. If I shoot someone sitting and talking there isn't a ton of difference between the 60fps source and 30fps final product (the AVCHD cameras I use shoot at 60fps progressive). You can see it, but it isn't something that jumps out at you. However if I shoot someone running, the difference is extremely noticeable.

  • by Fahrvergnuugen ( 700293 ) on Friday April 27, 2012 @06:35PM (#39827515) Homepage

    This wasn't shot on film. The exposure time in digital has nothing to do with the frame rate.

    I didn't realize it was shot digitally, but you're statement isn't completely true. If you shoot something at 48FPS then the slowest possible frame rate you can have is 1/48th of a second in digital. Digital does give you the chance have a faster shutter speed though.

    Here's the kicker though, in film you have to double it. So 24fps would give you 1/48th shutter speed (half open half closed) meaning the motion blur for 48fps digital vs 24fps film should be the same, which explains why they picked 48fps - it afforded them the option to do either 48fps, slow motion or 24fps in post without giving anything up (except disk space).

  • Re:Uh (Score:3, Informative)

    by medv4380 ( 1604309 ) on Friday April 27, 2012 @06:45PM (#39827627)
    Do you know the difference between the Frames in a Video game and the Frames in Film?

    The 24 fps frames in film are all Fluid Frames. Games use Still frames since they have no natural motion blur, and when you increase the frame rate you start to improve the blur by overlapping frames. This is the same Audiophile Videophile BS because your eye sees frames at a far lower frame rate, but utilizes the blur in interpret motion. Reducing the blur in the frame by increasing the frame rate will screw with the motion of the image.

    Try reading a book [google.com] on the subject. The entire reason they went to 48fps was to try and reduce eye strain during 3D movies. They seem to have forgotten that a 72 refresh rate with a 24 frame rate will do the same thing. Frame by Frame the 48fps will look better when it's still, however, the 24fps will look more natural to your eye when it's playing.

  • Re:Is it "too real"? (Score:5, Informative)

    by Nationless ( 2123580 ) on Friday April 27, 2012 @07:07PM (#39827889)

    They still add motion blur to almost every major 3D AAA game title out there you know.

    I dealt with the issue of motion blur a lot when working on 3d animated films... The problem was that non-blurred 3d animation looks a hell of a lot like claymation at times due to the lack of blur produced in that workflow. The motion blur issue with games doesn't really have an equal, but to most people it looks subconsciously better with it enabled for reasons they can't explain. It will be interesting to see whether or not a 48 fps cinema standard will effect the need for motion blur in games too!

  • Re:Is it "too real"? (Score:4, Informative)

    by similar_name ( 1164087 ) on Friday April 27, 2012 @07:16PM (#39827981)
    What I find interesting is that when film at 24 fps is converted for NTSC at 30 fps it means every second 6 frames are added. It's more complicated that just duplicating every fourth frame but it doesn't add any additional information either. 1 frame is added every second for PAL.

    On a side note NTSC and PAL are what they are because tv was originally interlaced and ran with the frequency of the electricity used. So in the U.S. TV used to run at 60 frames interlace producing 30 full frames because electricity is 60 hz. Countries that ran on 50 hz got 25 fps.
  • Re:Is it "too real"? (Score:5, Informative)

    by Anonymous Coward on Friday April 27, 2012 @07:31PM (#39828121)

    Why would you slow down the shutter speed during pans? That makes them even more blurry.

    Yes, that's exactly the point, and it was common practice for something like 70 years, so it's not a crazy avant-garde thing only a few people did. The basic idea is that blur masks judder, and since judder is worse than blur, people like it when you slow the shutter speed during pans. It's only when people stopped doing this fairly recently that suddenly everyone's complaining about seeing judder everywhere.

    I'm not saying there aren't advantages to be had from 48fps, far from it. But 24fps judder suddenly got a lot worse rather recently, which is making it seem more necessary than it really is.

  • Re:Hybrid system (Score:4, Informative)

    by hack slash ( 1064002 ) on Friday April 27, 2012 @07:35PM (#39828167)
    What? Chop 'n change framerates throughout the film? are you nuts?
  • Re:Is it "too real"? (Score:4, Informative)

    by hack slash ( 1064002 ) on Friday April 27, 2012 @07:43PM (#39828237)
    No, 1 frame is not added every second for FILM>PAL conversion, they simply play the footage back at 25fps and speedup the audio to match the new framerate (which yes, does affect the audio pitch).
  • by wonkey_monkey ( 2592601 ) on Friday April 27, 2012 @07:52PM (#39828319) Homepage

    Actually the "motion blur gap" is only half the width you think it is - in the olden days the shutter would be closed for half the time to allow the film to spool on, so each frame is a snapshot of 1/48th of a second.

    Shooting at 48fps, I would expect them to aim for a 1/96s shutter speed. I've worked on motion graphics at 50fps, and 100% motion blur still looks bad at the higher frame rate - 50% looks perfect.

    Increasing motion blur to 1/16s on a 48fps shoot would be a complete mess.

  • by Surt ( 22457 ) on Friday April 27, 2012 @08:19PM (#39828543) Homepage Journal

    24fps is actually the LOWER threshold. The level below which most people no longer perceive smooth motion.

  • Re:Uh (Score:5, Informative)

    by DreadPiratePizz ( 803402 ) on Friday April 27, 2012 @08:26PM (#39828605)
    And your opinion can be safely ignored. Did you know that in conventional 24 fps film projectors, the shutter displays each frame twice? Do you know why? Because 24hz would produce flicker! Old films which ran at 16fps flickered, because when projected they were being displayed at 32hz. The concept of refresh rate certainly applies to even conventional cinema. You could construct a projector to display every frame 6 times, for 120hz (which is what those new Tvs do), or you could display each one once and have everybody's eyes explode.

    Just because it is shot at 24fps, does not mean it has to be displayed at 24hz.
  • Re:Is it "too real"? (Score:5, Informative)

    by lgw ( 121541 ) on Friday April 27, 2012 @08:35PM (#39828671) Journal

    How well do you tolerate the infinite fps you get when you look away from your computer screen?

    A lot of people get motion sickness from TV/monitors that are "too real". Keep the framerate or resolution down enough, and the brain knows it's just video, but HD at 60FPS looks too much like real vision, moving in this odd way decoupled from how your head moves.

    The "infinite" FPS causes a different group of people to become sick when riding in a boat (not all seasickness, but some), or an a car, because the gorizon is again moving unrelated to how your head is moving.

    I get sick playing any FPS with "head bob" turned on. 2 minutes and I'm out. Fortunately, almost all games let you turn that shit off.

  • Re:Is it "too real"? (Score:5, Informative)

    by BetterSense ( 1398915 ) on Friday April 27, 2012 @09:23PM (#39828943)
    I'm sure it's due partly to the use of faster film stocks. All the cool kids are using Kodak Vision 500T, which is insanely fast in historical perspective. In black and white, Kodak no longer makes Plus-X (64 speedish) stock, and only offers Double-X (200ish). Slowing the shutter down with these fast films requires either a smaller aperture, possibly smaller than the cinematographer wants, or use of an ND filter.

    At 24 FPS, a wide, judder-reducing shutter angle gets you a shutter speed of like 1/50th of a second. If you want anything less than deep-focus, you need to use an aperture of like f/5.6. In sunlight, this would require a film speed of iso 6. So yeah, I'm sure Vision 500T has a lot to do with it.
  • Re:Is it "too real"? (Score:5, Informative)

    by Y-Crate ( 540566 ) on Friday April 27, 2012 @10:23PM (#39829313)

    Yes, I am incapable of editing my own comment prior to posting. That should have been:

    I've noticed that too. I can never figure out why daytime soap operas look so much different than prime-time shows. Is it the framerate that does it? I was beginning to think that the crappy dialogue and crappy plot were becoming visible.

    It's the frame rate + lighting.

    Shows like Community or 30 Rock are what's known as single camera. They are lit and shot as though they're feature films. This takes time. I love watching people visit a set for the first time and witness the hours it can take to perfect the lighting for a single shot of a single scene which may be built from multiple shots and end up as a few seconds of screen time in the finished product. But it looks cinematic. You get shadows and a true sense of depth. Frankly, it just looks more interesting than the alternative.

    On the other hand, show like The Big Bang Theory or Whitney are multi-cam. Multiple cameras run simultaneously and capture the entire scene at once. Consequently, the sets are lit so they can be shot from a whole bunch of angles without moving lights. Everything looks very flat, and very stage-y. Even real-world props often fall into a strange uncanny valley.

    Check out any episode of 30 Rock and then one of the live episodes if you want to see a great comparison between single and multi-cam.

  • Re:Change (Score:5, Informative)

    by RobbieThe1st ( 1977364 ) on Friday April 27, 2012 @10:35PM (#39829365)

    I call "not understanding technology" on the post above: Most screens these days only will update at 60hz, especially larger ones. Even 1280x1024 screens will only do maby 85hz.
    Unless you're using a CRT or a Nvidia 3D Vision compatible monitor, you're not getting more frames than that /displayed to you/.
    Which means the frames are simply dropped, and thus won!t look any better. You'd be better off enabling vsync, so you've got a constant maximum fps, at whatever rate your monitor is set to, and not wasting frame rendering time.

  • Re:Is it "too real"? (Score:5, Informative)

    by aarku ( 151823 ) on Saturday April 28, 2012 @12:02AM (#39829749) Journal
    Han didn't shoot first. Han just shot. Greedo died. Get it right!
  • Re:Is it "too real"? (Score:4, Informative)

    by Purity Of Essence ( 1007601 ) on Saturday April 28, 2012 @01:18AM (#39830031)

    >But 48fps is simply smoother, and just as they are able to fake up 3D on films that were never shot that way, they will be able to digitally fake up with the extra frames between every 24fps frame and re-release all those old films in Astounding 48 FPS, New and Improved, Digitally Remastered, For a Limited Time Only....

    Yeah, my TV did that (interpolated 24fps into 120fps) until I turned it off "motion enhancement". I hated the effect. Somehow the picture seemed artificial and less clear even though the action was arguably smoother. Motion interpolation is much more well understood and easier to implement than faking 3d, but it still produces bad results.

    Motion interpolation generally only works well for a very small subset of common visual imagery. Complex motion confuses it, often obliterating the original motion which makes things look subtly unreal, dreamlike, or otherwise confusing to the viewer. Discreet sampling and reconstruction filters, which are guaranteed to be sub-optimal, intensify the problem. When the video source is a DVD or some other video that's been wrung through the motion estimation process at least once already, it can only get worse. Garbage in, garbage out, Chinese whispers, turd polish, and all that rhythm.

You knew the job was dangerous when you took it, Fred. -- Superchicken

Working...