Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Lord of the Rings Movies Entertainment

Hobbit Film Underwhelms At 48 Frames Per Second 607

bonch writes "Warner Bros. aired ten minutes of footage from The Hobbit at CinemaCon, and reactions have been mixed. The problem? Peter Jackson is filming the movie at 48 frames per second, twice the industry standard 24 frames per second, lending the film a '70s era BBC-video look.' However, if the negative response from film bloggers and theater owners is any indication, the way most people will see the movie is in standard 24fps."
This discussion has been archived. No new comments can be posted.

Hobbit Film Underwhelms At 48 Frames Per Second

Comments Filter:
  • by Anonymous Coward on Friday April 27, 2012 @06:00PM (#39827109)

    What proportion of the population can actually tell the difference between 24fps and 48fps? Have there been any peer-reviewed studies to find out?

  • Can You SHow Me (Score:5, Interesting)

    by Anonymous Coward on Friday April 27, 2012 @06:02PM (#39827123)

    Could you show me what this "70s era BBC-video look" is. Despite having seen lots of 70s era BBC-video, I'm unable to understand what you're talking about based on the description.

  • by Surt ( 22457 ) on Friday April 27, 2012 @06:06PM (#39827171) Homepage Journal

    I'm one of the luck few with sensitive eyes. Watching movies at 24 fps is jarring. I can't wait til they move up to 60 or 120.

  • Modern 120Hz+ HDTVs (Score:1, Interesting)

    by SpryGuy ( 206254 ) on Friday April 27, 2012 @06:13PM (#39827275)

    Anyone who has watched a movie on a modern 120Hz+ HDTV knows exactly what they're talking about.

    Suddenly "film" looks like "video", and it "just doesn't look right". To the point of being annoying.

    And it's so clear, that sometimes you can see make-up lines on necks, and other signs of "fakery" used in productions, that totally take you out of the moment and spoil the suspension of disbelief.

    When I got my new HDTV, I had to spend an hour or two playing with the settings to "detune" the image so as not to be so damn clear and sharp and, for lack of a better word, "shiney". It took a while to get the colors to look okay, to get the sense of motion/motion-blur right, etc.

    It's still not perfect, but at least it's not visually jarring and annoying.

    I have to wonder if, when the movie is distributed, there will be guidelines for configuring the digital projectors to optimize the movie experience for viewers not used to the "new" look...

  • Psychological? (Score:4, Interesting)

    by Kylon99 ( 2430624 ) on Friday April 27, 2012 @06:13PM (#39827279)

    "THE HOBBIT, frankly, did not look cinematic."

    Is it because we are conditioned that the low frames per second represent a 'movie?' I remember seeing an FPS one time at 60 fps, not realizing right away that it was supposed to be a FPS and not a movie and my first and immediate response my brain gave me is, "wtf is this?!" It seems different frame rates make me think it's a different 'experience' of sorts, a game, a TV broadcast, etc. (Even say the 60fps black and white from back awhile ago... was it 60fps?) So I think I understand the feeling, even though I tell myself that I prefer the 48 frames per second. Because I then see the action in some other movies, say, Gladiator, at 24 fps and I see just how bad the action is represented.

    I really *do* want to see more motion/information on the screen and I'm willing to put myself through reconditioning to do so.
    But I'm not sure everyone else will, or even understands it this way.

    Has anyone else noticed this effect?

  • Re:Is it "too real"? (Score:5, Interesting)

    by Anonymous Coward on Friday April 27, 2012 @06:14PM (#39827291)

    Is this another version of the same issues people complained about when seeing their favorite newscaster (or "other" things) in HD?

    Do we need some "masking" of the mundane reality of scenes (e.g., things "looking like sets") to sufficiently suspend disbelief?

    A lot of the complaints may actually stem from lighting issues. In general, movies are dimmer than TV. Lots of mundane "set"-type things are hidden in the shadows, and brightening everything up will reveal them even at 24fps. The lighting may need to be adjusted differently for 48fps (possibly planned for post-production and just hasn't happened yet), or maybe the lighting is intentionally too bright to counteract the dimming effect of 3D. Either way, people may be reacting to a lot more than just 48fps, so don't just assume they're all Luddites.

    Also, the need for 48fps wouldn't be nearly as bad if the camera operators of the world hadn't all simultaneously forgotten how to slow down the shutter speed during pans. Seriously, there's judder all over the movie theatres today, and while it existed thirty years ago, it wasn't nearly as frequent or as bad as today.

  • by lattyware ( 934246 ) <gareth@lattyware.co.uk> on Friday April 27, 2012 @06:20PM (#39827353) Homepage Journal
    Just everyone do it, and in a few months, everyone will have forgotten this insane thing and be used to it.
  • by ajegwu ( 1142365 ) on Friday April 27, 2012 @06:21PM (#39827365)
    When my old TV finally gave up the magic smoke, I replaced it with a modern 240Hz LCD panel. The first show we watched on it was Lost. Everyone immediately said it looked fake. It was compared to a low budget History Channel documentary instead of a high budget network show. Within a week or two no one I lived with seemed to notice the difference any more. It was just different, therefore something for most people to complain about, until it became the new normal.
  • by SpryGuy ( 206254 ) on Friday April 27, 2012 @06:33PM (#39827491)

    While true, that is all utterly and completely irrelevant to what I posted. The reality is that the higher refresh-rates and "processing" going on in modern HDTVs makes "film" look like "video", regardless of the source. If you haven't seen the effect I'm talking about, you should make an effort.

  • by rAiNsT0rm ( 877553 ) on Friday April 27, 2012 @06:35PM (#39827523) Homepage

    I have tried time after time to get used to it but I can't. The overly smooth look pulls me out of what I'm watching and makes it look fake, to the point that it doesn't seem natural. There is something off about it but I don't know what it is, real life doesn't have that look so I think there is some other factor at play here that makes people (myself included) react this way.

  • Comment removed (Score:5, Interesting)

    by account_deleted ( 4530225 ) on Friday April 27, 2012 @06:37PM (#39827535)
    Comment removed based on user account deletion
  • Re:Is it "too real"? (Score:4, Interesting)

    by cpu6502 ( 1960974 ) on Friday April 27, 2012 @06:39PM (#39827561)

    >>>BBC 70's shows that use video, but by the time it gets over here in the colonies, it's not 48 frames per sec, but 25. I have no way of knowing what the TV stations played it at.

    BBC video is 25 frames per second. Interlaced.
    So basically it's just like U.S. video (30fps) but slightly slower.

  • Re:Can You SHow Me (Score:4, Interesting)

    by Tragek ( 772040 ) on Friday April 27, 2012 @06:50PM (#39827705) Journal

    My father's Sony drives me nuts with it's 120hz interpolation. I can attest to the soap-opera effect; it makes everything look very strange. Mission Impossible was positively ODD.

      I was always curious if it was an effect of the high frame-rate or the interpolation algorithms. Worryingly this story seems to indicate it's the frame-rate, not the algorithms.

  • Re:Is it "too real"? (Score:5, Interesting)

    by Anonymous Coward on Friday April 27, 2012 @06:57PM (#39827773)

    The effect is known as "soap effect", because soap operas are shot on video, in interlaced format. Interlaced video gives a time resolution of 50 or 60 images per second, compared to 24 images per second for film. Because we're used to seeing interlaced video on TV and movies are always non-interlaced with lower time resolution, it's irritating when a movie has fluid motion. You can experience this effect if your TV has an option to interpolate frames. Turning that feature off makes movies look more like "cinema" and turning it off makes movies look like soap operas.

  • by Forever Wondering ( 2506940 ) on Friday April 27, 2012 @07:04PM (#39827849)
    All modern/ordinary film is shot in the camera at 24 fps but projected with a shutter speed of 48 fps. Each frame is double shuttered in the projector and has been for years.

    ---

    This is a bit like TV that has a frame rate of 30 (29.97) but a field rate of 60 (59.94) because it's interlaced. It prevents jerky motion because the eye believes it's getting a frame rate higher than the true frame rate (e.g. it perceives the field rate to be the frame rate). When film is put on a DVD it has to undergo a telecine process to raise the field/frame rate.

    Some people I know [with better eyes than mine] can see flicker in 24/48 film content. They actually prefer video because of the higher frame rate.

  • Hybrid system (Score:3, Interesting)

    by Spy Handler ( 822350 ) on Friday April 27, 2012 @07:17PM (#39827999) Homepage Journal

    a solution might be to show the movie at 48 fps but keep most of the source 24 fps... ramping up to 48 fps during scenes that require it (such as camera panning)

    So basically what you'd do is shoot everything in 48 fps, but for most scenes take out every other frame, and just show the remaining frames twice. Then it would look like a regular 24 fps movie.

    For scenes with lots of motion, DON'T take out every other frame, show the full 48 fps.

  • Re:Is it "too real"? (Score:5, Interesting)

    by muon-catalyzed ( 2483394 ) on Friday April 27, 2012 @07:19PM (#39828013)
    Tell you the little Hollywood secret, they HATE this. If the rubicon of 24fps & 2D is crossed, the film industry and all their flicks will be stamped as outdated '70s era films, similar to mono audio recordings once the stereo era kicked in. The BBC rant is actually lifted from their own point of resistance, as they fear the obsoleteness of their own stuff. The elitist nature of going 3D, going to higher framerates and the associated production costs, the elaborate post, the new thinking behind 3D production, the ditched old-school principles, that is mind-boggling for the establishment. For that simple reason the innovative and groundbreaking PJ's 3D movie 'The Hobbit' is doomed by the wrath of the industry.
  • Change (Score:5, Interesting)

    by thegarbz ( 1787294 ) on Friday April 27, 2012 @07:29PM (#39828089)

    Seriously, what could be wrong with 48 fps? That it didn't flicker enough?

    The problem isn't that it is fundamentally better, it's that it is a change from what people expect. Every time I see a high fps recording of something the motion looks like it's going to fast. I fully expect the video and sound to drop out of sync but it never does. The results look fantastic and smooth as they should, but it takes my brain conditioned by years of 24fps shit a while to adapt to the new look.

    Any change from the norm is likely to attract serious criticism, whether good or bad.

  • Re:Just whiners (Score:4, Interesting)

    by Jah-Wren Ryel ( 80510 ) on Friday April 27, 2012 @07:40PM (#39828215)

    Man, reading this reminds me of those audiophile douchebags that insist that records sound 'warmer' and go into all of these nonsensical explanations about sound texture and other dumb shit when in reality, it's mostly all in their head and they're talking out of their ass.

    If it really does remind you of that, then you aren't paying attention. Grain is a characteristic of film that a good cinematographer uses, just as he uses things like exposure, focus, lens-flare and depth of field. Digitally removing grain from a movie where the cinematographer made artistic decisions regarding the grain is the equivalent of amping up the saturation, blowing out the contrast or even chopping off the edges of the picture - it is destructive to the artist's intent. Grain is part of the creation not part of the playback, unlike the "warmth" that vacuum tubes add to music (and which can be simulated with the right digital filters).

    Grain is such a basic part of modern cinematography that a fair number of movies shot on digital have had artifical grain added in post.

    I don't think film is going anywhere and digital most certainly is not going anywhere.

    Film is on life support already. [creativecow.net] By 2013 all US theaters will be digital. Over 90% of all primetime tv is already shot on digital.

  • Quake (Score:1, Interesting)

    by Impy the Impiuos Imp ( 442658 ) on Friday April 27, 2012 @07:57PM (#39828359) Journal

    Way back when, yust before 3D cards took over he world, I fired up old Quake on my more modern machine and ran the software renderer.

    I got some godlike fps, but more importantly, the 320x200 image, though blocky as hell, was smooth, baby, smoooooth. It felt like looking through a window at a weird blocky world.

    For some reason, no 3D card game has ever done this, though they all tend to push the limits until they're back scraping 30 fps again.

    I might try it on CoH or something, turn down options until fps is way back up.

  • Re:Hybrid system (Score:5, Interesting)

    by Spy Handler ( 822350 ) on Friday April 27, 2012 @07:58PM (#39828369) Homepage Journal

    why not? Should be easy enough. By default set everything to 24 fps, and just select some scenes and flag beginning and end frames for doubling up to 48 fps.

    Pirates that upload DVD-ripped movies eyeball each scene and manually adjust the bitrate all the time. (the better ones do, anyways) They do this to fit the movie onto a 700 MB fixed size. Basically you allocate more bitrate to scenes with motion, and less bitrate to mostly still scenes. Software can do this automatically, but humans with an artistic touch can do a much better job.

    And that's just *one* guy doing this for a whole movie in one evening. Should be nothing to a studio.

  • Re:Can You SHow Me (Score:5, Interesting)

    by Tanman ( 90298 ) on Friday April 27, 2012 @08:00PM (#39828385)

    A better word for "120hz interpolation" is "morphing" -- when these televisions do their thing, what they are really doing is morphing between frames. You have a 24 fps movie and want it at 120hz? Then the new in-betweens will be averages of the previous and upcoming frames until you hit the new frame.

    It is very, very different from filming at a higher frame rate. The best I can tell you is to film yourself smiling. Then, take the first frame (straight faced) and the last frame (smiling), then use a program to morph from straight-faced to smiling. You will see just how creepy it is.

  • Re:Is it "too real"? (Score:4, Interesting)

    by jd ( 1658 ) <imipak@yahoGINSBERGo.com minus poet> on Friday April 27, 2012 @08:28PM (#39828623) Homepage Journal

    It depends on the conversion system. The cheaper ones just speed everything up. The more complex ones create whole new frames through linear interpolation (in-betweening), but neither add any new information, you are correct.

    This is why, back when HDTV was first mooted, I was suggesting that they use the lowest resolution and framerate for which the existing standards were factors. It would mean that existing sets would be able to display actual pixels in actual frames, whether they were NTSC or PAL, resulting in cleaner images and cleaner sound. It would also have simplified manufacture (since switching between HDTV, PAL and NTSC would have been purely a matter of altering integer step sizes for horizontal, vertical and framerate, which is trivial compared to the algorithms multi-standard televisions are forced to use in practice).

    48 FPS for a movie should not have caused any problems - since the complaints have to do with contrast, the cameras used may have had dynamic range issues when the higher frame rate was selected. Lowering the speed won't help if that is true. It might just have been viewers with a preference for a crappy product, though - it's not like Slashdot is unaware of such folk, we bitch about them often enough.

    I wouldn't have used 48 for filming, though. Digital storage on the movie-making side is cheap. 48 for the theatres is fine, but it makes it hard to convert to TV. A frame rate of 240 for filming can be converted to conventional film, 48 frame film, 30 frame NTSC and 60 frame HDTV without any interpolation or time compression/stretching. HDR on high-speed digital cameras is usually done either using four colour filters or via 3CCD. In the first case, you can do up to 333 fps, which is above what I'm saying would be required to make a "play unmodified anywhere" movie.

    Harsh lighting is another complaint about the movie - easily fixed. Astronomical photos, in particular, have all kinds of non-linear contrast stretching applied to make the image easy to see. The algorithms are readily-available and widely-used.

    After that, people should stop whining about movies being actually better. You'd think they were expecting entertainment or something.

  • Re:Is it "too real"? (Score:5, Interesting)

    by Elrond, Duke of URL ( 2657 ) <JetpackJohn@gmail.com> on Friday April 27, 2012 @08:53PM (#39828783) Homepage

    This has always bugged me a lot. For most games, I personally think it looks better with motion blur turned off. You almost always get that option with games on a PC, but rarely can it be changed with console games.

    On consoles, I think one of the reasons it is used so frequently is to help mask low or dipping frame rates. The 3D on consoles seems to be designed such that games can enable motion blur without hurting the rest of the 3D rendering performance. Most PC video cards, however, seem to take a hit when it is enabled. But, perhaps that is no longer true with newer cards? Or maybe it is only noticable on a PC because the resolution is much higher?

    I've read that most console games only render internally at a size close to 800x600 and then scale to "HD" sizes... which I suppose makes sense when you consider how many years old the PS3 and XBox360 3D tech really is.

  • by Anonymous Coward on Friday April 27, 2012 @11:59PM (#39829741)

    Actually it's 15fps. 24fps was used so that audio running along side the film wouldn't have gaps. Learned about that in animation school.

  • Re:Is it "too real"? (Score:2, Interesting)

    by Anonymous Coward on Saturday April 28, 2012 @03:07AM (#39830349)

    A lot of people get motion sickness from TV/monitors that are "too real".

    I suspect that this problem is not nearly as widespread as it appears to be. Most people would probably acclimatize very quickly if they were exposed to the high frame rates all the time.

    Compare the situation at the very start of the movie era, when audiences fled in panic from a movie of an approaching train. They were unable to distinguish it from reality! That really doesn't happen so much any more.

  • Re:Is it "too real"? (Score:5, Interesting)

    by rtb61 ( 674572 ) on Saturday April 28, 2012 @03:15AM (#39830379) Homepage

    Supermarket bargain bins are still full of DVDs, more than you can ever hope to watch. That's the reality there is already far more content out there than you can consumer, full time doing nothing else in ten life times.

    Copyright was really all about burying old content so that you would pay top dollar for new content. The producers of new content got greedy and decided to dump the old content they had buried, case of this years executives hunting this years bonus and bugger tomorrow. Worst of all most of the new content is pretty crappy and can't compete with the old content beyond of course the tasteless cheetos crowd (the boring I've watched it already and who cares about story give me un-reality TV).

    The really funny thing about all this, the truly hilarious reality. Big screen, high definition 3d, high frame rates, is not good for 'fake' content or make believe, the only thing it is really good for and that people will truly enjoy, is the scenery channel. Just moving images of nature, great locations with beautiful sunsets and sunrises, of calming noon day tropical lagoons and beaches. Forget windows, filtered, conditioned air (maybe with aromas to match the view) and full wall sized video displays with high resolution like your there scenery in motion.

  • Re:Is it "too real"? (Score:4, Interesting)

    by mug funky ( 910186 ) on Saturday April 28, 2012 @10:43AM (#39831713)

    the way films are shot these days, absolutely. back in the day, flicker was a thing that was considered while shooting, and as such the camera operator tried not to pan too fast. also the cameras were so huge that handheld was not something that was done unless Schwarzenegger was shooting his own films.

    on a slow enough pan, at the resolution of a regular release print, you wont see the difference between 24 and 48 fps. bear in mind that projector shutters are twin-blade things that open twice for each frame, giving a 48fps flicker for 24 frames, so the "flashiness" wont give them away.

The one day you'd sell your soul for something, souls are a glut.

Working...