Hobbit Film Underwhelms At 48 Frames Per Second 607
bonch writes "Warner Bros. aired ten minutes of footage from The Hobbit at CinemaCon, and reactions have been mixed. The problem? Peter Jackson is filming the movie at 48 frames per second, twice the industry standard 24 frames per second, lending the film a '70s era BBC-video look.' However, if the negative response from film bloggers and theater owners is any indication, the way most people will see the movie is in standard 24fps."
Is it "too real"? (Score:5, Insightful)
Is this another version of the same issues people complained about when seeing their favorite newscaster (or "other" things) in HD?
Do we need some "masking" of the mundane reality of scenes (e.g., things "looking like sets") to sufficiently suspend disbelief?
Re:Is it "too real"? (Score:4, Insightful)
Every time I hear someone bitch about higher FPS video I'm seriously annoyed, I've had to deal with the damn 24 FPS jerky and/or blurry bullshit for too long people need to just adjust.
Re:Is it "too real"? (Score:5, Insightful)
Me too.
Seriously, what could be wrong with 48 fps? That it didn't flicker enough?
I read this story a few days ago and actually went searching for some samples but couldn't find any at that time, other than some silly animated combat scenes.
What I did find was a bunch of bloggers who have never produced anything in their life except whiny bitching without a single valid criticism that didn't amount to jealousy and NIH.
Re:Is it "too real"? (Score:5, Interesting)
Re:Is it "too real"? (Score:5, Insightful)
Well 3D still doesn't work properly, and probably nothing will fix that while projecting on a flat screen.
But 48fps is simply smoother, and just as they are able to fake up 3D on films that were never shot that way, they will be able to digitally fake up with the extra frames between every 24fps frame and re-release all those old films in Astounding 48 FPS, New and Improved, Digitally Remastered, For a Limited Time Only....
Its a whole new industry, and they can sell us all copies of the disks we already bought once.
The wrath of the industry is usually tempered by box office figures.
Re:Is it "too real"? (Score:5, Funny)
Re:Is it "too real"? (Score:5, Informative)
Re:Is it "too real"? (Score:4, Informative)
>But 48fps is simply smoother, and just as they are able to fake up 3D on films that were never shot that way, they will be able to digitally fake up with the extra frames between every 24fps frame and re-release all those old films in Astounding 48 FPS, New and Improved, Digitally Remastered, For a Limited Time Only....
Yeah, my TV did that (interpolated 24fps into 120fps) until I turned it off "motion enhancement". I hated the effect. Somehow the picture seemed artificial and less clear even though the action was arguably smoother. Motion interpolation is much more well understood and easier to implement than faking 3d, but it still produces bad results.
Motion interpolation generally only works well for a very small subset of common visual imagery. Complex motion confuses it, often obliterating the original motion which makes things look subtly unreal, dreamlike, or otherwise confusing to the viewer. Discreet sampling and reconstruction filters, which are guaranteed to be sub-optimal, intensify the problem. When the video source is a DVD or some other video that's been wrung through the motion estimation process at least once already, it can only get worse. Garbage in, garbage out, Chinese whispers, turd polish, and all that rhythm.
Re:Is it "too real"? (Score:4, Interesting)
the way films are shot these days, absolutely. back in the day, flicker was a thing that was considered while shooting, and as such the camera operator tried not to pan too fast. also the cameras were so huge that handheld was not something that was done unless Schwarzenegger was shooting his own films.
on a slow enough pan, at the resolution of a regular release print, you wont see the difference between 24 and 48 fps. bear in mind that projector shutters are twin-blade things that open twice for each frame, giving a 48fps flicker for 24 frames, so the "flashiness" wont give them away.
Re:Is it "too real"? (Score:4, Insightful)
Re:Is it "too real"? (Score:5, Insightful)
Tell you the little Hollywood secret, they HATE this. If the rubicon of 24fps & 2D is crossed, the film industry and all their flicks will be stamped as outdated '70s era films.
Really? I always thought Hollywood was jamming 3D down our throats. If 48fps takes hold and 3D starts being worthwhile, then the MPAA can just sell us all their old crap again in new "remastered" editions. The Citizen Kane blu ray collectors edition runs for $70!
Re:Is it "too real"? (Score:5, Interesting)
Supermarket bargain bins are still full of DVDs, more than you can ever hope to watch. That's the reality there is already far more content out there than you can consumer, full time doing nothing else in ten life times.
Copyright was really all about burying old content so that you would pay top dollar for new content. The producers of new content got greedy and decided to dump the old content they had buried, case of this years executives hunting this years bonus and bugger tomorrow. Worst of all most of the new content is pretty crappy and can't compete with the old content beyond of course the tasteless cheetos crowd (the boring I've watched it already and who cares about story give me un-reality TV).
The really funny thing about all this, the truly hilarious reality. Big screen, high definition 3d, high frame rates, is not good for 'fake' content or make believe, the only thing it is really good for and that people will truly enjoy, is the scenery channel. Just moving images of nature, great locations with beautiful sunsets and sunrises, of calming noon day tropical lagoons and beaches. Forget windows, filtered, conditioned air (maybe with aromas to match the view) and full wall sized video displays with high resolution like your there scenery in motion.
Change (Score:5, Interesting)
Seriously, what could be wrong with 48 fps? That it didn't flicker enough?
The problem isn't that it is fundamentally better, it's that it is a change from what people expect. Every time I see a high fps recording of something the motion looks like it's going to fast. I fully expect the video and sound to drop out of sync but it never does. The results look fantastic and smooth as they should, but it takes my brain conditioned by years of 24fps shit a while to adapt to the new look.
Any change from the norm is likely to attract serious criticism, whether good or bad.
Re:Change (Score:4, Insightful)
But is it still the norm ? Gamers are used to watching and participating in scenes are much higher FPS rates... for those of us who were born after 1980, this is better... tv looks flickery and annoying. ... our cellphones have higher resolution than that, why is it only being upgraded now and by so little ? The latter one inspired a wonderful XKCD (just so I'm not accused of plagiarism) :P
We had the same issue with HD
Re: (Score:3)
Yes it still is. Gamers don't have screens that cover 6 average basements, and more importantly I know most gamers turn off full screen motion blur. This makes games a VERY different experience to the movies which will naturally motion blur every frame. Gamers tend to seek extreme frame rates and let their eyes do the blurring. In this regard cinema is very different.
A game played at 24 fps is by common standards completely unplayable. Yet a movie at 24fps is a bit jerky, and only really a bit jerky if you'
Re:Change (Score:4, Insightful)
The problem with motion blurring or any sort of blurring is it makes my eyes hurt when I try to focus on something that can not be in focus.
In real life when you are looking at something moving, the object you are looking at becomes sharp, at worst the background becomes motion blurred. If you look at the background, the background becomes sharp, and the object becomes blurry. So whatever I look at is sharp unless the object is moving really fast, or I'm having problems with my eyes.
As technology improves they should strive to have more stuff sharp. As you said let our eyes do the blurring. Only in a few cases should the director blur stuff for effect.
Re:Change (Score:5, Informative)
I call "not understanding technology" on the post above: Most screens these days only will update at 60hz, especially larger ones. Even 1280x1024 screens will only do maby 85hz. /displayed to you/.
Unless you're using a CRT or a Nvidia 3D Vision compatible monitor, you're not getting more frames than that
Which means the frames are simply dropped, and thus won!t look any better. You'd be better off enabling vsync, so you've got a constant maximum fps, at whatever rate your monitor is set to, and not wasting frame rendering time.
Re: (Score:3)
I've always used screens rated 85Hz or better, and I quite prefer the 100+Hz models when possible.
One of the reasons I stayed away from LCD so long was because I loved my high refresh rate CRTs so much. My present LCD does 85Hz only at 1280x1024 or lower sadly.
Re:Change (Score:5, Funny)
People are used to high frame rates. Its not like 3D where it actually makes some people feel sick.
As for the GP stating "Every time I see a high fps recording of something the motion looks like it's going to fast.", I don't see that at all. It just looks normal, it doesn't look faster at all. Its just smooth and realistic.
The physical universe has a pretty good framerate -- about 8.3*10^16fps, according to Planck -- and it's in 3D too! I've never heard a sober person complain about either of these two things.
Hybrid system (Score:3, Interesting)
a solution might be to show the movie at 48 fps but keep most of the source 24 fps... ramping up to 48 fps during scenes that require it (such as camera panning)
So basically what you'd do is shoot everything in 48 fps, but for most scenes take out every other frame, and just show the remaining frames twice. Then it would look like a regular 24 fps movie.
For scenes with lots of motion, DON'T take out every other frame, show the full 48 fps.
Re:Hybrid system (Score:4, Informative)
Re:Hybrid system (Score:5, Interesting)
why not? Should be easy enough. By default set everything to 24 fps, and just select some scenes and flag beginning and end frames for doubling up to 48 fps.
Pirates that upload DVD-ripped movies eyeball each scene and manually adjust the bitrate all the time. (the better ones do, anyways) They do this to fit the movie onto a 700 MB fixed size. Basically you allocate more bitrate to scenes with motion, and less bitrate to mostly still scenes. Software can do this automatically, but humans with an artistic touch can do a much better job.
And that's just *one* guy doing this for a whole movie in one evening. Should be nothing to a studio.
Re: (Score:3)
... You've never seen side-by-side comparisons of 24fps versus 48fps, have you? It's a bad idea because the frame rate dramatically affects how the film is perceived. It'd be like finding a compromise for those who resisted colour film and colour TV by simply switching on or off colour depending on whether the scene in question really made optimal use of it. It'd be totally jarring and terrible.
To a trained eye, it's really jarring when TV serials nowadays do all their interior shots using RED One or simila
Re: (Score:3)
Re: (Score:3)
Re:Is it "too real"? (Score:5, Insightful)
Re:Is it "too real"? (Score:5, Funny)
No, he's frame-capped!
ba-dum-dum! The next show's at eleven!
Re:Is it "too real"? (Score:5, Funny)
Re:Is it "too real"? (Score:4, Funny)
This is exactly I am unable to leave the basement. The frame-rates "outside" literally make my brain hurt.
Well, I like the great resolution they have "outside", but the graphics for people and critters aren't very realistic. I saw something they called a "squirrel" and it didn't have any tentacles!
Re:Is it "too real"? (Score:5, Funny)
Strobe lighting, obviously.
Re:You moron... (Score:5, Funny)
You could of been nicer about that, you know.
Re:You moron... (Score:5, Funny)
"Then" and "than" are basically the same, for all intensive purposes.
Re: (Score:3, Insightful)
I'm sure that one day, you will accomplish something of value in your life, and you will no longer need to use trivialities to build your sense of self-worth.
Re:Is it "too real"? (Score:5, Funny)
I think this is evolutions way of saying "Don't have children, dudes."
I'm in that category for other reasons. (Autoimmune. Besides I'd rather build a robot with my own AI)
So you think he shouldn't reproduce just because he's unable to watch certain types of television? WTF? That's one of the lamest criteria for deciding whether to reproduce. Hell, I bet some people would say that's a sign he should reproduce like crazy and create a bunch of kids who are physiologically forced to go outside and play.
Re:Is it "too real"? (Score:5, Informative)
How well do you tolerate the infinite fps you get when you look away from your computer screen?
A lot of people get motion sickness from TV/monitors that are "too real". Keep the framerate or resolution down enough, and the brain knows it's just video, but HD at 60FPS looks too much like real vision, moving in this odd way decoupled from how your head moves.
The "infinite" FPS causes a different group of people to become sick when riding in a boat (not all seasickness, but some), or an a car, because the gorizon is again moving unrelated to how your head is moving.
I get sick playing any FPS with "head bob" turned on. 2 minutes and I'm out. Fortunately, almost all games let you turn that shit off.
Re: (Score:3)
I can't play an FPS without proper head-bob ... makes me feel like I'm floating around, very unreal and annoying.
Re: (Score:3)
Never met anyone like that. Most people looked at my Trinitron screen, realized their eyes were not getting tired, and came to the conclusion that a more expensive screen does make a difference.
Re:Is it "too real"? (Score:5, Informative)
They still add motion blur to almost every major 3D AAA game title out there you know.
I dealt with the issue of motion blur a lot when working on 3d animated films... The problem was that non-blurred 3d animation looks a hell of a lot like claymation at times due to the lack of blur produced in that workflow. The motion blur issue with games doesn't really have an equal, but to most people it looks subconsciously better with it enabled for reasons they can't explain. It will be interesting to see whether or not a 48 fps cinema standard will effect the need for motion blur in games too!
Re:Is it "too real"? (Score:5, Interesting)
This has always bugged me a lot. For most games, I personally think it looks better with motion blur turned off. You almost always get that option with games on a PC, but rarely can it be changed with console games.
On consoles, I think one of the reasons it is used so frequently is to help mask low or dipping frame rates. The 3D on consoles seems to be designed such that games can enable motion blur without hurting the rest of the 3D rendering performance. Most PC video cards, however, seem to take a hit when it is enabled. But, perhaps that is no longer true with newer cards? Or maybe it is only noticable on a PC because the resolution is much higher?
I've read that most console games only render internally at a size close to 800x600 and then scale to "HD" sizes... which I suppose makes sense when you consider how many years old the PS3 and XBox360 3D tech really is.
Re: (Score:3)
Kind of like the last Star Trek film, where they made computer generated tracking shots of space ships look like they were filmed through really grubby lenses. That was genius, IMHO. Computer generated imagery has got so detailed that nothing impresses us now, but somehow adding the illusion that a camera was involved makes the shot feel more real.
Re:Is it "too real"? (Score:5, Interesting)
Is this another version of the same issues people complained about when seeing their favorite newscaster (or "other" things) in HD?
Do we need some "masking" of the mundane reality of scenes (e.g., things "looking like sets") to sufficiently suspend disbelief?
A lot of the complaints may actually stem from lighting issues. In general, movies are dimmer than TV. Lots of mundane "set"-type things are hidden in the shadows, and brightening everything up will reveal them even at 24fps. The lighting may need to be adjusted differently for 48fps (possibly planned for post-production and just hasn't happened yet), or maybe the lighting is intentionally too bright to counteract the dimming effect of 3D. Either way, people may be reacting to a lot more than just 48fps, so don't just assume they're all Luddites.
Also, the need for 48fps wouldn't be nearly as bad if the camera operators of the world hadn't all simultaneously forgotten how to slow down the shutter speed during pans. Seriously, there's judder all over the movie theatres today, and while it existed thirty years ago, it wasn't nearly as frequent or as bad as today.
Re:Is it "too real"? (Score:5, Informative)
Why would you slow down the shutter speed during pans? That makes them even more blurry.
Yes, that's exactly the point, and it was common practice for something like 70 years, so it's not a crazy avant-garde thing only a few people did. The basic idea is that blur masks judder, and since judder is worse than blur, people like it when you slow the shutter speed during pans. It's only when people stopped doing this fairly recently that suddenly everyone's complaining about seeing judder everywhere.
I'm not saying there aren't advantages to be had from 48fps, far from it. But 24fps judder suddenly got a lot worse rather recently, which is making it seem more necessary than it really is.
Re:Is it "too real"? (Score:5, Informative)
At 24 FPS, a wide, judder-reducing shutter angle gets you a shutter speed of like 1/50th of a second. If you want anything less than deep-focus, you need to use an aperture of like f/5.6. In sunlight, this would require a film speed of iso 6. So yeah, I'm sure Vision 500T has a lot to do with it.
Re:Is it "too real"? (Score:4, Interesting)
>>>BBC 70's shows that use video, but by the time it gets over here in the colonies, it's not 48 frames per sec, but 25. I have no way of knowing what the TV stations played it at.
BBC video is 25 frames per second. Interlaced.
So basically it's just like U.S. video (30fps) but slightly slower.
Movement between one field and the next (Score:3)
Re:Is it "too real"? (Score:4, Informative)
On a side note NTSC and PAL are what they are because tv was originally interlaced and ran with the frequency of the electricity used. So in the U.S. TV used to run at 60 frames interlace producing 30 full frames because electricity is 60 hz. Countries that ran on 50 hz got 25 fps.
Re:Is it "too real"? (Score:4, Informative)
Re:Is it "too real"? (Score:4, Interesting)
It depends on the conversion system. The cheaper ones just speed everything up. The more complex ones create whole new frames through linear interpolation (in-betweening), but neither add any new information, you are correct.
This is why, back when HDTV was first mooted, I was suggesting that they use the lowest resolution and framerate for which the existing standards were factors. It would mean that existing sets would be able to display actual pixels in actual frames, whether they were NTSC or PAL, resulting in cleaner images and cleaner sound. It would also have simplified manufacture (since switching between HDTV, PAL and NTSC would have been purely a matter of altering integer step sizes for horizontal, vertical and framerate, which is trivial compared to the algorithms multi-standard televisions are forced to use in practice).
48 FPS for a movie should not have caused any problems - since the complaints have to do with contrast, the cameras used may have had dynamic range issues when the higher frame rate was selected. Lowering the speed won't help if that is true. It might just have been viewers with a preference for a crappy product, though - it's not like Slashdot is unaware of such folk, we bitch about them often enough.
I wouldn't have used 48 for filming, though. Digital storage on the movie-making side is cheap. 48 for the theatres is fine, but it makes it hard to convert to TV. A frame rate of 240 for filming can be converted to conventional film, 48 frame film, 30 frame NTSC and 60 frame HDTV without any interpolation or time compression/stretching. HDR on high-speed digital cameras is usually done either using four colour filters or via 3CCD. In the first case, you can do up to 333 fps, which is above what I'm saying would be required to make a "play unmodified anywhere" movie.
Harsh lighting is another complaint about the movie - easily fixed. Astronomical photos, in particular, have all kinds of non-linear contrast stretching applied to make the image easy to see. The algorithms are readily-available and widely-used.
After that, people should stop whining about movies being actually better. You'd think they were expecting entertainment or something.
Re:Is it "too real"? (Score:5, Interesting)
The effect is known as "soap effect", because soap operas are shot on video, in interlaced format. Interlaced video gives a time resolution of 50 or 60 images per second, compared to 24 images per second for film. Because we're used to seeing interlaced video on TV and movies are always non-interlaced with lower time resolution, it's irritating when a movie has fluid motion. You can experience this effect if your TV has an option to interpolate frames. Turning that feature off makes movies look more like "cinema" and turning it off makes movies look like soap operas.
Re: (Score:3)
Re: (Score:3)
You know they make /true/ 120hz LCD monitors? Just look up Nvidia 3d Vision. The monitors are just standard TN LCDs that have the input electronics to handle 120fps(and require DVI-D to do it).
Looks nice even if all you're doing is moving the mouse around quickly; the mouse jumps fewer pixels per frame.
Re:Crappy Soap Operas (Score:3)
Don't forget the lightning production schedule, that needs to churn out 5(?) episodes a week. Urban legend has it that they get at most two takes on something, and in some long forgotten show some character said "As I look into your thighs... I mean your eyes..." and they didn't have time to fix it.
Re:Is it "too real"? (Score:5, Informative)
Yes, I am incapable of editing my own comment prior to posting. That should have been:
I've noticed that too. I can never figure out why daytime soap operas look so much different than prime-time shows. Is it the framerate that does it? I was beginning to think that the crappy dialogue and crappy plot were becoming visible.
It's the frame rate + lighting.
Shows like Community or 30 Rock are what's known as single camera. They are lit and shot as though they're feature films. This takes time. I love watching people visit a set for the first time and witness the hours it can take to perfect the lighting for a single shot of a single scene which may be built from multiple shots and end up as a few seconds of screen time in the finished product. But it looks cinematic. You get shadows and a true sense of depth. Frankly, it just looks more interesting than the alternative.
On the other hand, show like The Big Bang Theory or Whitney are multi-cam. Multiple cameras run simultaneously and capture the entire scene at once. Consequently, the sets are lit so they can be shot from a whole bunch of angles without moving lights. Everything looks very flat, and very stage-y. Even real-world props often fall into a strange uncanny valley.
Check out any episode of 30 Rock and then one of the live episodes if you want to see a great comparison between single and multi-cam.
Can You SHow Me (Score:5, Interesting)
Could you show me what this "70s era BBC-video look" is. Despite having seen lots of 70s era BBC-video, I'm unable to understand what you're talking about based on the description.
Re: (Score:3, Informative)
It looks like a soap opera.
Re: (Score:3)
"Looks like a soap opera" to me means the weird overly contrasty look you get when some of the stupid autocontrast/edge "enhancement" features are turned on on modern TVs.
Re:Can You SHow Me (Score:4, Interesting)
My father's Sony drives me nuts with it's 120hz interpolation. I can attest to the soap-opera effect; it makes everything look very strange. Mission Impossible was positively ODD.
I was always curious if it was an effect of the high frame-rate or the interpolation algorithms. Worryingly this story seems to indicate it's the frame-rate, not the algorithms.
Re:Can You SHow Me (Score:5, Interesting)
A better word for "120hz interpolation" is "morphing" -- when these televisions do their thing, what they are really doing is morphing between frames. You have a 24 fps movie and want it at 120hz? Then the new in-betweens will be averages of the previous and upcoming frames until you hit the new frame.
It is very, very different from filming at a higher frame rate. The best I can tell you is to film yourself smiling. Then, take the first frame (straight faced) and the last frame (smiling), then use a program to morph from straight-faced to smiling. You will see just how creepy it is.
Habit (Score:5, Insightful)
The only reason people don't like it is because they are used to film looking another way. It has nothing to do with what is actually happening on screen, or some magical quality that allows 24fps to transport you to another place.
If all films changed to this, in three years no one would have an issue with it. In 10 years, people would say that older movies looked to "fake."
It's all what you are acclimated to.
Re:Habit (Score:5, Funny)
No, "Hobbit".
Re:Habit (Score:5, Insightful)
I think this is the case. I remember the transition to HDTV. When shows started airing in HD, I remember everything looking unnaturally crisp. It looked fake compared to the "real" 480i I was used to. By the time most shows went HD that effect went away for me, and the SD stuff started looking fake and crappy. I have roughly the same reaction watching SD shows now as I did watching the handful of B&W shows that were still airing when I was a kid. Yeah, it still works, but it definitely feels inferior and old fashioned.
My guess is 48fps movies will be about the same, unless they induce epileptic seizures or something...
please, please make 48fps available (Score:5, Interesting)
I'm one of the luck few with sensitive eyes. Watching movies at 24 fps is jarring. I can't wait til they move up to 60 or 120.
If movies had originally filmed at 48 FPS (Score:5, Insightful)
Just whiners (Score:5, Insightful)
People have decided that 24fps is "cinematic" since that's what movies have been for so long and so they expect it and hate on things that aren't. They need to STFU and just take some time to appreciate a more real format.
We have cameras at work that shoot 60fps and I just -love- it. It is so silky smooth. When you first see it, it almost seems like something is wrong. Then you realize what is missing is the stutter of 30 (or 24) fps. Things are fluid, much more like they really are. Motion looks great.
We need that in movies. Spatial resolution is getting really good these days, we need better temporal resolution. Get that framerate up there and things will start to look much more real.
People have just come to associate the stuttery crap that is 24fps as being "cinematic". They need to tie a can on it and get over it.
Re:Just whiners (Score:5, Insightful)
I went to a very early digital cinema festival years ago, and in the round-table discussions all these people were focussing on how "sterile" digital looked, and moaning about how that "film look" was going to die a horrible ugly death, and the world as they knew it was ending. Everybody else was thrilled to death about how the image was actually sharp and consistent, you couldn't see the ugly film grain, colors were sharper, there was no crap stuck to every frame or spinning along down one side, you didn't have frames jumping all over the screen (60ft screen avg vertical jitter is +- 8 inches per frame!), etc etc etc.
Guess what? Digital won, end of story.
The "film purists" will always find something to complain about, while the rest of the world moves on.
Re:Just whiners (Score:4, Interesting)
Man, reading this reminds me of those audiophile douchebags that insist that records sound 'warmer' and go into all of these nonsensical explanations about sound texture and other dumb shit when in reality, it's mostly all in their head and they're talking out of their ass.
If it really does remind you of that, then you aren't paying attention. Grain is a characteristic of film that a good cinematographer uses, just as he uses things like exposure, focus, lens-flare and depth of field. Digitally removing grain from a movie where the cinematographer made artistic decisions regarding the grain is the equivalent of amping up the saturation, blowing out the contrast or even chopping off the edges of the picture - it is destructive to the artist's intent. Grain is part of the creation not part of the playback, unlike the "warmth" that vacuum tubes add to music (and which can be simulated with the right digital filters).
Grain is such a basic part of modern cinematography that a fair number of movies shot on digital have had artifical grain added in post.
I don't think film is going anywhere and digital most certainly is not going anywhere.
Film is on life support already. [creativecow.net] By 2013 all US theaters will be digital. Over 90% of all primetime tv is already shot on digital.
Comment removed (Score:5, Interesting)
Re:Just whiners (Score:5, Funny)
I want it to look like a fantasy and that is what 24 fps makes it look like.
Fair enough. For a nominal fee, your local movie theater will set your 3D glasses to black out every other frame, so you can enjoy 48FPS Hobbits at 24FPS.
So? (Score:5, Funny)
lending the film a '70s era BBC-video look
Well, it's a story about olden-times in England, isn't it?
Psychological? (Score:4, Interesting)
"THE HOBBIT, frankly, did not look cinematic."
Is it because we are conditioned that the low frames per second represent a 'movie?' I remember seeing an FPS one time at 60 fps, not realizing right away that it was supposed to be a FPS and not a movie and my first and immediate response my brain gave me is, "wtf is this?!" It seems different frame rates make me think it's a different 'experience' of sorts, a game, a TV broadcast, etc. (Even say the 60fps black and white from back awhile ago... was it 60fps?) So I think I understand the feeling, even though I tell myself that I prefer the 48 frames per second. Because I then see the action in some other movies, say, Gladiator, at 24 fps and I see just how bad the action is represented.
I really *do* want to see more motion/information on the screen and I'm willing to put myself through reconditioning to do so.
But I'm not sure everyone else will, or even understands it this way.
Has anyone else noticed this effect?
In film, frame rate = exposure time (Score:5, Informative)
Because the shutter is fixed, the exposure time of each frame is directly related to the frame rate. Lower frame rate = longer exposure = more motion blur in the frame. Shorter frame rate = shorter exposure = less motion blur in each frame. You need more light to shoot at a higher frame rate to keep the same aperture setting.
So, if they do project this at 24 frames per second (by throwing away half the frames in post), the frames will not have the necessary motion blur and it will actually look worse because half the frames are missing. This could also probably be fixed in post, but that would be a pretty big hack for such a large production.
Re: (Score:3)
This wasn't shot on film. The exposure time in digital has nothing to do with the frame rate.
Re:In film, frame rate = exposure time (Score:5, Informative)
I didn't realize it was shot digitally, but you're statement isn't completely true. If you shoot something at 48FPS then the slowest possible frame rate you can have is 1/48th of a second in digital. Digital does give you the chance have a faster shutter speed though.
Here's the kicker though, in film you have to double it. So 24fps would give you 1/48th shutter speed (half open half closed) meaning the motion blur for 48fps digital vs 24fps film should be the same, which explains why they picked 48fps - it afforded them the option to do either 48fps, slow motion or 24fps in post without giving anything up (except disk space).
Re:In film, frame rate = exposure time (Score:5, Insightful)
In theory, the shutter speed (e.g. exposure time) could be faster than the frame rate, but the same holds true in film cameras as well by adjusting the shutter angle. Most films shoot with a shutter angle of 180 degrees. (think of the shutter as a circle, half of it is open and half of it isn't) If you decrease the shutter angle, you get less motion blur and a shorter exposure time. This was used to great effect in the D-Day storming of the beach scene at the beginning of "Saving Private Ryan."
Unless you know of some way to warp time, the exposure length will never, ever be longer than the frame rate in film or digital!
It'll take a little getting used to, that's all (Score:5, Interesting)
Re:It'll take a little getting used to, that's all (Score:5, Insightful)
What you're talking about is a very different issue. With that, they're taking video at another framerate - perhaps 30 or 60 - and "upscaling" it to 120/240Hz. There is a chip in there that is looking at two frames, figuring out what changed, and making up frames to shove in between. It not only looked fake, it genuinely was fake. It really isn't any different from taking 480p and trying to upscale it up to 1080p - just you're doing it in the time dimension instead of x/y.
Seeing video that was actually sourced at a higher framerate displayed at that higher framerate usually doesn't generate the "fake" look you're talking about. That having been said, I have no idea what's causing issues with the Hobbit film.
Re: (Score:3)
Same as 120/240Hz HDTVs, I can't stand it (Score:3, Interesting)
I have tried time after time to get used to it but I can't. The overly smooth look pulls me out of what I'm watching and makes it look fake, to the point that it doesn't seem natural. There is something off about it but I don't know what it is, real life doesn't have that look so I think there is some other factor at play here that makes people (myself included) react this way.
Try it, it's fantastic ! (Score:3)
In France, at the Futuroscope, there is a experimental projection 2D at 48fps since 1988. I enjoyed it for it brightness and flicker free movement. I remember that I was thinking that any movie theater should be like this. The realism sensation is way better that for 3D at 24fps. Can't wait to see 3D at 48fps.
24 fps -- 48 fps shutter projection speed (Score:5, Interesting)
---
This is a bit like TV that has a frame rate of 30 (29.97) but a field rate of 60 (59.94) because it's interlaced. It prevents jerky motion because the eye believes it's getting a frame rate higher than the true frame rate (e.g. it perceives the field rate to be the frame rate). When film is put on a DVD it has to undergo a telecine process to raise the field/frame rate.
Some people I know [with better eyes than mine] can see flicker in 24/48 film content. They actually prefer video because of the higher frame rate.
Good news for profits (Score:3, Funny)
Re:Can people actually tell the difference? (Score:5, Informative)
I don't have links handy but they aren't terribly hard to find. Most of the population (more than 90%) can tell the difference between 24 and 48. Most (over 50%) can tell the difference on any 10fps jump (i.e. 60fps to 70 fps) up to 80 fps IIRC. Beyond that it starts to dwindle, but there's still a substantial chunk (20ish%) that can tell a 10fps difference at 120fps. By 240fps you reach the point where basically no one can tell the difference between that and anything faster, no matter how much faster (e.g. 240 vs 480 fps benefits basically no one).
Also depends on the material, to an extent (Score:5, Informative)
The more fast motion/pans you have, the more noticeable framerate is. If I shoot someone sitting and talking there isn't a ton of difference between the 60fps source and 30fps final product (the AVCHD cameras I use shoot at 60fps progressive). You can see it, but it isn't something that jumps out at you. However if I shoot someone running, the difference is extremely noticeable.
Re: (Score:3)
Well, I think this is down to the question of motion blur vs frame rate, It has been shown that humans can perceive frames that are only on screen for an extremely short amount of time, but not that the fluidity matters. That is for example if you record a plane passing by in 24 fps and you miss it - the distance between frames is so that you don't see it - on the other hand if you recorded the scene at >>24 fps, like say 1000 fps and then slowed it down to 24 fps, people would notice the plane but it
Re: (Score:3, Funny)
Re:Can people actually tell the difference? (Score:5, Informative)
24fps is actually the LOWER threshold. The level below which most people no longer perceive smooth motion.
Re: (Score:3, Interesting)
Actually it's 15fps. 24fps was used so that audio running along side the film wouldn't have gaps. Learned about that in animation school.
Re:Can people actually tell the difference? (Score:4, Informative)
Ooops you posted twice. :-) BBC video is 25 frames per second..... so I don't understand the comparision.
And HDTV is upto 60 frames per second; aren't people used to seeing a rapid frame rate by now? I guess people are just weird.
Re: (Score:3)
I'll give you the benefit of the doubt since it appears as though you are asking a valid question, but I do have to say I'm tired of hearing the argument implied in this question pop up in every discussion of framerates, whether in film or games.
First and foremost, everyone should visit this link: http://boallen.com/fps-compare.html [boallen.com] Put simply, the human mind, and eyes, can perceive far more than 24, 30, or even 60 frames per second. Not consciously well enough that we can point out which image is operating
Re: (Score:3, Informative)
The 24 fps frames in film are all Fluid Frames. Games use Still frames since they have no natural motion blur, and when you increase the frame rate you start to improve the blur by overlapping frames. This is the same Audiophile Videophile BS because your eye sees frames at a far lower frame rate, but utilizes the blur in interpret motion. Reducing the blur in the frame by increasing the frame rate will screw with the moti
Re:Uh (Score:5, Informative)
Just because it is shot at 24fps, does not mean it has to be displayed at 24hz.
Re:Can people actually tell the difference? (Score:4, Funny)
Is that what happened to the original trilogy for LotR too?
Re:Modern 120Hz+ HDTVs (Score:5, Insightful)
Stop thinking of "movies" and "TV shows" as being separate entities. It's all basically the same (actors on fake sets), and the only distinction that exists is all in your mind.
In fact a lot of 2000-era movies don't even use film anymore..... they're using HD videocams. Same thing TV productions use.
Re: (Score:3, Interesting)
While true, that is all utterly and completely irrelevant to what I posted. The reality is that the higher refresh-rates and "processing" going on in modern HDTVs makes "film" look like "video", regardless of the source. If you haven't seen the effect I'm talking about, you should make an effort.
Re: (Score:3)
On my TV I turned all of the processing off. The "sharpness" is turned down to 0 and ditto any other filtering. Same on my Bluray player. The video is already near-perfection and doesn't need that other crap which was initially included to "clean up" the older DVD and VHS signals.
Re: (Score:3, Funny)
I totally understand. When I got my HDTV I found the wide aspect ratio to be completely annoying. So I taped black construction paper to the left and right side of the screen, and while it isn't perfect, it is a lot less visually jarring.
Re: (Score:3)
640fps should be enough for anybody.
Re:48fps with more motion blur? (Score:4, Informative)
Actually the "motion blur gap" is only half the width you think it is - in the olden days the shutter would be closed for half the time to allow the film to spool on, so each frame is a snapshot of 1/48th of a second.
Shooting at 48fps, I would expect them to aim for a 1/96s shutter speed. I've worked on motion graphics at 50fps, and 100% motion blur still looks bad at the higher frame rate - 50% looks perfect.
Increasing motion blur to 1/16s on a 48fps shoot would be a complete mess.
Re: (Score:3)
Also interesting that you say 50% shutter looks good with 50fps footage, and that 100% looks over blurry (and presumably not cinematic in a good way). Have you tried 25% or even a near 0% shutter (if that even exists) to see what that would look like? If
Re: (Score:3)
The 3:2 pulldown gives you 3 refreshes of one frame then 2 refreshes of the next, then 3 of the one after that. That variation in timing is what is annoying. Now with 120 Hz, it does NOT do 6 refreshes then 4 refreshes and such. It just does the obvious 5 refreshes each time. Now motion at least looks consistent. If it detects that the source material is already goofed up with the 3:2 pulldown, and corrects it to 5:5, that's a plus.
Motion interpolation can then play hell with that, turning your beautif