Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Television Displays Technology

TV Manufacturers Unite To Tackle the Scourge of Motion Smoothing (theverge.com) 152

An anonymous reader quotes a report from The Verge: The UHD Alliance, a collection of companies who work together to define display standards, has announced Filmmaker Mode, a new TV setting that's designed to show films as they were originally mastered, with as little post-processing as possible. Although the mode will affect multiple settings like frame rate, aspect ratio, overscanning, and noise reduction, its most important element is that it turns off motion smoothing, which creates that horrible "soap opera effect" that makes even the most expensive films look cheap. LG, Vizio, and Panasonic have all expressed an interest in including the new mode in their TVs.

Of course, it's always been possible to turn off this setting (we've got a guide on how to do so right here) but TV manufacturers have an annoying habit of referring to the same setting by different names, confusing the process. LG calls it "TruMotion," Vizio calls it "Smooth Motion Effect," and Panasonic calls it "Intelligent Frame Creation," for example. The difference with Filmmaker Mode is that it will have the same name across every TV manufacturer, and the UHD Alliance also says that it wants the setting to be enabled automatically when cinematic content is detected, or otherwise easily accessible via a button on the TV remote.
Over a dozen high profile directors have expressed support for the new mode, including Christopher Nolan, Martin Scorsese, James Cameron, and JJ Abrams.
This discussion has been archived. No new comments can be posted.

TV Manufacturers Unite To Tackle the Scourge of Motion Smoothing

Comments Filter:
  • by skogs ( 628589 ) on Wednesday August 28, 2019 @08:55PM (#59135304) Journal

    Smoothing - They know it is bad for gaming too right? And sports....and pretty much everything. In fact it never should exist. It never should be enabled by default.

    • by xjerky ( 128399 )

      I use on PS4 games which are frame-locked to 30 FPS, to make them look like 60 FPS.

    • Re:Cinematic Only? (Score:5, Interesting)

      by Excelcia ( 906188 ) <slashdot@excelcia.ca> on Thursday August 29, 2019 @12:29AM (#59135726) Homepage Journal

      Whatever you want to do for gaming, fill your boots. But motion smoothing is amazing. The idea that the human eye can't see the jerkiness in a 24/27 FPS "film" is hogwash. The reason that it has a "soap opera effect" is a bad artifact of the human brain's associative memory system. Smooth, fast frame rates filmed direct to video are associated in our brain with poor production-quality shows because it is shows like that which pioneered using direct-to-video filming. Cinematic feels higher quality because that jerkiness is associated in our brains with high production quality movies that are filmed in the unfortunate historically dismal frame rates.

      Motion smoothing is a godsend. It extrapolates between frames and makes it seem smooth and natural. Not like a soap opera. Like real-life. People who keep that association between motion smoothing and soap operas A) watched too many soap operas, and B) need to just keep watching motion smoothing until those associations are remade in their brains.

      • >"motion smoothing is amazing. Motion smoothing is a godsend. People who keep that association between motion smoothing and soap operas A) watched too many soap operas, and B) need to just keep watching motion smoothing until those associations are remade in their brains."

        It is amazing TO YOU. It is a Godsend TO YOU. To many people, including me, it is a severe distraction to the point of making everything almost unwatchable. And perhaps it is from a LIFETIME (and probably double your lifetime) of wat

        • by Jerry Atrick ( 2461566 ) on Thursday August 29, 2019 @05:53AM (#59136194)

          Motion smoothing doesn't look like real life vision. It doesn't look like true 50/60Hz or higher source material. It sits right in the uncanny valley and just looks wrong to most of us. You can't take something filmed with built in low fps motion blur and magically work out what it would have looked like with only half the blur in each doubled frame with any hope of success, it will always be wrong.

          The only time it's an improvement is on short frame exposure times where most people think the video looks wrong anyway!

      • We think alike on this part. Unfortunately having another opinion or another reaction on what and how we perceive, leads others thinking that there must be something wrong; see the response above: "You are mentally ill.".

        Movies once did not have sound. That is not like the real world. It prevents me becoming fully submerged in a story. I see that as a shortcoming.
        Movies once were in black and white. That is not like the real world. It prevents me becoming fully submerged in a story. I see that as a shor

      • Agreed. Panning in 24fps movies takes a lot of care not to become a jerky mess. There are scores of tutorials dedicated to it, like: https://www.premiumbeat.com/bl... [premiumbeat.com]

        Perhaps interesting, there is actually a project that does comparable motion interpolation as TVs on PCs:
        https://www.svp-team.com/wiki/... [svp-team.com]

      • by Ecuador ( 740021 )

        You have no idea what you are talking about. Film shot at 24fps is not the same as a video game jerkily rendering at low frame rates. Films have motion blur. If we could have a magical algorithm that removed motion blur and then figured out the exact in-between frames and add them for a high frame rate video, then the result would be more "real-life" as you put it.
        The cheap motion smoothing circuits in current TVs do nothing like that and actually introduce weird artefacts just because they use relatively s

    • ...and you know what the best fix is? Stop filming movies at 24 fps and we can finally get beyond this idiocy. I propose 240 fps as the new standard for everything.

      • I agree ... but that doesn't help with all the movies that were made more than a couple of years ago.
        .

      • by MrL0G1C ( 867445 )

        I don't like your standard because it doesn't include resolution, so I propose a different standard with 200fps (should be enough) and 8k resolution for all films, tv, game streaming etc.

        • 200 doesn't evenly divide by 24, 30, and 60, so that will be a lot rougher for legacy content.

          • by MrL0G1C ( 867445 )

            Well I guess we'll have to use smoothing then! ;-) Shame our standards miss out colour depth, 16bits per colour should do it.

      • It's called art. Artists choose how to present their art. You don't go to a museum and complain the portraits aren't high enough resolution, do you?

        Leave movie making decisions to the professionals, please.

      • You won't like this for most movies as much as you think you will.

        I think it was the film critic Roger Ebert who talked about the best movies being like a dream, or like feeling like you are in a dream. You don't want it to be too real (or hyper-real), because then your brain rejects too easily any parts of the movie that isn't real (which is the whole thing). Being in a dream-like state allows for your brain to get past the fact that movies aren't real. Gaming might be a different story, or even sports o
    • >"Smoothing - They know it is bad for gaming too right? And sports....and pretty much everything. In fact it never should exist. It never should be enabled by default."

      I don't mind it existing. Some people like it. I just wish:

      1) Don't EVER force it on without a way to turn it off, persistently, and across all inputs and situations.

      2) Don't combine turning it on or off with any other setting or processing. Give us control.

      3) Give control over HOW MUCH motion interpolation. On/off is nice, but strengt

  • by fuzzyfuzzyfungus ( 1223518 ) on Wednesday August 28, 2019 @08:55PM (#59135310) Journal
    "TV Manufacturers Unite To Tackle the Scourge of Motion Smoothing"

    That they created in the first place. It's not some law of nature that TVs should have appalling postprocessing buried behind cryptic menus; TV manufacturers have dedicated their finest incompetence to creating this state of affairs. It's not a surprise that they would try to sell a move toward incrementally less self-inflicted lousiness as an achievement; but that doesn't mean we have to take them seriously.
    • by Solandri ( 704621 ) on Wednesday August 28, 2019 @09:15PM (#59135356)
      It was originally made to deal with film movies, which were shot at 24 fps. When you try to map 24 fps onto a 60 Hz screen refresh rate, you find out it doesn't divide evenly. 60 / 24 = 2.5. To make it work, some of the frames have to be shown for 2 frames, some for 3. But this causes a subtle jerkiness in smooth panning shots and motion called judder.

      To eliminate judder, TV manufacturers came up with motion interpolation to artificially generate in-between frames. That smooths out the motion and eliminates judder. But then some people are unhappy that the motion now looks smoother than in the original. (I've noticed the same technology being used on early silent films. Those were usually shot at 16 fps, so were "unacceptably" jerky. In the old days they simply played those at 24 fps, which resulted in things like the Charlie Chaplain being sped up. But most modern documentaries I'm seeing which use those old film clips (like WWI documentaries) are using motion smoothing.)

      The long-term solution is for TVs to standardize on 120 Hz. 24 fps does divide evenly into that, so you can display film movies on that without motion smoothing and without judder. But until then, I'm not sure why film directors think should be the ones who get to decide whether we have to live with judder or motion smoothing. The viewer is really the one who should be the one deciding which artifact they prefer in their flawed reproduction of a film movie.
      • by Wookie Monster ( 605020 ) on Wednesday August 28, 2019 @10:30PM (#59135526)
        Why can't the TV adapt to display 24 fps? Is 60 Hz built into the panel itself, and TV manufacturers cannot change this even for an integrated unit? Don't the big companies control the production of the TV and the display? This problem has never made sense to me.
        • by JBMcB ( 73720 )

          Why can't the TV adapt to display 24 fps? Is 60 Hz built into the panel itself, and TV manufacturers cannot change this even for an integrated unit?

          From what I've read, the LCD panel itself is designed for a certain refresh rate. You can go faster, but if you try to drive it slower it doesn't respond predictably.

          • I'm not sure about the rest of the world, but in the US, quite a few new TVs ARE, in fact, capable of natively displaying both 50fps AND 60fps video (via HDMI), as long as your source ignores the EDID and just blindly spews video at the desired framerate at the TV.

            Put another way... if you're running Kodi on a Raspberry Pi and you force 1080i50, 1080p50, or 720p50 over HDMI (without involving HDCP), it'll work just fine, even though the TV will swear it can't do it if asked (via EDID). On the other hand, if

        • by SeaFox ( 739806 )

          Why can't the TV adapt to display 24 fps? Is 60 Hz built into the panel itself, and TV manufacturers cannot change this even for an integrated unit? Don't the big companies control the production of the TV and the display? This problem has never made sense to me.

          Mine can. I have an old monitor/HDTV combination display and its normal refresh rate is 60 hz, but it actually has a 24 hz mode that can be activated by devices connected to the second (edge) HDMI port. I run my blu-ray player and an Amazon Fire Stick on an HDMI switch on that port and if I pull up the monitor's stats display when playing 24 fps content it does actually state it is running at 24 hz.

        • Why can't the TV adapt to display 24 fps? Is 60 Hz built into the panel itself, and TV manufacturers cannot change this even for an integrated unit? Don't the big companies control the production of the TV and the display? This problem has never made sense to me.

          I can tell you from personal experience that the issue is due to lighting. I used to work in digital video and we accidentally displayed a 60Hz camera feed with 50Hz fluorescent lighting at a sales event. It is incredibly painful. So painful that I was up 24 hours fixing the code for the encoder and the decoder. You can't tell, but fluorescent lights flicker at whatever frequency they are supplied AC at. In the US this is 60HZ and in Europe it is 50Hz. You put 60Hz video on with 50Hz lighting and it l

        • > Why can't the TV adapt to display 24 fps?

          1. Because NTSC is 29.97 Hz, PAL is 25 Hz.
          2. Because 24 fps looks like shit (compared to 120 fps)
          3. Because it ISN'T solving the _original_ problem: Crappy low frame rates, when film should be recorded at 120+ Hz.

      • The long-term solution is for TVs to standardize on 120 Hz. 24 fps does divide evenly into that, so you can display film movies on that without motion smoothing and without judder. But until then, I'm not sure why film directors think should be the ones who get to decide whether we have to live with judder or motion smoothing. The viewer is really the one who should be the one deciding which artifact they prefer in their flawed reproduction of a film movie.

        I don't understand why this would be a problem. You have 24 Hz content the player selects the 24 Hz display mode and the output on screen is 24 Hz. Why would 24 Hz content ever be outputted as 60 Hz?

        Only crappy services and STBs that don't properly switch display modes have these issues. Aside from broadcast media where most modern TVs will automatically detect and compensate anyway there is no reason for this problem to exist in the first place.

        • by Miamicanes ( 730264 ) on Thursday August 29, 2019 @02:19AM (#59135894)

          The big monkey wrench is HDCP. Most older implementations (and a lot of newer ones) treat mode changes as a mandatory re-authentication trigger. So, if a broadcaster had a 1080i60 show, then switched to 720p60 for a commercial before going back to 1080i60 for the show, most TVs would barf for several seconds while reauthenticating to make HDCP happy.

          IMHO, the industry REALLY needs to hit pause on ATSC 3.0, and make a slight amendment to roll it out as ATSC 3.1 from day one, with official support for 2160p120 as a video mode (but still within the 19.2mbps bitrate constraint). That would mostly solve the problem of mode-switching:

          24fps Hollywood films could be digitized to 2160p24, then broadcast as 2160p120 with each frame simply repeated 5 times.

          2160p30 prime-time TV shows could be encoded to repeat each frame four times.

          1080p60 TV shows could be resized to 3840x2160, then smoothed and compressed down to the original detail level, with each frame repeated twice. Done properly, the 2160p120 encoding would have more or less the same bitrate as the 1080p60 encoding, since both would ultimately have the same amount of actual detail.

          Ditto, for 720p120... scale to 3840x2160, then smooth and compress it down to fit within the bitrate constraints. As an added bonus, future CODECs could take advantage of their knowledge of the video's content to apply different smoothing & compression strategies to live video vs overlaid text... the underlying live video might have an effective resolution of 1280x720, but the overlaid text could still be crisp 3840x2160 (especially if the text were on an opaque background that lined up with the underlying macroblocks and had an effective framerate of 15-30fps... the live video would change from frame to frame, but the overlaid text's macroblocks would just keep pointing to portions of earlier frames).

          Under this scheme, the majority of content would be neither "real" 120fps nor 3840x2160, but it would solve a lot of otherwise intractable problems that the industry is going to have to deal with sooner or later anyway, if only due to the conflicting demands of TV sports (where framerate matters more than resolution) and primetime TV shows & Hollywood films (where resolution matters more than framerate... at least, for now).

      • Comment removed based on user account deletion
      • by AmiMoJo ( 196126 )

        Motion smoothing really came to the fore because it helped deal with the limitations of LCDs.

        LCDs have relatively slow and variable switching speeds (it takes longer to go from black to white than from grey to grey), compensated for by things like overdrive and backlight flickering. Particularly on medium and low end sets, some motion smoothing massively improves motion clarity and makes the TV much more pleasant and relaxing to watch.

        OLED and plasma before it are a different matter. Switching speeds are ve

      • They most certainly didn't do interpolation. Obviously it was done either by the TV company or before the film was burned to DVD.

      • it doesn't divide evenly. 60 / 24 = 2.5

        Actually, it's not the only reason why the TV manufacturers started implementing motion smoothing. Something like 10 years ago, some manufacturers built 120Hz screens and tried marketing them on the notion that 120Hz is greater than 60Hz, therefore better. And I believe some attempted to go even higher. Now that standard cable programming is effectively 60Hz after de interlacing, they came up with batshit crazy motion smoothing effects and affected even things like spo

      • I've noticed the same technology being used on early silent films. Those were usually shot at 16 fps, so were "unacceptably" jerky. In the old days they simply played those at 24 fps, which resulted in things like the Charlie Chaplain being sped up.

        There wasn't a standard frame rate in early silent films.

        Projection speeds varied with the audience, mood and timing. If the program for the night was slim, you put on the brakes and bought yourself some time. When the audience became restive, you stepped up the pace. Your pianist or orchestra was expected to adapt.

    • by Dutch Gun ( 899105 ) on Wednesday August 28, 2019 @09:36PM (#59135414)

      This is another version of audio's "loudness / compression" wars. Modern music is compressed all to hell, because to the casual listener, it sounds subjectively "better". It's also easier to listen in noisy or sub-optimal environments. Purists, on the other hand, prefer a high dynamic range that takes advantage of their high end audio systems and quiet listening environment.

      I suspect it's the same with video smoothing. My guess is that the average watcher finds it a better experience on average, while the purists abhor anything that interferes with the original presentation. I don't think TV manufacturers would have turned this option on by default if most people didn't find it more pleasing to the eye, though.

      To be honest, I don't really have a problem with the status quo, as anyone who cares enough to research this topic will find and change the setting on their TV. Unlike with music mastering, we at least still have a choice. I suppose more uniform labeling won't hurt though.

    • "TV Manufacturers Unite To Tackle the Scourge of Motion Smoothing"
      That they created in the first place. It's not some law of nature that TVs should have appalling postprocessing buried behind cryptic menus; TV manufacturers have dedicated their finest incompetence to creating this state of affairs. It's not a surprise that they would try to sell a move toward incrementally less self-inflicted lousiness as an achievement; but that doesn't mean we have to take them seriously.

      Ummm.... this "fix" means they get to sell you a whole new TV. It's win-win! (for them)

      Oh, wait. You thought they have your interests at heart?

      (LOL!)

    • by Yaztromo ( 655250 ) on Thursday August 29, 2019 @02:28AM (#59135912) Homepage Journal

      TV manufacturers have dedicated their finest incompetence to creating this state of affairs.

      It's not incompetence -- it's marketing. TV's are still often bought by people going into a store and seeing the display for a few minutes, and displays with really good motion smoothing look better when you see one beside a TV that doesn't do motion smoothing. It's the same as TVs in the previous generation that had demo modes that cranked up the brightness and saturation (or the generation before that where the salespeople would crank them up manually for TVs they really wanted to sell).

      The thing is, what looks good for a few minutes on the show floor often isn't what looks good for long-term viewing in your living room. This is much akin to the old Cola Wars "Taste Tests", where you'd get a few mL of each cola to drink in a testing environment. With so little liquid people nearly always picked whichever one was sweeter -- but for sitting on your porch on a hot day and drinking down a whole bottle, somewhat less sweet is universally considered better. This is what doomed "New Coke" back in the 80's -- taste tests showed people liked the new formulation better, but int he real world people liked the old recipe better. But the market research was based on the taste test, and not real world consumption.

      So if you make a TV without motion smoothing, you are likely at a disadvantage on the showroom floor when setup next to one that does. For short-term viewing of just a few minutes, the motion smoothed TVs will look like the image is more realistic -- it's only when you find yourself trying to watch something for an hour or more that you realize it makes everything look bad.

      As a somewhat related aside, it has always made me laugh whenever a television advertisement for a TV tries to show how much better its image quality is. The image of the advertisement on your TV can only ever be as good as what your TV can already output. Advertisers know this, and usually use some tricks to saturate the colours to make your TV's image of their TV's image pop (the old Sharp Quattron [youtube.com] ads with George Takei come to mind here).

      Motion smoothing is all about a showroom arms race between the TV manufacturers. And they do it because for most of the populace, it works.

      Yaz

  • by HalAtWork ( 926717 ) on Wednesday August 28, 2019 @08:59PM (#59135316)

    I would use this for games.

    As for video, motion smoothing all the way. Wish it was built into streaming video players and Kodi because they can grab the next frame and have less artifacts.

    On a big screen I don't like stuttering motion. On a small 14" screen with telecine and interlacing and other issuses it wasn't evident but we're not there anymore.

    • by AmiMoJo ( 196126 )

      It's all about comfort. My Panasonic plasma has THX calibrated modes. The most accurate is the "dark room" setting, but unless you have the lights off it's not very comfortable to watch. So I use the bright room setting or just the normal mode that I tweaked to my personal preference most of the time.

      Still looks great and it's a decent compromise between comfort and accuracy. Same with motion smoothing, I find I get annoyed by not being able to see detail in 24 fps motion so a bit of smoothing is appreciate

  • by Solandri ( 704621 ) on Wednesday August 28, 2019 @09:00PM (#59135318)
    Every company gives it a different name to make clear that their algorithm is different from everyone else's, so you shouldn't sue them for software patent infringement. Same thing happened in physical space with IPS, PLS, and AHVA monitors. They all do the same thing, but slightly different so as not to stomp on each others' patents. They're required by law to state the exact technology in the product specs. But the marketers, reviewers, and sales reps do us the favor of referring to all of them as IPS. (IPS = LG, PLS = Samsung, AHVA = AUO)
  • by ArchieBunker ( 132337 ) on Wednesday August 28, 2019 @09:12PM (#59135346)

    I'm not investing in a gimmicky surround sound setup to fix volume issues. I have two ears and already watching something on a 2d display. Anything more is just a gimmick. Ohhh the sound panned across the speakers, big deal. First we were told by everyone (including Slashdot posters) that 44KHz 16bit sampling was more than adequate for human hearing. Now why does Bluray support 192KHz 24bit audio if its indistinguishable? Seems like a lot of overkill for not even a marginal improvement.

    Funny how no one complained about the sound of VHS movies. You could hear everything just fine without sustaining ear damage, Finally I said fuck it and stuck a studio compressor before my two channel amplifier. What a relief to be able to hear dialogue and not be deafened by someone dropping a penny on the floor. VLC has a great compression feature as well and no I'm not talking about normalization that is totally different.

    • I think you really just want a center channel.
    • Some people have better hearing than other people. The people with the very best hearing, trained to detect imperfections, can under optimum conditions detect the difference between 44 kHz / 16 bits and better encoding.

      Some people can hear higher frequencies than the 22 kHz Nyquist limit of 44 kHz / 16 bits. 50 years ago, I could.

      A higher sample rate simplifies the math behind audio signal processing.

      • by lgw ( 121541 )

        Some people have better hearing than other people. The people with the very best hearing, trained to detect imperfections, can under optimum conditions detect the difference between 44 kHz / 16 bits and better encoding.

        Ahh, audiophiles. Frauded out of 10s of thousands of dollars because the salesman convinced them they were superheroes. The bast scam of the all was HDCD, where they'd put both "normal CD audio" and "HD" on the same disc so you could hear the difference. Of course, the "normal CD audio" was remastered to add extra noise, so of course the "HD" sounded better.

      • Some people have better hearing than other people. The people with the very best hearing, trained to detect imperfections, can under optimum conditions detect the difference between 44 kHz / 16 bits and better encoding.

        Nope.

      • by Viol8 ( 599362 )

        "Some people can hear higher frequencies than the 22 kHz Nyquist limit of 44 kHz / 16 bits. 50 years ago, I could."

        No, you just thought you could because you weren't actually hearing a perfect sine wave but probably something with a fundamental of 22+Khz but lots of subharmonics. Listen to any signal generator and as you turn up the frequency the sine wave will vanish long before the square, saw or triangle appear to.

      • > Some people can hear higher frequencies than the 22 kHz Nyquist limit of 44 kHz / 16 bits. 50 years ago, I could.

        This one is tricky. More people can sense 24-28KHz sound through sympathetic vibrations in their ear bones than can hear it with their eardrums (for which 20KHz is usually plenty).

        Some people do hear a lower harmonic and some can be shown to have perception in a lab test but don't experience it as sound.

        It's probably one of the factors that makes live music better. I'm glad we have better

    • I have two ears

      Two ears capable of distinguishing the direction of sound in 3D space.
      Surround does add to the experience. For me personally, only a little. For others it adds more.

      First we were told by everyone (including Slashdot posters) that 44KHz 16bit sampling was more than adequate for human hearing.

      It is.

      Now why does Bluray support 192KHz 24bit audio if its indistinguishable? Seems like a lot of overkill for not even a marginal improvement

      Marketing. People with Monster cables are willing to pay for it.

      Dynamic range has a purpose. The real world has dynamic range. Whispers are quiet and explosions are loud. Many people want this in a movie too.
      There are of course times when this is not appropriate, and adding a compressor then seems reasonable.
      And if you have problems hearin

      • by Viol8 ( 599362 )

        "Two ears capable of distinguishing the direction of sound in 3D space."

        Not always and not as well as we think. We often turn our heads subconciously slightly to figure out whether a sound is in front or behind us so the brain gets extra information from the direction of pan.

    • First we were told by everyone (including Slashdot posters) that 44KHz 16bit sampling was more than adequate for human hearing.

      It is.

      (although it's not ideal for the signal reconstruction filter in the playback devices... 48kHz allows a bit more margin even though it's technically not "necessary")

      Now why does Bluray support 192KHz 24bit audio if its indistinguishable?

      Because audiophools are willing to pay extra for it.

    • by AmiMoJo ( 196126 )

      The main "benefit" of 24 bit audio is that it allows for greater dynamic range. For some use that may not be a good thing - personally I find i annoying when I have to turn up the volume for the dialogue and then suddenly there's a loud bang and I'm scrambling for the remote.

      Some of the better soundbars offer a dialogue enhancement mode that actually works, but really it would be nice if the bluray came with a special soundtrack that was suitable for home use and which had clear dialogue. That's why VHS sou

    • Paraphrasing your comment:

      I have a problem, there exists a solution, but I think the solution to my problem is a gimmick because I'm an angry ranter.
      I have only 2 ears so I only hear in 2D and have no spatial awareness because I don't know how biology or physics works.
      I don't understand why nyquist theorem works for my ears nor do I understand why it would be beneficial to not resample audio that is mastered at higher rates for very specific reasons which I won't ever bother to lookup. Therefore it must al

      • Everyone complains about the sound mix of movies when watching them at home. Movies are mixed so badly that when dialogue is at a reasonable level the action scenes cause physical discomfort because they are 25dB louder. VHS did have perfect audio. Did anyone need subtitles or thousands of dollars in amplifiers and speakers to get good results? No. You could watch anything without touching the volume control once.

    • Kodi has great audio compression as well. Go into the Audio settings and put Volume amplification at -15db to -20db and it will sound great!

      Now if only podcast players would get compression, a lot of people don't know how to produce audio and I'd like to do something on my end about it without having to apply a filter on audacity myself every time.

    • by thomst ( 1640045 )

      ArchieBunker snarled:

      I'm not investing in a gimmicky surround sound setup to fix volume issues. I have two ears and already watching something on a 2d display. Anything more is just a gimmick. Ohhh the sound panned across the speakers, big deal.

      Surround sound and volume are two different issues. Conflating them is stupid.

      I have a surround sound system that actually delivers completely convincing sound from all directions. It uses good-quality bookshelf speakers (Boston Acoustics and Polk Audio) with 8-inch woofers for the side and rear channels, towers with 18-inch woofers for the mains, a Klipsch center-channel speaker and a WAY overpowered 12-inch subwoofer for the frequencies below 120 Hz that I have to throttle down almost

  • by SvnLyrBrto ( 62138 ) on Wednesday August 28, 2019 @09:42PM (#59135430)

    There's nothing special or magical about 24fps. In fact, in any other context it would be and is considered pathetically inadequate. Even NTSC's 30fps (Yeah, I know.. 29.99something.) would have been laughed at all the way back in Quake 3's era.

    24fps, far from being some gold-standard of visual quality, was merely a cost-savings measure. Film used to be stupidly expensive back in ye olden days. And somebody figured out that 24fps was the bare minimum for a movie to fool the human eye and not look like absolute garbage. The only reason higher frame rates look "bad" or "strange" to us is that we're used to 24fps. If the movie industry were to switch to 48fps or better across-the-board, it'd look odd for a couple of years at most. Then we'd all be used to it. And we'd all recognize 24fps as the trash that it is.

    • by Waffle Iron ( 339739 ) on Wednesday August 28, 2019 @10:26PM (#59135516)

      Exactly. This whole thing just shows how the human mind is totally susceptible to bias created by early conditioning.

      If the Hollywood film makers had for some reason decided to market "premium" 48 fps movies starting in the 1940s (which was certainly technically doable), then today's consumers would flip their positions and say that smoothed TV content looks "deluxe", and 24 fps content looks "cheap and jerky" like a B movie.

      • There IS one definite technical downside to high-framerate theatrical content... as framerate increases, dialogue timing becomes absolutely CRITICAL. At 24fps, the lips are mostly a blur, so you can get away with fairly sloppy timing. At 120fps, your dialogue soundtrack's timing has to be absolutely SPOT-ON, or it's going to look like the lipsync of dubbed Japanese anime. And dubbing into other languages in general is a lot harder to pull off convincingly, because viewers will INSTANTLY notice that what the

        • by thomst ( 1640045 )

          Miamicanes pointed out:

          There IS one definite technical downside to high-framerate theatrical content... as framerate increases, dialogue timing becomes absolutely CRITICAL. At 24fps, the lips are mostly a blur, so you can get away with fairly sloppy timing. At 120fps, your dialogue soundtrack's timing has to be absolutely SPOT-ON, or it's going to look like the lipsync of dubbed Japanese anime. And dubbing into other languages in general is a lot harder to pull off convincingly, because viewers will INSTANTLY notice that what they're hearing doesn't align with what they're seeing.

          Put another way, with even SLIGHTLY sloppy timing, EVERYTHING starts to look like a sloppy dub, even when it really IS the actual actor speaking.

          That problem is almost always caused by poor ADR (Automated Dialogue Replacement).

          What's called "wild sound" (i.e. - the audio signal recorded with the original, unedited footage) is frequently of unacceptable quality - even in a carefully-controlled studio setting - due to any number of causes, including incidental sounds (air conditioning, for instance) and poor microphone placement (which is often unavoidable, because you do NOT want microphones visible in the frame) that results i

    • by xjerky ( 128399 )

      Right. It's so strange that people are sticklers for pixel resolution, but not temporal resolution.

    • Savings (Score:5, Insightful)

      by JBMcB ( 73720 ) on Wednesday August 28, 2019 @11:04PM (#59135594)

      There's a tradeoff. Yes, 24fps doesn't look nearly as good as 30 or 60fps. Higher frame-rates benefit from being brighter and looking sharper as well. And there's the problem. I saw The Hobbit in high frame rate, and almost everything looked fake. Because almost everything was fake. The makeup. The CGI sets. The fake rocks. It was all fake and looked fake.

      24fps smears out the detail so you don't see enough detail to notice everything is on a movie set. Ever see Darth Vader's costume up close in real life? The control thing on his chest looks like it's from a Halloween costume a ten year old made himself. But in the movie it looks awesome. Because you can't see it that well.

      So, before we get used to high frame rates, TV and film producers are going to have to get used to sinking serious money into making more realistic looking sets.

      • 24fps smears out the detail

        When you're panning or moving quickly. I saw the normal version of The Hobbit too, hate to say but it still looked fake.

      • Let your eyes adjust. Because everything looks so different, you're hyper-aware of a lot more of the visuals. You're automatically analyzing everything more.

        At first I turned it on because it was gimmicky and I just wanted to see how each different show or movie would look with it on. But after living with it for a few days or a week the sensation got normalized and I don't have that feeling anymore.

        Instead now when I look at something that doesn't have it on, it looks off, jerky, and stuttery.

        Try the exper

      • The Hobbit CGI characters were the least believable. They just about need to start from scratch on VFX at higher frame rates, because none of it is believable.

        I'm fairly certain that the CGI needed more motion blur to look realistic.

    • Just film in 3D. It's obviously better fidelity because you have twice the content. Better fidelity makes it automatically a better movie right?

      Or maybe - just maybe - movie makers choose 24 fps for artistic reasons and you should leave movie making to the professionals and stop trying to pigeon hole the content so you have a reason to have a 240hz TV.

    • by sootman ( 158191 )

      > There's nothing special or magical about 24fps...
      > Even NTSC's 30fps... would have been laughed at all the way back in Quake 3's era.

      *sigh*

      There's more to it than you know. When you FILM something at 24 or 30fps, you are taking ACTUAL PHOTOGRAPHS of the real world, and if something is moving, each frame will capture a bit of blur, which looks realistic when played back at 24 or 30fps.

      When rendering individual frames for a game, THERE IS NO MOTION BLUR. (This is being worked on but it's not in wide u

    • 24fps film is a medium, just like say oil or acrylic to painting. Applying motion smoothing to content that was originally shot at 24fps is akin to insisting to see a classical or modern oil painting only after applying some kind of filter that makes the medium less blurry. In other words, this is almost always bad, because people who shot and edited the film they certainly intended it to be seen at 24fps, and not the other way.

    • by k6mfw ( 1182893 )

      What about filming a subject I am interested in watching? I never was interested in Star Wars, Marvel comic heros, Pirates of the Carribean sequels, etc. Also being an old guy I have seen all the the old movies (stop watching TCM because they rerun same stuff each year). But that's me.

      Your question about 24fps and NTSC, and computer monitors at 59.94 Hz... all because olden days film was expensive. Fascinating history though.

  • by jaa101 ( 627731 ) on Thursday August 29, 2019 @12:33AM (#59135730)

    This is not a complaint about frame interpolation; it's a complaint against making motion pictures look realistic. How can you tell? Because people complaining about the "soap opera effect" caused by up-sampling to 60fps or 120fps also complain about The Hobbit at 48fps. There's no up-sampling in the Hobbit; it was filmed at 48fps.

    Yes, there are artefacts when TVs do frame interpolation, but I find them way less objectionable than what happens when you do slow pans with 24fps content. This is especially true when you have 3:2 pull-down on a 60fps display but, even on a 120fps TV, 5:5 pull down looks horrible for slow pans compared to just about any frame interpolation. Good TVs already allow you to turn off frame interpolation. People who care can buy those. People who don't care watch their TVs in "showroom" mode anyway.

    If the studios are so against interpolation artefacts, just release 60fps or 120fps versions of movies. The effort to do a good job at this has to be way less than that required to create 3D versions or colourised versions. And the result should be so much better than what a TV can do live, however much smart AI is stuffed in there.

    Want to make TVs more realistic? Start by killing off the ridiculous "showroom" and "dynamic" modes that turn every show into a psychedelic cartoon. And so much of the new HDR content is way over-done too. Less is more!

    Back to the original question, it seems there's a group of cinema purists who think that quality content can only ever be in 24fps, that the characteristic stutteriness that this frame rate causes will forever be associated with the best artistic works.

    • by lgw ( 121541 )

      This is not a complaint about frame interpolation; it's a complaint against making motion pictures look realistic. How can you tell? Because people complaining about the "soap opera effect" caused by up-sampling to 60fps or 120fps also complain about The Hobbit at 48fps. There's no up-sampling in the Hobbit; it was filmed at 48fps.

      Right, it's not the up-sampling that's the problem, it's the too-high frame rate, however arrived at. Looks like garbage, for film. Can be fine for a nature documentary, or sports, or something else that realistic instead of artistic.

      Want to make TVs more realistic? Start by killing off the ridiculous "showroom" and "dynamic" modes that turn every show into a psychedelic cartoon.

      Showroom mode looks better on the showroom floor, where most people still make buying decisions. It's never going away. It would be great if it weren't the default, but that would require the salesmen setting up the displays to be smart enough to change the default, so don't

    • by AmiMoJo ( 196126 )

      I wish Hollywood would figure out how to film action scenes. Often they are just a whirling mess where you can't see anything, with rapid cuts that make the action impossible to follow.

      Compare to 80s movies coming out of Hong Kong where they used editing to actually help the viewer follow the action. Like when someone throws a punch, they might cut at the moment of contact for dramatic effect but the new shot will actually rewind time a bit so you have a couple of frames to see the punch coming in again bef

    • There's no up-sampling in the Hobbit; it was filmed at 48fps.

      And soap operas are actually filmed at 60 fps (fields per second in this case)

  • by ruddk ( 5153113 ) on Thursday August 29, 2019 @01:52AM (#59135834)

    24 FPS is awful, I don’t get why anyone likes to look at that.

    • >"24 FPS is awful, I donâ(TM)t get why anyone likes to look at that."

      Because not everyone is the same as you. I can't stand high-frame-rate video for movies and TV. But conversely, I love [well-done] 3D movies/TV while many others do not.

      And then there is the majority, who can't tell or don't care. Probably way more than half the people out there can't tell any difference between quality 720P and 1080P resolution on a large screen at normal viewing distances. Perhaps 99% can't tell any differenc

  • by Ozan ( 176854 ) on Thursday August 29, 2019 @02:26AM (#59135908) Homepage
    Film the movies at 60fps and there is no need to upsample them in the first place. Action films would massively benefit from it, also sports broadcasts. Of course this would mean going all digital, which means Christopher Nolan would not be on board.
  • by peppepz ( 1311345 ) on Thursday August 29, 2019 @02:30AM (#59135916)
    It's time we move away from 24fps and enjoy the realistic picture that modern TVs allow. It's not right to hold progress back because of the fetish for outdated technology of a club of fanatics.
    (xkcd 598) [xkcd.com]
  • by evanh ( 627108 )

    The interpolation feature is only there because of the pathetic 24 FPS films that are way out of touch now.

    BTW, it's not a new feature they're adding. There has always been a low latency minimal processed "games" mode on these TVs anyway.

  • All TV manufacturers have to do to support this "mode" is not enable DNR, edge enhancement and other image processing effects in the first place. Or at least ask during setup. Is that really hard?

    I don't even understand why it is enabled by default. If you're a sports fan, then DNR totally screws on grass & detail. If you're a movie fan, then it turns skin into plastic and strips out detail. If you're a gamer then it introduces frame latency and other visual glitches. Who is it meant for?

  • I looked at a few youtube videos on "motion smoothing" and I can't easily tell the difference. My TV at home is 1080p but not new enough to be a smart TV, I don't know that it has this feature at all. Is this really a big deal to everyone else?

Elliptic paraboloids for sale.

Working...