Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Open Source Entertainment Technology

Netflix Finds x265 20% More Efficient Than VP9 (streamingmedia.com) 178

Reader StreamingEagle writes (edited): Netflix conducted a large-scale study comparing x264, x265 and libvpx (Google-owned VP9), under real-world conditions, and found that x265 encodes used 35.4% to 53.3% fewer bits than x264, and between 21.8% fewer bits than libvpx, when measured with Netflix's advanced VMAF assessment tool. This was the first large-scale study to use real-world encoder implementations, and a large sample size of high quality, professional content.A Netflix spokesperson explained why they did the test in the first place; "We wanted to understand the current state of the x265 and libvpx codec implementations when used to generate non-realtime encodes optimized for OTT use case. It was important to see how the codecs performed when testing on a diverse set of premium content from our catalog. This test can help us find areas of improvement for the different codecs."
This discussion has been archived. No new comments can be posted.

Netflix Finds x265 20% More Efficient Than VP9

Comments Filter:
  • by John Smith ( 4340437 ) on Tuesday September 06, 2016 @10:23AM (#52834557)
    I just want websites to use the HTML5 video player as opposed to Flash. x265 is not very important except for 4K content and mobile phones. It will, though, eventually become the standard.
    • Re:Mostly... (Score:5, Insightful)

      by Black.Shuck ( 704538 ) on Tuesday September 06, 2016 @10:29AM (#52834613)

      x265 is not very important except for 4K content and mobile phones

      I think a 20% overall reduction in bandwidth for an in-demand and still-growing medium is *very* significant no matter what the resolution or device is.

      • Re:Mostly... (Score:5, Informative)

        by geek ( 5680 ) on Tuesday September 06, 2016 @10:44AM (#52834693)

        x265 is not very important except for 4K content and mobile phones

        I think a 20% overall reduction in bandwidth for an in-demand and still-growing medium is *very* significant no matter what the resolution or device is.

        The real number is 34-53%. The 20% is just over VP9. The 34-53% is over their current streaming codec.

        • by Luthair ( 847766 ) on Tuesday September 06, 2016 @10:46AM (#52834715)
          Clearly though, the image produced by mpeg2 is much superior to h265 much like records sound better than CDs.
          • not if you use the same bit rate.

            • by Rockoon ( 1252108 ) on Tuesday September 06, 2016 @10:57AM (#52834795)
              mpeg2 uses gold bits, while h265 uses copper and aluminum.
              • by Matt.Battey ( 1741550 ) on Tuesday September 06, 2016 @11:42AM (#52835171)

                Yes but Capacitance Electronic Discs have a much warmer picture, especially when paired with new old-stock cathode ray tube amplifiers.

                • by doom ( 14564 )

                  But for best results, you need to run a green magic marker around the rim.

                  Seriously, if you want to compare to the vinyl wars, the real take-away is human beings like what ever they like, and it's possible to do something that's technically "better" that nevertheless has a negative subjective-impact... (myself, I couldn't care less if the "warm" sound I think I hear on vinyl is just surface noise effects-- if I like noisey better than clean, give me noisey).

                  One hopes that they've got this covered in th

                  • I'm surprised the various players don't have a "vinyl" mode; all you'd need to do is add some hiss and the occasional pop.

          • They are comparing the bit rate of the streams at comparable image quality. They're not using MPEG2 now, that would be even less efficient than their current H.264 encoding.

            One problem is that few computers and streaming boxes have H.265 hardware decoding support. That means that the decoding has to be done in software. (The latest generation of video cards DOES include H.265 decode support.) At best that means higher power consumption; at worse it means degraded playback because the CPU isn't up to the tas

      • I have noticed that x265 requires much more CPU for encoding AND decoding than x264. For example, my slightly aged laptop will not handle playing my 1080p x265 streams.

        • by jandrese ( 485 ) <kensama@vt.edu> on Tuesday September 06, 2016 @11:28AM (#52835039) Homepage Journal
          The intention is that the decoding is handed off to the GPU. Doing x265 on the CPU is inefficient, and completely impractical on devices like phones and tablets.
        • Comment removed (Score:5, Informative)

          by account_deleted ( 4530225 ) on Tuesday September 06, 2016 @11:50AM (#52835275)
          Comment removed based on user account deletion
        • by Bengie ( 1121981 )
          Of course you did. Nearly all CPUs and GPU support accelerating x264, but not x265.
          • by jedidiah ( 1196 )

            > Of course you did. Nearly all CPUs and GPU support accelerating x264, but not x265.

            Even without GPU decoding, most CPUs have gotten powerful enough to decode h264 without any special help.

            h264 just got OLD, much like mpeg2 did.

            Now we have the new shiny shiny that's almost gauranteed to choke general purpose processors just like it's predecessors used to.

          • The latest generation of desktop and laptop CPUs does, including NVidia GTX 10x0, ATI RX4x0, and the integrated GPUs in new Intel and AMD CPUs. But there is a lot of installed base that lacks H.265 hardware decoding. Over on the mobile side, the Snapdragon 820 has it but I doubt the previous generation does.
      • Elsewhere, I saw something about x265 not being so good (comparatively) at higher bandwidth anyway. So the question also is if this was a low bandwidth or high bandwidth test.
    • The only thing preventing HEVC/H.265 from being supported natively in browsers is the patent license terms. The developers of x265 have made a proposal to fix this situation. See http://x265.org/proposal-accel... [x265.org]
      • by donaldm ( 919619 )

        The only thing preventing HEVC/H.265 from being supported natively in browsers is the patent license terms. The developers of x265 have made a proposal to fix this situation. See http://x265.org/proposal-accel... [x265.org]

        The Edge browser does support H265 but surprise surprise it does not support many open formats. If you go to this site [html5test.com] and do an intercomparison between Chrome, Firefox, Edge and your preferred browser, with particular emphasis on the Video and Audio support you can see this.

        BTW. That link was an interesting read although I don't think patents are really going to stop the home user from using the particular codecs. It is surprisingly easy using tools like Handbrake (it really does hammer your PC though)

        • by Kjella ( 173770 )

          It is surprisingly easy using tools like Handbrake (it really does hammer your PC though) to convert from one codec to another as well as converting 8bit to 10bit or even 12bit. The main reason to convert H264 to H265 is the fact that you can get a reduction in file size from 55% to 65%

          transcoding from one loss codec to another -> *facepalm 1*
          8 bit source -> 10/12 bit destination *facepalm 2*
          55-65% and in some cases much better than that -> *facepalm 3*

          Sometimes, even Picard and Riker is not enough...

          • If you care about storage space more than you care about quality, encoding to h265 is fine. Please use the cleanest source, though, unless you're a dumbass and you don't care about generational loss.

            10-bit encoding of 8-bit sources is retarded. But it's a thing they do, however, especially the anime kiddos. It effectively acts a variable denoise filter which is useful when they're ripping from crunchyroll, OTA/cable/satellite, or even BluRay. 99% of the time the content gets quantized back the same (or

        • I didn't know about this, so researched and found this page [github.com] that explains "10-Bit H.264".

          I offer it here so that someone can complain that I am karma whoring.
          • Freudian slip on Line #71 [github.com] ?

            And what does this all mean for my precious fagsubs?

            Shouldn't that be: fan subs ? :-)

            • by Falos ( 2905315 )
              Those communities get a lot of mileage out of "fag". It can still be used for gays, but is such an exceedingly prevalent catchall it even becomes affectionate or respectful in some contexts. Not most. Usually it's negative, serving as the same generic disapproval appendage Matt & Trey decided the town of South Park employs.

              You can issue reprimands about how it harms gay culture or whatever, but it only furthers their bemused fascination. Today, it's a casual sardonic remark on their own fringe cultur
    • HTML5 - Video (Score:5, Informative)

      by DrYak ( 748999 ) on Tuesday September 06, 2016 @12:15PM (#52835513) Homepage

      Speaking of which (HTML5 Video and Netflix):

      The IETF has a working group to produce a new gen video codec "NetVC [wikipedia.org]" (Designed to be easy for wide adoptions, as the previous efforts of the IETF like Opus for audio).

      The main candidate is by a group called "AOMedia [wikipedia.org]" (association for openmedia), working on AV1 (AOMedia's Video codec 1).

      The association includes:
      - Google (of Youtube fame) : They are using their current development as a base for AV1 (what would have become VP10 if there wasn't this whole NetVC story).
      - Xiph (of Vorbis and Opus fame, with also contributions toward Flac, Speex, etc.) : They are developing a very interesting project called Daala, and they ended up also contributing the innovation done for Daala into AV1.
      - Cisco : They gave what they have developed for their Thor codec also into AV1.

      Netflix has also joined the AOMedia and they are investing resources into it.
      Same with several browser makers (including Mozilla).

      With all the people involved:
      - you know there's some interesting performance coming (given the brains involved here, given past successes like Opus, and given the promising results of research projects like Daala).
      - given that 2 top content providers like Google (Youtube) and Netflix are on board, there's a high chance of seeing deployment of the new codec.
      - given that browser makers like Google (chrome), Mozilla, and Microsoft (Edge) are on board, there's a high chance of seeing browser support for the new codec.
      - given that hardware chip maker like ARM, Intel, AMD, Nvidia, etc. are on board there's going to be hardware decoding support.

      (Adobe is on board too, so browser support is guaranteed for the Widevine DRM plug-in required by Netflix' licensors. Not that it matters that much, because that part of HTML5 Video is already defined and deployed everywhere, except maybe with Firefox on Linux which is a bit delayed)

      But you know that this looks promision,
      and maybe same time next year, we'd be reading summaries along the lines of "Netflix and Google find AV1 20% more efficient than HEVC/H.265" "And also cheaper, royalty-free and widely supported"

      • Adobe is on board too, so browser support is guaranteed for the Widevine DRM plug-in required by Netflix' licensors.

        Google owns Widevine, not Adobe. Did you mean Primetime?

    • > x265 is not very important except for 4K content and mobile phones.

      This is exactly the mindset that leads to waste, inefficiency and bloat. If you can save even 5% on bandwidth, which is a limited resource, WHY NOT DO IT? And this is way more than 5%.

      • by Bengie ( 1121981 )
        Bandwidth is rarely limited, only more expensive then choosing your encoding. Spend an extra $5k once in electricity encoding using a better codec, save $50k/month in bandwidth. But I do agree with your statement, why not save if you can?
        • The problem is it's the mindset and everyone seems to have it these days. "Oh, bandwidth isn't limited" "processors are more powerful" etc. You get a program built with 10 pieces of separate code that everyone failed to optimize and suddenly it's sucking 3x the bandwidth and 5x the processing power that it would have if everyone involved actually thought about optimization.

          Where I work we go back and refactor a lot of our code and queries when we have opportunities to do so and in some cases it's horrify

          • by Bengie ( 1121981 )
            I've had queries that took 3 minutes and got them down to under 10 seconds, for a nightly job that no one cared about. Until some large dataset came along and made that 3 minute run go for 24+ hours before a DBA killed it. At which point I tossed my version in and it ran in less than a minute.

            In my experience, good code not only runs much faster, but scales linearly, while bad code scales exponentially. All of those scrubs and their "micro benchmarks" showing that their code is fast enough, then you throw
          • The problem is it's the mindset and everyone seems to have it these days. "Oh, bandwidth isn't limited" "processors are more powerful" etc. You get a program built with 10 pieces of separate code that everyone failed to optimize and suddenly it's sucking 3x the bandwidth and 5x the processing power that it would have if everyone involved actually thought about optimization.

            Where I work we go back and refactor a lot of our code and queries when we have opportunities to do so and in some cases it's horrifying the waste that's been put up with. A SQL query that "works" but takes 30 seconds gets rewritten properly and now takes .3 seconds. Back when that was the only app on that server it ran faster so it was "no big deal" but now it saves hours of batch processing time per day. And this is EVERYWHERE. People wonder why their octa-core phones run barely faster than phones from 3 years ago? Shitloads of wasteful coding in those app design tools is my guess.

            That SQL query that "works" was effective for the time. If the (then) future hadn't gone the way that it did, that 30 seconds may have continued to be completely acceptable. Judging code on CURRENT use case misses the point of skipping premature optimization (and using that time saved on something else) when it's not needed.

            That said, there often comes a time when it's required. That's when you do the optimization. Which sounds like it's exactly what you've done.

            Working as intended.

            • by Bengie ( 1121981 )

              If the (then) future hadn't gone the way that it did

              If there is anything that I have learned is if something can go wrong it WILL go wrong. Don't let it go wrong in the first place. Spend the extra 1 second and think about your code before you vomit your crap code into production.

              It isn't "premature optimization" until it makes it some degree harder to understand than the non-optimized version. Most of what people call "premature optimization" is really just writing correct code. Should I use String.EndsWith or String.Contains to find if a string has a ce

      • This is exactly the mindset that leads to waste, inefficiency and bloat. If you can save even 5% on bandwidth, which is a limited resource, WHY NOT DO IT? And this is way more than 5%.

        Because h.265 requires hardware decoding to be even moderately watchable and the vast majority of devices that people currently own are not capable of decoding the latest h.265 videos. Netflix could force people to upgrade which would cause them to buy new decoding hardware and the waste you mention would shift from wasted b

      • by jedidiah ( 1196 )

        Converting is not free.

        Neither is replacing all of the decoders out there that can't decode the new format yet. Some of these decoders may be embedded in devices that consumers have no intention of replacing.

        Dumping a real standard on a whim is a bit of a bitch.

    • by Greyfox ( 87712 )
      I was looking last month and couldn't find an encoding standard that all browsers support. Google wants you to drink their kool aid, and seems to only support VP8 and 9 and ogg audio formats. IIRC, Firefox and IE both support h.264 and AAC. None of the browsers seem to support UDP streaming at all.

      If I want to do zero-latency live streaming to just a few endpoints outside the browser, I found that mpeg2video and aac streaming over UDP encodes and transmits with the least amount of lag. Near as I can tell,

  • I thought VP10 was supposed to be the real competitor to HEVC.

    • Re:What about VP10? (Score:4, Informative)

      by Merk42 ( 1906718 ) on Tuesday September 06, 2016 @10:49AM (#52834739)
      I don't quite understand the situation, but I'm guessing it won't exist [wikipedia.org]?

      However, Google decided to incorporate VP10 into AOMedia Video 1 (AV1). The AV1 codec will use elements of VP10 as well as the experimental formats Daala (Xiph/Mozilla) and Thor (Cisco).[72][73] Accordingly, Google has stated that they will not deploy VP10 internally or officially release it, making VP9 the last of the VPx-based codecs to be released by Google.[74]

      • Re:What about VP10? (Score:4, Informative)

        by NotInHere ( 3654617 ) on Tuesday September 06, 2016 @11:20AM (#52834973)

        Well that AV1 codec will have a much higher chance for success as more industry partners are involved, and it will get better hardware decoding support (both by the vendors, and by the codec itself being designed in a way with hardware decoding in mind). Netflix is part of AOM as well. The only bigger company of significance which is _not_ member of AOM is apple, who suprise suprise sits at the MPEG table and makes bucks with their H.265 patents.

        • by DrYak ( 748999 )

          The only bigger company of significance which is _not_ member of AOM is apple, who suprise suprise sits at the MPEG table and makes bucks with their H.265 patents.

          Which also explain why they insist of using AAC for everything audio,
          despite the tremendous success and performance of Opus - the previous similar success story of company collaboration (Xiph and Skype) to provide a standard to the IETF.

          (Opus is currently used by Skype, WhatsApp, etc. - basically if it's on the web today, it probably uses Opus as an audio codec).

          (I really wish IETF and AOMedia similar succes for their NetVC initiative/AV1 codec.

      • Correct. It's the foundation for the new format, but will be enhanced further using Daala, Thor, and Opus.

        Considering members of the new OpenAlliance consist of just about all browser manufactures (Microsoft, Google, Mozilla), chip manufacturers (Intel, AMD, ARM, Nvidia), and major content providers (Netflix, Amazon, Google), it's going to be widely supported/adopted quickly.

        h.265 might have an advantage right now with the current iteration, VP9, but if the industry has put their weight behind this free Ope

  • by Solandri ( 704621 ) on Tuesday September 06, 2016 @11:34AM (#52835111)
    All these codecs allow you to choose the bitrate, so efficiency is meaningless without a common basis for comparison. In this case it turns out they mean efficiency at the same video quality. But video quality is a completely subjective thing - how can you compare it in a reproducible manner? So I dug into how Netflix is deterministically measuring something subjective. That in itself is a pretty fascinating read [netflix.com].

    tl;dr - they took subjective test results from showing video samples to people, then used machine learning to develop an algorithm which produced similar results.
    • "how can you compare it in a reproducible manner"

      If you bother to look it up, you'll see exactly how this is done.

      It's widely used and there are numerous automated benchmarking systems.

      • by doom ( 14564 )

        If you bother to look it up ...

        And if you'd bothered to finish reading the post, you would see that he did look it up, and gave us a link, and a one-sentence summary.

        Accusing someone for being lazy when you can't read four sentences may be a new low in internet history.

  • "The 20% is just over VP9."

    So, basically, VP9 offers little to no advantage over 264, while even 265 is somewhat limited.

    That strikes me as somewhat sad, considering all the verbiage wasted by Google on the licensing issue.

    • by Bengie ( 1121981 )
      Umm, no. If you click on the link and look at the first graph, it clearly shows VP9 is about 17% better than 264 at 360p, 32% better at 720p, and 43% better at 1080p. VP9 would be a win over 264, but 265 is an additional 20% more.
  • Mathematics (Score:4, Interesting)

    by quenda ( 644621 ) on Tuesday September 06, 2016 @02:29PM (#52836629)

    A small nitpick, but sad to see a common but serious maths error in a technical article.

    20% fewer bits is not equivalent to 20% more efficient, but 25% more .
    Efficiency would be the reciprocal of the bitrate. A ratio of 4:5 becomes 5:4 when looked at the other way around.

    If you were to halve the bitrate, it would be twice as efficient, not 50% more.
    Or to put it in simple money terms, its like if two items are $100, one gets a 20% discount to $80, the other is now 25% more expensive.

  • VP9 is free, as in beer. There's something to be said for that.
    Or, do you want to keep sticking your head in nooses?

  • The title is misleading.

    We sampled 5000 12-second clips from our catalog, covering a wide range of genres and signal characteristics. With 3 codecs, 2 configurations, 3 resolutions (480p, 720p and 1080p) and 8 quality levels per configuration-resolution pair

    and then

    x265 and libvpx demonstrate superior compression performance compared to x264, with bitrate savings reaching up to 50% especially at the higher resolutions. x265 outperforms libvpx for almost all resolutions and quality metrics, but the performance gap narrows (or even reverses) at 1080p.

    So the highest resolution they tested was 1080p and performance between the 2 codecs was very close with libvpx beating out x265 in some cases. As far as bandwidth goes, saving at 1080p and above is more valuable than saving at 480p. Practically everything we watch at home is streamed 1080p. I don't see that x265 is the winner here. And where are the 4k tests?

  • Hey, that's great Netflix. Nice to see progress on the horizon in video encoding tech. Now would you please add an option to buffer the start of shows so they don't look like pixelated crap for the first 30 seconds or more on my HDTV? Maybe a checkbox somewhere? Even my wife notices, and she's not usually picky about these things. Thanks.

Single tasking: Just Say No.

Working...