Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Television Displays Graphics Entertainment Technology

The Trouble With 4K TV 442

An anonymous reader sends this quote from an article about the difficulties in bringing so-called '4K resolution' video — 3840x2160 — to consumers. "Though 4K resolutions represent the next step in high-definition video, standards for the format have yet to emerge and no one’s really figured out how to distribute video, with its massive file footprint, efficiently and cost effectively. How exactly does one distribute files that can run to hundreds of gigabytes? ... Given that uncompressed 4K footage has a bit-rate of about 600MB/s, and even the fastest solid-state drives operate at only about 500MB/s, compression isn’t merely likely, it’s necessary. ... Kotsaftis says manufacturers will probably begin shipping and promoting larger TVs. 'In coming years, 50-inch or 55-inch screens will have become the sort of standard that 40-inch TVs are now. To exploit 4K, you need a larger form factor. You’re just not going to notice enough of a difference on smaller screens.' The same quality/convenience argument leads him to believe that physical media for 4K content will struggle to gain traction among consumers. '4K implies going back to physical media. Even over the Internet, it’s going to require massive files and, given the choice, most people would happily settle for a 720p or 1080p file anyway.'"
This discussion has been archived. No new comments can be posted.

The Trouble With 4K TV

Comments Filter:
  • by Joe_Dragon ( 2206452 ) on Wednesday January 09, 2013 @05:20PM (#42538713)

    cable and sat don't have the bandwidth for it and that's on the broadcast side.

    Maybe 1-2 channels but most cable systems are loaded with sd channels and old mpeg2 HD boxes.

    Sat has moved to all mepg 4 HD but stills has lots of SD boxes out there as well.

    • Yep - even if the set-top box customers are given is natively MPEG-4 AVC, the backend is frequently still MPEG-2 passed through a realtime transcoder. 4K resolution is going to be a big deal for theaters and exhibition halls of various stripes. At the smaller scale tech isn't ready for home yet by a long shot - AVC's successor HEVC is still in the drafting stages, never mind successful deployment - and the improvements won't have the impact that the DVD --> Blu-ray jump did for most customers. I predic
      • 4K ?

        4K Q 2 !

        • Yep, I just realized 4K indicates 3840x2160, because apparently 2160p just doesn't sound cool.. Blech.
          • by sg_oneill ( 159032 ) on Wednesday January 09, 2013 @10:43PM (#42542159)

            Its the term used at the production side, and is pretty much the standard for pro video at the high end. And its notorious for being really hard and expensive to work with because its simply taxing for the cameras to output and even more taxing to work through post-production with. Something like a RED or whatever camera will sell themselves on "4K" but generally unless filming for cinema its hardly needed.

            With one exception however, when dealing with chromakeying and the like the higher resolution provides more information to key out backgrounds without the telltale green halos around the characters, so even on TV work, 4K cameras are idea for special effects stuff, just to give the post-production work more to work with.

            Which is, of course, why those newfangled DSLR cameras might look seriously fantastic for normal footage, but are simply the wrong choice for special effects stuff because whilst the effect of compressing it all down might be acceptable for most video, it removes too much detail for chromakeying without a lot of (expensive) extra work in post production. With that said, for DIY amateur stuff, its not time thats the problem but gear and so people are able to spend more time getting keying right and working with looser tolerances.

            • by AmiMoJo ( 196126 ) *

              One of the big problems with 4k, and even more so with 8k, is that the camera operator can't focus it by eye any more. With that much resolution being even slightly off will be noticeable, so auto-focus is the only option.

      • "4K resolution is going to be a big deal for theaters and exhibition halls of various stripes"

        thats what I'm thinking. I think we won't see consumer 4k devices for another 20 years.

        but its good we get a spec now, and start hammering out bugs, so by the time the rest of the compute world can handle the bandwith, we're ready.
    • I fail to see any point in your ramblnigs. 4K / 8K are clearly future techs intended to be delivered over equally future networks. Your argument is invalid. Get with the programme.

    • They would have to go to SDV [wikipedia.org]. It would be the only way. It would definitely be feasible, but might limit the number of different channels any one house could have on at the same time.
  • by mark-t ( 151149 ) <markt AT nerdflat DOT com> on Wednesday January 09, 2013 @05:22PM (#42538731) Journal

    ... with going back to physical media.

    I prefer it, in fact.... it's far easier to account for than bits stored on a disk drive I can't possibly see without an electron microscope.

    The biggest grievance I have with 4k is that the devices are too bloody costly.

    • Another nice thing about physical media is that you can lend it to your friends, pass it on to your children, etc... without running into digital rights management restrictions.

      Although it may not matter - if nobody has a physical DVD player anymore in thirty years, passing on my DVD collection to the kids or offering to lend it to friends is meaningless.
  • by XPeter ( 1429763 ) on Wednesday January 09, 2013 @05:24PM (#42538765) Homepage

    The same thing happened when the first 1080P screens came out. The market will adapt, there's no problem here.

    • by mug funky ( 910186 ) on Wednesday January 09, 2013 @05:34PM (#42538967)

      the problem is that HD is still more than is needed, and a fair amount of programming is still made for SD (and most still broadcast in SD).

      broadcast facilities dragged their feet with HD adoption - the single factor that made all facilities HD capable was the move away from hardware to software and masters-on-HDD.

      so no... the market didn't adapt in 2005, it didn't adapt in 2010, it hasn't adapted now and it will be a long time before anything other than big budget movies or events like the olympics will get the 4k treatment.

      also consider the optimum viewing distance of 2.5 screen heights. if Jobs were still here, he'd stop at 2k and call it "retina television". unless you're doing it well wrong, you're not going to get much benefit. even the jump from SD to HD was marginal - most of the gains were in compression quality (a macroblock is harder to see when it's 1/4 the size, and in h.264 it's impossible to see as it's filtered out by design).

      but i suppose 4k will be interesting for perving on individual audience members at sporting events...

      • Re: (Score:3, Insightful)

        consider the optimum viewing distance of 2.5 screen heights

        I keep seeing that pop up, and I don't know who came up with it, but they're wrong. If you have to turn your head to see the the action going on the corners, you're too close. If you don't have to turn your head, you're not too close. That point is closer than 2.5 screen heights.

        if Jobs were still here, he'd stop at 2k and call it "retina television".

        Doubtful, considering the iPad 2048x1536 is only 10" screen.

        even the jump from SD to HD was marginal

        Holy shit, and this is how you know that you have no idea what you're talking about. The difference of SD to HD was more significant by far than the change from black a

        • by hawguy ( 1600213 ) on Wednesday January 09, 2013 @06:57PM (#42540127)

          even the jump from SD to HD was marginal

          Holy shit, and this is how you know that you have no idea what you're talking about. The difference of SD to HD was more significant by far than the change from black and white to color. It's huge! Do you have a 10" tv that you're watching from 7 ft away when making this comparison or something?

          Are you old enough to remember the B&W TV days? I think you're underestimating the scale of the switch from B&W to color. I still remember when my parents got a color TV (we had a B&W set far longer than most people) and the difference was amazing and quite apparent to everyone. It didn't take a side by side comparison to see the difference between B&W and color, and you could see the difference no matter the size of the screen or how close you were.

          On my current 37" LCD (capable of 720p, 1080i), I notice only a minimal difference between SD DVD's (480i) and HD Blu-rays. The difference is so minimal that I stopped paying the extra dollar or two for Blu-ray disks from Netflix because I couldn't really tell the difference. Perhaps if I had a bigger 1080p capable set I might notice more of a difference, but at my normal viewing distance (10 - 12 feet) the difference is quite minimal on my current set. I don't think I'd notice any difference at all between 720p and 4K without a much larger TV, or sitting much closer to the TV.

          This chart doesn't go up to 4K, but suggests that you'd have to sit closer than 10 feet away from a 100" screen to take advantage of even 1440p:

          http://www.engadget.com/2006/12/09/1080p-charted-viewing-distance-to-screen-size/ [engadget.com]

          • by EdZ ( 755139 )

            On my current 37" LCD (capable of 720p, 1080i), I notice only a minimal difference between SD DVD's (480i) and HD Blu-rays

            In descending order of probability:
            1) You are WAY to far away from your TV. THX standard is for the display to fill 40 of your field of view diagonally, or a viewing distance of 1.2x the screen diagonal. A little under 4 feet would be the recommended viewing distance.
            2) Your TV is still set to the factory/showroom default profile, and will look like shit whatever you feed it with. Use a calibration disc (e.g. Spears & Munsil) or the basic THX calibration included on many BD (e.g. Disney/Pixar films) t

            • That, or he's legally blind....which after reading the other comments about not being able to read text at 1080, I'd say is probably the case.
      • by aXis100 ( 690904 ) on Wednesday January 09, 2013 @07:21PM (#42540389)

        also consider the optimum viewing distance of 2.5 screen heights. if Jobs were still here, he'd stop at 2k and call it "retina television". unless you're doing it well wrong, you're not going to get much benefit. even the jump from SD to HD was marginal - most of the gains were in compression quality (a macroblock is harder to see when it's 1/4 the size, and in h.264 it's impossible to see as it's filtered out by design).

        I thought the jump from SD to HD was great....for a while. Lately most of the free to air TV channels in Australia have been going terribly downhill with overly compressed or badly up-scaled video sources. It's rare now to get HD content that looks like HD - the best thing I've watched lately was a 3 year old doco I had saved on hard drive.

        Which then begs the question, why go to 4K when we cant even get 1080p right consistently?

    • O'rly? I'm perfectly happy with having an 38-inch TV, as my living room couldn't house a larger screen without it being too invasive. Why would i want 4k if it wouldn't create a noticable quality improvement unless i had a huge screen?

      Sooner or later, the physical size of consumers living rooms will determine the upper limit for how highres a screen can be.
      • I have a 40 inch screen and it's very hard to see the difference between anamorphic DVDs upconverted to 1080P and Blu-ray. The digital surround sound improvement is actually more noticeable, even in 5.1.
    • by wilson_c ( 322811 ) on Wednesday January 09, 2013 @08:10PM (#42540857)

      I'm not sure that's true. 1080p had always been the goal of HD, even with the original HD spec developed in Japan in the 80s. No matter what, everyone knew we were going to get there and understood the advantages over NTSC and PAL. Consumers and content creators could see the improvements brought by HD. Most of the people who cared about 1080p just waited until prices dropped and skipped 720p and 1080i. That all occurred as part of the big HD uptake over the past 5 years.

      The problems with 4k are twofold. First, it isn't part of the existing HD spec. It is a new standard that doesn't have the imprimatur of governments and cable companies designating it as a target to be achieved. Second, it is a move driven entirely by the consumer electronics industry. There isn't demand from users and there is certainly no interest on the production side.

      I work in post production and the data hassles of 3D have been enough to keep our company (and many others) away from it. The substantially larger file sizes associated with 4K are even worse. For a production company like ours, we'd have to move to a petabyte SAN just to manage an equivalent amount of 4K footage to what we do now in HD. Transcoding times would go through the roof, bandwidth would be heavily taxed, even the hardware requirements for decoding a single compressed stream (to say nothing of editors handling MANY simultaneous streams) for playback would be much higher. And for what? The only quality improvements would be in resolution (as opposed to the jump to HD, which brought a massive change to color handling over NTSC). Networks don't want to pay higher budgets for something that won't help make them any more competitive. Satellite providers, who already compress the shit out of their HD signals, don't have spare bandwidth for fatter streams. Cable companies, who are basically in the data business now, don't want to waste their bandwidth on it. Even with SDV it would add a lot to their overhead. Game consoles are still struggling to make the most out of HD, so are nowhere near ready to handle that many additional pixels. You might have videophiles willing to spend a ton of money on ultra-premium gear, but even they would be limited to using specialty playback hardware that would have to download massive files because 4k exceeds the storage capacity of any commercially available blu-ray media.

      TV manufacturers are pushing this because the great upgrade is over and 3D has failed to excite consumers. They need something to try and convince consumers to replace a perfectly functional, nearly new 1080p TV. So they're going to run with 4K in 2013.

  • And don't forget.. (Score:5, Informative)

    by Striikerr ( 798526 ) on Wednesday January 09, 2013 @05:25PM (#42538791)

    .. the cable companies would compress the signal as they presently do with "HDTV" to the point that it looks like crap. They have ruined the HDTV quality with this compression and I can only imagine how much they would ruin 4k TV content. The best HDTV experience I have ever had was pulling HDTV signals from the Over The Air broadcasts. The first time I saw this (after spending so much time watching compressed HDTV on Comcast) I couldn't believe how great it looked. If you don't believe me, give it a try. The OTA HDTV signals blow Comcast's HDTV signals out of the water with crispness and detail.
    Hopefully the means of getting this type of signal dramatically improves so that compression is not needed and we can watch 4k the way it was meant to be..

    • by bmo ( 77928 )

      .. the cable companies would compress the signal as they presently do with "HDTV" to the point that it looks like crap

      Pretty much this.

      There is so much artifacting going on with Verizon "hdtv" that I may as well be watching sdtv DVD rips from Pirate Bay, and no, nothing is wrong with the signal. The signal itself is fine without dropouts. It's just crap.

      --
      BMO

  • I still have a 1 DVD out at a time along with Netflix streaming because it's better than $5 iTunes rentals for recent stuff (and I can rip DVDs for anything I want to keep), so staying with discs for a while longer is no big deal to me. It is a shame we can get the infrastructure's bandwidth up at a better pace, though.

    • Re: (Score:2, Offtopic)

      by mug funky ( 910186 )

      you rip rentals? that's pretty scummy, dude.

      • I still have a 1 DVD out at a time along with Netflix streaming because it's better than $5 iTunes rentals for recent stuff (and I can rip DVDs for anything I want to keep),...

        you rip rentals? that's pretty scummy, dude.

        Well... Technically... As long as he has an active Netflix by/mail account, he's simply being efficient and saving them postage.

      • What's your point?

      • Re: (Score:2, Offtopic)

        by MightyYar ( 622222 )

        I've got him beat - I just download them.

  • Put movies on cartriges. By the time 4K is ready to become a standard it will make more sense to use solid state drives than optical. They should focus on making flash memory faster and distribute films on jump drives. Kingston has a 1TB key drive in the lab now.

    • Comment removed based on user account deletion
      • by Guspaz ( 556486 )

        Thunderbolt goes a bit higher, so a RAID array over thunderbolt might do over 600MB/s...

        There's a lot of professional video gear that supports thunderbolt. Probably because a lot of professional video gear targets Mac. That's actually kind of annoying, and the third-party Windows drivers for HFS+ have spotty support. I couldn't get them working with the CF cards recorded on a KiPro Mini, for example. I had to hunt down somebody on-site with a macbook to read the damned things.

  • I can live with upscaled DVDs / blu-rays. It'll be worth it having something that bad ass and it would mostly function as a second monitor if I could afford one.
  • by thesupraman ( 179040 ) on Wednesday January 09, 2013 @05:31PM (#42538917)

    Much of the bandwidth/media etc claims are rubbish. 4k has (approximately) 4 time the pixels of standard full HD, so at most a given
    format will increase by 4 times, HOWEVER, most lossy compression methods (for example AVC/MPEG4) on real footage scale better
    than linear with pixel count, as detail becomes more repeated at higher resolutions, so a more likely estimate for such formats is
    2 times, which is not crazy (blueray for example can already delivery that for many movies if needed). newer compression methods are
    coming on line that can deliver close to double the compression for equivalent quality, meaning we end up back to normal HD data sizes.

    Is it needed? thats a whole different story, with the size of living rooms/available and comfortable wall space for screen, etc it is pretty
    marginal, but trying to use raw uncompressed bitrates as a scare tactic is rubbish.

    Their raw figures are of course not even right as they seem to be assuming 444/12bit storage, which would be rather rare in real life, 422 10 bit
    would be MUCH more common, and most workflows would actually use comrpessed storage (as they do now for HD.).

  • The real problem is that the resolution is exactly double that of 1920x1080. This means scaling up or down will work very well and people won't be able to tell the difference between this and 1080p. You know, because all the 720 TVs are actually 1366x768 which means images have to be smeared to shit, making 1080 TV look so much better (even with OTA 720 shows). And yes, I'm claiming industry-wide effort to make 1080 appear visibly better than 720. Or perhaps the 1080 sets will start to be 1152 to make 4K lo
    • by vlm ( 69642 )

      Or perhaps the 1080 sets will start to be 1152 to make 4K look better than regular HD even with 1080 content.

      I'd like to be able to buy the 1600x1200 monitors I bought for many years before 1080 HDTV became popular and forced a lower resolution for PC users.

    • This means scaling up or down will work very well and people won't be able to tell the difference between this and 1080p.

      I don't think pixels work the way you think they do.

      That aside, you may not realise that most 1080p TVs, by default, scale up the image by about 5%, and yet that somehow doesn't look "smeared to shit." For any natural image source the difference is marginal at worst, and probably likewise for any digitally originated images, since broadcasters deliberately keep them soft for a variety of reasons.

      And yes, I'm claiming industry-wide effort to make 1080 appear visibly better than 720.

      You do know that it's also actually better than 720p too, right? The clue is in the numbers.

  • by jd659 ( 2730387 ) on Wednesday January 09, 2013 @05:33PM (#42538955)
    “Even over the Internet, it’s going to require massive files” While this is true, the speed of the Internet connection makes a huge difference. Unfortunately for the US population, the market is divided among a couple of companies and the slow speeds are offered at bank-robbery prices (e.g. 25/3Mbps for $50). Many countries in Europe get a faster and cheaper connection (e.g. 75/50Mbps for $10) and that changes how people watch TV. With TVs that can play MPEGs directly off some network connected HDD and a laptop that can download any torrents to that HDD, the experience of watching a show is often:
    1. Find a torrent on a laptop and click on it to start downloading.
    2. Wait a couple of minutes.
    3. Navigate TV to the specific file on HDD and start watching.

    It is amazing how much the experience changes for the better with faster connection speeds and more reasonable laws on downloading/uploading the content.
    • by v1 ( 525388 )

      Not sure if you're trolling or just uninformed. So maybe I'm feeding.

      Torrent files don't download sequentially, from start to finish. Clients that follow spec will randomly select a piece to download, possibly influenced by availability or speed of peers with pieces. The larger the file gets, the lower the odds are that you're not missing an early piece. For a movie that has 1000 pieces, the odds of having ALL of the first 250 pieces at any point before you hit the 95% downloaded point are astronomicall

      • by snadrus ( 930168 )
        Download Torrents sequentially with QBittorrent. @ 5%, if the time remaining < show length, then VLC will play it to the end. The download speeds don't appear to change between sequential and regular modes of QBittorrent. Linux may be needed b/c Windows file locking may mess this up.
  • As failure rates of electronics decrease, sets last longer and longer. This seems like just another sales ploy to force us all to buy new TV sets. 3D hasn't been widely received with popularity, so maybe the proles will buy into needing even higher definition!
  • In coming years, 50-inch or 55-inch screens will have become the sort of standard that 40-inch TVs are now. To exploit 4K, you need a larger form factor. Youâ(TM)re just not going to notice enough of a difference on smaller screens.' The same quality/convenience argument leads him to believe that physical media for 4K content will struggle to gain traction among consumers.

    I don't get how this person has the foresight to note all these things, but totally gloss over the fact that for many (I may even say most) living rooms, any TV above 46"-47" is simply too large. I will never have a 4K TV in my living room because there is simply nowhere to put a TV that is 55"... it doesn't matter if the manufacturer is selling it, heck it doesn't matter if it is FREE, I have nowhere to put the damn thing. 46" is already way bigger than needed.

  • by Anonymous Coward on Wednesday January 09, 2013 @05:36PM (#42539011)

    4K is so 2007, I have seen 8K broadcast streets (all equipment needed to acquire, store, transmit, compress, scale, playback and display) for years as shown at the international broadcasting conference.

    http://en.wikipedia.org/wiki/Super_Hi-Vision_television

    Before anyone comes up with, "but the eyes cannot resolve that kind of details", YOU ARE WRONG!
    8K is not even a little comparable to HDTV.

    I have also seen 4K being displayed, often scanned from 35mm prints, I doesn't have much impact beyond 2K. But this may be due that this is not captured on a digital camera and the grain (effective resolution) of 35 mm is worse than pixels at 4K. The 8K footage I've seen was captured on a 8K digital camera.

    Also 300 fps video is freaking amazing, this was a demo from the BBC, your eyes can track fast moving objects and therefor focus on it razer sharp like when you track a moving object in the real-world. Finally we can actually watch Hollywood action sequences which as 24 fps is just a blurry mess of motion blur, or a vomit inducing slideshow.

  • by An dochasac ( 591582 ) on Wednesday January 09, 2013 @05:37PM (#42539019)
    Never underestimate the bandwidth of a station wagon full of Betamax tapes. Analog of course.
  • by White Flame ( 1074973 ) on Wednesday January 09, 2013 @05:41PM (#42539083)

    If they do crank these out, 4K computer monitors should come down in price. I don't care what happens to the TV market as long as that happens.

    • Indeed. Better monitors are really the primary advantage I see for typical consumers, with the only other one being large-screen applications, such as projectors and the like.

      Otherwise, if we're talking about TVs, 1080p is already sufficient, since I did the math awhile back when I was deciding whether to wait for 4K or not, and at typical viewing distances with the sizes of HDTVs we have today, the individual pixels in a 1080p TV are already far smaller than a person with 20/20 vision is able to discern, m

  • I have absolutely no issue with physical media. Sure, streaming is convenient. But I can tell you, that physical media saved me from absolute boredom during severe snowstorms, where my only power source was an extension cord, an inverter, and my laptop. For flying, physical media (whether thumb drive or DVD) is a necessity. And for driving, I do not want to be bound by a physical internet connection to enjoy a TV show/movie that I have purchased. I still get DVD's by mail form Netflix, because my month

  • by kelemvor4 ( 1980226 ) on Wednesday January 09, 2013 @05:43PM (#42539101)
    The MPAA must be downright giddy about it. It's the first technical detail I've heard in years that could actually hinder piracy.
    • by Twinbee ( 767046 )
      Wouldn't the pirates just downscale the content to 1080p? - I doubt the viewers/pirates would mind much.
  • by SomeKDEUser ( 1243392 ) on Wednesday January 09, 2013 @05:44PM (#42539127)

    The important point is that at last, there'll be computer screens with non-stupid resolutions again! They took my 1920x1200 away, and though I would prefer 3840x2400, I can live with 3840x2160.

    At least resolutions are going up again.

  • by guidryp ( 702488 ) on Wednesday January 09, 2013 @05:44PM (#42539137)

    Remember when Blu Ray came out and a number of people were claiming they couldn't see much difference.

    Well this time it will actually be true for almost everyone.

    Most people don't even have their TV's close enough to visually discern 1080p.

    This kind of TV resolution is irrelevant in a normal home setup.

    • Precisely. When I was deciding whether or not to wait on 4K a few months back, I did the math and realized that unless you have a ridiculously large TV or are sitting ridiculously close, 1080p is already well past the point where it can be considered a "retina display" (to borrow Apple's term for any display where the pixels are indistinguishable from one another at typical viewing distances to a person with 20/20 vision). 4K provides no additional benefits in those contexts, which is how most people will l

  • by Moskit ( 32486 ) on Wednesday January 09, 2013 @05:47PM (#42539189)

    Pity that submitter/editor did not research further into the topic.

    There are already standards (JPEG2000 DCI) that allow to compress 4K stream from about 5Gbit/s to 250 Mbit/s, which is much more manageable. There is at least one commercial vendor (intoPIX) that makes such hardware de/compressors.

    If you want to stretch your imagination - start thinking about 3D movies in 4K, which is quite an obvious step. This is 12 Gbit/s uncompressed, but 500 Mbit/s in normal transmission.

    Oh, by the way - 8K is already being worked on. And 8K 3D (48Gbit/s uncompressed)...

  • If 2K is good enough for theaters (and it is), who is it that wants 4K for their living room?
    • I want it on my monitor. Who watches TV anymore anyway?

      OK, not a very good point, but basically, I can absolutely use more pixels. youtube videos will still be crappy, but when I write, code or draw, I'll see much more :). I don't think that the fact that those pixels require such large files is very relevant compared to the fact that economies of scale will yield much larger and better screens.

      I am still bitter about the whole fullHD scam -- for a scam it was: moitor resolutions went down. and in the 21st

    • Re:Who Wants This? (Score:5, Insightful)

      by W2IRT ( 679526 ) <pjd@panix.com> on Wednesday January 09, 2013 @06:36PM (#42539929) Homepage

      In who's mind is 2K good enough for theatres? Speaking as a former motion picture projectionist who ran 35mm and 70mm film for almost 20 years, I can tell you the "quality" you get in a 2K auditorium is significantly inferior to what was delivered by a 35mm print, albeit with no jitter or weave. 4K cinematic presentations are actually quite good, even on a 40 or 50 foot screen but I steadfastly refuse to see anything in a theatre that's shown in the 2K format. What's worse, most exhibitors run their 2k machines with the 3D lenses in place when they're not showing 3D, cutting the available light in half. So what the vast majority of patrons experience in a movie theatre today is a dark, washed-out image with lower overall quality than they were seeing just 5 years ago. The only winners here are the companies who don't have to ship 12,000 feet of film (for a 2 hour movie), which weighs about 40-50 pounds per print, to 2000 screens -- and pay to ship it back again at the end of the run. The exhibitors also win because they got the 2k machines for free from the companies and they don't have to employ skilled projectionists to run them either.

      So yeah, I'll take 4K home presentation once the price comes down to the level that mere mortals can afford. I have a 53" Aquos screen now that's OK at 9' viewing distance but a 65" class screen at 4K and using HFR would rock my world once content becomes available.

      My bet is that flat panel manufacturers are quickly realizing that 3D in the home is a dud and they'll concentrate their efforts into amping up 4K in the coming years, even though content will be quite minimal for a very long time. Since you'll never see anything more than 1080i or 720p from OTA broadcast (6 MHz channel size ain't changing any time soon), it'll only be a selling point for movies or DVDs of TV series. I don't know about everyone else, but 95% of what I watch is broadcast TV dramas, comedies and sports. I don't see the studios converting to shoot and edit to 4k in the foreseeable future, either.

  • The only people that are going to care about uncompressed size are those that make movies and movie theaters (I'm assuming theaters use uncompressed files, but I honestly don't know.) And a move-maker or theater won't have any problem with it; a simple drive array is just fine to cope with the bandwidth demands.

    Just as few (any?) consumers ever get their hands on uncompressed 1080p, so it will be with 4K.

    Unless I did the math wrong, it's only triple the size... hardly an insurmountable problem; not even an

  • Comment removed (Score:5, Interesting)

    by account_deleted ( 4530225 ) on Wednesday January 09, 2013 @05:51PM (#42539261)
    Comment removed based on user account deletion
  • Watching TV just ain't right since they did away with interesting programs. I really don't give a rat's ass about resolution since movie channels repeat everything I've seen and channels like History and Discovery no longer show history or real science/engineering programs. That's my Gripe Of The Month.
  • Ecole Polytechnique Federale de Lausanne (EPFL) did a study to evaluate the subjective video quality of HEVC at resolutions higher than HDTV. The study was done with three videos with resolutions of 3840Ã--1744 at 24 fps, 3840Ã--2048 at 30 fps, and 3840Ã--2160 at 30 fps. The five second video sequences showed people on a street, traffic, and a scene from the open source computer animated movie Sintel. The video sequences were encoded at five different bitrates using the HM-6.1.1 HEVC encoder and the JM-18.3 H.264/MPEG-4 AVC encoder. The subjective bit rate reductions were determined based on subjective assessment using mean opinion score values. The study compared HEVC MP with H.264/MPEG-4 AVC HP and showed that for HEVC MP the average bitrate reduction based on PSNR was 44.4% while the average bitrate reduction based on subjective video quality was 66.5%.

    High Efficiency Video Coding [wikipedia.org]

  • Do 4 times the pixels need 4 times the bandwidth? I would think larger blocks of solid colors, simple gradients, etc. would compress much at a much higher ratio than smaller ones. Or do they still encode the same size of pixel blocks as the old standards?

    As for digital artifacts, I find that applying a very light noise filter (artificial 'film grain') conceals obvious banding, blockyness, etc. improving perceived (but not actual) quality.

  • So a repeat of the argument against HDTV. It took the media companies 10 years to catch up, and the same will happen again. Early adopters get what they deserve.

  • I remember seeing articles about the use of holographic storage medium with 500 GB potential http://www.crn.com/news/storage/217200230/ge-unveils-500-gb-holographic-disc-storage-technology.htm [crn.com] . Don't know if it will ever come around, but it would be a possible physical media source (assuming that the read speeds were fast enough)

  • by petes_PoV ( 912422 ) on Wednesday January 09, 2013 @07:21PM (#42540393)
    The basic problem with Ultra-HD is that nobody can see it. You'd have to be sitting so close to the screen to appreciate the difference (from "normal" HD) that your eyes couldn't see the whole screen. Add on to that. that the data stream would be so highly compressed to fit into the available bandwidth that the only difference would be the resolution of the artifacts. What you have is the video equivalent of an audio bandwidth extending into the 100's of kHz. great for any dogs listening, or eagles watching your TV, but utterly pointless for humans, unless their motivation is so immature that they feel the need to have something impractically better than the guy next door's, no matter what the cost - or usefulness.
  • by Dastardly ( 4204 ) on Wednesday January 09, 2013 @08:34PM (#42541097)

    The more interesting step to me would be 1920x2160 panels for 1080P passive 3D. Right now passive 3D polarizes alternate lines so at 1080P it is more like 1920x540 per eye. Which probably is perceived by the brain like 1920x700 or something like that. If no one makes a 1920x2160 panel I presume it could be done with a 4K panel.

  • by djbckr ( 673156 ) on Wednesday January 09, 2013 @10:48PM (#42542205)
    I think I'd rather see a higher frame rate. When I was watching "The Hobbit" I really enjoyed the HFR but I was thinking to myself that the rate needs to be even higher still. No less than 60, I would say...

This place just isn't big enough for all of us. We've got to find a way off this planet.

Working...