Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Television Displays Graphics Entertainment Technology

The Trouble With 4K TV 442

An anonymous reader sends this quote from an article about the difficulties in bringing so-called '4K resolution' video — 3840x2160 — to consumers. "Though 4K resolutions represent the next step in high-definition video, standards for the format have yet to emerge and no one’s really figured out how to distribute video, with its massive file footprint, efficiently and cost effectively. How exactly does one distribute files that can run to hundreds of gigabytes? ... Given that uncompressed 4K footage has a bit-rate of about 600MB/s, and even the fastest solid-state drives operate at only about 500MB/s, compression isn’t merely likely, it’s necessary. ... Kotsaftis says manufacturers will probably begin shipping and promoting larger TVs. 'In coming years, 50-inch or 55-inch screens will have become the sort of standard that 40-inch TVs are now. To exploit 4K, you need a larger form factor. You’re just not going to notice enough of a difference on smaller screens.' The same quality/convenience argument leads him to believe that physical media for 4K content will struggle to gain traction among consumers. '4K implies going back to physical media. Even over the Internet, it’s going to require massive files and, given the choice, most people would happily settle for a 720p or 1080p file anyway.'"
This discussion has been archived. No new comments can be posted.

The Trouble With 4K TV

Comments Filter:
  • And don't forget.. (Score:5, Informative)

    by Striikerr ( 798526 ) on Wednesday January 09, 2013 @06:25PM (#42538791)

    .. the cable companies would compress the signal as they presently do with "HDTV" to the point that it looks like crap. They have ruined the HDTV quality with this compression and I can only imagine how much they would ruin 4k TV content. The best HDTV experience I have ever had was pulling HDTV signals from the Over The Air broadcasts. The first time I saw this (after spending so much time watching compressed HDTV on Comcast) I couldn't believe how great it looked. If you don't believe me, give it a try. The OTA HDTV signals blow Comcast's HDTV signals out of the water with crispness and detail.
    Hopefully the means of getting this type of signal dramatically improves so that compression is not needed and we can watch 4k the way it was meant to be..

  • by neokushan ( 932374 ) on Wednesday January 09, 2013 @06:48PM (#42539221)

    I'm calling bullshit. The summary talks about uncompressed video, glancing over the fact that even 1080p uncompressed requires a bitrate of 190 MBytes/s (at 24FPS) - faster than most HDD's can handle. Storage space required? 667Gb per HOUR, or a solid TB for a 90min film. Do you need a massive SSD to play 1080p files? Do you even need an SSD to edit and encode them? No, you don't.

    Source: []

    Compression has always been essential, even for DVD. Uncompressed SD videos would still fill a dual-layer blu-ray (50GB) after about 30mins. Yes, we'll need better CODECS to handle 4k and yes a lot of work needs to be done, but the size of uncompressed video isn't and never has been the issue. At worst, it'll take a slightly newer disk format (I don't see why a 4-layer Blu-ray disk - which exists today, couldn't do the job) and better internet connections to stream.

    Devices will be released THIS YEAR capable of outputting 4k - just look at Tegra 4, or nVidia's SHIELD, which has been demonstrated live supplying a 4K TV (Admittedly, probably up-scaled content, but obviously the technology is there today).

    The real issue is content, in that nothing is really in 4K right now. The transportation and storage method is not.

  • Artifacts suck (Score:2, Informative)

    by Da_Biz ( 267075 ) on Wednesday January 09, 2013 @07:23PM (#42539721)

    I used to work for a small Texas company you've probably never heard of (Enron). Their Broadband Services division had, under their employ, some of the brightest minds in networking and video distribution for that era (unfortunately, the same cannot be said of their executive management).

    In any event, here's why I like 4K--and related plans by a premier provider of 4K camera equipment (Red) to utilize their new Odemax acquisition to distribute it. (I also suspect that the ideas being employed by Red and Odemax are things that can be replicated in whole or substantial part by others.)

    1) I really hate macroblocking: it looks ugly. Depending on the quality of TV and BluRay player, the macroblocking (and other artifacting) is better or worse--but it's still there even with well-produced BluRay disks. Even with relatively small screens (e.g., 42"), it looks like there could be some really nice gains in artifact reduction.

    2) One more novel idea the architects at Enron Broadband employed was to build their own content distribution network, complete with Enron-provided fiber connectivity and servers placed at ISPs. These "head-end" servers at ISPs would reduce the host's peering costs and improve performance.

    3) At least in metro areas where Odemax has a presence, the ISP's backbone capabilities will be less of an issue--and I suspect their WAN capacity is substantially more capable. 20 mbit/sec within an ISP's own network seems much more feasible (be it xDSL, DOCSIS, etc.).

  • by Anonymous Coward on Wednesday January 09, 2013 @07:32PM (#42539853)

    Verizon FIOS TV is actually the ONLY (that I know of anyway) TV provider that does NOT re-compress HDTV signals! They broadcast EXACTLY what the TV channel sends them. Comcast, Time Warner (and probably the other major cable companies) DO re-compress the signal, squeezing 3 channels into the space normally allocated for 2, in order to deliver more channels over their antiquated copper networks. This is one of the main reasons I went with Verizon FIOS in the first place. Personally, I find the premium channels like HBO to be absolutely PRISTINE...AMC, on the other hand, looks like a bad Netflix stream! Don't blame Verizon, blame the channels themselves!

  • by RoboRay ( 735839 ) on Wednesday January 09, 2013 @08:10PM (#42540255)
    Show me an interlaced digital display. Seriously. Show me one.
  • by PhotoJim ( 813785 ) <jim AT photojim DOT ca> on Wednesday January 09, 2013 @11:22PM (#42541979) Homepage

    The display may be not technically be interlaced, but the content certainly can be.

  • by sg_oneill ( 159032 ) on Wednesday January 09, 2013 @11:43PM (#42542159)

    Its the term used at the production side, and is pretty much the standard for pro video at the high end. And its notorious for being really hard and expensive to work with because its simply taxing for the cameras to output and even more taxing to work through post-production with. Something like a RED or whatever camera will sell themselves on "4K" but generally unless filming for cinema its hardly needed.

    With one exception however, when dealing with chromakeying and the like the higher resolution provides more information to key out backgrounds without the telltale green halos around the characters, so even on TV work, 4K cameras are idea for special effects stuff, just to give the post-production work more to work with.

    Which is, of course, why those newfangled DSLR cameras might look seriously fantastic for normal footage, but are simply the wrong choice for special effects stuff because whilst the effect of compressing it all down might be acceptable for most video, it removes too much detail for chromakeying without a lot of (expensive) extra work in post production. With that said, for DIY amateur stuff, its not time thats the problem but gear and so people are able to spend more time getting keying right and working with looser tolerances.

  • by Jeremy Erwin ( 2054 ) on Wednesday January 09, 2013 @11:47PM (#42542193) Journal

    OTA gives 19 Mbps to each channel, usually split into 3 stations. So 20 for a single 4x channel sounds about right.

    OTA uses MPEG-2, and (in my experience) for a good, decent 1080i signal, between 12-15 Mb/s is required, before things look a bit blurry. I do have an OTA EyeTV setup, with a 1080p monitor, and I can quite easily see the bitrates.
    Ironically, I've taken to streaming my PBS shows, because although my downstream bandwidth is a mere fraction of a full ATSC stream, the image quality is far superior to what is possible with a 9 Mb/s "720p" sub channel.
    For Bluray, 25 Mb/s AVC is very common, and that doesn't include the lossless audio. It's in another league entirely, provided that the grain hasn't been scrubbed out of existence. (Pan's Labyrinth isn't realistic, though it is shiny)

    More efficient coding standards may help cut the bandwidth requirements quite severely, but at a certain point, you'll have to decide that you just don't care about image quality. And that sort of attitude won't sell 4k screens.

  • by Lachlan Hunt ( 1021263 ) on Thursday January 10, 2013 @06:58AM (#42544087) Homepage

    The article said 600 MB/s, not "Mbps". There is a difference. The former is megabytes per second, the latter is megabits per second. And, yes, it does.

    3840*2160 = 8294400 px/frame

    Colour depth is 24 bits per pixel

    And either 24, 25 or 30 frames per second, depening on whether it's native film rate, or adjusted for PAL or NTSC. In either case, the calculation is:

    3840*2160 px/frame * 24 bit/px * 24 frame/s = 4,777,574,400 bits/s = 4.78 Gb/s or around 597.2 MB/s

    It's obviously more for the higher frame rates.

"I'm not afraid of dying, I just don't want to be there when it happens." -- Woody Allen