Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Displays Television Hardware

4K Ultra HD Likely To Repeat the Failure of 3D Television 559

New submitter tvf_trp writes "Fox Sports VP Jerry Steinbers has just announced that the broadcaster is not looking to implement 4K broadcasting (which offers four times the resolution of today's HD), stating that 4K Ultra HD is a 'monumental task with not a lot of return.' Digital and broadcasting specialists have raised concerns about the future of 4K technology, drawing parallels with the 3D's trajectory, which despite its initial hype has failed to establish a significant market share due to high price and lack of 3D content. While offering some advantages over 3D (no need for specs, considerable improvement in video quality, etc), 4K's prospects will remain precarious until it can get broadcasters and movie makers on board."
This discussion has been archived. No new comments can be posted.

4K Ultra HD Likely To Repeat the Failure of 3D Television

Comments Filter:
  • by JDeane ( 1402533 ) on Thursday October 24, 2013 @08:33AM (#45222509) Journal

    But I don't want to pay 4K.

    • by jerpyro ( 926071 ) on Thursday October 24, 2013 @08:40AM (#45222577)

      I would love 4k too but I don't want to use it for a TV, I want to use it for a computer monitor (How many IDEs can you fit in 4k?). I keep looking at this particular TV and thinking about how much space I'd have to clear off on my desk to use it with my laptop:
      http://www.amazon.com/Seiki-Digital-SE39UY04-39-Inch-Ultra/dp/B00DOPGO2G [amazon.com]

      Much cheaper than a lot of the 4k monitors out there, but is the image quality good enough to not make your eyes bleed?

      • by JDeane ( 1402533 )

        Reading the first really long review it seems to work best as a 1080P monitor for PC's.

        http://www.amazon.com/Seiki-Digital-SE39UY04-39-Inch-Ultra/product-reviews/B00DOPGO2G/ref=dp_top_cm_cr_acr_txt?ie=UTF8&showViewpoints=1 [amazon.com]

      • by Anonymous Coward on Thursday October 24, 2013 @09:39AM (#45223361)

        I bought their original 50inch model in May of this year to use as a monitor. I paid $1099 at the time, with Amazon Prime shipping.

        There were a few little annoyances immediately that I had to work out, and the Seiki support people were great. Got new firmware to fix a few things.

        The only functional issue I have left is it won't autowake up from the hdmi on my video card (it is actually the video card not the monitor) so I have to hit the button.

        Overall I'm happy with it, here are a couple of my quick comments
        The screen is a little glossy for my taste but not horrible.(personal preference)
        The colors are a little over saturated, I should probably to a color calibration on it.
        The monitor is a little too big, I actually have to turn my head and pick up my mouse more than I'd like for stuff on the far edge. I've been telling people a 42" would be about perfect so the 39" looks nice, especially for the price.
        On a couple of games I've thought I've seen a little ghosting but nothing horrible. At 4k the HDMI is only 30Hz but the actual screen refresh is still normal.

        I originally said I would try it for 60days and worst case scenario it would become just another TV. That time expired in July and I'm still using it.

        Hope this helps.

      • by scamper_22 ( 1073470 ) on Thursday October 24, 2013 @10:06AM (#45223735)

        Maybe I'm just a simpleton, but I recently went out to get a new monitor.

        I ended up getting a 1080p 23 inch LED TV instead and just plug in my PC via HDMI.

        Now, like I said, I'm a simpleton, and I'm sure other people can make use of much higher resolutions or other characteristic that my simple eyes and brain cannot process.

        But for me, I sat there staring at the monitor and then the TVs. Then I looked at the price; they're about the same and it just made sense to get the TV. It comes with built in sound, a remote control (good for sound control too).

      • Seiki 4K (Score:5, Informative)

        by neoshroom ( 324937 ) on Thursday October 24, 2013 @01:12PM (#45226435)
        Most people who replied to you didn't answer you and most of those people gave you the wrong answer. A number of people said that the Seiki will only run at 1080p with a computer attached, which is just flat wrong.

        The 4k Seiki will run in full resolution with both the 39-inch and 50-inch models. The limiting factor on the Seiki's are the connector, which is standard HDMI. A standard HDMI cable cannot push more than 30 hz, which is a very flow refresh rate for monitors these days. Indeed, the Seiki itself supports 120hz, but because it only comes with a cable jack that allows 30hz, you need to use 30hz.

        In the next year hopefully other companies or Seiki itself will come out with displays with HDMI2 or Thunderbolt ports at similar price points. This will allow higher refresh rates to be used, prevent screen tearing in 3d work and gaming and improve fast-motion scenes.
    • by canadiannomad ( 1745008 ) on Thursday October 24, 2013 @08:41AM (#45222591) Homepage

      Let me put it this way:
      Linus Torvalds Advocates For 2560x1600 Standard Laptop Displays [slashdot.org]

      The fact that laptops stagnated ten years ago (and even regressed, in many cases) at around half that in both directions is just sad.

    • by AmiMoJo ( 196126 ) * on Thursday October 24, 2013 @08:42AM (#45222605) Homepage Journal

      It will be like HD and 3D. In a few years it will become standard on mid range and even cheap TVs.

      The key difference with 3D is not the cost of the TVs, it's the cost of the broadcast equipment and cameras. 3D was actually quite a cheap upgrade from HD, and most of the same equipment and software could be used with a few modifications. 4K is another ball game though.

      Even worse there is 8K on the horizon as well which will require yet more brand new equipment. NHK, the Japanese national broadcaster that invented 8K, has stated that they will not support 4K at all and are instead going to look at going directly to 8K around 2020 (in time for the Olympics). I have a feeling they may not be alone in wanting to wait, but of course TV manufacturers all want to push 4K as a reason for the consumer to upgrade or pay a premium.

      • 8K sounds like an opportunity for 3D 4K .....
        Someone just has to have the balls to make a decision.

      • 3D TV doesn't actually exist. Every production model on the planet is a half assed fake 3D that a good portion of the population can't even actually perceive 3D from.

        Its not 3D, its lameass stereoscopic.

        3D TV requires my perspective to change when I move MY head, not just when the camera moves.

      • by skids ( 119237 ) on Thursday October 24, 2013 @09:02AM (#45222867) Homepage

        It will be like HD and 3D. In a few years it will become standard on mid range and even cheap TVs.

        ...and People On The Internet(TM) will still be complaining that it's all "hype" and will never make it in the market, even though they own one.

        • by jedidiah ( 1196 ) on Thursday October 24, 2013 @09:16AM (#45223015) Homepage

          Just because it's being force fed to you, it doesn't mean you are actually using it.

          I own a Smart TV but I have a Roku attached to it. If my next TV also has "smart tv" features, they will be completely transparent to me. It's like a PC that has a force bundled copy of Windows on it.

          Will never see it. Will never use it.

          The real question here is "where's the content?".

  • by Major Ralph ( 2711189 ) on Thursday October 24, 2013 @08:33AM (#45222513)
    I can understand why 4k televisions may not take off, but 4k monitors will definitely be a big deal. Just look at how AMD and NVIDIA are gearing up their GPUs to support it.
    • I'll start to be interested in 4K when there are cheap devices, displays and content worthy of driving a 4K display.

      Until home consoles are rendering 4K@60 frames per second comfortably across all games, or super-mega-ultra-duper-bluray is becomes mainstream, I doubt your average joe will really care. Current generation consoles can't even do 1080p at decent framerates across all games. Though blu-ray is pretty nice.

  • by Jody Bruchon ( 3404363 ) on Thursday October 24, 2013 @08:34AM (#45222519)
    Existing 1080p quality can't be discerned as better by someone sitting 10 feet away on a couch looking at a 42" TV. Going past 1080p has no value whatsoever unless you're talking about insanely huge screens or impractically close viewing.
    • Re: (Score:2, Funny)

      by Anonymous Coward

      1080p ought to be good enough for anybody.

    • by Luthair ( 847766 )
      10 feet is well beyond the recommended viewing range for a 42" TV. THX for example would recommend 4-6 feet viewing distance for a 40" TV
      • by Guppy06 ( 410832 )

        Yeah, but does it say that on the TV's box, or anywhere in the documentation?

        Parent is going by typical usage, not recommended usage.

    • Existing 1080p quality can't be discerned as better by someone sitting 10 feet away on a couch looking at a 42" TV. Going past 1080p has no value whatsoever unless you're talking about insanely huge screens or impractically close viewing.

      A: Existing 1080p quality can't be discerned as better by someone sitting 10 feet away on a couch looking at a 42" TV
      B: Going past 1080p has no value whatsoever unless you're talking about insanely huge screens or impractically close viewing

      You're implying:
      C: 42" is insanely huge.

      My answer is:
      C is demonstrably false, as I'm about two feet away from the screen I'm using at this very moment.
      D is demonstrably false, as many sane people buy larger screens.

      I suggest you rethink your position replacing distance a

      • Existing 1080p quality can't be discerned as better by someone sitting 10 feet away on a couch looking at a 42" TV. Going past 1080p has no value whatsoever unless you're talking about insanely huge screens or impractically close viewing.

        A: Existing 1080p quality can't be discerned as better by someone sitting 10 feet away on a couch looking at a 42" TV
        B: Going past 1080p has no value whatsoever unless you're talking about insanely huge screens or impractically close viewing

        You're implying:
        C: 42" is insanely huge.

        My answer is:
        C is demonstrably false, as I'm about two feet away from the screen I'm using at this very moment.
        D is demonstrably false, as many sane people buy larger screens.

        I suggest you rethink your position replacing distance and size by field of vision. Your previous statement would turn into "an field of vision over n degrees is useless". To which I'd answer "Anything less than my entire FoV is not enough."

        I am lost, what was D again?

      • I meant
        C: 42" is insanely huge.

        C is demonstrably false, as I'm about two feet away from the screen I'm using at this very moment.
        D is demonstrably false, as many sane people buy larger screens.

      • You're implying:

        C: 42" is insanely huge.

        My answer is:
        C is demonstrably false, as I'm about two feet away from the screen I'm using at this very moment.
        D is demonstrably false, as many sane people buy larger screens.

        I suggest you rethink your position replacing distance and size by field of vision. Your previous statement would turn into "an field of vision over n degrees is useless". To which I'd answer "Anything less than my entire FoV is not enough."

        I've never understood these people who never get close to their monitor to see more detail.
        For me it is the most natural thing to want to do instead of "zooming."
        Just cause I can zoom doesn't mean that sometimes I won't want to actually get closer and look.

        • There is such a thing as the "Resting Point of Vergence" which is the shortest distance at which people's eyes can focus effortlessly and indefinitely. The average is 45" looking straight ahead and 35" looking on a 30 degrees down-angle. Sitting closer to your TV/monitor than your RPV will cause eye fatigue over time. In my case, that distance is around 30" looking straight ahead. For some people, it can be as short as 15". But the average is 45".

          So for most people, sitting close enough to their monitor(s)

      • Re: (Score:3, Funny)

        by BitZtream ( 692029 )

        You're 2 feet away from a 42" display?

        Are you stupid? Does you neck hurt yet? Are you tired of having to lean over to get a good head on look at the 1/3rd of the screen on either side of the middle or do you just ignore 2/3rds of your screen.

        Sitting 2 feet away from a 42" display makes you a moron unqualified to continue this conversation.

        • by Immerman ( 2627577 ) on Thursday October 24, 2013 @10:06AM (#45223743)

          I take it you've never played a first-person game on a 40" screen. Granted I'm usually a little closer to 3' away (arms length is the recommended distance to sit from a monitor). Filling a larger portion of your FOV is a great way to boost immersiveness. And yes, I do have to move my eyes a lot to see the full detail in the corner of the screen, but I have to do that out in the real world to.

          Works great for office work as well (though that 4K resolution would be a huge bonus), in which case I'm generally only looking at a portion of the screen at a time, but can switch between tasks/monitor different things simply by moving my eyes, almost like working on a physical desk. And it's a big boost over multiple monitors in that you can size windows to whatever size and aspect ratio makes sense for the tasks at hand.

      • by mwvdlee ( 775178 )

        I don't think that's what GP was implying at all.
        In fact quite the opposite.

        A. 1080p on 42" at 10 feet away is more than most people can discern.
        B. More than 1080p on 42" at 10 feet away has no value. 1080p may have value if on a screen far bigger than 42" at 10 feet, or with 42" far closer than 10 feet.

        So he seems to imply;
        C: 42" is less than 'insanely huge'.

        Also, your assertion of D may or may not be correct, since D is undefined.

    • Why are bigger screen insane and being closer impractical?
      • by Guppy06 ( 410832 )

        Not going to speak to the sanity of screen size, but with respect to sitting closer: most people would prefer watching television in a living room rather than a closet.

    • by AHuxley ( 892839 )
      As people have hinted, go to a shop and look. 4K is really what 1080p 'should' have been finally years later. This will be great with new or cleaned up digital media.
    • by JaredOfEuropa ( 526365 ) on Thursday October 24, 2013 @09:59AM (#45223633) Journal
      4k may not make much sense on a 42" TV, but on 55" the difference is clearly visible. And screens are getting bigger all the time, with sizes around 65" being common and even a few screens of over 100" hitting the market.

      Also the comparison to 3D is flawed. 3D requires 3D content, but viewing stuff on a 4k screen carries a benefit even for content not in that resolution. Compare an ordinary blu-ray on a HD screen and a 4K one (both 55" or over); you'll see a marked difference in quality thanks to the upscaler. The same way DVDs look way better on my upscaling HD screen than they do on a lower res one of the same size.
  • Hnnnnnggggg (Score:5, Insightful)

    by L4t3r4lu5 ( 1216702 ) on Thursday October 24, 2013 @08:36AM (#45222531)
    To make full use of that resolution ("Retina" quality, i.e. indistinguishable pixels) at a viewing distance of 10ft you'd need a screen 150" screen. That's 8ft wide 4ft6in tall.
    • To make full use of that resolution ("Retina" quality, i.e. indistinguishable pixels) at a viewing distance of 10ft you'd need a screen 150" screen. That's 8ft wide 4ft6in tall.

      Still much smaller than an average wall.

    • Re:Hnnnnnggggg (Score:5, Insightful)

      by MozeeToby ( 1163751 ) on Thursday October 24, 2013 @08:59AM (#45222827)

      The ability to see individual pixels is not the limit of perceptible improvement though. Even on 'retina' displays there is visible aliasing on diagonal lines. Think about it like this, a 12nm chip fab produces individual elements at 12nm, but places them with much, much better than 12nm accuracy.

    • Re: (Score:3, Insightful)

      by jon3k ( 691256 )
      Where are you getting your numbers from?

      Using this: http://isthisretina.com/ [isthisretina.com]

      I got: 4K display at 70" becomes "retina" at 55 inches.
    • Re: (Score:3, Insightful)

      by Anonymous Coward

      This, like the "you can only see so many colos" argument is misleading.

      You absolutely can tell the difference between 4k and 1080p at average viewing sizes and distances - but not because you can pick out the individual pixels.
      Lower pixel density creates visual artifacts. Aliasing, uneven gradients, pixel pop (Where small elements or points of light like stars get lost between large pixels), etc.
      If you see a 4k and a 1080p display side by side the difference is shocking.

      There is absolutely a place for 4k T

    • Re:Hnnnnnggggg (Score:4, Informative)

      by jddj ( 1085169 ) on Thursday October 24, 2013 @09:49AM (#45223503) Journal

      I've seen 4K on a not-yet-released 20-inch Panasonic tablet [engadget.com] - it's jaw-dropping. You might not be making "full use", but...oh, my it's beautiful. This from a guy who doesn't care much for TV or video.

      OK, you're asking "why a 20" tablet? WTF?" - one vertical market for this is radiologists, who definitely need all the resolution they can get, high dynamic range, and a big screen. Saw it at a medical convention.

  • Cable companies have a hard enough time providing enough bandwidth for more than a couple HD channels, where are they going to find the bandwidth for 4K Ultra HD? Does Blue Ray even have the ability to take advantage of this technology? How about gaming platforms? What, exactly, would let someone be able to justify their investment?
    • Supposedly the PS4 and Xbox One will support 4K displays, but I imagine it will be done by either upscaling or sacrificing various graphical effects.
      • Yes but 4K content can be rendered in a video game given the right hardware/software. 4K video requires that the content be recorded and sent at that resolution. Content providers like cable channels are now only producing all of their content in 1080p much less 4K.
    • Cable companies have a hard enough time providing enough bandwidth for more than a couple HD channels, where are they going to find the bandwidth for 4K Ultra HD?

      Exactly, I get 720p from TV, and in places I can see where they're compressing it down and it looks blocky.

      In the abstract, this might be good. But from a practical purpose, my cable company isn't delivering 1080p to me now, there's no way they'd give me 4K.

      This makes sense for movie theaters, but for consumers I think this is a complete dead end

      • Exactly - if content providers aren't even willing to send enough bitrate through the pipe to deliver a satisfactory experience by today's HD standards, who on Earth would imagine they'd do justice to 16x the bandwidth requirement just a few years from now? Some broadcasts are still MPEG-2; some others are MPEG-2 but get passed through a last-leg AVC transcoder to save bandwidth; and while AVC's enjoying healthy adoption, there's no way to expect most companies will pay the hefty fees to adopt HEVC equipme
      • by Mitsoid ( 837831 )

        Not to mention... If you want to stream a 4K Show over Hulu or Netflix, you'd hit your AT&T or cable provider's MONTHLY bandwidth cap in ~5 MINUTES. (variable, depends on compression, cap, buffering, throughput, etc.)..

        Reminds me of LTE, Verizon Wireless is quick to point out you can download from them at over 50Mb/sec.. they wont tell you that after 60 seconds your phone bill is now $600

    • by Guppy06 ( 410832 )

      Cable companies have a hard enough time providing enough bandwidth for more than a couple HD channels, where are they going to find the bandwidth for 4K Ultra HD?

      They can start by charging for analog channels commensurately for the bandwidth they use, rather than giving away the analog stuff they modulate in-house for "free" while charging "extra" for digital content they've nothing to but encrypt.

    • by AHuxley ( 892839 )
      I think some are hoping good codec plus 'time' via local storage will out pace and bandwidth limits of rotting telco copper, HFC until optical is ready.
      http://www.red.com/store/products/redray-player [red.com]
  • Fix HD First (Score:5, Insightful)

    by Rob Riggs ( 6418 ) on Thursday October 24, 2013 @08:37AM (#45222543) Homepage Journal
    Why the heck would I want UHD when most HD content is so compressed that the artifacts are easily discernible from across the room. At least that is my experience with every HD medium I have seen OTA, cable, satellite, and to a much lesser degree in Blu-ray.
    • Re:Fix HD First (Score:5, Insightful)

      by MightyYar ( 622222 ) on Thursday October 24, 2013 @08:48AM (#45222679)

      I came here to post this. I'm in the minority, but to my eye it is more pleasant to watch the old grainy picture than it is to watch compressed high resolution video. In particular, my eye gets drawn to grass. Every time I watch a game played on grass (baseball, football, the other football, etc), the digital compression just hijacks my eyes. I can learn to ignore it over time, like watching a movie with subtitles, but it still is not my preference.

      • Re:Fix HD First (Score:4, Interesting)

        by dinfinity ( 2300094 ) on Thursday October 24, 2013 @10:01AM (#45223677)

        If an option, use ffdshow. Add noise.

        Best way to turn almost all compression artifacts into regular noise. Your brain is great at perceiving that as being higher quality imagery.
        Using post resize noise or post resize sharpening (MPC-HC or MPC-BE sharpen complex 2) also works great to turn 720p content into '1080p'.

    • Re:Fix HD First (Score:4, Informative)

      by Russ1642 ( 1087959 ) on Thursday October 24, 2013 @08:48AM (#45222681)

      I agree. Compression is the primary issue here. Make the resolution 10k and it'll still look like crap because of the heavy compression. But if you're claiming to see compression artifacts on a blu-ray disc I think you need your eyes checked. Those usually don't use anywhere near the compression of cable TV.

    • Re: (Score:3, Insightful)

      Why the heck would I want UHD when most HD content is so compressed that the artifacts are easily discernible from across the room. At least that is my experience with every HD medium I have seen OTA, cable, satellite, and to a much lesser degree in Blu-ray.

      You have a point, but you lost credibility when you included OTA in that list. OTA is uncompressed 18.2mbit MPEG. There is no point in compressing an OTA broadcast because the bandwidth is functionally unlimited, and I don't even think that the ATSC standard supports compression beyond normal MPEG2. When you see artifacts on an OTA broadcast it is most emphatically *not* from compression, it's usually from interference or a badly tuned/aligned antenna.

      With a proper antenna setup, an OTA HD broadcast looks p

      • Re:Fix HD First (Score:5, Informative)

        by Trip Ericson ( 864747 ) on Thursday October 24, 2013 @09:23AM (#45223115) Homepage

        MPEG-2 is compressed by definition; an uncompressed HD picture is something like 1 Gbps. Confetti, for example, looks awful no matter what the source, because it's hard to compress.

        The only reason MPEG-4 isn't supported in ATSC is because it didn't exist when the standard was written! MPEG-4 is actually now in ATSC, but is not a required part, so no receivers support it and no broadcasters use it except in rare corner cases.

        And it's only 18.2 Mbps if there are no other services on the OTA channel; some stations in smaller markets now cram 3 HD services into the 19.393 Mbps channel, which is an average of about 6 Mbps per video channel when you take into account audio and overhead. Most other stations run at least one SD channel in addition to the HD channel, many run more than one. Others are doing Mobile DTV which eats into the bandwidth available. The bitrate of a single HD feed averaged across all OTA stations in the US and Canada is something in the neighborhood of 13 Mbps in MPEG-2.

        Obligatory disclaimer: I used to work for a broadcast TV company heading up our broadcast TV engineering projects. I now work for the FCC on over-the-air digital TV matters. In my spare time, I run digital TV website RabbitEars.Info.

      • by crow ( 16139 )

        OTA broadcast is, as you say, MPEG. Or, more precisely, MPEG-2. To say it is uncompressed is completely false. It may not be over-compressed, but you still see artifacts in scenes that the compression can't handle well, particularly scenes with rain or fire--anything chaotic where there are massive changes between frames.

      • Re: (Score:3, Informative)

        by lobosrul ( 1001813 )
        There are so many facts wrong in your post that I sincerely hope you don't work in the technical field of broadcasting. OTA uses MPEG-2 (same codec as on DVD's), which is a lossy compression technique. ABC NBC and CBS stations all take an MPEG-4 feed from their network and re encode it to MPEG-2; FOX stations get MPEG-2 video that they then "splice" is their network bug and local commercials and promos. Getting a lousy picture on digital TV from a poor or unaligned antenna is a lie that salesmen use. If th
  • There's a simple reason for this ... people don't care, and don't have the money to replace their TVs just because something new and shiny comes along.

    I'd need to replace my amp, my DVD player (or whatever it would be called), my TV and who knows what else. All to get me a marginally better display?

    No thanks.

    I'm interested in 4K for my computer monitor, but the ever changing standards around TV makes it a nuisance.

    I know plenty of people who bought "HDTV" early in the game, only to find out when HD became

    • What consumers want is a stable technology, not be be on a constant upgrade treadmill

      There are different kinds of consumers. What you say is probably true of the consumers buying their sets at Walmart and Target, and it's probably true of me as well (at least to a degree). But I know plenty of people who are always on the bleeding edge. This 4k stuff is blatantly targeted at those consumers, and it may or may not trickle down to the rest of us... sometimes these high end things succeed (hi-fi VHS, HDTV) and sometimes they fail (videodisk, DVD audio), but the high-end, bleeding edge folks ge

  • by ShooterNeo ( 555040 ) on Thursday October 24, 2013 @08:40AM (#45222571)

    As it is right now, the only true 1080p content is high bitrate blu-ray disks, and PC games. There is nothing else.

    None of the currently released consoles can render 1920x1080 at 60 fps : they use a lower frame rate (30 fps) and a lower rendering resolution (not even 720p internally for most games). The next gen can maybe do it, but I suspect that some games will use lower frame rates or internal resolutions so that they can put more detail into other things.

    Broadcast channels, satellite channels, and HD cable channels all generally are full of lower bit-rate tradeoffs. You need about 30-50 mbps to do 1080p without compromises or visible encoding errors.

    Maybe in another 10 years, when the technology is actually fully utilizing the 1080p displays we already have, will an upgrade make sense.

    Note that this is for video content. For your computer or tablet PC, higher resolutions are useful, and shipping tablets are already at higher resolutions.

  • I still don't get 1080p over digital cable, I don't see 4k coming anytime soon. The only 1080p content I have that gets displayed on my TV is blu-ray, video games, and digital downloads.
  • Is also a movie studio and is already selling 4k content.
  • Comment removed based on user account deletion
  • by PortHaven ( 242123 ) on Thursday October 24, 2013 @08:42AM (#45222613) Homepage

    I really do. It wasn'tmuch added cost. And yes, the problem is content.Movie makes should do the year of 3D. And actually sell the 3D at the same price. People would buy more 3DTVs.

  • There is way too must current content that is still not transmitted in 1080p. Buying a new (expensive) TV just to display most shows in standard resolution makes no sense at all. Yes, I know live broadcasts are usually in high def, but one can only watch so must sports on TV. To be fair, I think it is actually a legacy problem. There is so much good legacy content recorded in standard definition that it is tough for new content to compete, at least from a percentage perspective. Best excuse for a good
  • Actual 1080p isn't even here yet for a lot of media. Most games and TV stations still only use 720p, and there are quite a few movies in that mode as well. It's no surprise that no major content provider is considering 4K at this point.

  • The timing for 4K is just too soon and wrong. Firstly, we're in a delicate financial situation around the world and the biggest consumer nation is on the edge of collapse. It seems like only a few days ago we went to digital TV. People are STILL getting rid of the CRT TVs. And the marketers are trying to sell us 4K TVs??! I'm sorry but no. Just no.

  • by argStyopa ( 232550 ) on Thursday October 24, 2013 @08:56AM (#45222791) Journal

    3d TV's failure was most certainly not a 'lack of content' and if it's perceived that way by the media mavens, then the same mistakes will be repeated.

    3d failed because:
    - technologically not-ready-for-prime-time; wearing uncomfortable specs etc wasn't popular in theaters the FIRST go around with 3d.
    - people recognized it for what it was: a money-grab by hardware producers trying to re-milk the public that had already been forced to go out and buy all-new digital tvs.

  • Just give it up. Broadcast TV standards don't change overnight, and 4k is going to take huge effort, to provide a small improvement.

    You're talking about making all those receivers people just went out and bough, completely useless. The government would have to PAY to replace them, just like they did with digital converter boxes a few years ago.

    And don't tell me about satellite/cable companies! They lag BEHIND broadcasters, they do not take the LEAD... And internet service looks to be more bandwidth cons

  • and just repost every complaint about going to 1080p form 10 years ago? Jest replace 1080 with 4k.
    Or flat screen with 4k.

    People are going to want 4k because it's stunning.

    If I had time I would look at the history of the loud complainers and see if they were the people saying no one would do HD or pay for a flat screen.

  • by trongey ( 21550 ) on Thursday October 24, 2013 @09:51AM (#45223541) Homepage

    Wake me when they announce 640K.
    That should be enough for anybody.

  • 4K is stunning (Score:5, Informative)

    by peter303 ( 12292 ) on Thursday October 24, 2013 @09:55AM (#45223579)
    When I visit the local Sony and see the 4K 9with true 4K content) side-byside with their best regular HDTVS, the improvement is quite stunning. The get pretty close to "appearing like a real window rather a just a TV" threshhold.
  • by ThomasBHardy ( 827616 ) on Thursday October 24, 2013 @03:00PM (#45227883)
    I look at this article and I see the other article today at http://yro.slashdot.org/story/13/10/23/2213237/top-us-lobbyist-wants-broadband-data-caps [slashdot.org] for broadband data caps and clearly these two things are opposed initiatives, both designed to make more money by treating the public as money pinatas.
  • by a4r6 ( 978521 ) on Friday October 25, 2013 @09:20AM (#45233805)
    It is not to have 4 times as many things on the screen as a 1080p monitor. It is to have a 2:1 pixel ratio (like all the apple retina displays) or somewhere in-between. Web content, thanks partly to apple pushing high dpi displays, is now often tuned for this, showing you twice as much detail in the same space while keeping the dimensions it would have on a normal dpi display.

    Read what anandtech had to say about testing a 4k monitor, and about how nice it is to look at fonts that arent just anti-aliased, but hardly have aliasing to begin with, thanks to the dpi.

    I run a 1440p monitor, as it was the most pixels I could reasonably afford, (4K is just too much $) and I scale everything up so it's roughly 1080p sized. I love it for the clarity and sharpness, not for the number of things I can cram on the screen. (Although I do run my font just a little small in my text editor/ide)

    There are of course downsides besides the price. Most of the 1440p monitors have poor input latency, meaning your mouse might feel a tiny bit laggy or put you at a slight disadvantage if you're a gamer, compared to lower latency 1080p monitors. That's totally ignoring whether your video card can render smoothly at that resolution. With 4K I'm not sure but I suspect it's the same or worse.

"Oh what wouldn't I give to be spat at in the face..." -- a prisoner in "Life of Brian"

Working...