Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Television

Why You Shouldn't Buy a UHD 4K TV This Year 271

Lucas123 writes "While it's tempting to upgrade your flatscreen to the latest technology, industry analysts say UHD TVs are still no bargain, with top brand names selling 65-in models for $5,000 or more. And, even though 4K TVs offer four times the resolution of today's 1080p HDTVs, there are no standards today for how many frames per second should be used in broadcasting media. Additionally, while there's plenty of content being produced for UHDs, little has been made available."
This discussion has been archived. No new comments can be posted.

Why You Shouldn't Buy a UHD 4K TV This Year

Comments Filter:
  • Your argument is invalid. See subject.
    • Also Linux friendly (Score:5, Informative)

      by SuperKendall ( 25149 ) on Wednesday November 27, 2013 @04:46PM (#45542401)

      If you look at the TV on Amazon [amazon.com] (not an affiliate link), one of the top-rated comments is a really helpful set of instructions in getting it to work well under Linux.

      I have to admit I am strongly tempted in getting the monitor for programming, and there are some indications it might be good for photo work after calibration. But I would really love to see one in person first.

    • by Dachannien ( 617929 ) on Wednesday November 27, 2013 @04:46PM (#45542403)

      You forgot to factor in the cost of the microscope you'll need to see any additional detail at 4k on a 39" screen.

      • I can easily see pixellation on the 30" 2560x1600 monitor I'm sitting at. Please step aside and make way for progress.
        • To play devil's advocate here, how far away from that monitor are you sitting? How far away from your TV do you sit?
          • by h4rr4r ( 612664 )

            To be a reasonable person here, if you are using that TV as a monitor you would sit the same distance.

            Even if you are using it as a TV, my couch is not bolted to my living room floor, I doubt yours is either.

            • Even if you are using it as a TV, my couch is not bolted to my living room floor, I doubt yours is either.

              mine is bolted to the ceiling! it's because i am that awesome.

          • I'm not the guy you want to ask that question, I have a linux PVR connected to my TV so I do use it as a computer sometimes.

            I am disappointed the new Playstation 4 and XBox One won't support 4k gaming though. A 4-way head-to-head game (remember Goldeneye?) would be so cool on that. I wonder if any PC games would allow me to run two instances and 'network' them.

            • I am disappointed the new Playstation 4 and XBox One won't support 4k gaming though

              from what i read, it seems to be questionable whether or not they'll be able to support 1080 gaming (well).
              framerate > pixel count anyhow.

          • by F.Ultra ( 1673484 ) on Wednesday November 27, 2013 @05:32PM (#45543001)
            Who buys a larger TV just so that they can sit further back in the room? I bought my 64" to get a bigger screen, not to sit far far away.
        • Re: (Score:3, Insightful)

          by Anonymous Coward

          I can easily see pixellation on the 30" 2560x1600 monitor I'm sitting at. Please step aside and make way for progress.

          Wait a few years.... screen will get better, and your eyes will get worse. Soon, you'll have nothing to worry about.

      • 39" is a fairly modest TV; but a big monitor. Like 'dominates your desk' big. I suspect that the bigger question would be whether you find yourself comfortably able to use real estate that is that far out of the center of your field of view (and, unlike dual or triple monitor setups, is all fixed in the same plane, rather than in two or more individually rotated chunks).
    • It's also a crap tv, but hey who are we to disagree?

      This is like saying that a visio 67" 720p screen exists. It doesn't mean people with common sense should just give up their money and absolve all logic. In the non-TLDR form: it's a 30 hz (read: 1/2 of the supposed maximum for the human eye which has been debunked) 4K display. Even the worst of TV's can handle a proper 60hz at all resolutions.

      Hell, many graphics cards can output 4k at 30 fps. 60 is a different story.

      • by pla ( 258480 )
        it's a 30 hz (read: 1/2 of the supposed maximum for the human eye which has been debunked) 4K display. Even the worst of TV's can handle a proper 60hz at all resolutions.

        True, but only by a technicality.

        Refresh rates haven't mattered nearly so much since the bad ol' days of having an electron beam scan a screen so it updates in pulses of brightness. At 30hz, a CRT causes massive headaches. At 60hz, most people could feel the eye strain after a while. But with a display that doesn't flicker, none of
        • by Kjella ( 173770 )

          If you actually had a true full-4k feed at 60+hz (and did I miss the announcement of a mainstream optical media format that holds more than a terabyte?), you could - marginally - detect the difference in a rapidly moving scene.

          Get some 60p content and turn it into 30p, I think you'll notice the difference. The "filmatic" 24p is *very* noticable. But yeah for writing code, no problem.

        • by jandrese ( 485 )
          For watching movies or writing code that 30hz refresh will be perfectly fine. Fire up a FPS on there and you'll be feeling the choppiness. The difference between 30fps and 60fps is absolutely noticeable in fullscreen games.
  • by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Wednesday November 27, 2013 @04:42PM (#45542339) Homepage

    there are no standards today for how many frames per second should be used in broadcasting media.

    Rec. 2020 [wikipedia.org], a standard used by UHD, specifically gives framerates of 120p, 60p, 59.94p, 50p, 30p, 29.97p, 25p, 24p, and 23.976p.

    • It's not that content can't be properly produced or formatted. It's getting it to you that's the problem.

    • Is rec. short for approved standard or recommendation?

      • Is rec. short for approved standard or recommendation?

        When speaking video standards, Rec. 2020 is short for ITU-R Recommendation BT.2020. Not sure who started that abbreviation, but it's stuck.

    • Great, now I can finally watch the HFR version of the Hobb... oh, wait... [wikipedia.org]

    • Re:Err, what? (Score:4, Informative)

      by ahabswhale ( 1189519 ) on Wednesday November 27, 2013 @10:13PM (#45545453)

      HDMI 2.0 went official only a couple of months ago and none of the sets on the market today support it, so you're limited to 24p. In short, the TV is obsolete the moment you buy it. It's the dumbest purchase you could possibly make right now (in regards to a TV). I would also add that if you have an AV receiver, then you will need to upgrade that to a new model that has HDMI 2.0 as well, and they don't exist either.

    • by Dahamma ( 304068 )

      That is NOT a broadcast standard, it's just an ITU recommendation.

      ATSC (the current US standard) included a bunch of horrible choices basically made because some US companies had certain tech and wanted to use theirs over others - for example, using 8VSB over the superior OFDM. But that's just one example of the ridiculous politics that play into the real "standards" vs. the "recommendation" that you quoted...

  • Early Adopters (Score:5, Insightful)

    by almitydave ( 2452422 ) on Wednesday November 27, 2013 @04:42PM (#45542343)

    But we need the deep-pocketed early-adopting suckers to offset R&D costs as much as possible so the prices come down for us average Joes when the content is actually widely available!

    • by Todd Palin ( 1402501 ) on Wednesday November 27, 2013 @04:44PM (#45542377)
      Maybe they can trade in their 3-D TVs.
      • I'll have you know that I watched several hours of the Olympics (and nothing else ever) in 3D last year, thank you very much.

        Also, my lawn. Get off it.

        • I'll have you know that I watched several hours of the Olympics (and nothing else ever) in 3D last year, thank you very much.

          I'm guessing a split between beach volleyball and ladies tennis.

      • by h4rr4r ( 612664 )

        I will be getting one of those this year if only so I can rewatch the Dr Who 50th in 3D. I intended to buy a bigger TV anyway and adding in 3D is pretty cheap.

  • by Anonymous Coward on Wednesday November 27, 2013 @04:42PM (#45542351)

    I don't need an analyst to tell me not to spend $5000 on a TV. That's common sense. Duh.

  • by mwvdlee ( 775178 ) on Wednesday November 27, 2013 @04:44PM (#45542371) Homepage

    Why You Shouldn't Buy a UHD 4K TV This Year

    Because there is very little content for it.

    • by Pope ( 17780 )

      Because there is very little content for it.

      And there's no point in such a high resolution standard for the home user at this point anyway.

      On top of that, the very name of the standard is misleading, which puts me against it regardless.
      1080p = 1920x1080 pixels. Easy to understand.
      4K = 3840×2160 pixels. Why not just call it 2160p so we have something easy to compare to?

      • by Qzukk ( 229616 )

        Why didn't they call it 4X since it's 4X the pixels?

        Because then they couldn't steal the thunder from the real 4K 4320p standard that was being worked on, which everyone would have just sat out a TV generation to wait for.

        TV Tokyo is promising 4320p (now called "8K" just to keep one step ahead) broadcast for the 2020 Olympics. Are you sure you want to buy that "4K" TV now?

        • Lots of companies were showing off 4K screens at CES, and they look good. Sharp was the only company with an 8K screen, and it was jaw dropping.

      • On top of that, the very name of the standard is misleading, which puts me against it regardless.
        1080p = 1920x1080 pixels. Easy to understand.
        4K = 3840×2160 pixels. Why not just call it 2160p so we have something easy to compare to?

        There is a whole other industry out there that measures by a completely different method than we're used to, but as usual, tech doesn't define trends. Smart salesmen do.

        But let's think like consumers for five minutes, and accept a bit of tongue-in-cheek realism.
        Not as easy as you think to recall 2160p to ask to see one at the store to... BUY it! ;-)

        * how many non-zero numbers do I have to recall, and in what order did they go, again?
        * what do I do if see the 720 and 1080 stickers, but want to ask if they ca

      • by wonkey_monkey ( 2592601 ) on Wednesday November 27, 2013 @06:31PM (#45543547) Homepage

        Why not just call it 2160p so we have something easy to compare to?

        1080p = ten-eighty-pee = 4 syllables
        2160p = twenty-one-sixty-pee = 7 syllables

        That's why.

        • Replying to fix mod. This is a fairly spot on assessment. If 4K weren't 2 syllables I imagine that they would be using something else as well.
    • I for one, cannot wait to see the new edits George Lucas has planned for the UHD version of "A New Hope". But that probably won't be for another few years.

    • While content would be nice it's not 100% necessary. My 64" 1080 Plasma shows SD content much better than an 64" 480 Plasma ever would have.
  • by onyxruby ( 118189 ) <onyxruby&comcast,net> on Wednesday November 27, 2013 @04:45PM (#45542387)

    Follow the porn industry, they have an unblemished track record going back decades of getting at the bleeding edge of technology. From VHS to DVD to any number of other technologies porn was there first at any notable level. The rule of thumb for buying new technology without paying an arm and a leg is porn adoption + 4 years. That gets past the bleeding edge costs, the differing standards and the price typically settles down.

    • That reminds me of the time porn thought it was a good idea to have movies that you can cut to different camera angles in the same scene. Intriguing idea for titillating video, but for mainstream content, directors tend to want to control the camera angles the audience sees. Not every tech the adult industry backs works out.
    • Follow the porn industry, they have an unblemished track record going back decades

      I seem to remember the porn industry backed HD-DVD rather than Blu-ray in the earlier days of that format war.

    • by Trogre ( 513942 )

      That *might* possibly maybe have been true 30 years ago, but it certainly isn't now.

  • OLED (Score:3, Insightful)

    by Travco ( 1872216 ) on Wednesday November 27, 2013 @04:45PM (#45542389)
    OLED is the tops for image. The "depth" of the black pixels makes the OLED image SO superior to anything else, it beats pixel count no end.
    • Another nice thing about OLED is the dark areas take no power. If I'm getting a 4K TV it will be a BIG one. But I don't want normal TV viewing to be that huge or power hungry, I want to light up just a 1080p area in the center, thus having a smaller TV inside my big TV.
    • I agree. I am far more interested in OLED than I am 4K. 4K is nice to have, but a large screen OLED would be a must have.

      85" 4K OLED FTW!!

  • by JDG1980 ( 2438906 ) on Wednesday November 27, 2013 @04:45PM (#45542391)

    The average viewer would probably notice little difference on a 4K TV even if corresponding content were readily available (which, at this time, it is not). But I'm still hoping for the success of 4K, because it will make a big difference on monitors. Higher production volumes means cheaper panels. Currently, to get a 4K monitor (based on a 32" IGZO panel) that supports 60 Hz, you need to shell out $3500; but once the 4K monitors based on cheaper 39" VA panels hit the market, this should drop to $1000 or less. Seiki can sell TVs with those panels for $500, but the big drawback is that these only support 30 Hz due to limitations of the input controller.

    • by hawguy ( 1600213 )

      The average viewer would probably notice little difference on a 4K TV even if corresponding content were readily available (which, at this time, it is not).

      You'd have to qualify that with screen size. The average viewer sitting 10 feet away from his 40" TV wouldn't notice a different with 4K content, but give him a 70" or 80" screen, and he will.

      Before I had an HD TV, I had a 30" CRT - if it had been 1080p capable, I wouldn't have noticed much (if any) difference between that and 480p. It wasn't until I upgraded to a 37" 720P LCD TV, and later to a 55" 1080p TV that I could take advantage of the higher resolutions. 4K is the same - users will need much bigger

    • by Kjella ( 173770 )

      I would say 39" is a bit too much for the average desk though, 30-32" is just right. But as you say the only 4K monitors in that size are all based on the same IGZO panel (variants from Sharp, Asus, Dell), they seriously need a competitor. It's silly that you can buy a ca. 55" 4K TV from LG, Toshiba or Samsung for the same money. I guess it won't really hit the mass market until Apple is ready, like with the Retina tablets and laptops then everyone else followed. Hurry up ;)

    • Exactly correct. From typical TV viewing distances (and even from about half of the widely-considered-to-be-too-close THX recommended viewing range), a person with "perfect" vision will be incapable of distinguishing individual pixels once you're at the pixel density provided by 1080p (i.e. any 1080p TV at a typical viewing distance is a "retina" display), making an increase in pixel density pretty much worthless in that context.

      Since it won't improve the level of detail on the screen, an increase in pixel

  • by Iskender ( 1040286 ) on Wednesday November 27, 2013 @04:45PM (#45542393)

    "While it's tempting to upgrade your flatscreen to the latest technology,

    I don't have a TV, and don't watch TV/movies other than through my faux-HD monitor.

    I understand not everyone is like me, and that's OK. But in my circle of friends, it's really common to not have a TV and not care. Is this the experience of others, too?

    Also, this whole 4K thing reeks of "we tried to sell 3D, failed, now trying desperately with the next thing..." But please reply if you're really into 4K, too...

    • Why they ever upgraded from photographs to moving pictures I'll never know. I had all the detail I needed. Don't even get me started on color TVs.
  • And it's $3k. That might sound like a lot, but 1080p televisions of the same size seem to go for about $2k from most vendors anyhow...

  • by Russ1642 ( 1087959 ) on Wednesday November 27, 2013 @04:47PM (#45542423)

    Please see last year's posts on why you shouldn't buy a 3D TV.

    • They'll just be ignored by people citing the anti-HD arguments were wrong. There is no rationalizing with mania.
  • by timeOday ( 582209 ) on Wednesday November 27, 2013 @04:47PM (#45542427)
    1) Remove SD card from your digital camera.
    2) Insert in SD slot on TV.
    3) Enjoy.
  • Based on the amount of input lag present in "1080p" TVs, I can only imagine how bad the input lag is on "4K" TVs. (2 seconds or higher?)

    Of course, this isn't an inherent property of high-resolution panels. It's caused by idiots in management that "insist" that these TVs have worthless image filtering algorithms that distort the picture and lag the image.
    • I thought the lag happened only when 720 is up-scaled to 1080. If a 1080 TV is getting 1080 input then lag the lag goes away (or becomes negligible). Isn't that so?
  • by bob_super ( 3391281 ) on Wednesday November 27, 2013 @04:51PM (#45542493)

    My living room is too quiet to put an H265 decoder in it.

    • by h4rr4r ( 612664 )

      Please explain?
      You think solid state parts working make noise?

      • I was unnecessarily hyperbolic. It's the H265 compression which requires decent fans to keep the solid-state parts from releasing magic smoke.

        The decompression is a lot easier, but by the time you do it for 4k, I don't know that you can do it fanless yet.

        • by h4rr4r ( 612664 )

          Well, you can always make the box bigger.
          Just make the heat sink a few hundred pounds of copper.

  • When was being an early adopter ever a good idea?

  • From a philosophic point of view, 4K is pure aristocracy. 1080p was about perfecting the traditional TV, and was somewhat justified, but 4K is just "we gave you more pixels, because we can". The good side is of course that selling people another round of screens is good for the economy and employment. Personally, I'm perfectly happy with a high-quality 480p image, going higher than that is fun, but does not bring significant enhancement to my enjoyment.

    Also, the higher resolutions make me desire a higher fr

  • It really is 4K by how the naming conventions goes today. Today it goes Horizontal then Vertical like 1080p = 1920x1080 pixels. Really easy to understand. What they are calling 4K 4K = 3840Ã--2160 pixels. So fuck the advertisers that are trying to sell stuff that most people well assume is something else.
  • After a number of years in the desolate wasteland that is 1080P, we are finally at a convergence of the television and monitor markets with 4K televisions. Based on the ability of Seiki to sell a 4K 39" panel for less the $500, it's likely that 2014 will usher in a series of relatively-inexpensive monitors delivering this resolution. Similar 1080p panels are selling for $300, and since the manufacturing isn't significantly more difficult, it's likely that in 12-18 months that pricepoint will be reached f

  • As mentioned already, there is hardly any content for a 4K TV. Nobody broadcasts 4K, and there is no 4K cable provider either.

    While there are 4K movie theaters, and some productions are really shot and finished in 4K, most are not. And the current model of the most professional and widely used motion picture camera, the Arri Alexa, is not 4K.

    From their FAQ [arri.com]:

    Will there be a 4K ALEXA?

    [...] Given that 4K digital workflows are still in their infancy, and that for the foreseeable future most productions will finish in 2K or HD, ALEXA is the perfect choice for theatrical features as well as television productions. Furthermore, the ascendance of 3D has resulted in a doubling of image data volumes which further complicates the effective storage, processing and movement of such data. So, for the foreseeable future, ALEXA is ideally suited for 2K or HD workflows in 2D and 3D.

  • by neminem ( 561346 ) <<neminem> <at> <gmail.com>> on Wednesday November 27, 2013 @06:01PM (#45543259) Homepage

    Because you don't need one. This year or ever.

  • by Shaman ( 1148 ) <shaman AT kos DOT net> on Wednesday November 27, 2013 @06:23PM (#45543467) Homepage

    That I can't even tell the difference between 720P and 1080P. Once you get into the high colour and high resolution systems, the eye starts to lose the ability to see any flaws at all. Maybe there is a difference for uses other than recorded video. however.

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...