Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Television Japan Sci-Fi

The World's First 8K TV Channel Launches With '2001: A Space Odyssey' (bbc.co.uk) 146

AmiMoJo writes: Japanese broadcaster NHK is launching the world's first 8K TV channel with a special edition of 2001: A Space Odyssey. NHK asked Warner Bros. to scan the original negatives at 8K specially for the channel.

8K offers 16 times the resolution of standard HD, 120 frames per second progressive scan, and 24 channels of sound. NHK is hoping to broadcast the 2020 Tokyo Olympics on the channel.

17 other channels also began broadcasting 4K programming today, according to Japan Times, even though, as Engadget points out, "almost no one has an 8K display, and most of the people who do need a special receiver and antenna just to pick up the signal... Also, HDMI 2.1 hasn't been implemented in any of these displays yet, so just getting the signal from box to TV requires plugging in four HDMI cables."

NHK's channel will broadcast for 12 hours a day, reports the BBC, adding that Samsung already sells an 8K TV for $15,000, and that LG has announced one too, while Engadget reports that Sharp sells one for $6,600.
This discussion has been archived. No new comments can be posted.

The World's First 8K TV Channel Launches With '2001: A Space Odyssey'

Comments Filter:
  • Be warned, higher resolution is not necessarily better. I had the pleasure of seeing 2001 at a film festival in a good theatre with a nice screen, a good sound system and a good quality print of the film, circa 1990s. Using film in a theatre some things were more clear than on a TV screen at home, even with a large projection screen of the day. For example painted backgrounds were far more obvious. However the theatre environment, screen, sound, sort of compensated and overall it was a net win for the legac
    • by Anonymous Coward

      Uh, you're saying the content was of poor quality and that upping the displayed resolution revealed that. That's not a problem with the resolution.

      But it is a valid gripe that content for 8k doesn't really exist yet.

    • Well, I assume the content would theoretically match the resolution just fine once they are in mainstream release in 2020, but if you want the resolution to match the TV now, you'll have to buy something else. I saw a whole bunch of movie clips on 8K a while ago. Lots of horror movies. They all gave me the creeps in high resolution (as horror movies should, lol) but maybe I will get used to it when it is available down the road.
    • by AmiMoJo ( 196126 )

      It's a problem even for modern shows filmed in 8k. It also requires the camera operator to change the way they work a little, and directors to account for it. For example the cameras use auto-focus because with the small screen mounted on the camera itself there is zero chance of ever getting it focused enough for 8k.

      • by dwywit ( 1109409 )

        I think anyone with the money to use an 8K camera will have a stonking great external monitor for the Director and DoP to use.

        Professional cameras still have focus controls, and facilities for things like follow-focus.

    • Let's take a hypothetical. For all scenes not involving people, they build every single model in a computer and use photon tracing plus photon mapping for each and every frame, so you've as good a render as we've the science to produce.

      From there, they've a few options.

      They can transfer shading onto the digitized frames, to bring the dynamic range up to whatever they like. That won't alter the content but will restore colours and intensities to something nearer the original.

      They can repair film defects with

      • "Radiosity"? "Renderman-style shading"?

        You're about 10 years behind modern thinking when it comes to production VFX rendering. Almost everything is path tracing with postprocessed denoising now.

        • by jd ( 1658 )

          Path tracing is different from any other form of raytacing how? Sill has the same limitation because light doesn't reflect in lines. There is no path.

          • Path tracing subsumes both Whitted-style ray tracing and radiosity. It solves the rendering equation by constructing a random variable, the mean of which is the integral.

            Having said that, the main reason why the industry (including Renderman) have moved over to path tracing isn't primarily to get reflection and refraction right, it's to get GI and subsurface scattering right.

    • by Hadlock ( 143607 )

      We bought a 65" Name Brand 4K HDTV online during holiday season 2016 for $900 with "partial" (90%) HDR color/brightness/contrast. Even two years later it's still in the top third of 65"4K TVs you can buy.

      This year we bought a 1080p Nintendo Switch.

      There is a dramatic difference in the quality/sharpness in the UI. It is about 15' from TV wall to back of couch, probably 14' from screen to eyeball. Even though it's wall mounted, we had to buy a larger, 67" wide cabinet below it to fit properly,

      • by dwywit ( 1109409 )

        If I had "Taxan mega bucks" worth of income, I'd go digital cinema.

      • Been to make mega large homes in Texas, and lived in one too, and have never seen anyone with a TV screen larger than 60-70ish inches. That's because not everyone in the real wold an actual TV-specs worshiping nerd. Saw someone with a dedicated media room just once.

    • by vlad30 ( 44644 )
      Agreed, However it might cure porn addiction when you get to see what the actors/actresses really look like in detail, or maybe the Japanese need the higher resolution to get rid of the pixelation problem on their porn
  • by fahrbot-bot ( 874524 ) on Saturday December 01, 2018 @03:59PM (#57732840)
    ... with current ISP bandwidth and monthly data limitations. Not to mention the lack of 8k TVs and Blue-ray devices -- or affordable ones anyway. And... there's no real benefit to 8k for a typical home setting. So, who's this for? People with money to burn?
    • by Kjella ( 173770 )

      ... with current ISP bandwidth and monthly data limitations. Not to mention the lack of 8k TVs and Blue-ray devices -- or affordable ones anyway. And... there's no real benefit to 8k for a typical home setting. So, who's this for? People with money to burn?

      Twenty years ago I was on 64 kbps ISDN and DVDs was the hot new shit, now I got fiber and there's 4K on BluRay and Netflix. Today it's for the very early adopter... in 10 years? I dunno, 1080p -> 4K went much quicker than I thought considering how much 1080p beat SD.

      • There is practically no benefit to even 4k resolution screen considering the typical screen size and viewing distance. This is how much the human eye can resolve. I sit from my 55inch 1080p screen at 6 ft away, but to be able to tell the better detail on this size screen in 4k resolution, I'd have to sit either at 4.5 ft away, or continue sitting at 6ft away while replacing the TV with a 70+ inch one.

        I dunno, 1080p -> 4K went much quicker than I thought considering how much 1080p beat SD

        But did it? Where

  • by fahrbot-bot ( 874524 ) on Saturday December 01, 2018 @04:03PM (#57732846)
    Look. I like this movie, but why didn't they pick something with more action, like March of the Penguins?
    • by Anonymous Coward

      Probably because MOTP was filmed in 35mm and 2001 in https://en.wikipedia.org/wiki/Super_Panavision_70 = 65mm doubled? Probably easier/more suitable to remaster for that size. My guess anyway.

    • by drnb ( 2434720 )
      For iconic visuals the go to's are 2001 and Lawrence of Arabia. I'd lean towards the later but the former is politically safer.
    • Modern stuff is filmed on digital devices no better in quality than the images are designed to be shown at.

      Old film stock, particularly if it was good quality, doubly if it was also medium, supported a very high dynamic range and a reasonably impressive resolution. You need an 80 megapixel camera to match the very best film camera.

      So it depends on how good 2001's film stock was.

      You must also consider audience. Those likely to have the money will be the richer end of the arthouse types, and 2001 is an arthou

    • There are literally thousands of films with amazing visuals that could be used for a first 8K transmission. Personally, March of the Penguins would be pretty far down on the list.

      Thinking of great visuals, I would suggest:
      - Empire Strikes Back
      - The Fifth Element
      - Independence Day
      - drnb suggested Lawrence of Arabia
      - Thunderball
      - Life of Pi
      - 20,000 Leagues Under the Sea
      - Saving Private Ryan
      - The Sound of Music
      - Close Encounters of the Third Kind
      - Apocalypse Now
      - Raiders of the Lost Ark
      and so on...

      I think wha

      • Personally, March of the Penguins would be pretty far down on the list.

        It was a joke commenting on the actual lack of action in 2001: ...

    • by AHuxley ( 892839 )
      Content was 8K ready and broadcast ready in terms of resolution, preservation and quality.
      A lot of other movies might have legal, resolution, restoration and ownership problems.
      Movies get ready for 4K media projects. Their 8K content will be ready for their own network use.
  • by Anonymous Coward on Saturday December 01, 2018 @04:07PM (#57732864)

    This is silly. Please someone instead work on increasing the color resolution (bit depth) instead, and turn down the digital compression.

    I'd much rather see 2k uncompressed with 16-bits per channel of color. That's what a videophile standard should be about.

    • There's no need for compression in a movie. You're not transmitting over slow data links unless it's a broadcast. Lossless compression is tolerable but what's the point?

      I agree on colour. OpenEXR is good and is used by ILM. Who also invented it. Any 48-bit format should work, but to get movie-level dynamic range, you need a mantissa-exponent format for your three colours.

      • by noodler ( 724788 )

        How do you mean there's no need for compression?
        All videos you get to see at home is compressed in some way. Most videos people get to see will have several compression schemes applied to them.

        An 8k video at 24fps and 8 bits per pixel takes up more than 6Gb/s of bandwidth.
        And that already has chroma subsampling compression applied.
        If you have more bits per pixel then it gets bigger of course.
        16 bit per channel RGB would take the video up to 38Gb/s.

        That would be pretty much unworkable in a typical home setti

        • by jd ( 1658 )

          When you've 10 gigabits to the home, 6 gigabits isn't so bad.

          Interlaced degrades resolution, but nobody needs the full resolution.

          You make the assumption that lossy compression and lossless compression are the same.

          Efficient representation is not compression. OpenEXR doesn't use image compression but supports a much wider dynamic range than the bit count suggests.

    • This is silly. Please someone instead work on increasing the color resolution (bit depth) instead

      More than bit depth it's important to consider the dynamic range, and color gamut as well.

      HDMI has slowly addressed both those things - with HDMI 1.3 8-16 -bit color was supported, UltraHD covers the P3 color gamut.

    • by AmiMoJo ( 196126 )

      8k does include 12 bits per channel of colour information. 16 bits is pointless. It's colour model covers 76% of the human visible colour spectrum, compared to about 50% for digital cinema/Adobe RGB and 36% for HD.

    • Please someone instead work on increasing the color resolution (bit depth) instead

      Err ... done that. Rec2020 with it's 12bit encoding (you don't want to go more, it's just a waste) and wide colour space is the standard for UHD and you can happily enjoy it with a bluray player and a not offensively expensive TV.

      I'd much rather see 2k uncompressed with 16-bits per channel of color. That's what a videophile standard should be about.

      I'm sure you would however I don't want to change discs 10 times while watching a movie.

    • Comment removed based on user account deletion
    • There is WCG (wide color gamut) and two or three depending on how you look at it hdr schemes. Dolby vision and hdr10 and hdr10+. Don't buy a 4k tv without one or both of these. Looks like hdr10 will won, cause its cheaper. I just bought a 4k tcl roku tv with both schemes. It is awesome. I bought one that was too large though - first world problems.
  • by 93 Escort Wagon ( 326346 ) on Saturday December 01, 2018 @04:15PM (#57732896)

    The manufacturers have to keep coming up with some differentiator in order to entice people to buy their new products... I get that. But it does seem kind of pointless from the point of view of the typical consumer.

    Of course, I realize what they’re really doing is pandering to those people who think “typical consumer” is a derogatory phrase - those folks who are convinced other people care about what television they own.

    • I agree [slashdot.org] with what you're saying, but the problem is that in their drive to make this a 'new standard', they force people who otherwise were perfectly satisfied with what they have, to either buy something they can't or don't want to afford, or be left without. I suppose there'll be 'converter boxes' like they had at the OTA HD changeover, but that'll break things for many people just like those converter boxes did. In my case for instance, I'd have to toss out TiVo because the internal tuners would no longe
      • by Megane ( 129182 )
        You may or may not know that there is already talk of replacing the current ATSC standard with a new and incompatible one. Current 8VSB channels would be ghettoized into legacy transmitters, presumably at lower bandwidth than now. The only good thing about it for me is that I mostly watch recordings from my MythTV system, and I can just swap out the tuner cards, though it may also need a newer version of Linux to support them.
        • Yes, I've heard this, and I'm sure there will be push-back on that, for some reasons I've already stated: people who already invested in TVs that work just fine and will continue to work fine for years and years to come, now being told, again, too soon, that they're obsolete.
      • Force? How do they force people to buy a new TV?

  • At what point does this simply start to make the film grains bigger? I suppose that may still be a benefit, making a digital transfer look even more like projected film and less like pixels. A Cinerama-sized, highly curved screen (as I saw it originally) is far too big for my house, though; probably needs to be VR to get a theater experience in a home.
    • I was curious about that and did a quick check and there is no simple answer.

      Film grain size on a frame is dependent on a number of factors including when the film was shot, the size of the negative (16mm, 35mm or 70mm), sensitivity (ISO rating) of the film; the higher the sensitivity the larger the grains. Also affecting who visible they are is how the speed of the filming (faster means fewer grains visible), how the image is placed on the film and how the scanning was carried out.

      I think the short answer

    • by Anonymous Coward

      Even 35 mm film is capable of being far higher than 8K in resolution. If it's shot on IMAX or medium format I imagine you need 100K resolution. If it's on 1960s era color film, maybe even 4K is enough.

      A bigger issue is that on a TV, you don't really need more than 720p. OK, maybe 1080p, depending on TV size and how far away you sit. 8K is crazy overkill except for specialist applications like projecting it huge, or being able to observe fine details in photographs in a specialized work environment.

      • IMAX is estimated to be around 18K for the original negative, and less than 12K for the final print that gets projected at the cinema.

        Regarding TV, it depends on your eyes, but I can clearly see the difference between 720p and 1080p, and on a big enough screen (around 50"), between 1080p and 2160p.

        • by Megane ( 129182 )
          That is meaningless without the distance at which you can see the difference. Of course you can see the difference at a few inches, or even at desk distance (2-3 feet), but can you see the difference at couch distance (6-10 feet)? Don't worry, as your eyes age, you will understand.
  • Even the 1080p and 2160p re-mastered BluRay and UHD-BluRay editions already show the limitations of the analog film from back then. The movie is fine, and certainly one of the best produced of its time, but it definitely does not reach anywhere near true 8k resolution.

    Not even the 8k-sample-video from the ISS recently released by NASA demonstrates proper 8k resolution - most parts cannot even use a 4k TV to its fullest.
  • by Rick Schumann ( 4662797 ) on Saturday December 01, 2018 @04:31PM (#57732972) Journal
    Is this really going to be better? Or is it just a solution in search of a problem, and driven by an industrys' need to continue to increase profits?

    Unless you have a theatre-sized 8k screen, does this really make any difference over 1080?
    What about OTA signals? How much bandwidth does an 8k full-resolution signal need? How much will compression affect picture quality during motion?
    Then there's cable and satellite companies. I can't speak for satellite, but I know that the dirty little secret of cable TV is the content is re-compressed to within an inch of it's life, so they can fit those hundreds of channels into the available bandwidth. The result is poor picture quality during motion. How bad will it be for 8k?
    Even over the Internet, bandwidth will be large, won't it? Again: compression. Also: data caps.

    I think the TV industry knows that once someone buys a TV, that's that for up to, say, 10 years? If nothing changes, and the set still works like it's supposed to, no one goes out and buys a replacement. If you build shitty TVs that break every couple years, people complain and won't buy from you, so you can't just build poorly and get repeat sales that way. So, hey, let's keep 'upgrading' the standards every so often, just so we can make people feel like their current set is 'obsolete', regardless of whether it's still in perfect working order, so we can sell them a brand-new one! Brilliant idea! Except I think it's already at the point of diminishing returns. Does the average person even care about this? Or is 1080 more than enough? Does the average person have a ten foot TV in their house? What really makes this worth having? Just not convinced it's worthwhile. Going from a CRT TV that could only handle standard definition NTSC signals to an HDTV that can handle 1080p was great, don't regret it, but this? Not convinced.
    • How much will compression affect picture quality during motion?

      Compression is not evil, its a savior. If the bitrate wasn't present to explain the detail and motion, it will look shitty instantly. So if people aren't giving it the bitrate it needs they're at fault, not the codec.

      • You're missing the point I was making. When, 10 years ago, I dumped cable TV and started using an antenna for OTA broadcast TV, the thing I noticed immediately was that during motion on the screen the compression artifacts were at least an order of magnitude less than with cable. Cable TV (and perhaps satellite) re-compresses the content to a higher extent to fit all those hundreds of channels onto the limited bandwidth of the wire, so they can claim 1080 resolution on as many channels as possible. It's a f
        • Ok, but framed as 'its compressions fault' instead of 'they crush the bitrate' is just irritating since codecs are fantastic from my point of view.
          • You still don't get it.

            Hey Fred we only have so much bandwidth on the wire and we need to add another hundred channels!
            No problem Steve, just crank the compression rate from 50% to 90%, idiot customers won't know the difference, LOL!

            Do you get it now???

  • by Ecuador ( 740021 ) on Saturday December 01, 2018 @05:00PM (#57733090) Homepage

    OK, this is just silly. Apart from the fact that we switched from vertical to horizontal resolution to get bigger numbers, 4k was already beyond the limit of the resolution I can discern without sitting unusually close to a monitor. I don't know if the rest of humanity has some sort of super-vision, but from my own experience I find that I certainly can't see better than the 1 arcsec resolution often quoted - probably a little worse. And this resolution, for a 50 inch 8k TV would mean I'd have to be sitting at 0.5m away! Sure, if you are one of those who claim they can "see" 0.5 arcsec detail, you could marvel the same 50 inch TV from as far away as... 1m!
    It all seems to me like the ol' "fuck it, we'll do 5 blades" gimmick. I could see some value in 8k media, which is reportedly about the full effective resolution of 65mm negative film stock (only IMAX 70mm is higher res at around 12k, as it runs the same 65mm film horizontally instead of vertically), for example for Cinema projection, or for allowing zooming in on details for smaller monitors. But 8k TVs are just silly. And you just know somebody will eventually manage to put 8k on a phone screen and boast about it..

    • For things where details are not terribly important, like live action stories, 480p is fine. Hell, even primitive stick figures are fine to tell stories (thanks Randall!).

      If you are displaying information where each detail has meaning, finer resolution means being able to put more information into a display of size X.

      You are arguing from use-case 1 and completely disregarding use-case 2... which is weird, because this is supposed to be a site for people interested in details, not pretty pictures.

      People like

  • 8K Fallacy (Score:5, Informative)

    by markdavis ( 642305 ) on Saturday December 01, 2018 @05:14PM (#57733152)

    My estimates based on a nice, large 70" TV at a normal 10 foot viewing distance for a random set of people (with all content being a mix of typical movie material, with high-quality recording/encoding, and high bitrate, identical in every way except resolution):

    20% of people can NOT tell any res difference between 480P native and 720P native. This was HUGE.

    50% of people can NOT tell any res difference between 720P native and 1080P native. This was good.

    94% of people can NOT tell any res difference between native 1080P and native 4K.

    98% of people can NOT tell any res difference between 1080P upscaled to 4K and native 4K.

    99.9% of people can NOT tell any res difference between native 4K and native 8K.

    Now, in special cases, with huge, huge screens and sitting close, 8K might have some tiny value. But as it is, quality 1080P content, upscaled to a modern 4K TV is "good enough" for nearly everyone. 4K native content will please only a very few.. 8K for any normal purpose is just a total waste of bandwidth/storage/money. It is just a meaningless spec war that confuses and robs consumers or gives techno-ego-snobs something to brag about, even though none of them can tell any difference, either.

    What *has* been helpful is HDR and increased color info... but even that is minor compared to what came before; and only helpful to a limited point. So what's next on the marketing train? 20 trillion colors more than the human eye can distinguish? Refresh rates 1,000 times higher than the human brain can ever distinguish?

    • I don't care what's coming next. I'm staying at 1080p, which is more than good enough for my eyes.

      • by mentil ( 1748130 )

        I upgraded my eyes to 8k, so they're future-proof.
        Now 16k screens, that's a fool's errand that will never take off.

    • 50% of people can NOT tell any res difference between 720P native and 1080P native. This was good.

      I'd put this closer to 90%-95%. The test I always use is that some of the major TV networks broadcast in 720p, some of them broadcast in 1080i (which your TV converts to 1080p). I ask people to identify which networks are 720p, which are 1080i. Despite having watched these networks on their HDTVs for a decade, nobody has been able to answer me correctly. Try it yourself - of ABC, CBS, Fox, and NBC, which are 7

      • One of the problems is compression. A 1080p video doesn’t look that much better than 720p if both are using the same bitrate (which they probably are on cable). I used to have Comcast TV - it looked awful at any resolution, because all they cared about was jamming in as many channels as possible. But for something like YouTube that gives you higher bitrates to go along with the increased resolution, the difference is night and day. Sure, that makes it an unfair comparison, but if you were to select a
    • by ledow ( 319597 )

      I have a 75" inch projected screen.

      Not only can I see the difference between 480 and 720... the really important thing... is that I don't care.

      If I sat and squinted and dots, sure I can see them. But I know the image is made of dots. It's always been made of dots. My old CRT had coloured dots just the same (i.e. three colours).

      I didn't care then, and I don't care now. Because... when those dots are moving, you can't see them.

      The real test is "at what point were you swearing at your TV because it wasn't

      • by Megane ( 129182 )
        Back in the early 2Ks, I got a Sony Wega "HD-Ready" 4:3 CRT (that weighed about 80 kilos), and a separate HD tuner. It wasn't long before I got tired of the TV switching scan modes all the time, especially between wide and 4:3, so in the end I set the tuner to always output 480p. It was still a good picture with a rock-solid DVD-quality 480p as opposed to a snowy NTSC 480i.
    • by AmiMoJo ( 196126 )

      Even if you sit too far away for your eyes to see every pixel, 8k resolution still has advantages. Due to the way digital sampling works the maximum frequency it can reproduce is half the sampling frequency, called the Nyquist frequency.

      So a 1920 pixel wide image can only reproduce details with a frequency of 960 pixels, meaning that even if your eyes can't see every pixel they will still see the aliasing effects of any detail finer than 2 pixels wide. Increasing the resolution reduces the aliasing.

      With 8k

      • >"With 8k you also get the benefit of 120 frames per second motion, which many TVs already fake by interpolating 30 frames per second material (and thus introducing more aliasing, typically visible as halos around moving objects)."

        Actually, I *despise* motion interpolation or high frame rates. Absolutely hate it. So I turn all that off and watch at 24 frames (or native 30 of TV sources). I don't know why I hate it so much- I have tried over and over again to watch it, and to me it looks "too real" whi

        • by AmiMoJo ( 196126 )

          It depends on the material. For movies many people prefer 24 fps to give it that distinct look. For sport 120 FPS is great.

          And actually it's not a binary choice between 120Hz motion on or off. Most TVs allow you to choose the "strength" of the effect, which mostly boils down to how far something can move before it isn't interpolated any more. I prefer a fairly low setting, so you don't get that "soap opera" effect but small motions are also clearer than an LCD can normally provide, resulting in a display th

          • Yep. Mine is a Samsung. So they have a funky name for the setting, but it allows for a strength. I could just tolerate the weakest setting and tried that for a month. Eventually I just turned it off because it was introducing some slight but noticeable other artifacts.

            Oh, remember how I was saying that xx% of people can't notice a difference in fine resolution. The same is with the motion interpolation. When my friend's family got a new TV, that damn interpolation is on by default. I was there watchi

    • by ljw1004 ( 764174 )

      My estimates based on a nice, large 70" TV at a normal 10 foot viewing distance for a random set of people... Now, in special cases, with huge, huge screens and sitting close, 8K might have some tiny value. But as it is, quality 1080P content, upscaled to a modern 4K TV is "good enough" for nearly everyone

      I think your post amounts to "I conjecture that my experience and that of my friends is typical, and we don't benefit from higher resolutions, so likely most other people won't."

      As for me, I and my friends game on 120"+ projection screens at 6-8' distance, where the pixels of 1080p are indeed very noticeable. I sit along with 15% of the audience in the front five rows of an IMAX theater (for me it's because having the screen fill my peripheral vision makes it feel more immersive), and again the pixels are n

    • Yes. We get it. Since moving images (movies, tv shows) are not improved, there is absolutely no use for a display that is 8k. Nobody ever uses these for the display of information, they are only used to display moving images where details are not necessarily meaningful. All display devices are to be measured on how useful displaying moving images is. Nothing else matters.

      CGA was the pinnacle of displaying information. Being able to actually view the picture elements (pixels for you newbs) provides definitio

      • >"Yes. We get it. Since moving images (movies, tv shows) are not improved, there is absolutely no use for a display that is 8k"

        The article is about 8K TV channels (video). Not 8K touchscreens or 100" computer monitors...

  • Fad (Score:4, Insightful)

    by Artem S. Tashkinov ( 764309 ) on Saturday December 01, 2018 @05:37PM (#57733240) Homepage

    When you're a few meters away from a 60" 4K screen you already cannot see individual pixels, so any sharpness increase beyond that doesn't really make a lot of sense unless you're looking at the screen with a spyglass.

    So, what's the point of 8K resolution for the average consumer again? I can imagine it being useful for medical professionals but beyond that? No really sure.

    • It's useful for ISPs to rent a bigger pipe to you every month. It's also useful for Netflix/Hulu/etc to make you switch to a more expensive account. Etc.

      But for you? Unless you like sitting right in front of your TV with a magnifying glass, it's useless.

  • Two minutes.....

  • On TV on 4k, let alone 8k
  • Contrary to what I originally believed, higher resolution makes a massive difference.

    I bought my first retina display iMac last year, and man does the screen look crisp. You notice it mostly in text or small details, that is why most test pictures don't show a difference.

    I'd like to see 8K in action. Maybe no difference to 5K, but maybe I'd be surprised.

    • Higher resolution makes a clearly discernible difference for text and computer generated graphics, such as in video games. However, we're discussing movies an RV here. The benefits of 4k or 8k are dar far from being clear in the real world, except apparently for the folks bulding a cinema with a wall-sized display.

      • by Tom ( 822 )

        Well, I happen to have a cinema with a wall-sized display. (around 450 cm diagonal) I can easily tell the difference between full HD and anything less. Nature movies ask for full HD, and I'd love to watch them in 4K but my projector doesn't do 4K. The next one will.

        A 4K resolution at around 4m width gives me pixels of 1 mm size.
        8K resolution would cut that in half. I don't think I'll see much of a difference (viewing distance is almost 5m) but it could make scenes appear more crisp.

        I'm not some Bill Gates.

    • My current resolution is 5760x 1080, 3 HD screens in Eyefinity mode, on my gaming PC. What's that, 5k? And a bit? :)

      I have 4K on my HTPC.

      Eyefinity (or the Nvidia equivalent) is great for games, particularly car racing games - it's like looking out of the windscreen. When you add the force feedback steering wheel and pedals it is VERY immersive (and yes, I have 5.1 sound using my old home theatre amp & sub and the Logitech 5.1 speakers that I haven't blown up yet to make it even more immersive).

      My tv tha

  • There was an interesting article in SMPTE Journal recently about 8K (dead tree magazine or paywalled, so no link), pointing out two problems with 8K TV beyond the obvious ones of lack of bandwidth to the home and content.

    The first is motion blur. Still images on an 8K monitor look stunning, particularly if WCG (wide color gamut) and HDR (high dynamic range) are also part of the display. However, once the image starts moving (which is the point of TV after all) motion blur becomes a real problem. If you keep

The hardest part of climbing the ladder of success is getting through the crowd at the bottom.

Working...