The World's First 8K TV Channel Launches With '2001: A Space Odyssey' (bbc.co.uk) 146
AmiMoJo writes:
Japanese broadcaster NHK is launching the world's first 8K TV channel with a special edition of 2001: A Space Odyssey. NHK asked Warner Bros. to scan the original negatives at 8K specially for the channel.
8K offers 16 times the resolution of standard HD, 120 frames per second progressive scan, and 24 channels of sound. NHK is hoping to broadcast the 2020 Tokyo Olympics on the channel.
17 other channels also began broadcasting 4K programming today, according to Japan Times, even though, as Engadget points out, "almost no one has an 8K display, and most of the people who do need a special receiver and antenna just to pick up the signal... Also, HDMI 2.1 hasn't been implemented in any of these displays yet, so just getting the signal from box to TV requires plugging in four HDMI cables."
NHK's channel will broadcast for 12 hours a day, reports the BBC, adding that Samsung already sells an 8K TV for $15,000, and that LG has announced one too, while Engadget reports that Sharp sells one for $6,600.
8K offers 16 times the resolution of standard HD, 120 frames per second progressive scan, and 24 channels of sound. NHK is hoping to broadcast the 2020 Tokyo Olympics on the channel.
17 other channels also began broadcasting 4K programming today, according to Japan Times, even though, as Engadget points out, "almost no one has an 8K display, and most of the people who do need a special receiver and antenna just to pick up the signal... Also, HDMI 2.1 hasn't been implemented in any of these displays yet, so just getting the signal from box to TV requires plugging in four HDMI cables."
NHK's channel will broadcast for 12 hours a day, reports the BBC, adding that Samsung already sells an 8K TV for $15,000, and that LG has announced one too, while Engadget reports that Sharp sells one for $6,600.
Be warned, higher res not necessarily better ... (Score:1)
Re: (Score:1)
Uh, you're saying the content was of poor quality and that upping the displayed resolution revealed that. That's not a problem with the resolution.
But it is a valid gripe that content for 8k doesn't really exist yet.
Re: (Score:1)
Re: (Score:2)
It's a problem even for modern shows filmed in 8k. It also requires the camera operator to change the way they work a little, and directors to account for it. For example the cameras use auto-focus because with the small screen mounted on the camera itself there is zero chance of ever getting it focused enough for 8k.
Re: (Score:2)
I think anyone with the money to use an 8K camera will have a stonking great external monitor for the Director and DoP to use.
Professional cameras still have focus controls, and facilities for things like follow-focus.
Re: Be warned, higher res not necessarily better . (Score:2)
Let's take a hypothetical. For all scenes not involving people, they build every single model in a computer and use photon tracing plus photon mapping for each and every frame, so you've as good a render as we've the science to produce.
From there, they've a few options.
They can transfer shading onto the digitized frames, to bring the dynamic range up to whatever they like. That won't alter the content but will restore colours and intensities to something nearer the original.
They can repair film defects with
Re: (Score:2)
"Radiosity"? "Renderman-style shading"?
You're about 10 years behind modern thinking when it comes to production VFX rendering. Almost everything is path tracing with postprocessed denoising now.
Re: (Score:2)
Path tracing is different from any other form of raytacing how? Sill has the same limitation because light doesn't reflect in lines. There is no path.
Re: (Score:2)
Path tracing subsumes both Whitted-style ray tracing and radiosity. It solves the rendering equation by constructing a random variable, the mean of which is the integral.
Having said that, the main reason why the industry (including Renderman) have moved over to path tracing isn't primarily to get reflection and refraction right, it's to get GI and subsurface scattering right.
Re: (Score:2)
We bought a 65" Name Brand 4K HDTV online during holiday season 2016 for $900 with "partial" (90%) HDR color/brightness/contrast. Even two years later it's still in the top third of 65"4K TVs you can buy.
This year we bought a 1080p Nintendo Switch.
There is a dramatic difference in the quality/sharpness in the UI. It is about 15' from TV wall to back of couch, probably 14' from screen to eyeball. Even though it's wall mounted, we had to buy a larger, 67" wide cabinet below it to fit properly,
Re: (Score:2)
If I had "Taxan mega bucks" worth of income, I'd go digital cinema.
Re: (Score:2)
Been to make mega large homes in Texas, and lived in one too, and have never seen anyone with a TV screen larger than 60-70ish inches. That's because not everyone in the real wold an actual TV-specs worshiping nerd. Saw someone with a dedicated media room just once.
Re: (Score:2)
Re: High-Resolution Delusions (Score:2)
You understand your website has more errors than an Enron accounts book?
Re: (Score:2)
It's like trying to use stone age arrow heads with a carbon fiber compound bow.
That stone age obsidian arrowhead is still highly effective, razor sharp, nicely weighted for penetration (inertia), etc. :-)
Streaming 8k vid will be fun ... (Score:3)
Re:The same thing was said... (Score:5, Insightful)
Beyond 4k we've hit the point of diminishing returns. The jump from SD to 720p has several advantages. First, the switch from either composite to component (huge analog quality difference, significantly better color representation), or analog to digital in general (no signal degradation). Next was the jump from interlaced to progressive scan. But the jump from 4k to 8k is only higher pixel density, when we've already got extremely crisp and clear visuals. This jump wont matter anywhere near as much.
Re: (Score:2)
I'm looking forward to affordable 8k monitors. 4k isn't enough for a decent size, say 28", as normal viewing distances. It's better than 1080p but you can still see the pixels and aliasing. 8k monitors are more like the kind of quality you get from a decent phone display.
Re: (Score:1)
Re: (Score:2)
... with current ISP bandwidth and monthly data limitations. Not to mention the lack of 8k TVs and Blue-ray devices -- or affordable ones anyway. And... there's no real benefit to 8k for a typical home setting. So, who's this for? People with money to burn?
Twenty years ago I was on 64 kbps ISDN and DVDs was the hot new shit, now I got fiber and there's 4K on BluRay and Netflix. Today it's for the very early adopter... in 10 years? I dunno, 1080p -> 4K went much quicker than I thought considering how much 1080p beat SD.
Re: (Score:3)
There is practically no benefit to even 4k resolution screen considering the typical screen size and viewing distance. This is how much the human eye can resolve. I sit from my 55inch 1080p screen at 6 ft away, but to be able to tell the better detail on this size screen in 4k resolution, I'd have to sit either at 4.5 ft away, or continue sitting at 6ft away while replacing the TV with a 70+ inch one.
I dunno, 1080p -> 4K went much quicker than I thought considering how much 1080p beat SD
But did it? Where
Why 2001: A Space Odyssey? (Score:5, Funny)
Re: (Score:1)
Probably because MOTP was filmed in 35mm and 2001 in https://en.wikipedia.org/wiki/Super_Panavision_70 = 65mm doubled? Probably easier/more suitable to remaster for that size. My guess anyway.
Re: (Score:3)
Re: Why 2001: A Space Odyssey? (Score:3)
Modern stuff is filmed on digital devices no better in quality than the images are designed to be shown at.
Old film stock, particularly if it was good quality, doubly if it was also medium, supported a very high dynamic range and a reasonably impressive resolution. You need an 80 megapixel camera to match the very best film camera.
So it depends on how good 2001's film stock was.
You must also consider audience. Those likely to have the money will be the richer end of the arthouse types, and 2001 is an arthou
Why Not 2001? (Score:3)
There are literally thousands of films with amazing visuals that could be used for a first 8K transmission. Personally, March of the Penguins would be pretty far down on the list.
Thinking of great visuals, I would suggest:
- Empire Strikes Back
- The Fifth Element
- Independence Day
- drnb suggested Lawrence of Arabia
- Thunderball
- Life of Pi
- 20,000 Leagues Under the Sea
- Saving Private Ryan
- The Sound of Music
- Close Encounters of the Third Kind
- Apocalypse Now
- Raiders of the Lost Ark
and so on...
I think wha
Re: (Score:3)
Personally, March of the Penguins would be pretty far down on the list.
It was a joke commenting on the actual lack of action in 2001: ...
Re: (Score:3)
A lot of other movies might have legal, resolution, restoration and ownership problems.
Movies get ready for 4K media projects. Their 8K content will be ready for their own network use.
How about color depth and compression? (Score:5, Insightful)
This is silly. Please someone instead work on increasing the color resolution (bit depth) instead, and turn down the digital compression.
I'd much rather see 2k uncompressed with 16-bits per channel of color. That's what a videophile standard should be about.
Re: How about color depth and compression? (Score:2)
There's no need for compression in a movie. You're not transmitting over slow data links unless it's a broadcast. Lossless compression is tolerable but what's the point?
I agree on colour. OpenEXR is good and is used by ILM. Who also invented it. Any 48-bit format should work, but to get movie-level dynamic range, you need a mantissa-exponent format for your three colours.
Re: (Score:1)
How do you mean there's no need for compression?
All videos you get to see at home is compressed in some way. Most videos people get to see will have several compression schemes applied to them.
An 8k video at 24fps and 8 bits per pixel takes up more than 6Gb/s of bandwidth.
And that already has chroma subsampling compression applied.
If you have more bits per pixel then it gets bigger of course.
16 bit per channel RGB would take the video up to 38Gb/s.
That would be pretty much unworkable in a typical home setti
Re: (Score:2)
When you've 10 gigabits to the home, 6 gigabits isn't so bad.
Interlaced degrades resolution, but nobody needs the full resolution.
You make the assumption that lossy compression and lossless compression are the same.
Efficient representation is not compression. OpenEXR doesn't use image compression but supports a much wider dynamic range than the bit count suggests.
Re: (Score:2)
They are working on that, UltraHD or HDMI specs (Score:2)
This is silly. Please someone instead work on increasing the color resolution (bit depth) instead
More than bit depth it's important to consider the dynamic range, and color gamut as well.
HDMI has slowly addressed both those things - with HDMI 1.3 8-16 -bit color was supported, UltraHD covers the P3 color gamut.
Re: (Score:2)
8k does include 12 bits per channel of colour information. 16 bits is pointless. It's colour model covers 76% of the human visible colour spectrum, compared to about 50% for digital cinema/Adobe RGB and 36% for HD.
Re: (Score:2)
Please someone instead work on increasing the color resolution (bit depth) instead
Err ... done that. Rec2020 with it's 12bit encoding (you don't want to go more, it's just a waste) and wide colour space is the standard for UHD and you can happily enjoy it with a bluray player and a not offensively expensive TV.
I'd much rather see 2k uncompressed with 16-bits per channel of color. That's what a videophile standard should be about.
I'm sure you would however I don't want to change discs 10 times while watching a movie.
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)
Re: Compression? (Score:2)
Japan has gigabit and ten gigabit links to the home.
8K at 24bpp at 60 fps would be 47,775,744,000 gigabits per second.
Ok, not doing to be able to do that on a 10 gig link.
But you only need to compress it to a fifth that.
No problem, even with lossless compression, you can reduce the frames substantially, and an old movie can be shown at a far lower frame rate.
And not a compression artefact in sight.
Re: (Score:1)
I'm currently living in Japan and on a good day I get 80/80 Mbps, but usually the download bandwidth is smaller. Not everybody in Japan lives in Tokyo.
Parallels with phones, computers, etc. (Score:3)
The manufacturers have to keep coming up with some differentiator in order to entice people to buy their new products... I get that. But it does seem kind of pointless from the point of view of the typical consumer.
Of course, I realize what they’re really doing is pandering to those people who think “typical consumer” is a derogatory phrase - those folks who are convinced other people care about what television they own.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Force? How do they force people to buy a new TV?
Where's the cut-over? (Score:2)
Re: (Score:3)
I was curious about that and did a quick check and there is no simple answer.
Film grain size on a frame is dependent on a number of factors including when the film was shot, the size of the negative (16mm, 35mm or 70mm), sensitivity (ISO rating) of the film; the higher the sensitivity the larger the grains. Also affecting who visible they are is how the speed of the filming (faster means fewer grains visible), how the image is placed on the film and how the scanning was carried out.
I think the short answer
Re: (Score:1)
Even 35 mm film is capable of being far higher than 8K in resolution. If it's shot on IMAX or medium format I imagine you need 100K resolution. If it's on 1960s era color film, maybe even 4K is enough.
A bigger issue is that on a TV, you don't really need more than 720p. OK, maybe 1080p, depending on TV size and how far away you sit. 8K is crazy overkill except for specialist applications like projecting it huge, or being able to observe fine details in photographs in a specialized work environment.
Re: (Score:1)
IMAX is estimated to be around 18K for the original negative, and less than 12K for the final print that gets projected at the cinema.
Regarding TV, it depends on your eyes, but I can clearly see the difference between 720p and 1080p, and on a big enough screen (around 50"), between 1080p and 2160p.
Re: (Score:2)
Absurd - 2001 does not reach 8k resolution (Score:2)
Not even the 8k-sample-video from the ISS recently released by NASA demonstrates proper 8k resolution - most parts cannot even use a 4k TV to its fullest.
Solution in search of a problem? (Score:5, Insightful)
Unless you have a theatre-sized 8k screen, does this really make any difference over 1080?
What about OTA signals? How much bandwidth does an 8k full-resolution signal need? How much will compression affect picture quality during motion?
Then there's cable and satellite companies. I can't speak for satellite, but I know that the dirty little secret of cable TV is the content is re-compressed to within an inch of it's life, so they can fit those hundreds of channels into the available bandwidth. The result is poor picture quality during motion. How bad will it be for 8k?
Even over the Internet, bandwidth will be large, won't it? Again: compression. Also: data caps.
I think the TV industry knows that once someone buys a TV, that's that for up to, say, 10 years? If nothing changes, and the set still works like it's supposed to, no one goes out and buys a replacement. If you build shitty TVs that break every couple years, people complain and won't buy from you, so you can't just build poorly and get repeat sales that way. So, hey, let's keep 'upgrading' the standards every so often, just so we can make people feel like their current set is 'obsolete', regardless of whether it's still in perfect working order, so we can sell them a brand-new one! Brilliant idea! Except I think it's already at the point of diminishing returns. Does the average person even care about this? Or is 1080 more than enough? Does the average person have a ten foot TV in their house? What really makes this worth having? Just not convinced it's worthwhile. Going from a CRT TV that could only handle standard definition NTSC signals to an HDTV that can handle 1080p was great, don't regret it, but this? Not convinced.
Re: (Score:2)
How much will compression affect picture quality during motion?
Compression is not evil, its a savior. If the bitrate wasn't present to explain the detail and motion, it will look shitty instantly. So if people aren't giving it the bitrate it needs they're at fault, not the codec.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Hey Fred we only have so much bandwidth on the wire and we need to add another hundred channels!
No problem Steve, just crank the compression rate from 50% to 90%, idiot customers won't know the difference, LOL!
Do you get it now???
Re: (Score:2)
Re: (Score:2)
Reducing the bitrate for the same resolution and framerate == increasing the compression
Anything else?
Re: (Score:3)
hospitals, assisted living and nursing homes
I am sure people living there have a hawk's eye resolution to be able to benefit from 8k resolution. LOL
Re: (Score:2)
Re: (Score:2)
This is just silly (Score:3)
OK, this is just silly. Apart from the fact that we switched from vertical to horizontal resolution to get bigger numbers, 4k was already beyond the limit of the resolution I can discern without sitting unusually close to a monitor. I don't know if the rest of humanity has some sort of super-vision, but from my own experience I find that I certainly can't see better than the 1 arcsec resolution often quoted - probably a little worse. And this resolution, for a 50 inch 8k TV would mean I'd have to be sitting at 0.5m away! Sure, if you are one of those who claim they can "see" 0.5 arcsec detail, you could marvel the same 50 inch TV from as far away as... 1m!
It all seems to me like the ol' "fuck it, we'll do 5 blades" gimmick. I could see some value in 8k media, which is reportedly about the full effective resolution of 65mm negative film stock (only IMAX 70mm is higher res at around 12k, as it runs the same 65mm film horizontally instead of vertically), for example for Cinema projection, or for allowing zooming in on details for smaller monitors. But 8k TVs are just silly. And you just know somebody will eventually manage to put 8k on a phone screen and boast about it..
Re: (Score:2)
For things where details are not terribly important, like live action stories, 480p is fine. Hell, even primitive stick figures are fine to tell stories (thanks Randall!).
If you are displaying information where each detail has meaning, finer resolution means being able to put more information into a display of size X.
You are arguing from use-case 1 and completely disregarding use-case 2... which is weird, because this is supposed to be a site for people interested in details, not pretty pictures.
People like
Re: (Score:2)
You realize that you simply doubled my size? Which means I'd need to be 1m away from the 100" to be able to "see" the 4k resolution. I can see me sitting at about 2-2.5m from it, which would allow me to "see" a resolution between full HD & 4k and would fill my field of vision, but I don't think I'd sit any closer without feeling less comfortable.
8K Fallacy (Score:5, Informative)
My estimates based on a nice, large 70" TV at a normal 10 foot viewing distance for a random set of people (with all content being a mix of typical movie material, with high-quality recording/encoding, and high bitrate, identical in every way except resolution):
20% of people can NOT tell any res difference between 480P native and 720P native. This was HUGE.
50% of people can NOT tell any res difference between 720P native and 1080P native. This was good.
94% of people can NOT tell any res difference between native 1080P and native 4K.
98% of people can NOT tell any res difference between 1080P upscaled to 4K and native 4K.
99.9% of people can NOT tell any res difference between native 4K and native 8K.
Now, in special cases, with huge, huge screens and sitting close, 8K might have some tiny value. But as it is, quality 1080P content, upscaled to a modern 4K TV is "good enough" for nearly everyone. 4K native content will please only a very few.. 8K for any normal purpose is just a total waste of bandwidth/storage/money. It is just a meaningless spec war that confuses and robs consumers or gives techno-ego-snobs something to brag about, even though none of them can tell any difference, either.
What *has* been helpful is HDR and increased color info... but even that is minor compared to what came before; and only helpful to a limited point. So what's next on the marketing train? 20 trillion colors more than the human eye can distinguish? Refresh rates 1,000 times higher than the human brain can ever distinguish?
Re: (Score:3)
I don't care what's coming next. I'm staying at 1080p, which is more than good enough for my eyes.
Re: (Score:2)
I upgraded my eyes to 8k, so they're future-proof.
Now 16k screens, that's a fool's errand that will never take off.
Re: (Score:3)
I'd put this closer to 90%-95%. The test I always use is that some of the major TV networks broadcast in 720p, some of them broadcast in 1080i (which your TV converts to 1080p). I ask people to identify which networks are 720p, which are 1080i. Despite having watched these networks on their HDTVs for a decade, nobody has been able to answer me correctly. Try it yourself - of ABC, CBS, Fox, and NBC, which are 7
Re: (Score:2)
Re: (Score:2)
I have a 75" inch projected screen.
Not only can I see the difference between 480 and 720... the really important thing... is that I don't care.
If I sat and squinted and dots, sure I can see them. But I know the image is made of dots. It's always been made of dots. My old CRT had coloured dots just the same (i.e. three colours).
I didn't care then, and I don't care now. Because... when those dots are moving, you can't see them.
The real test is "at what point were you swearing at your TV because it wasn't
Re: (Score:2)
Re: (Score:2)
Even if you sit too far away for your eyes to see every pixel, 8k resolution still has advantages. Due to the way digital sampling works the maximum frequency it can reproduce is half the sampling frequency, called the Nyquist frequency.
So a 1920 pixel wide image can only reproduce details with a frequency of 960 pixels, meaning that even if your eyes can't see every pixel they will still see the aliasing effects of any detail finer than 2 pixels wide. Increasing the resolution reduces the aliasing.
With 8k
Re: (Score:3)
>"With 8k you also get the benefit of 120 frames per second motion, which many TVs already fake by interpolating 30 frames per second material (and thus introducing more aliasing, typically visible as halos around moving objects)."
Actually, I *despise* motion interpolation or high frame rates. Absolutely hate it. So I turn all that off and watch at 24 frames (or native 30 of TV sources). I don't know why I hate it so much- I have tried over and over again to watch it, and to me it looks "too real" whi
Re: (Score:2)
It depends on the material. For movies many people prefer 24 fps to give it that distinct look. For sport 120 FPS is great.
And actually it's not a binary choice between 120Hz motion on or off. Most TVs allow you to choose the "strength" of the effect, which mostly boils down to how far something can move before it isn't interpolated any more. I prefer a fairly low setting, so you don't get that "soap opera" effect but small motions are also clearer than an LCD can normally provide, resulting in a display th
Re: (Score:2)
Yep. Mine is a Samsung. So they have a funky name for the setting, but it allows for a strength. I could just tolerate the weakest setting and tried that for a month. Eventually I just turned it off because it was introducing some slight but noticeable other artifacts.
Oh, remember how I was saying that xx% of people can't notice a difference in fine resolution. The same is with the motion interpolation. When my friend's family got a new TV, that damn interpolation is on by default. I was there watchi
Re: (Score:2)
My estimates based on a nice, large 70" TV at a normal 10 foot viewing distance for a random set of people... Now, in special cases, with huge, huge screens and sitting close, 8K might have some tiny value. But as it is, quality 1080P content, upscaled to a modern 4K TV is "good enough" for nearly everyone
I think your post amounts to "I conjecture that my experience and that of my friends is typical, and we don't benefit from higher resolutions, so likely most other people won't."
As for me, I and my friends game on 120"+ projection screens at 6-8' distance, where the pixels of 1080p are indeed very noticeable. I sit along with 15% of the audience in the front five rows of an IMAX theater (for me it's because having the screen fill my peripheral vision makes it feel more immersive), and again the pixels are n
Re: (Score:2)
Yes. We get it. Since moving images (movies, tv shows) are not improved, there is absolutely no use for a display that is 8k. Nobody ever uses these for the display of information, they are only used to display moving images where details are not necessarily meaningful. All display devices are to be measured on how useful displaying moving images is. Nothing else matters.
CGA was the pinnacle of displaying information. Being able to actually view the picture elements (pixels for you newbs) provides definitio
Re: (Score:2)
>"Yes. We get it. Since moving images (movies, tv shows) are not improved, there is absolutely no use for a display that is 8k"
The article is about 8K TV channels (video). Not 8K touchscreens or 100" computer monitors...
Fad (Score:4, Insightful)
When you're a few meters away from a 60" 4K screen you already cannot see individual pixels, so any sharpness increase beyond that doesn't really make a lot of sense unless you're looking at the screen with a spyglass.
So, what's the point of 8K resolution for the average consumer again? I can imagine it being useful for medical professionals but beyond that? No really sure.
Re: (Score:2)
It's useful for ISPs to rent a bigger pipe to you every month. It's also useful for Netflix/Hulu/etc to make you switch to a more expensive account. Etc.
But for you? Unless you like sitting right in front of your TV with a magnifying glass, it's useless.
Monthly bandwidth used in (Score:2)
Two minutes.....
Nothing WORTH the bandwish (Score:2)
yes, higher resolution matters (Score:2)
Contrary to what I originally believed, higher resolution makes a massive difference.
I bought my first retina display iMac last year, and man does the screen look crisp. You notice it mostly in text or small details, that is why most test pictures don't show a difference.
I'd like to see 8K in action. Maybe no difference to 5K, but maybe I'd be surprised.
Re: (Score:2)
Higher resolution makes a clearly discernible difference for text and computer generated graphics, such as in video games. However, we're discussing movies an RV here. The benefits of 4k or 8k are dar far from being clear in the real world, except apparently for the folks bulding a cinema with a wall-sized display.
Re: (Score:2)
Well, I happen to have a cinema with a wall-sized display. (around 450 cm diagonal) I can easily tell the difference between full HD and anything less. Nature movies ask for full HD, and I'd love to watch them in 4K but my projector doesn't do 4K. The next one will.
A 4K resolution at around 4m width gives me pixels of 1 mm size.
8K resolution would cut that in half. I don't think I'll see much of a difference (viewing distance is almost 5m) but it could make scenes appear more crisp.
I'm not some Bill Gates.
Re: (Score:1)
My current resolution is 5760x 1080, 3 HD screens in Eyefinity mode, on my gaming PC. What's that, 5k? And a bit? :)
I have 4K on my HTPC.
Eyefinity (or the Nvidia equivalent) is great for games, particularly car racing games - it's like looking out of the windscreen. When you add the force feedback steering wheel and pedals it is VERY immersive (and yes, I have 5.1 sound using my old home theatre amp & sub and the Logitech 5.1 speakers that I haven't blown up yet to make it even more immersive).
My tv tha
Two Other Issues with 8K (Score:2)
There was an interesting article in SMPTE Journal recently about 8K (dead tree magazine or paywalled, so no link), pointing out two problems with 8K TV beyond the obvious ones of lack of bandwidth to the home and content.
The first is motion blur. Still images on an 8K monitor look stunning, particularly if WCG (wide color gamut) and HDR (high dynamic range) are also part of the display. However, once the image starts moving (which is the point of TV after all) motion blur becomes a real problem. If you keep
Re: (Score:2)
The first post that's not about how technology is dumb and how 1080p ought to be enough for everyone.
I wish I had mod points.