The Trouble With 4K TV 442
An anonymous reader sends this quote from an article about the difficulties in bringing so-called '4K resolution' video — 3840x2160 — to consumers.
"Though 4K resolutions represent the next step in high-definition video, standards for the format have yet to emerge and no one’s really figured out how to distribute video, with its massive file footprint, efficiently and cost effectively. How exactly does one distribute files that can run to hundreds of gigabytes? ... Given that uncompressed 4K footage has a bit-rate of about 600MB/s, and even the fastest solid-state drives operate at only about 500MB/s, compression isn’t merely likely, it’s necessary. ... Kotsaftis says manufacturers will probably begin shipping and promoting larger TVs. 'In coming years, 50-inch or 55-inch screens will have become the sort of standard that 40-inch TVs are now. To exploit 4K, you need a larger form factor. You’re just not going to notice enough of a difference on smaller screens.' The same quality/convenience argument leads him to believe that physical media for 4K content will struggle to gain traction among consumers. '4K implies going back to physical media. Even over the Internet, it’s going to require massive files and, given the choice, most people would happily settle for a 720p or 1080p file anyway.'"
cable and sat don't have the bandwidth for it (Score:5, Insightful)
cable and sat don't have the bandwidth for it and that's on the broadcast side.
Maybe 1-2 channels but most cable systems are loaded with sd channels and old mpeg2 HD boxes.
Sat has moved to all mepg 4 HD but stills has lots of SD boxes out there as well.
Re: (Score:3)
Re: (Score:2)
4K ?
4K Q 2 !
Re: (Score:2)
Re:cable and sat don't have the bandwidth for it (Score:5, Informative)
Its the term used at the production side, and is pretty much the standard for pro video at the high end. And its notorious for being really hard and expensive to work with because its simply taxing for the cameras to output and even more taxing to work through post-production with. Something like a RED or whatever camera will sell themselves on "4K" but generally unless filming for cinema its hardly needed.
With one exception however, when dealing with chromakeying and the like the higher resolution provides more information to key out backgrounds without the telltale green halos around the characters, so even on TV work, 4K cameras are idea for special effects stuff, just to give the post-production work more to work with.
Which is, of course, why those newfangled DSLR cameras might look seriously fantastic for normal footage, but are simply the wrong choice for special effects stuff because whilst the effect of compressing it all down might be acceptable for most video, it removes too much detail for chromakeying without a lot of (expensive) extra work in post production. With that said, for DIY amateur stuff, its not time thats the problem but gear and so people are able to spend more time getting keying right and working with looser tolerances.
Re: (Score:3)
One of the big problems with 4k, and even more so with 8k, is that the camera operator can't focus it by eye any more. With that much resolution being even slightly off will be noticeable, so auto-focus is the only option.
Re: (Score:3)
Some sports broadcasters use 1080i60 ... of course 720p60 is superior, but sheep like high numbers.
Re: (Score:3)
My TV is 720p native (well, 768) but supports 1080i. Despite the nerd in me wanting to prefer 720p for techie reasons, the picture is unquestionably better with 1080i input.
Re: (Score:3)
thats what I'm thinking. I think we won't see consumer 4k devices for another 20 years.
but its good we get a spec now, and start hammering out bugs, so by the time the rest of the compute world can handle the bandwith, we're ready.
Re:cable and sat don't have the bandwidth for it (Score:4, Insightful)
What is the issue? (Score:5, Insightful)
Digital television is ALWAYS compressed.
It will require 4 times the data throughput as it is only 4 times as many pixels. Period. There isn't a downside. If it is only getting a 1080p signal then it will be at least that good and you know that they will have a lot of processing to anti alias the upscaled image. It will probably really help on 3D movies where they are cheesing out by cutting the vertical resolution in half.
The only issue will be getting the infrastructure caught up with it. The cable companies may have a problem but if they don't take care of it they will go the way of the buggy whip because the Internet and Netflix will scale to take care of it.
The only real issue that 4K may have is if it makes enough visual difference that anyone will care enough to pay the premium. I really think the only place it will really noticeably shine is 3D. We will just need to see how fast meaningful 3D content becomes available. And with the limitations on how much 3D content you should reasonably watch in a day that will slow the "need" for it.
But it does require 150MBPs (Score:3)
Really, it does, uncompressed it does. Yes, we can compress very effectively and we do, but he does have a point that the current infrastructure is struggling hard to keep up with a few 720P channels, let alone 1080P. One 4K channel will probably take the same bandwidth that 8 or 10 720P "HD" channels take. Given the amount of 720P channels one user can choose from at the moment and the amount they can play simultaneously over their connection, only very few people will be able to receive 4K broadcasts in
Re: (Score:3)
It will probably really help on 3D movies where they are cheesing out by cutting the vertical resolution in half.
If 3D films only have half the vertical resolution, wouldn't a more sensible fix be to distribute files with double the normal height (or width in the case of SBS formats)? I've sometimes wondered why 1080p 3D files don't come in 1920x2160, or 3840x1080 resolution, which could be chopped and displayed on a normal 1080p display without the loss. It's like interlacing all over again.
Re:What is the issue? (Score:4, Informative)
The article said 600 MB/s, not "Mbps". There is a difference. The former is megabytes per second, the latter is megabits per second. And, yes, it does.
3840*2160 = 8294400 px/frame
Colour depth is 24 bits per pixel
And either 24, 25 or 30 frames per second, depening on whether it's native film rate, or adjusted for PAL or NTSC. In either case, the calculation is:
3840*2160 px/frame * 24 bit/px * 24 frame/s = 4,777,574,400 bits/s = 4.78 Gb/s or around 597.2 MB/s
It's obviously more for the higher frame rates.
Re:What is the issue? Obsolete already? (Score:5, Funny)
Well, I'll just wait for the 640K TV. That should be enough for anybody.
Re:cable and sat don't have the bandwidth for it (Score:4, Informative)
OTA gives 19 Mbps to each channel, usually split into 3 stations. So 20 for a single 4x channel sounds about right.
OTA uses MPEG-2, and (in my experience) for a good, decent 1080i signal, between 12-15 Mb/s is required, before things look a bit blurry. I do have an OTA EyeTV setup, with a 1080p monitor, and I can quite easily see the bitrates.
Ironically, I've taken to streaming my PBS shows, because although my downstream bandwidth is a mere fraction of a full ATSC stream, the image quality is far superior to what is possible with a 9 Mb/s "720p" sub channel.
For Bluray, 25 Mb/s AVC is very common, and that doesn't include the lossless audio. It's in another league entirely, provided that the grain hasn't been scrubbed out of existence. (Pan's Labyrinth isn't realistic, though it is shiny)
More efficient coding standards may help cut the bandwidth requirements quite severely, but at a certain point, you'll have to decide that you just don't care about image quality. And that sort of attitude won't sell 4k screens.
Re: (Score:3)
OTA uses MPEG-2, and (in my experience) for a good, decent 1080i signal, between 12-15 Mb/s is required, before things look a bit blurry.
Blurry? When bitrate gets low, I see blockyness - the opposite of blurry. And Blu-ray is almost intentionally large (an artifact from the HD-DVD war). There's no need for uncompressed audio, given that double-blind tests show that people can't really tell a difference. But yes, OTA digital sucks, just watch an action sequence in strobe or lightning (no, I can't think of any at the moment), and the flashes are all blocky, as the frames change too much too fast and the resolution drops to meet the specifi
Re: (Score:3)
Lowpass filters are often used in MPEG encoders. I'm not taking about a signal glitch. I'm talking about a conscious effort by the stations to limit the video bandwidth of any one subchannel. As a result, it never seems as if the video can really make use of the full frame resolution. You should be able to see more, but you can't.
Re: (Score:2)
I fail to see any point in your ramblnigs. 4K / 8K are clearly future techs intended to be delivered over equally future networks. Your argument is invalid. Get with the programme.
Re: (Score:3, Insightful)
Yeah, because there is no readily accessible source of 1080p content..
Re: cable and sat don't have the bandwidth for it (Score:5, Informative)
Re: (Score:3, Informative)
The display may be not technically be interlaced, but the content certainly can be.
Re: (Score:3)
They use the higher framerate to simulate the way scanlines faded on a real CRT. Among other things, but that's the big thing you get from 120hz or 240hz -- you can watch 1080i60 video without it being completely ugly and ruined by bob/weave artifacts.
I did a lot of research before I bought my first TV that was over 60Hz and I've never heard that before. I've read lots of lengthy discussions and even read manuals and that was never mentioned.
I just googled "240Hz LCD" and the first three results talked about reducing motion blur, not a single hint and reproducing CRT effects. Where did you hear that?
Re: (Score:3)
Re: (Score:3)
That's not true
With any technology based on photolithography any increase in density decreases yield, since as you increase density you make the "lines" smaller which increases the number of errors per wafer or sheet of glass.
Since they normally cut these screens out of a much larger sheet of glass they get a higher yield with smaller screens (a single pixel fault ruins a smaller percentage of the glass) or lower pixel densities (less faults). The reason that the current round of tech has gotten cheaper has
Re: (Score:3)
a move to ala-carte purchasing might eliminate a lot of crap no one watches and free up more bandwidth
How so?
Satellite and cable broadcasts the same to everyone ... if only 1 person is watching it then it needs to be broadcast
A-la-carte suggests an on-demand model - that is WAY more bandwidth because everyone is now watching something different !
Re:cable and sat don't have the bandwidth for it (Score:4, Interesting)
Once the fiber is there the 4K shouldn't be a problem. Replace the old neighborhood boxes with streaming servers and harddisks and just pull the 4K data from there. Then the connection between the main switches and the neighborhood boxes isn't as heavily taxed and the real internet can still be fast.
I have no problem.... (Score:3)
I prefer it, in fact.... it's far easier to account for than bits stored on a disk drive I can't possibly see without an electron microscope.
The biggest grievance I have with 4k is that the devices are too bloody costly.
Re: (Score:3)
Although it may not matter - if nobody has a physical DVD player anymore in thirty years, passing on my DVD collection to the kids or offering to lend it to friends is meaningless.
Re:I have no problem.... (Score:5, Interesting)
The CD is over 30 years old, you can still buy CD players.
Welcome back to 2005 (Score:3, Insightful)
The same thing happened when the first 1080P screens came out. The market will adapt, there's no problem here.
Re:Welcome back to 2005 (Score:5, Interesting)
the problem is that HD is still more than is needed, and a fair amount of programming is still made for SD (and most still broadcast in SD).
broadcast facilities dragged their feet with HD adoption - the single factor that made all facilities HD capable was the move away from hardware to software and masters-on-HDD.
so no... the market didn't adapt in 2005, it didn't adapt in 2010, it hasn't adapted now and it will be a long time before anything other than big budget movies or events like the olympics will get the 4k treatment.
also consider the optimum viewing distance of 2.5 screen heights. if Jobs were still here, he'd stop at 2k and call it "retina television". unless you're doing it well wrong, you're not going to get much benefit. even the jump from SD to HD was marginal - most of the gains were in compression quality (a macroblock is harder to see when it's 1/4 the size, and in h.264 it's impossible to see as it's filtered out by design).
but i suppose 4k will be interesting for perving on individual audience members at sporting events...
Re: (Score:3, Insightful)
consider the optimum viewing distance of 2.5 screen heights
I keep seeing that pop up, and I don't know who came up with it, but they're wrong. If you have to turn your head to see the the action going on the corners, you're too close. If you don't have to turn your head, you're not too close. That point is closer than 2.5 screen heights.
if Jobs were still here, he'd stop at 2k and call it "retina television".
Doubtful, considering the iPad 2048x1536 is only 10" screen.
even the jump from SD to HD was marginal
Holy shit, and this is how you know that you have no idea what you're talking about. The difference of SD to HD was more significant by far than the change from black a
Re:Welcome back to 2005 (Score:4, Interesting)
even the jump from SD to HD was marginal
Holy shit, and this is how you know that you have no idea what you're talking about. The difference of SD to HD was more significant by far than the change from black and white to color. It's huge! Do you have a 10" tv that you're watching from 7 ft away when making this comparison or something?
Are you old enough to remember the B&W TV days? I think you're underestimating the scale of the switch from B&W to color. I still remember when my parents got a color TV (we had a B&W set far longer than most people) and the difference was amazing and quite apparent to everyone. It didn't take a side by side comparison to see the difference between B&W and color, and you could see the difference no matter the size of the screen or how close you were.
On my current 37" LCD (capable of 720p, 1080i), I notice only a minimal difference between SD DVD's (480i) and HD Blu-rays. The difference is so minimal that I stopped paying the extra dollar or two for Blu-ray disks from Netflix because I couldn't really tell the difference. Perhaps if I had a bigger 1080p capable set I might notice more of a difference, but at my normal viewing distance (10 - 12 feet) the difference is quite minimal on my current set. I don't think I'd notice any difference at all between 720p and 4K without a much larger TV, or sitting much closer to the TV.
This chart doesn't go up to 4K, but suggests that you'd have to sit closer than 10 feet away from a 100" screen to take advantage of even 1440p:
http://www.engadget.com/2006/12/09/1080p-charted-viewing-distance-to-screen-size/ [engadget.com]
Re: (Score:3)
On my current 37" LCD (capable of 720p, 1080i), I notice only a minimal difference between SD DVD's (480i) and HD Blu-rays
In descending order of probability:
1) You are WAY to far away from your TV. THX standard is for the display to fill 40 of your field of view diagonally, or a viewing distance of 1.2x the screen diagonal. A little under 4 feet would be the recommended viewing distance.
2) Your TV is still set to the factory/showroom default profile, and will look like shit whatever you feed it with. Use a calibration disc (e.g. Spears & Munsil) or the basic THX calibration included on many BD (e.g. Disney/Pixar films) t
Re: (Score:3)
Re:Welcome back to 2005 (Score:4, Insightful)
I don't give a damn about 4K TV but i want it to become stupidly popular because a 4K TV LCD panel is also a 4K computer monitor, and mainstream purchases of 1080p TVs are why 1080p monitors are less than half the price of 16:10 1920x1200 monitors.
"good enough for TV" is a huge limiting factor on the affordability of high-resolution monitors, so if the plebs think they need 4K to watch TV then that's just fabulous.
Re:Welcome back to 2005 (Score:5, Insightful)
also consider the optimum viewing distance of 2.5 screen heights. if Jobs were still here, he'd stop at 2k and call it "retina television". unless you're doing it well wrong, you're not going to get much benefit. even the jump from SD to HD was marginal - most of the gains were in compression quality (a macroblock is harder to see when it's 1/4 the size, and in h.264 it's impossible to see as it's filtered out by design).
I thought the jump from SD to HD was great....for a while. Lately most of the free to air TV channels in Australia have been going terribly downhill with overly compressed or badly up-scaled video sources. It's rare now to get HD content that looks like HD - the best thing I've watched lately was a 3 year old doco I had saved on hard drive.
Which then begs the question, why go to 4K when we cant even get 1080p right consistently?
Re: (Score:2)
Sooner or later, the physical size of consumers living rooms will determine the upper limit for how highres a screen can be.
Re: (Score:2)
Re:Welcome back to 2005 (Score:4, Insightful)
I'm not sure that's true. 1080p had always been the goal of HD, even with the original HD spec developed in Japan in the 80s. No matter what, everyone knew we were going to get there and understood the advantages over NTSC and PAL. Consumers and content creators could see the improvements brought by HD. Most of the people who cared about 1080p just waited until prices dropped and skipped 720p and 1080i. That all occurred as part of the big HD uptake over the past 5 years.
The problems with 4k are twofold. First, it isn't part of the existing HD spec. It is a new standard that doesn't have the imprimatur of governments and cable companies designating it as a target to be achieved. Second, it is a move driven entirely by the consumer electronics industry. There isn't demand from users and there is certainly no interest on the production side.
I work in post production and the data hassles of 3D have been enough to keep our company (and many others) away from it. The substantially larger file sizes associated with 4K are even worse. For a production company like ours, we'd have to move to a petabyte SAN just to manage an equivalent amount of 4K footage to what we do now in HD. Transcoding times would go through the roof, bandwidth would be heavily taxed, even the hardware requirements for decoding a single compressed stream (to say nothing of editors handling MANY simultaneous streams) for playback would be much higher. And for what? The only quality improvements would be in resolution (as opposed to the jump to HD, which brought a massive change to color handling over NTSC). Networks don't want to pay higher budgets for something that won't help make them any more competitive. Satellite providers, who already compress the shit out of their HD signals, don't have spare bandwidth for fatter streams. Cable companies, who are basically in the data business now, don't want to waste their bandwidth on it. Even with SDV it would add a lot to their overhead. Game consoles are still struggling to make the most out of HD, so are nowhere near ready to handle that many additional pixels. You might have videophiles willing to spend a ton of money on ultra-premium gear, but even they would be limited to using specialty playback hardware that would have to download massive files because 4k exceeds the storage capacity of any commercially available blu-ray media.
TV manufacturers are pushing this because the great upgrade is over and 3D has failed to excite consumers. They need something to try and convince consumers to replace a perfectly functional, nearly new 1080p TV. So they're going to run with 4K in 2013.
Re:Welcome back to 2005 (Score:5, Informative)
I'm calling bullshit. The summary talks about uncompressed video, glancing over the fact that even 1080p uncompressed requires a bitrate of 190 MBytes/s (at 24FPS) - faster than most HDD's can handle. Storage space required? 667Gb per HOUR, or a solid TB for a 90min film. Do you need a massive SSD to play 1080p files? Do you even need an SSD to edit and encode them? No, you don't.
Source: http://en.wikipedia.org/wiki/Uncompressed_video#1080i_and_1080p_HDTV_RGB_.284:4:4.29_uncompressed [wikipedia.org]
Compression has always been essential, even for DVD. Uncompressed SD videos would still fill a dual-layer blu-ray (50GB) after about 30mins. Yes, we'll need better CODECS to handle 4k and yes a lot of work needs to be done, but the size of uncompressed video isn't and never has been the issue. At worst, it'll take a slightly newer disk format (I don't see why a 4-layer Blu-ray disk - which exists today, couldn't do the job) and better internet connections to stream.
Devices will be released THIS YEAR capable of outputting 4k - just look at Tegra 4, or nVidia's SHIELD, which has been demonstrated live supplying a 4K TV (Admittedly, probably up-scaled content, but obviously the technology is there today).
The real issue is content, in that nothing is really in 4K right now. The transportation and storage method is not.
Re: (Score:3)
Whelp, I've went and made a bit of an ass of myself. The actual bandwidth requirements for uncompressed 1080p (24FPS) is 95MB/s, not 190MB/s as I originally stated. The 190MB/s figure was for the RGB 4:4:4 format. The rest of my point still stands, though.
Re: (Score:3)
The 190MB/s figure was for the RGB 4:4:4 format
Which is what you should be editing with (well, R'G'B'), unless you want to progressively lose more and more information through colourspace conversion.
Re: (Score:3)
Re: (Score:3)
The integrated graphics chips in Intel's Ivy Bridge processors is capable of displaying and decoding h.264 video in hardware. They've been on the market since late 2011. I'm not talking about expensive nVidia cards here, I'm saying the integrated graphics Intel has been selling for over a year already supports 4K (although they only actually enabled output/decoding in October 2012, anybody who had Ivy Bridge hardware benefitted).
The downside is that it requires two DisplayPort outputs, and few motherboards
Re:Welcome back to 2005 (Score:5, Insightful)
Exactly, it's just early days and all this doom and gloom about the format is ridiculous. I remember when 1080p video started appearing and I tried playing a sample on my AMD 3400+ processor - of course, it skipped and jumped all over the place. The file was also huge, my paltry 2Mbit internet connection took ages downloading it and my monitor was too small to display it properly. It was excessive, took up a lot of space and required fast, new hardware. How could such a thing ever catch on?!
Oh easy, this is technology and technology constantly moves forward. If you're the kind of person that has ever complained about having a quad-core processor in your phone as "unnecessary", then please hand in your geek card and get off slashdot. Technology always moves forward, things always get better and nothing will stop that. There is never a "Good enough", things can always be done faster or with less power.
Re:Welcome back to 2005 (Score:4, Insightful)
This is the year of HEVC/H.265, which is expected to give birate improvements for the same quality of up to 50% compared with AVC/H.264. Expect to see content in this format later in the year.
Ultimately though you're right: without 4K content there'll be little demand. Upscaling 1080p will only go so far.
And don't forget.. (Score:5, Informative)
.. the cable companies would compress the signal as they presently do with "HDTV" to the point that it looks like crap. They have ruined the HDTV quality with this compression and I can only imagine how much they would ruin 4k TV content. The best HDTV experience I have ever had was pulling HDTV signals from the Over The Air broadcasts. The first time I saw this (after spending so much time watching compressed HDTV on Comcast) I couldn't believe how great it looked. If you don't believe me, give it a try. The OTA HDTV signals blow Comcast's HDTV signals out of the water with crispness and detail.
Hopefully the means of getting this type of signal dramatically improves so that compression is not needed and we can watch 4k the way it was meant to be..
Re: (Score:3)
.. the cable companies would compress the signal as they presently do with "HDTV" to the point that it looks like crap
Pretty much this.
There is so much artifacting going on with Verizon "hdtv" that I may as well be watching sdtv DVD rips from Pirate Bay, and no, nothing is wrong with the signal. The signal itself is fine without dropouts. It's just crap.
--
BMO
Eh... (Score:2)
I still have a 1 DVD out at a time along with Netflix streaming because it's better than $5 iTunes rentals for recent stuff (and I can rip DVDs for anything I want to keep), so staying with discs for a while longer is no big deal to me. It is a shame we can get the infrastructure's bandwidth up at a better pace, though.
Re: (Score:2, Offtopic)
you rip rentals? that's pretty scummy, dude.
Re: (Score:3)
I still have a 1 DVD out at a time along with Netflix streaming because it's better than $5 iTunes rentals for recent stuff (and I can rip DVDs for anything I want to keep),...
you rip rentals? that's pretty scummy, dude.
Well... Technically... As long as he has an active Netflix by/mail account, he's simply being efficient and saving them postage.
Re: (Score:3)
What's your point?
Re: (Score:2, Offtopic)
I've got him beat - I just download them.
Solid state drives. (Score:2)
Put movies on cartriges. By the time 4K is ready to become a standard it will make more sense to use solid state drives than optical. They should focus on making flash memory faster and distribute films on jump drives. Kingston has a 1TB key drive in the lab now.
Re: (Score:2)
Re: (Score:2)
Thunderbolt goes a bit higher, so a RAID array over thunderbolt might do over 600MB/s...
There's a lot of professional video gear that supports thunderbolt. Probably because a lot of professional video gear targets Mac. That's actually kind of annoying, and the third-party Windows drivers for HFS+ have spotty support. I couldn't get them working with the CF cards recorded on a KiPro Mini, for example. I had to hunt down somebody on-site with a macbook to read the damned things.
I'll live with it (Score:2)
Mis-guided claims.. (Score:5, Insightful)
Much of the bandwidth/media etc claims are rubbish. 4k has (approximately) 4 time the pixels of standard full HD, so at most a given
format will increase by 4 times, HOWEVER, most lossy compression methods (for example AVC/MPEG4) on real footage scale better
than linear with pixel count, as detail becomes more repeated at higher resolutions, so a more likely estimate for such formats is
2 times, which is not crazy (blueray for example can already delivery that for many movies if needed). newer compression methods are
coming on line that can deliver close to double the compression for equivalent quality, meaning we end up back to normal HD data sizes.
Is it needed? thats a whole different story, with the size of living rooms/available and comfortable wall space for screen, etc it is pretty
marginal, but trying to use raw uncompressed bitrates as a scare tactic is rubbish.
Their raw figures are of course not even right as they seem to be assuming 444/12bit storage, which would be rather rare in real life, 422 10 bit
would be MUCH more common, and most workflows would actually use comrpessed storage (as they do now for HD.).
The real problem (Score:2)
Re: (Score:3)
Or perhaps the 1080 sets will start to be 1152 to make 4K look better than regular HD even with 1080 content.
I'd like to be able to buy the 1600x1200 monitors I bought for many years before 1080 HDTV became popular and forced a lower resolution for PC users.
Re: (Score:2)
This means scaling up or down will work very well and people won't be able to tell the difference between this and 1080p.
I don't think pixels work the way you think they do.
That aside, you may not realise that most 1080p TVs, by default, scale up the image by about 5%, and yet that somehow doesn't look "smeared to shit." For any natural image source the difference is marginal at worst, and probably likewise for any digitally originated images, since broadcasters deliberately keep them soft for a variety of reasons.
And yes, I'm claiming industry-wide effort to make 1080 appear visibly better than 720.
You do know that it's also actually better than 720p too, right? The clue is in the numbers.
Internet speed does make a difference (Score:4, Insightful)
1. Find a torrent on a laptop and click on it to start downloading.
2. Wait a couple of minutes.
3. Navigate TV to the specific file on HDD and start watching.
It is amazing how much the experience changes for the better with faster connection speeds and more reasonable laws on downloading/uploading the content.
Re: (Score:2)
Not sure if you're trolling or just uninformed. So maybe I'm feeding.
Torrent files don't download sequentially, from start to finish. Clients that follow spec will randomly select a piece to download, possibly influenced by availability or speed of peers with pieces. The larger the file gets, the lower the odds are that you're not missing an early piece. For a movie that has 1000 pieces, the odds of having ALL of the first 250 pieces at any point before you hit the 95% downloaded point are astronomicall
Re: (Score:3)
Another sales ploy (Score:2)
Form factor will be an issue for 4K adoption (Score:3)
In coming years, 50-inch or 55-inch screens will have become the sort of standard that 40-inch TVs are now. To exploit 4K, you need a larger form factor. Youâ(TM)re just not going to notice enough of a difference on smaller screens.' The same quality/convenience argument leads him to believe that physical media for 4K content will struggle to gain traction among consumers.
I don't get how this person has the foresight to note all these things, but totally gloss over the fact that for many (I may even say most) living rooms, any TV above 46"-47" is simply too large. I will never have a 4K TV in my living room because there is simply nowhere to put a TV that is 55"... it doesn't matter if the manufacturer is selling it, heck it doesn't matter if it is FREE, I have nowhere to put the damn thing. 46" is already way bigger than needed.
4K is so 2007. 8K is already here. (Score:5, Interesting)
4K is so 2007, I have seen 8K broadcast streets (all equipment needed to acquire, store, transmit, compress, scale, playback and display) for years as shown at the international broadcasting conference.
http://en.wikipedia.org/wiki/Super_Hi-Vision_television
Before anyone comes up with, "but the eyes cannot resolve that kind of details", YOU ARE WRONG!
8K is not even a little comparable to HDTV.
I have also seen 4K being displayed, often scanned from 35mm prints, I doesn't have much impact beyond 2K. But this may be due that this is not captured on a digital camera and the grain (effective resolution) of 35 mm is worse than pixels at 4K. The 8K footage I've seen was captured on a 8K digital camera.
Also 300 fps video is freaking amazing, this was a demo from the BBC, your eyes can track fast moving objects and therefor focus on it razer sharp like when you track a moving object in the real-world. Finally we can actually watch Hollywood action sequences which as 24 fps is just a blurry mess of motion blur, or a vomit inducing slideshow.
A station wagon full of Betamax tapes. (Score:5, Funny)
The benefit of 4K TV (Score:5, Insightful)
If they do crank these out, 4K computer monitors should come down in price. I don't care what happens to the TV market as long as that happens.
Re: (Score:3)
Indeed. Better monitors are really the primary advantage I see for typical consumers, with the only other one being large-screen applications, such as projectors and the like.
Otherwise, if we're talking about TVs, 1080p is already sufficient, since I did the math awhile back when I was deciding whether to wait for 4K or not, and at typical viewing distances with the sizes of HDTVs we have today, the individual pixels in a 1080p TV are already far smaller than a person with 20/20 vision is able to discern, m
Personally (Score:2)
I have absolutely no issue with physical media. Sure, streaming is convenient. But I can tell you, that physical media saved me from absolute boredom during severe snowstorms, where my only power source was an extension cord, an inverter, and my laptop. For flying, physical media (whether thumb drive or DVD) is a necessity. And for driving, I do not want to be bound by a physical internet connection to enjoy a TV show/movie that I have purchased. I still get DVD's by mail form Netflix, because my month
The MPAA must be downright giddy (Score:3)
Re: (Score:3)
Who cares, this is not the important point! (Score:5, Insightful)
The important point is that at last, there'll be computer screens with non-stupid resolutions again! They took my 1920x1200 away, and though I would prefer 3840x2400, I can live with 3840x2160.
At least resolutions are going up again.
Bigger problem. Visually irrelevant (Score:5, Insightful)
Remember when Blu Ray came out and a number of people were claiming they couldn't see much difference.
Well this time it will actually be true for almost everyone.
Most people don't even have their TV's close enough to visually discern 1080p.
This kind of TV resolution is irrelevant in a normal home setup.
Re: (Score:3)
Precisely. When I was deciding whether or not to wait on 4K a few months back, I did the math and realized that unless you have a ridiculously large TV or are sitting ridiculously close, 1080p is already well past the point where it can be considered a "retina display" (to borrow Apple's term for any display where the pixels are indistinguishable from one another at typical viewing distances to a person with 20/20 vision). 4K provides no additional benefits in those contexts, which is how most people will l
Compression, of course. Even for 4K 3D (sic!) (Score:4, Interesting)
Pity that submitter/editor did not research further into the topic.
There are already standards (JPEG2000 DCI) that allow to compress 4K stream from about 5Gbit/s to 250 Mbit/s, which is much more manageable. There is at least one commercial vendor (intoPIX) that makes such hardware de/compressors.
If you want to stretch your imagination - start thinking about 3D movies in 4K, which is quite an obvious step. This is 12 Gbit/s uncompressed, but 500 Mbit/s in normal transmission.
Oh, by the way - 8K is already being worked on. And 8K 3D (48Gbit/s uncompressed)...
Who Wants This? (Score:2)
Re: (Score:2)
I want it on my monitor. Who watches TV anymore anyway?
OK, not a very good point, but basically, I can absolutely use more pixels. youtube videos will still be crappy, but when I write, code or draw, I'll see much more :). I don't think that the fact that those pixels require such large files is very relevant compared to the fact that economies of scale will yield much larger and better screens.
I am still bitter about the whole fullHD scam -- for a scam it was: moitor resolutions went down. and in the 21st
Re:Who Wants This? (Score:5, Insightful)
In who's mind is 2K good enough for theatres? Speaking as a former motion picture projectionist who ran 35mm and 70mm film for almost 20 years, I can tell you the "quality" you get in a 2K auditorium is significantly inferior to what was delivered by a 35mm print, albeit with no jitter or weave. 4K cinematic presentations are actually quite good, even on a 40 or 50 foot screen but I steadfastly refuse to see anything in a theatre that's shown in the 2K format. What's worse, most exhibitors run their 2k machines with the 3D lenses in place when they're not showing 3D, cutting the available light in half. So what the vast majority of patrons experience in a movie theatre today is a dark, washed-out image with lower overall quality than they were seeing just 5 years ago. The only winners here are the companies who don't have to ship 12,000 feet of film (for a 2 hour movie), which weighs about 40-50 pounds per print, to 2000 screens -- and pay to ship it back again at the end of the run. The exhibitors also win because they got the 2k machines for free from the companies and they don't have to employ skilled projectionists to run them either.
So yeah, I'll take 4K home presentation once the price comes down to the level that mere mortals can afford. I have a 53" Aquos screen now that's OK at 9' viewing distance but a 65" class screen at 4K and using HFR would rock my world once content becomes available.
My bet is that flat panel manufacturers are quickly realizing that 3D in the home is a dud and they'll concentrate their efforts into amping up 4K in the coming years, even though content will be quite minimal for a very long time. Since you'll never see anything more than 1080i or 720p from OTA broadcast (6 MHz channel size ain't changing any time soon), it'll only be a selling point for movies or DVDs of TV series. I don't know about everyone else, but 95% of what I watch is broadcast TV dramas, comedies and sports. I don't see the studios converting to shoot and edit to 4k in the foreseeable future, either.
Who cares about uncompressed size? (Score:2)
The only people that are going to care about uncompressed size are those that make movies and movie theaters (I'm assuming theaters use uncompressed files, but I honestly don't know.) And a move-maker or theater won't have any problem with it; a simple drive array is just fine to cope with the bandwidth demands.
Just as few (any?) consumers ever get their hands on uncompressed 1080p, so it will be with 4K.
Unless I did the math wrong, it's only triple the size... hardly an insurmountable problem; not even an
Comment removed (Score:5, Interesting)
content content content (Score:2, Insightful)
HEVC (Score:2)
Ecole Polytechnique Federale de Lausanne (EPFL) did a study to evaluate the subjective video quality of HEVC at resolutions higher than HDTV. The study was done with three videos with resolutions of 3840Ã--1744 at 24 fps, 3840Ã--2048 at 30 fps, and 3840Ã--2160 at 30 fps. The five second video sequences showed people on a street, traffic, and a scene from the open source computer animated movie Sintel. The video sequences were encoded at five different bitrates using the HM-6.1.1 HEVC encoder and the JM-18.3 H.264/MPEG-4 AVC encoder. The subjective bit rate reductions were determined based on subjective assessment using mean opinion score values. The study compared HEVC MP with H.264/MPEG-4 AVC HP and showed that for HEVC MP the average bitrate reduction based on PSNR was 44.4% while the average bitrate reduction based on subjective video quality was 66.5%.
High Efficiency Video Coding [wikipedia.org]
Do larger images compress better? (Score:2)
Do 4 times the pixels need 4 times the bandwidth? I would think larger blocks of solid colors, simple gradients, etc. would compress much at a much higher ratio than smaller ones. Or do they still encode the same size of pixel blocks as the old standards?
As for digital artifacts, I find that applying a very light noise filter (artificial 'film grain') conceals obvious banding, blockyness, etc. improving perceived (but not actual) quality.
Where have I heard this before? (Score:2)
So a repeat of the argument against HDTV. It took the media companies 10 years to catch up, and the same will happen again. Early adopters get what they deserve.
new possible storage mediums to transfer (Score:2)
I remember seeing articles about the use of holographic storage medium with 500 GB potential http://www.crn.com/news/storage/217200230/ge-unveils-500-gb-holographic-disc-storage-technology.htm [crn.com] . Don't know if it will ever come around, but it would be a possible physical media source (assuming that the read speeds were fast enough)
Well past the biological limit (Score:5, Insightful)
Use the panels for 1080P Passive 3D (Score:5, Insightful)
The more interesting step to me would be 1920x2160 panels for 1080P passive 3D. Right now passive 3D polarizes alternate lines so at 1080P it is more like 1920x540 per eye. Which probably is perceived by the brain like 1920x700 or something like that. If no one makes a 1920x2160 panel I presume it could be done with a 4K panel.
Not so much resolution... (Score:5, Insightful)
Re:what about... (Score:5, Funny)
We've still got 636k to go then!
Re: (Score:2)
A lot of hi-def production looks terrible now-a-days because it's too real, it looks like actors standing on a set
You know what else looks like actors standing on a set? Live theater.
Re: (Score:2)
don't blame the messenger - tell the people who make the shows!
if you get a crap makeup artist it'll look like a stage show.
if you get crap actors it will look like people standing around talking.
if you get a crap DoP, everyone will look boring.
maybe viewer discretion should be advised?
Re: (Score:2)
half the filesize of the JM reference h.264 encoder released aeons ago.
about 0.9x the size of an x264 encode with mbtree enabled.
Re: (Score:2)
Not sure how true that is, but 2 things are immediately obvious:
- not a lot of hardware would implementing that latest and greatest, so something like a standard cable STB will indeed see a substantial reduction .. .particularly for those still using old school MPEG-2 and would be seeing closer to 4x the efficiency and 4x the resolution (well, technically 4x the pixels not resolution)
- H.265 HEVC is just a baseline implementation and while many MPEG2/MPEG4 optimisations still apply, I don't think are includ
Re: (Score:2)
Re: (Score:2)
Yes dramatically higher - but manageable by, for example, todays smart phones so not really all that onerous. You'll just be up at 100% of a weak CPU rather than 10%- fine for a STB !
Re: (Score:3)
I'm far more exited about OLED displays, because A) even though they are still clinging to 1080p, they pushing boundaries for display technology in other ways such as improved contrast and viewing angle, and B) the fundamental technology of OLEDs is far more exciting that the same old LED-backed LCD technology being scaled down that's done with 4k televisions.
In other words... 4k isn't geek tech enough to be that exciting.