I dislike those charts claiming to be gospel. Real life testing shows otherwise:
http://www.hdtvtest.co.uk/news/4k-resolution-201312153517.htm
Only because you don't control the source. Trust me, those studies are very very precise, as precise as they are though, they're not covering the nuances people are appreciating as "improvements",
I was trying to kill this early yesterday, hence all the "motherfucking" being used which is not my style at all (and I apologize if I came across cocky, in reality I'm not to eager to discuss this if I can avoid it). I know it always escalates to this or "I saw it on a store and it blew me, like looking through a window".
I already touched on the biggest difference against a normal LCD, which is final colors your eyes perceive. By Panasonic's own admission at this CES, their current flagship (WT600) is 20% akin to the marketed DCI 98%, it's not a bad LCD, but take a 10 step gradation of any sort and 20% means a lot specially if you have something better to compare it to... LCD's always had problems with color reproduction. Anyway, Triluminos and Sharp Quattron (not Quattron 4K as I haven't seen those yet, nor seen reviews) are closer than that but still not there in color reproduction (to DCI 98%).
Still, the comparisons are valid:
Triluminos vs regular LCD
It's quite silly to see them through a regular LCD just the same, kinda like gasping at the pictures of the black levels of a plasma... On a IPS LCD. Relation though, makes that a possibility, but it's still not overly representative.
When a color can't be reproduced it falls back onto other, and that's a huge problem the crappier a screen is it means you might see a big red blur and think it looks exactly as it should despite the fact it subsided a bunch of different nuances you could actually see if the screen displayed them (this is different than dithering). No LCD out there hits the DCI 98% marketing term Panasonic penned last year though, which means that if a color doesn't match the originally meant color the error will never amount to more than 2% variation on a calibrated screen.
With 4K you have 1:4 ratio to make up for it with 1080p (providing you use a look up table to create macroblocks to represent each and every color "missing" by solid pixel representations, and natural grain and oscilations to further mask it in native 4K. It's a tech best suited to benefitting crappy LCD's than proper good screens with good color accuracy, like Plasmas or OLED's, even because pixels in Plasma and OLED are not nearly as cheap as they are on a TFT as of now. Had it not been for this we might have flagship OLED's ready for market before, yields would be better for the simple reason of there being less pixels to it.
I've told you I've already been everywhere we're going here so I'll proceed, the other "advantage" of 4K is, drumroll, natural supersampling.
In a no-AA console game environment 1:1 1080p means seeing this:
if you pull 4K/2160p and keep your original viewing distance though, you'll get something closer to this:
(note: that's multisampling 8x, whereas 4K/2160p would be the equivalent of 4x)
It's not that you can take in more detail,
that's a falacy, you're not taking all that detail in for the very simple reason that pixel by pixel it really didn't change, it just increased 4 fold and your eyes are doing the rest, because it surpassed their ability to distinguish between them - at ideal distance.
Thing is pixels are really not the way we perceive things, just like frames are not "it" and hence a natural image will never be comprised of rectangles, the point is achieving the degree of softness you expect just so you don't notice, kinda like the frames in games, nobody bitches a 24 fps movie is a slideshow, but we'll bitch to no end with a videogame. The reason is simple, we're seeing still images, 24 of then instead of 24 moving ones. Of course though the ramifications of what I'm saying are huge and I don't intend to go there in detail, everyone can connect the dots once pointed in the right direction.
Fact is, pixels without AA are of substandard quality, but pushing pixels in quantity is really the single most expensive thing to do on a graphics chip to this day and hence, it's a better tradeoff to pull 1080p looking like 4K/2160p at ideal viewing distance than 2160p just being rendered so you downsample it anyway (and if you get close you'll see the same crappy image quality the original game picture showed). If you must ask though, 2160p and 2K with better pixels won't appear the same, because one is filtering the edges, making them softer, the other is not doing anything and thus it'll appear slightly sharper.
I could make it so that my game whilst being merged down via super sampling was sharper, because the only way to properly pull sharpen is doing sharpening on a higher resolution source and then merging it down. Results then could be impossible to discern.
This also applies to films, because they'll try to play out 4K content as being sharper even if it's artificially sharpened
which is why Sony is pulling these, 2K blurays, ironically advertised to sell 4K TV's whose difference is apart from a very good encode job, were sharpened at 4K and then merged down so 2K (the link has a comparison going on). The difference? slightly more sharpening going on, and ironically they benefit 2K TV's more than they do 4K ones; specially if you had the aforementioned mastering (sharpened or not) running on a 4K and pinning it against a 2K, 99.9% of the people would see no difference providing I can control the source.
Hence: 4K bluray is actually exciting, for 2K TV's because whilst downsampling it I can control control the sharpness of the source (hence the difference will effectively be 0 on a cuttting edge TV of today) and attain something close to 4:4:4 reproduction... On a perfectly fine 2K set (same can't be said at 4K where 4:2:0 will keep being the standard). But it's not better than proper mastering (sharpening on the original higher resolution source has been done loads of times in the past too, it always worked to some extent but now we've achieved decreasing returns).
This changes nothing, 4K is the next big thing and you'll have dudes swearing their desktop screen is so much better due to being 4K and taxing a top range graphics card for no good reason in the process to churn out pixels they won't see anyway (and at some point they'll surpersample or antialiase those pixels they can't discern anyway). But one might as well understand why is that, this is a tech forum after all.
All of us will have 4K screens eventually if anything because quality TV sets will move there fast, that's also not necessarily a bad thing as there are some, minor I insist, advantages - if I need to plug in and use said TV as a working PC screen (not respecting ideal view distance for a TV, which we often don't do on PC's because the point is not seeing the whole screen at once but focusing on parts of it); but right now 4K means the death of Plasma and good OLED yields being further apart. It only benefits LCD - and that means it's a huge set back.