Digital Foundry: Will 8K Gaming Ever Become Relevant?

Short answer: No

Long answer: Nooooooooooooooo.
 
4K is overrated so 8K seems like a giant waste of everyone's time and money. Just another buzzword thrown in the PR machine. 8K will also probably contribute to even longer and costly game development considering how fucking long it takes to render anything in 4K.
 
If I can play = relevant.

You're either purposefully or unintentionally missing the point of the question.

The question isn't whether or not it's possible (which obviously it is and has been for years). The question is whether it will ever be relevant in the greater gaming landscape.
 
Of course it will. Tech progress isn't going to stop. It'll just take awhile. Probably much longer than 4K took. But, once the tech is cheap enough to have 55" 8K TVs for $400, people will start picking them up.
 
You're either purposefully or unintentionally missing the point of the question.

The question isn't whether or not it's possible (which obviously it is and has been for years). The question is whether it will ever be relevant in the greater gaming landscape.
The great gaming landscape makes zero difference for me if I can play right now.
 
Given that most games have to sacrifice performance when activating native 4K, the question is a bit of a joke right now. Even if people had 8K TVs, gaming will take like 2-3 gens to catch up.

So, no.
 
It's completely pointless and a buzzword right now.

Only in the distant future when that's the norm. Everyone will focused on the next buzzword by then though.
 
Even Sony abandoned the 8K push with their latest TV lineup. Nobody gives an actual dump about 8K except a small contingent of super nerds evoking "my smartphone camera has more megapixels" to a laughing audience.
 
We cant even pull off 4k60 without a seriously overpriced gaming rig for modern AAA games, and not even then is it possible for every game. 4k is already sharp enough too, so what would be the advantage of 8k?
 
Last edited:
?
Just look for 16k gaming on youtube.
Yes, it would just require a multi monitor setup to display as there are no 16k monitors. You can down sample of course but that's rather pointless.

So yes, 16k gaming is possible but nobody is going to seriously use it.
 
The problem is diminishing returns. For relatively normal-sized TVs (let's say 60"), going from 1080p to 4K is a huge difference, but from 4K to 8K, it's very hard to spot even at close distance. So why bother? Lol.
 
This is the same nonsense as "people can only perceive 30fps"

No, it isn't the same at all. You quickly run into the limits of human biology, living room size and budget. More likely all three at once.


Most people are not even taking full advantage of their 4K screens, let alone 8K.

The 8K difference will be the exact same story as the people who swear up and down that they can hear the difference in 192khz music vs 44.1khz (ie. 22.05khz per ear) - despite the audible hearing range in humans maxing out at 20khz. And that's in newborns, it declines to 16khz by your twenties.

4K is the end game for video content. No company or consumer is going to reasonably invest so much money into such a minuscule gain nobody can even see.

At the recent 2020 Hollywood Professional Association Tech Retreat, Michael Zink, vice president of technology at Warner Bros. presented, "Tested Perceptual Difference Between 4K & 8K," a double-blind, visual perception experiment to gauge if 8K proved to be a better quality experience than 4K on an 88" 8K OLED consumer display. 139 participants rated seven different 8K and 4K clips from three different viewing distances and determined that 8K "did not result in significantly improved visual difference" in the testing environment.
 
Last edited:
With modular MicroLEDS (and possibly/hopefully QDEL too) built from seamlessly joined wafers under a flexi glass panel I suspect we'll see 8K on 80"+ sets just by way of using more of the same wafer to reduce costs and simplify manufacture. Regardless of content they'll just use good scaling. See my old post below:

When MicroLED and QDEL become a thing; and assuming they take the modular approach by joining lots of fine-tolerance, bezel-less mini panels under a monolithic flexible glass substrate. I could see them using just a few different mini panel ["wafer"] sizes for better production efficiency. With larger displays simply having an 8K default as consequence. All displays could take all inputs up to 8K and just use high quality scaling where applicable.

For eg.

5.2" 216p Wafers
26"
| 1080p via 5 x 5 @ 384x216p 5.2"
52" | 2160p via 10 x 10 @ 384x216p 5.2"
104" | 4320p via 20 x 20 @ 384x216p 5.2"

6.4" 216p Wafers
32"
| 1080p via 5 x 5 @ 384x216p 6.4"
64" | 2160p via 10 x 10 @ 384x216p 6.4"
128" | 4320p via 20 x 20 @ 384x216p 6.4"

8.0" 432p Wafers
40"
| 2160p via 5 x 5 @ 768x432p 8.0"
80" | 4320p via 10 x 10 @ 768x432p 8.0"

That way the cost per inch won't rise as exponentially (though the chassis/additional electronics/shipping/packaging will still add a small premium) and the scaling of pixels relative to screen size would be more linear. Plus, just creating 2-3 smaller "wafers" would be very economical rather than a bunch of different large screen sizes with different pixel densities. If your image scaling is good enough you could theoretically reduce it down to just 1 or 2 wafer sizes/densities and use intermediate resolutions.

...

Outside traditional displays, VR/AR [in the long term] will probably need to go out to >12Kx12K per eye for an effectively perfect image.

Even though I want regular games to target 4K with good scaling from roughly 1/2-res and use that horsepower on FX, Perf & Simulation, there's plenty of rudimentary games out there that will have loads of headroom, so I have no aversion to those cranking out as many pixels as possible. I also expect 1440p-2160p native titles will scale nicely to 8K with future AI/ML scalers. It may not add a whole lot of detail with most display sizes in most conditions, but it does lend a greater sense of presence, not to mention an intrinsic AA.

On the display side again, there's also the issue of light output, it might make more sense to decimate those pixels/subpixels down further and run more units at a lower light output rather than one larger unit going full-tilt; then group them in a lens array.

I think 8K will happen just as a matter of course, even if most content remains scaled to it; and I think it's enough overkill to assure perfection in any conventional setup should the content or scaling be up to the task. 4K is well along the curve of diminishing returns but I don't agree that they diminish entirely at that point. I think that happens somewhere inbetween, around 5-7K. I've seen high quality 4K content on 4K displays and 8K content on 8K displays...again, even though it doesn't feel more detailed it's as if the display stops existing between me and the content, it has a presence to it more akin to looking through a window.

I believe 8K truly is the end of the road in raw resolution for large, conventional displays in conventional settings; it'll just slowly creep up on us as a matter of course rather than being some big marketable buzzword.

After this though and beyond brightness increases (which can also indirectly solve motion issues, colour accuracy, efficiency and lifespan); I'd like to see the industry look to adding more primaries, 4, 6 or 9. RGB/BT.2020 can only present ~70% of human vision and the only way to increase that is move beyond the triangle. Film scanners and many cameras can already technically do it with minimal updates, the workflow should be able to be updated easily and on the display it's a case of more subpixels. Max light output, number of primaries and panel bit depth are the last 3 things left to solve for the spatial image in conventional displays.

See: https://6pcolor.com/faqs/
 
Last edited:
For a person with 20/20 vision, while sitting 10 feet away, one would need about a 75-inch display-diagonal for HD, 120-inch for 4K, and a whopping 280 inches for 8K to be able to distinguish the resolution!.
Absolute and utter bullshit as anyone who's actually been able to compare knows.....
Have you ever actually seen a 120" screen with a 4k Pixel Matrix?.. It's not pretty.. Pretty atrocious actually if you don't have a very large viewing distance.

And that is before we've even talked about monitors where the PPI is much much more "in your face" due to the shorter viewing distance....
I haven't seen 16k devices yet, but the difference between 4k and 8k is very much perceivable on 77+" devices at about 3m and on our designer's monitors even more so....
 
Last edited:
It will when it becomes a simple task for the hardware & a cheap upgrade for your TV.

Things kinda went crazy in the last 4 years so prices went up on things that was supposed to go down but don't be fooled 8K will be a thing in the next few years .


Do I need to remind you all of how you acted towards 4K before PS4 Pro was revealed?
 
At some point visual fidelity will be so marginal it won't be a topic of conversation. Maybe after the next next gens in 2030/2040 will be hitting 8K, but by then new tech might will probably have changed what we think of screens amd resolution. Then again it can be an argument of 8K versus 600 fps
 
That depends on the content.

The next consoles will probably finally get a stable 4k, many high end PC gamers barely give a shit about 4k, devs won't want to waste resources on 8k output.

8k is long away, but it's only a matter of time.
 
Top Bottom