Digital Foundry: Will 8K Gaming Ever Become Relevant?

The main advantage I've heard of 8k is you can use it to do integer scaling of 720p, 1080p, 1440p, and 4k to make it not blurry. (But then again my 4k oled doesn't even do 1080p integer scaling so 1080p content looks blurry even though it shouldn't.)
 
It will when it becomes a simple task for the hardware & a cheap upgrade for your TV.

Things kinda went crazy in the last 4 years so prices went up on things that was supposed to go down but don't be fooled 8K will be a thing in the next few years .


Do I need to remind you all of how you acted towards 4K before PS4 Pro was revealed?
Not anytime soon, a 4090 is probably around the power of what we can expect from next gen (bit higher, or lower) and nobody is seriously using that to game at 8k. And 8k TVs have dropped in price quite a bit, you can pick them up for as low as $1499 these days. The problem is lack of 8k media; broadcasting, internet streaming, and physical media all have significant challenges to overcome with regards to 8k content.

Maybe 10 years from now it will be more relevant.
 
Last edited:
It will when it becomes a simple task for the hardware & a cheap upgrade for your TV.

True. But, the caveat to that is that the reduction in costs over time requires demand to get the ball rolling. There was a strong natural demand for 4k because 1080p is muddy on larger displays, that demand allowed the economics of scale to help lower the costs of the TVs along with technological progression. If 8k is a lot less noticeable to viewers (haven't bothered to compare myself) at normal sizes and viewing distances than you may never get enough demand.

Good point @ Zathalus Zathalus about needing video content with the resolution and bit rate at a level to make it worthwhile as well.
 
Last edited:
Absolute and utter bullshit as anyone who's actually been able to compare knows.....
Have you ever actually seen a 120" screen with a 4k Pixel Matrix?.. It's not pretty.. Pretty atrocious actually if you don't have a very large viewing distance.

And that is before we've even talked about monitors where the PPI is much much more "in your face" due to the shorter viewing distance....
I haven't seen 16k devices yet, but the difference between 4k and 8k id very much perceivable on 77+" devices at about 3m....

Yes, i'm sure the formulas used for calculating viewing distances devised by Lechtner and THX, which are used by home theater specialists everywhere, are "utter bullshit".

How far were you seated? Is your eyesight better than average? Both of these can radically change the result. For many people all it can take is one additional step back from the screen to make a difference indistinguishable. I guess you missed the part about the double blind tests being carried out where 139 participants couldn't see a significant difference.

There is also no point even asking someone if they've ever seen a 120" screen - the home is not a tech showroom like CES. Sure, if you step all the way up to a 120" screen you might be able to pick apart flaws. But nobody is a) ever going to do this in a normal viewing scenario and b) ever going to have enough physical space in the home to fit in a TV of that caliber (and likely also c) ever have enough money to afford it).

This is why I said:

You quickly run into the limits of human biology, living room size and budget.

Let's not forget that a typical 'good' home cinema setup looks something like this:

B4IwzWh.jpeg


Where the TV is miles away from the couch. We don't even need to refer to THX viewing formulas or viewing angles or whatnot, because your average consumer like the pic above is watching their TV from outside the stratosphere in comparison.

... and it's this same average consumer who all of these companies need to convince to invest in the 8K difference!

It's just not happening.
 
No, it isn't the same at all. You quickly run into the limits of human biology, living room size and budget. More likely all three at once.



Most people are not even taking full advantage of their 4K screens, let alone 8K.

The 8K difference will be the exact same story as the people who swear up and down that they can hear the difference in 192khz music vs 44.1khz (ie. 22.05khz per ear) - despite the audible hearing range in humans maxing out at 20khz. And that's in newborns, it declines to 16khz by your twenties.

4K is the end game for video content. No company or consumer is going to reasonably invest so much money into such a minuscule gain nobody can even see.

Utterly false. I can easily spot differences in 1080p vs. 4K on a 55" OLED from 10 feet away, let alone at 75" and above.

Regarding gaming, consoles will follow 8K penetration. Since it seems to very low, I don't think it is a consideration outside of symbolic "8K" support.
 
Last edited:
I'm still the only one in my group of friends who even has a 4K TV, and I always play in performance mode anyway, which never really runs at 4K.
I bought my first 4K TV eight years ago, for the PS4 Pro launch. Eight years, and I still don't see widespread adoption of that resolution.
I'm sure at some point 8K may become relevant, but I don't see it happening anytime soon, let alone this decade.
 
Last edited:
Yes, i'm sure the formulas used for calculating viewing distances devised by Lechtner and THX, which are used by home theater specialists everywhere, are "utter bullshit".
Yes they are because 99% of people buying big TVs don't start doing math before positioning it.
It's theoretical bullshit.

And ofc your are ignoring the elephant in the room, monitors altogether.
 
Last edited:
Consoles already jumped to wasting GPU power on 4K too early, I hope they don't push this bullshit any time soon. With enough time they'll eventually circle back to it yeah, but we're nowhere near tapping out 4K and the benefits to more resolution are diminishing.
 
I'm a tech enthusiast and always have been. Buying the highest end PC, TV, accessories, etc.. that I can afford for decades at this point. I jumped in to blu-ray day one. I jumped into 4K UHD very early on. I own a 77" OLED. My PC is powered by an RTX 4090.

I'll be the first to tell you that 8k is pointless for 99% of consumers. It offers no real-world benefit in the vast, vast majority of usecases.


And all that rendering power to render at 8k would be so catastrophically wasted instead of using that power for improved lighting/animations/detail/volumetrics/etc.. etc..
 
Last edited:
The same people in this thread who tell you 4k is overrated are the same who thought HD didn't matter in 2006.
The difference between SD and HD was huge but difference between 1440p and 4k on my living room 55" TV is negligible.
 
Last edited:
This is actually a good example of diminishing returns (for once) because it's tied to something that probably won't change for most people - the sizes of our TV's and monitors.
 
I am glad I saw this video from HDTVTest when it was originally released in 2018 when the first consumer 8K screens started coming out. It is more about the TV screen and broadcast technology, but it helped to put things into a perspective and not to have any false expectations.

Main points raised were:
  • as the average size of consumer TVs keeps growing every year, it makes sense that resolutions will also increase
  • 8K resolution will be relevant for screens sized 70" and up
  • BBC and NHK, who are pushing the 8K TV broadcast format, expect it to become more relevant by the end of this decade, until then we are just paving the way for mainstream adoption
 
Gaming is just modern demanding titles?
No? But people with a 4090 and interest in 8k would usually want to play games that have more demanding graphics. If all you are interested in is older or less demanding titles then a 3060 is sufficient really, even at 8k. But even then I'd say 4k240 would be a far better choice then 8k60.

Look, if you are really interested in 8k then go for it, but its not going to have any real relevance for the wider PC gaming space, nor would you really get a meaningful difference in IQ quality for what requires an insane amount of rendering budget.
 
Do people really think that in year 2040+ we are still gonna be playing games at 1440p/4k? By that time 8k will probably be standard.
 
Right now of course it's pointless. In 10-15 years I don't see why 8K couldn't happen. Hell, we would be at crazy PC specs and PS7 by then.
 
Yes they are because 99% of people buying big TVs don't start doing math before positioning it.
It's theoretical bullshit.

jlDvcJD.jpeg


I mean, you're correct that 99% of people buy big TV and set it up without much thought.

But that doesn't change the scientific underpinning of why they wanted to invest in a new TV to begin with. Ask someone why they wanted a 4K TV and they will tell you it's because it's 1) larger and 2) looks sharper. The first point i've already gone over the physical and budget limitations. The second, they actually have to see the difference for it to be a selling point.

There is one piece of math they will do - whether the enormous asking prices that 8K demands over 4K are worth it when they're staring down two TVs in the showroom and can't actually tell one from the other, like 139 people concluded in the Warner Bros research. And in all likelihood it will be the panel quality and HDR brightness that will sway the purchase of one over the other, not resolution.

And ofc your are ignoring the elephant in the room, monitors altogether.

Monitors are a somewhat different story and i'll say there are advantages above 4K there. Whether that will justify 8K is another story. At most I could get behind an interim resolution like 5K given the closer viewing distance. This is a fairly good article on it:


Greater resolutions allow you to 'get away' with sitting closer to larger monitor sizes, so to speak. At some point though you are simply sitting too close to even take in everything on screen. The reason why things like THX measurements are useful and aren't just "theoretical bullshit" is that they also take into account viewing angle. Yeah, you can make the argument that with 8K you can do this:

8k is totally relevant...

for this dude:
IjfXDNT.png


Assuming you're at least somewhat normal, outside of VR it's pretty useless.

But then you're missing out on tons of screen real estate around the periphery of the vision. And once you end up sitting far enough away from the monitor to correct that, then you start losing the resolution benefits! Get far enough away and you may as well just get a TV at that point.
 
No? But people with a 4090 and interest in 8k would usually want to play games that have more demanding graphics. If all you are interested in is older or less demanding titles then a 3060 is sufficient really, even at 8k. But even then I'd say 4k240 would be a far better choice then 8k60.

Look, if you are really interested in 8k then go for it, but its not going to have any real relevance for the wider PC gaming space, nor would you really get a meaningful difference in IQ quality for what requires an insane amount of rendering budget.
Why not both?

And if IQ is minimum then why we still need anti aliasing at 4k?
 
Last edited:
Do people really think that in year 2040+ we are still gonna be playing games at 1440p/4k? By that time 8k will probably be standard.

We're still listening to most music in 44.1khz and that standard came along in the early 80s with CD.
 
Why not both?

And if IQ is minimum then why we still need anti aliasing at 4k?
As I said, 4k240 would be better then 8k60 in most situations. Buying a 8k monitor to get a minimal fidelity increase on older titles (or zero fidelity increase depending on monitor size, eyesight, and viewing distance) while not being able to use it on the latest titles is mostly pointless.

As to why the need of AA? To address shimmering and pixel edges. Which is solved far better with DLAA instead of upping the resolution to 8k (which will still have shimmering and pixel edges anyway).
 
We're still listening to most music in 44.1khz and that standard came along in the early 80s with CD.
Well that's because humans can't hear frequencies above 20 kHz and you need to sample the sound at twice that frequency according to the Nyqvist theorem to be able to reproduce it perfectly in digital form.

So for playback to humans there's no advantage having a higher frequency. A bit simplified but pretty much. So it's not a very relevant comparison in this case.
 
I don't think it will be relevant anytime soon, unless people start to buy 100+ inches televisions. I have a 65'' 4k TV and I can be barely distinct between 1080p and 4k 3 meters away from the TV, I need to sit close to even see a difference.
 
I mean, you're correct that 99% of people buy big TV and set it up without much thought.
And yet you tried to paint the "mathematically correct setup" as the one and only truth when it`s simply irrelevant in private households... Talk about misplaced sarcasm.

But that doesn't change the scientific underpinning of why they wanted to invest in a new TV to begin with. Ask someone why they wanted a 4K TV and they will tell you it's because it's 1) larger and 2) looks sharper. The first point i've already gone over the physical and budget limitations. The second, they actually have to see the difference for it to be a selling point.
Which you absolutely can if you are either near enough or have enough diameter....given you have good enough quality viewing material and not just some crappy encoded stream. The difference is much smaller than it was from FHD to UHD but still visible.

There is one piece of math they will do - whether the enormous asking prices that 8K demands over 4K
As if price wasn`t subject to change.....:messenger_mr_smith_who_are_you_going_to_call:. Member how expensive the first 4k or OLED displays were?

, like 139 people concluded in the Warner Bros research.
Probably the same people that couldn`t tell the difference between 30 and 60+ fps and are happily watching crappy 720p TV on their 4k setups, people like my mother. That´s not a standard, that`s anecdotal evidence...


Switch out the 70"+ display diameters we`ve been talking about here for 50 sth and I swear this is exactly the same discussion I had back when 4k started to establish itself.
"You can`t see a difference if you`re 50m away bro" "but but the viewing guidelines say so bro"...............
Batman Facepalm GIF by WE tv


But then you're missing out on tons of screen real estate around the periphery of the vision. And once you end up sitting far enough away from the monitor to correct that, then you start losing the resolution benefits! Get far enough away and you may as well just get a TV at that point.
Someone has never worked with a super UW monitor....
 
Last edited:
You can use AI to upscale to any resolution. I'm sure we will eventually see a push to higher supported resolutions once AI is able to handle most of the performance lift needed to output in that resolution.

I've seen a few 8k TVs, the difference is considerable. It'll eventually happen, just probably not for several more years at least.

I remember all the posts on gaming sites about 4k being unnecessary "Why can't we just push for 1080p 60fps instead?" so many comments like that.. and within a decade 4k is the new standard. Things can happen fast in tech adoption.
 
VR needs 8K
Correct. The Pimax Crystal Super at 3840 x 3840 per eye already achieves ~89% of the pixel count of 8K. At 130 degrees horizontal FoV, that comes out to 50 pixels-per-degree (PPD), or around 83% of human visual acuity. But we'll eventually want to increase the horizontal FoV to 180 degrees and beyond, requiring even more pixels. 16K will at some point become relevant for VR, not sure about 32K.
 
Last edited:
VR needs 8K

Correct. The Pimax Crystal Super at 3840 x 3840 per eye already achieves ~89% of the pixel count of 8K. At 130 degrees horizontal FoV, that comes out to 50 pixels-per-degree (PPD), or around 83% of human visual acuity. But we'll eventually want to increase the horizontal FoV to 180 degrees and beyond, requiring even more pixels. 16K will eventually become relevant for VR, not sure about 32K.

Yep, this is the one use case I can agree that 8K would be truly beneficial.
 
Last edited:
8k is held back by its power consumption, a standart 8k model with HDR wants power upwards to like 200 - 400 watts depending on how big the panel is.
 
I feel like this conversation is the same every time with a resolution increase.

4K is great right now. But when technology can get to a point where resolution is never a bottleneck, why not push it further?

I'd love to never have to worry about aliasing.

I remember seeing 1080p for the first time and thinking it couldn't get any better. 4x the resolution later and it certainly can and does look better objectively.
 
Last edited:
The Nintendo switch is 540-720p on the go and 720-1080p docked, and about to reach ps2 numbers. I think the console market isn't really caring about anything too beyond 1080p at the moment. Nintendo has shown you don't need 4K to be relevant or ps4/5 visuals.
 
I'm normally all for having the absolute best in quality, but I genuinely don't see the point in 8K. 4K is plenty. Seems a lot of people think the same, and I don't know anyone that's ever actually mentioned 8K IRL or said they wanted an 8K TV.
 
Top Bottom