Digital Foundry: Will 8K Gaming Ever Become Relevant?

I feel like this conversation is the same every time with a resolution increase.

4K is great right now. But when technology can get to a point where resolution is never a bottleneck, why not push it further?

I'd love to never have to worry about aliasing.

I remember seeing 1080p for the first time and thinking it couldn't get any better. 4x the resolution later and it certainly can and does look better objectively.

There's the law of diminishing returns and we've reached that point with resolution.

And frankly in the age of DLAA and other options, increasing resolution as an anti-aliasing solution is the least effective method.
 
Last edited:
People are expecting the same difference between 480p and 1080p when they jump from 1080p to 2160p. I prefer playing in 4K when it's possible but the gap isn't as much as important as it was from SD to HD. I would even say that HDR/DV is more noticeable than FHD and 4K. I will take 1080p60 over 2160p30 anytime and I will take 2160p@60 over 4320p@30 as well.
 
There's the law of diminishing returns and we've reached that point with resolution.

And frankly in the age of DLAA and other options, increasing resolution as an anti-aliasing solution is the least effective method.
I don't think we have remotely seen an increase of resolution being meaningless yet. Maybe 8k, 16k, 32k? I don't really know. But we are definitely not there yet.

People have been making your argument since 720p became a thing.
 
People are expecting the same difference between 480p and 1080p when they jump from 1080p to 2160p. I prefer playing in 4K when it's possible but the gap isn't as much as important as it was from SD to HD. I would even say that HDR/DV is more noticeable than FHD and 4K. I will take 1080p60 over 2160p30 anytime and I will take 2160p@60 over 4320p@30 as well.
But would you take 1080p60 over 2160@60? Because I think that's the point. As long as it's not bottlenecking something else why would you NOT want more resolution?
 
But would you take 1080p60 over 2160@60? Because I think that's the point. As long as it's not bottlenecking something else why would you NOT want more resolution?
Sure, I'll take the best quality available! But 8K TVs will be a tough sale I think. Sports are still filmed on 1080p, only a few events or Olympic games are in 4K (I think there were some Tokyo broadcasts at 8K though). Running a game engine from 1080 to 2160 isn't the double the power, it's the quadruple. So it will takes 16x of GPU power to get a 8K render. Currently, only a 4090 is able to get 4K60 on most recent games so I think it will take quite a while before we get 8K60. It's diminishing returns as its best.
 
The difference between SD and HD was huge but difference between 1440p and 4k on my living room 55" TV is negligible.

That's dependant on your distance to the screen and your eye sight. I can tell the difference between 1440p and 4k on my tv no problem. 1440p doesn't evenly scale to 4k
 
And yet you tried to paint the "mathematically correct setup" as the one and only truth when it`s simply irrelevant in private households... Talk about misplaced sarcasm.

People's ignorance of how they arrive at their own decisions isn't proof that the "mathematically correct setup" is incorrect in any way.

You keep mentioning framerate which is itself a perfect example of what I mean: people KNOW instinctively that 60fps feels and looks smoother and given the choice they would buy something at 60fps over 30fps any day of the week. They often can't really put into words WHY, other than "it feels better", and they don't need a technical breakdown to back that up. But that doesn't stop us from still scientifically exploring exactly why it is the case (eg. lower input lag, better motion fluidity etc), and perhaps at what point we stop being able to feel a difference.

No, a THX chart specifically doesn't factor into most people's decision making. I described what actually does - being able to see a difference. With their own two eyes. It just so happens that when you work backwards from that, the THX chart explains an awful lot about what point a person stops being able to tell.

THX has become a standard in home cinema certification for a reason as it has a lot of predictive power. There isn't any guess work in how the seating is arranged or which screen size to pick or the angle of the speakers as it's already been worked out (they did the math™). If their research is telling me that most people won't be able to tell the difference then I think there is a very good chance the consumer - who their research was designed to aid - won't be able to tell either. We'll see.

Which you absolutely can if you are either near enough or have enough diameter....given you have good enough quality viewing material and not just some crappy encoded stream.

You seemed to just gloss over what I said re: monitors, so i'll just copy and paste my answer:

"Greater resolutions allow you to 'get away' with sitting closer to larger monitor sizes, so to speak. At some point though you are simply sitting too close to even take in everything on screen. The reason why things like THX measurements are useful and aren't just "theoretical bullshit" is that they also take into account viewing angle. Yeah, you can make the argument that with 8K you can [sit right up against the screen.jpg]

But then you're missing out on tons of screen real estate around the periphery of the vision. And once you end up sitting far enough away from the monitor to correct that, then you start losing the resolution benefits! Get far enough away and you may as well just get a TV at that point."

People know when they are sitting too close once they begin to lose important parts of the image eg. an ammo counter in the corner of the HUD or an enemy creeping on you at the far edge of the screen. Again, viewing distance recommendations take that into consideration as they don't just start from one foot away.

As if price wasn`t subject to change.....:messenger_mr_smith_who_are_you_going_to_call:. Member how expensive the first 4k or OLED displays were?

Prices get lower but there is always a floor, particularly with the highest monitor sizes. A 98" Samsung, over a decade after 4K was introduced, is still going to be retailing at $8K this year.

Chinese-farmed cavier may be cheaper than the Russian stuff decades ago but it's still cavier and it's still €3,000/kilo.

Probably the same people that couldn`t tell the difference between 30 and 60+ fps, people like my mother. That´s not a standard, that`s anecdotal evidence...

It was a double blind study which is about as good as you can get in research. Literally the opposite of anecdote, which is all you've offered so far this whole thread.

You can read the full thing here: https://www.techhive.com/article/578376/8k-vs-4k-tvs-most-consumers-cannot-tell-the-difference.html

I don't even know what you're trying to argue here either.

Me: most people can't really tell the difference
You: that's bullshit I can totally tell the difference on a 120" screen if I sit close enough
Me: here's research showing most people couldn't tell the difference
You: nah they probably don't know shit anyway that research doesn't count

Well... yeah. That was kind of the whole point. If people can't tell, they can't tell, and they shouldn't need any extra insight or added "theoretical bullshit" to do so.

(the theoretical bullshit is just the cherry on top to explain why after ;))

Switch out the 70"+ display diameters we`ve been talking about here for 50 sth and I swear this is exactly the same discussion I had back when 4k started to establish itself.
"You can`t see a difference if you`re 50m away bro"...............
Batman Facepalm GIF by WE tv

Just because a similar argument has been had before doesn't mean that you can't get a different conclusion later by tweaking some of the variables in the argument.

Well that's because humans can't hear frequencies above 20 kHz and you need to sample the sound at twice that frequency according to the Nyqvist theorem to be able to reproduce it perfectly in digital form.

So for playback to humans there's no advantage having a higher frequency. A bit simplified but pretty much. So it's not a very relevant comparison in this case.

I've already mentioned that and the reason I bring it up is because I think there are parallels between it and 4K in the sense that we reach biological diminishing returns, making advancing beyond the standard kind of pointless. With audio frequency it's our hearing failing to notice a difference and with 4K it's our vision.

Of course there are those who claim to be able to hear a difference up to 96khz (and even beyond lol) but they are outliers. I think 8K will have the same fate for most use cases.
 
Last edited:
Considering I game on consoles these days, theres far more important places to be focused on technically/graphically. And on my 8k panel, I'd generally rather have my PC desktop at 4k 120fps full chroma than 420 8k 60fps when we consider bandwidth limitations.

That said, I love the 8k upscale of 4k content and as panels are forever increasing in size, the pixel density benefit is tangible on larger size screens. I'm so used to my upscale, I notice the subtle loss in detail when I go and look at 4k TV's now. Is it a must have, no. But I'd definitely miss it if it were taken away from me.
 
Last edited by a moderator:
Of course not and there is no reason for any individual to have a computer in his home and certainly 640kb is more memory than anyone will ever need*
 
That's dependant on your distance to the screen and your eye sight. I can tell the difference between 1440p and 4k on my tv no problem. 1440p doesn't evenly scale to 4k
Obviously.

Im just speaking about my own experience. im 50 and you could stick a 70" 16k screen Infront of me and i wouldn't be able to tell the difference.
 
People's ignorance of how they arrive at their own decisions isn't proof that the "mathematically correct setup" is incorrect in any way.
Keep on clinging to a setup model that`s simply not found in private households...that totally doesn`t get sillier with every iteration you`re doing that.
THX has become a standard in home cinema certification for a reason as it has a lot of predictive power. There isn't any guess work in how the seating is arranged or which screen size to pick or the angle of the speakers as it's already been worked out (they did the math™). If their research is telling me that most people won't be able to tell the difference then I think there is a very good chance the consumer - who their research was designed to aid - won't be able to tell either. We'll see.
And again more of the same shit as with the 4k introduction, just slightly different numbers.

Prices get lower but there is always a floor, particularly with the highest monitor sizes. A 98" Samsung, over a decade after 4K was introduced, is still going to be retailing at $8K this year.
More made up nonsense. A Hisense 100" 4k LCD is available below 2k (€) already....
Just depends what quality level you are looking at. Tech always trickles down.

You seemed to just gloss over what I said re: monitors, so i'll just copy and paste my answer:
You seem to take an hour editing your answers so you didn`t read my monitor answer edit, hence I´ll just copy and paste my answer:
"Someone has never worked with a super UW monitor..."

People know when they are sitting too close once they begin to lose important parts of the image eg. an ammo counter in the corner of the HUD or an enemy creeping on you at the far edge of the screen. Again, viewing distance recommendations take that into consideration as they don't just start from one foot away.
That UW resolutions aren`t made to be 100% in focus at all times but instead to fill up the peripheral vision too, as well as simply to provide more screenspace for work seems to have slipped your attention. SUW optimized HUDS also usually don`t put anything in the corners....which you`d know if you`d ever used one.....
as said before this sounds like you know super UW at best from heresay....I´m sitting in front of a 57" DualUHD monitor right now and have a double setup of the same monitor at work , so I´m already at 8k in the professional space., just like a lot of my colleagues and especially our designers.

Me: most people can't really tell the difference
You: that's bullshit I can totally tell the difference on a 120" screen if I sit close enough
Me: here's research showing most people couldn't tell the difference
You: nah they probably don't know shit anyway that research doesn't count
If you don´t have anything to say better get into polemics, classic.

I've already mentioned that and the reason I bring it up is because I think there are parallels between it and 4K in the sense that we reach biological diminishing returns, making advancing beyond the standard kind of pointless.
No one claimed that the jumps are the same as previously, they are much much smaller, but still discernible as even your own "study" says....Now put people in that test that are used to looking for differences with high FPS material and see if the scrutiny is even comparable......

Why do I have the impression that I´m talking from the enthusiast niche (we`re on an enthusiast board after all) while you are trying to discuss based on....joe shmoe you found at the bus stop....as if those people had ever been the ones to pave the way for such tech or could even see what was in front of them if it punched them on the nose.
 
Last edited:
Imagine thinking TVs will be 4K forever. Lol.
Why not? Do you think the human eye will evolve enough in the next 500 years or so to be able to discern smaller pixels? Maybe it's for larger panels? But you are still not supposed to sit very close, there's a minimum recommended distance for each panel size.

In what other way a higher resolution than 4K is beneficial? You do realize 8K or even 6K is going to be significantly more demanding for a very small gain, if any?

Higher refresh rates though? That's much more beneficial. You need at least 360hz to reach CRT like motion clarity in modern panels, for instance. So let's make this the standard first.
 
Last edited:
I think 8k is more useful on PC desktop monitors than TVs. I have a 27" 4k monitor and the difference when downsampling from 8k is still fairly visible at my seating distance. Native 8k should make for a pretty decent improvement at 32" and over.

The whole thing changes at TV distances though. I'd much rather see ultrawide become the next TV standard than 8k.
 
Sure, I'll take the best quality available! But 8K TVs will be a tough sale I think. Sports are still filmed on 1080p, only a few events or Olympic games are in 4K (I think there were some Tokyo broadcasts at 8K though). Running a game engine from 1080 to 2160 isn't the double the power, it's the quadruple. So it will takes 16x of GPU power to get a 8K render. Currently, only a 4090 is able to get 4K60 on most recent games so I think it will take quite a while before we get 8K60. It's diminishing returns as its best.
No doubt it takes more power. I didn't think this was a conversation about trade offs. I agree I'd rather have higher frame rates than resolution. I'm just making the point if the hardware can handle I'll always take more resolution.

Also all sports are shot with 4k cameras at this point. It might only be broadcasted in 1080 but they are shot in a higher resolution. Would be like internal resolution being 4k and it outputting 1080.
 
You seem to take an hour editing your answers so you didn`t read my monitor answer edit, hence I´ll just copy and paste my answer:
"Someone has never worked with a super UW monitor..."


That UW resolutions aren`t made to be 100% in focus at all times but instead to fill up the peripheral vision too, as well as simply to provide more screenspace for work seems to have slipped your attention. SUW optimized HUDS also usually don`t put anything in the corners....which you`d know if you`d ever used one.....
as said before this sounds like you know super UW at best from heresay....I´m sitting in front of a 57" DualUHD monitor right now and have a double setup of the same monitor at work , so I´m already at 8k in the professional space.....

Right, so now you're shifting the argument away from just 8K to advocating for ultra-wide setups to solve the problem of viewing angles. A problem which was only created because you have to sit closer to the screen to appreciate the resolution difference...

Completely delusional.

UW will never be standard considering 99% of content is authored for 16:9.

Why do I have the impression that I´m talking from the enthusiast niche (we`re on an enthusiast board after all)

Because you are!

while you are trying to discuss based on....joe shmoe you found at the bus stop....as if those people had ever been the ones to pave the way for such tech or could even see what was in front of them if it punched them on the nose.

Because I am!

Tech only gets momentum if Joe Shmoe can see the value in it. It's not up to tech enthusiasts to issue marching orders about what the masses should adopt (if you come back with the Steve Jobs quote I may kill myself). History is littered with tech which fell into insignificance for one reason or another.

If you show 139 people 4K versus 8K and they can't tell the difference and probably wouldn't pay $8K to see it, that's it. Game over.

You can't tell them "well actually i'm sitting here with 57" DualUHD monitor right now and have a double setup of the same monitor at work , so I´m already at 8k in the professional space - here is why you are all wrong". They still won't buy into it.

It's the 96khz music fan with the $5k audiophile setup all over again. A 1%-er niche which claims they can tell the difference, and who knows maybe they even can, but which 99% of other people can't and are more than happy listening to their ripped MP3s on a pair of $30 wireless Anker earbuds.
 
Right, so now you're shifting the argument away from just 8K to advocating for ultra-wide setups to solve the problem of viewing angles. A problem which was only created because you have to sit closer to the screen to appreciate the resolution difference...

Completely delusional.
I'm not shifting anything away from 8k nor am I advocating anything to solve anything. Stop making up discussions that have only happened in your head.... Monitors only came into this because they are prime examples for why PPI matters. The whole "focus" thing was only you not having a clue how SUW works....

Despite that however I have to say that you are completely ignorant of the reality in companies and have the audacity to call others delusional........
(S)UW is already the standard in IT and it trickles down into the PC gaming space because people like feeling immersed.
As said before you have absolutely no clue what UW actually means......you are literally standing at the sideline yelling at a cloud.

UW will never be standard considering 99% of content is authored for 16:9.
For movies, no. Peripheral vision doesn`t make much sense there, even though stuff like IMAX exists.
In games the support is actually already rather good on PC. It is more immersive after all.
However nobody ever claimed anything like uw becoming the standard for movies/TVs so I'm not sure why your are even addressing this?

If you show 139 people 4K versus 8K and they can't tell the difference and probably wouldn't pay $8K to see it, that's it. Game over.
It`s like i´m talking to a toddler here who has never ever seen how high tech markets work. Early adopters ring a bell? And since when does new high tech have to be sold cheaply or offer earth shattering differences to succeed, have you ever looked at the smartphone market?
Btw a 65" 8k Samsung starts at <2k....... So much for yet another of your fantasy numbers.....
You can't tell them "well actually i'm sitting here with 57" DualUHD monitor right now and have a double setup of the same monitor at work , so I´m already at 8k in the professional space - here is why you are all wrong". They still won't buy into it.
If they don`t care about that stuff then they are not the target audience at that point.
Early adopters, the enthusiasts, pay the price until everything scales up so joe shmoe doesn`t have to think too hard anymore. Not exactly a new principle...or hard to understand.
 
Last edited:
No doubt it takes more power. I didn't think this was a conversation about trade offs. I agree I'd rather have higher frame rates than resolution. I'm just making the point if the hardware can handle I'll always take more resolution.

Also all sports are shot with 4k cameras at this point. It might only be broadcasted in 1080 but they are shot in a higher resolution. Would be like internal resolution being 4k and it outputting 1080.

Football here is 4k already.
You mean European Football right? Because as far as I know, American Football is still in Full HD.
 
Back in 2012 Av cords (yellow, white, red) were still bundled with PS3's, HDMI, 1080p played its part in immersing gamers into hd gaming. I've done 4K game it is very close to 1080p.
 
I don't own an 8K tv

But I dint see why I wouldn't in the future

Agreed.

This is a easy yes.

Yes it will be relevant in the future.

Anyone saying no is deeply stuck in some cognitive recency mindset. Its this weird thing where someone can't actually see the forest from the trees, they are just stuck in the um "now" and or "yolo" of how something currently is, to understand how the future may or may not be.

Thats like fucking thinkin in 2006 720p will continue to be this deeply relevant thing, did they just forget we moved on from 480p? Why would we not move on to 1080p? Then 4K?

Then clearly 8K?

Soooo someone has to be deeply slow, stuck or something to believe 8k would not be relevant in the future.

Its inevitable that it will be.

Fuck is PS6 and Nextbox suppose to flex? Future GPUs, fuck are the selling points going to be? I'm fucking shocked this shit is even a topic let alone how many just think things would remain the same. They never do in tech and I've yet to see 1 post on here with anything even remotely compelling to argue otherwise.

Imagine thinking TVs will be 4K forever. Lol.
lol I'm saying! I'm telling you, people like this really exist. This is the yolo type thinking lol
 
I've yet to see one 8K image or vidéo lol. My TV doesn't even have HDR .

In the contrary, I would absolutely prefer companies making CRT's once again but less heavy and thinner. 😎
 
4K is completely sufficient for large tvs. 8k would only make sense on a wall sized screen, but not many people have the space or environment for this setting…
 
It already takes a 80in+ TV just to fully resolve 4k for our eyes. 8k would require 120in or over to be fully noticed. I dont see the public adopting a 120in tv as standard and if they cant make out the difference then why would they even be motivated to adopt it?

Id much rather push for ultra wide 5k-6k tvs. Atleast then the benefits would be obvious to the consumer.
 
Last edited:
I feel like this conversation is the same every time with a resolution increase.

4K is great right now. But when technology can get to a point where resolution is never a bottleneck, why not push it further?

I'd love to never have to worry about aliasing.

I remember seeing 1080p for the first time and thinking it couldn't get any better. 4x the resolution later and it certainly can and does look better objectively.
the jump from 480p to 720p was enormous. The jump from 720p to 1080p was almost as large. The jump from 1080p to 4K is... large, but not as much as the two prior jumps.

There is diminishing returns. There is a practical limit based on distance from screen, size, and what our eyes perceive, and if 4K doesn't hit that limit, it's realllllllly close to it. If we need like a 100" screen and need to stand like 4 feet from it to perceive 8K, that's just out of the realm of possibility for most people.

And in terms of games, the tech behind graphics is advancing faster than the graphics hardware to render it, hence all the DLSS, PSSR, FSR, etc. The most advanced graphics card cannot render a game with all these features at 4K. This is after like 6 or 7 years of 4K being commonplace in displays. And 8K has 16x the pixels of 4K (I think). Next gen consoles will likely be the first time since 32-bit (more or less) that the baseline resolution is the same as the last one.
 
Last edited:
480p to 1080p HUGE jump
1080p to 4k Peek jump
4k to 8k - deminished returns.

4k gaming is where it's at (and ultrawide). Resolution is the next thing getting maxed out. After that it will just be further graphical tech upgrades like how RTX raised the bar.
 
the jump from 480p to 720p was enormous. The jump from 720p to 1080p was almost as large. The jump from 1080p to 4K is... large, but not as much as the two prior jumps.

There is diminishing returns. There is a practical limit based on distance from screen, size, and what our eyes perceive, and if 4K doesn't hit that limit, it's realllllllly close to it. If we need like a 100" screen and need to stand like 4 feet from it to perceive 8K, that's just out of the realm of possibility for most people.

And in terms of games, the tech behind graphics is advancing faster than the graphics hardware to render it, hence all the DLSS, PSSR, FSR, etc. The most advanced graphics card cannot render a game with all these features at 4K. This is after like 6 or 7 years of 4K being commonplace in displays. And 8K has 16x the pixels of 4K (I think). Next gen consoles will likely be the first time since 32-bit (more or less) that the baseline resolution is the same as the last one.
I think you are arguing a point that I'm not making. I agree diminishing returns is a thing and we have gotten closer to that.

I was just saying if there is NO bottleneck, let's say 6 years from now then why wouldn't you want it higher? It might not make as big a difference but there will still be a difference. Just look at text on a 4K display currently. It's still pixelated to some degree.

2 x the resolution number = 4 times the pixels. 4K to 8k is 4 times the pixels.
 
just try it for yourself

play game at 4k on 4k display, then downsample at higher res... if it instantly looks better to you, congrats 8k will help you.
it helps me w/ my 77" tv.

AA can help a lot, but still cant do everything raw pixel count can.
 
Last edited:
At some point it has to be, right? Or are we expecting a jump to 16K before anyone bothers with 8K?

The only thing that's certain is that it won't stop at 4K I guess.
 
just try it for yourself

play game at 4k on 4k display, then downsample at higher res... if it instantly looks better to you, congrats 8k will help you.
it helps me w/ my 77" tv.

AA can help a lot, but still cant do everything raw pixel count can.

If you're downsampling 8K to 4K on a 4K display and you're seeing a benefit, you've just proved to yourself that you don't need an 8K HDTV because what you really want is better anti aliasing.
 
Next step should be solid 60fps and then 120fps gaming at 4K. Then we can talk about 8K. But when you already output 4K, Image quality and Framerate is far more important than reaching 8K at lower IQ / FPS. There is such thing as temporal resolution. 30fps was ok at 720p, 60fps is OK at 1080p, 120fps should be the sweatspot at 4K and 240fps should be OK at 8K.

The higher the resolution, the higher you need framerate otherwise the gap between temporal resolution in motion and stills is too big! This is how I think many are perceiving those things and why many people owing 3000$ PCs are playing at a low resolutions and >120fps.
 
Yes it will be relevant in the future.

Anyone saying no is deeply stuck in some cognitive recency mindset. Its this weird thing where someone can't actually see the forest from the trees, they are just stuck in the um "now" and or "yolo" of how something currently is, to understand how the future may or may not be.

Thats like fucking thinkin in 2006 720p will continue to be this deeply relevant thing, did they just forget we moved on from 480p? Why would we not move on to 1080p? Then 4K?

At a certain point you can't convince your average consumer anymore to invest in new technology when the old tech is good enough. Look what happened to hires audio formats like SACD and DVDA that were supposed to make the CD format obsolete. All failed because mainstream consumers were more interested in convenience and portability than better sound quality. So MP3 won out.

This chart is pretty enlightening (or depressing) as well.

2424ChBqbVCNSVnWiT6MF19Dog4diiksFZUh5pTbfBF6ofQunLZ63nb6mXCnzQWhnQTVH.png


DVD had more than 50% marketshare just 18 months ago. I bet a huge number of the people still buying DVDs actually watch them on HDTVs (either 1080p or 4K), but they simply don't care about getting the best image quality. How are you going to make these people ever see the benefit of 8K when they couldn't be bothered to upgrade to Blu-Ray, let alone UHD?

And why did DVD become such a success? Because playing DVDs was much more convenient than playing videotapes that had to be rewinded and were more prone to defects.
 
Top Bottom