DF: 4K gaming: what can PC learn from PlayStation Pro?

That's where you're wrong.

A lot of people use their PCs on a TV and 4K TVs are becoming common. I'm only running at GTX980ti which is enough for 4K in some games but falls short in many others.

What you're not getting is that checkerboard 4K looks virtually identical to "real" 4K when played on a TV at a normal or even close viewing distance. You need to be very close to the screen to appreciate any difference and the difference is often very subtle.

...but it saves a TON of performance and looks dramatically better than standard 1440p. Being able to do this on my TV would be an amazing thing that would allow me to enjoy better image quality on more games.

Dynamic resolution scaling would also be a great thing since a lot of games can hit 60fps at 4K maybe 75% of the time but the drops are just too annoying forcing me to drop resolution. With resolution scaling, it would be possible to enjoy higher resolutions whenever possible with dips in image quality occurring rather than dips in performance which is FAR more noticeable and distracting.

I don't understand why anyone would be against these options
.


I'm already running a 4K OLED and, I'm telling you, the difference between checkerboard and "real" 4K is very minimal unless you're sitting a foot away from the TV. I don't think people get just how convincing it can be.


Because you're conversing with people who are either most likely an immature platform warrior or close minded fool. Maybe both.

Let's be real here, it's a great option to have and lowers the bar of entry to 4k quality gaming. But it's associated with weak console hardware, and compromises the elite experience that should be PC gaming.
 
So basically, PC could learn by doing fake 4K, and dynamic resolution scaling?
Ehhh.

It's not only about 4K. Same technique can be used for displays with lower resolution also. Checkerboard rendering is additional performance saver, so it could be useful on laptops with slow GPUs, for example.
 
Never fails, people argue for fewer options simply because they spent more money on their GPUs.

If checkerboarding enters the PC space in a big way, what's stopping you from checkerboarding 8k for the Downsampling if your GPU can already handle 4K?

DF's comparisons have shown that checkerboarding looks close enough to regular 4K that it's damned near indistinguishable unless you freeze frame and zoom on. So, it really looks like native 4K is an unoptimized way of rendering the image. It wastes resources for no immediately distinguishable benefit.
I can currently handle 4k30 in a lot of games, but with checkerboarding I'd be able to do 4k60 with a miniscule drop in image quality, but an enormous increase in performance and playability.
 
Because you're conversing with people who are either most likely an immature platform warrior or close minded fool. Maybe both.

Let's be real here, it's a great option to have and lowers the bar of entry to 4k quality gaming. But it's associated with weak console hardware, and compromises the elite experience that should be PC gaming.

What makes PC gaming great isn't just all about top end performance though, most PC gamers don't have top end rigs anyway, it's all about having options and flexibility that really makes PC gaming better.
 
This is really disappointing to hear from DF:




As I'm sure somebody must already have mentioned, enabling GPU scaling allows whatever resolution you want. I run games at 1080p, 12XXp, 1440p, 1620p, 1800p and 2160p. All depending on the game and target framerate.

It's super easy to set up on AMD cards at least, and I recon it's the same for Nvidia.

If the scaling algorithm is bad then I can see why people don't use GPU scaling.
 
GPU scaling is garbage compare to checkerboarding. Someone already posted a comparison between native, checkerboarding and upscaling. And yes, PC games can and should definitely start looking into various method of doing 4K without the need of high end GPU, pc gamers act like there is not such thing but nVidia themselves already have support multi res rendering for VR to improve performance, and Shadow Warrior 2 already support it as a standard feature, not VR only.
 
I mean sure, the more options the merrier. Faux 4K looks very damn good. If it can be combined with supersampling to reproduce the crispness of 4K on sub-4K displays without the heavy GPU load that usually comes with such cleanness, everyone wins.
 
Checkerboarding does look pretty good, though after playing a few games with it I think the difference is bigger than what DF is claiming. I'll just continue to brute force actual 4k with absurdly powerful GPUs. It'd be nice if the feature eventually came to PC though since options are always good.
 
Lessons like "Don't let people actually use their GPU power unless the developer specifically implements support for their hardware model".

:P

That said, an option to perform some type of modern subsampling ("checkerboarding" is a bit too restrictive a term) in very demanding games is a good thing.
 
Lessons like "Don't let people actually use their GPU power unless the developer specifically implements support for their hardware model".

:P

That said, an option to perform some type of modern subsampling ("checkerboarding" is a bit too restrictive a term) in very demanding games is a good thing.

What was it that Quantum Break did? Because that's how not to do it.
 
Didn't a recent pc game have checkerboarding? Is this a software feature added by devs?
Multiple PC games have some form of subsampling with temporal reuse. Yes, this is a software feature added by devs.

What was it that Quantum Break did? Because that's how not to do it.
It's what every game that uses "checkerboarding" does. It's just a broader term.

I wouldn't use "checkerboarding" in this context since it might well not be the be-all end-all of spatial subsampling and temporal reuse.
And it's also a bit of an unclear term, people use it e.g. both for 1x1 "checkers" (generally implemented using MSAA patterns) and 2x2 "checkers".
 
Multiple PC games have some form of subsampling with temporal reuse. Yes, this is a software feature added by devs.

It's what every game that uses "checkerboarding" does. It's just a broader term.

I wouldn't use "checkerboarding" in this context since it might well not be the be-all end-all of spatial subsampling and temporal reuse.
And it's also a bit of an unclear term, people use it e.g. both for 1x1 "checkers" (generally implemented using MSAA patterns) and 2x2 "checkers".
I just remember Quantum Break having very obvious ghosting, loss of IQ, and artifacts on PC that I don't see on PS4 Pro checkerboard games.
 
I thought PC gamers liked options
We do.

If we are talking about a very-high-budget "AAA" game where implementing competent spatial subsampling with temporal reuse is not a big drain on resources, then I'm all for including such an option.

However, if we are talking about a "A", "B" or indie game, which is the vast majority of what I play, then we do need to take into account that these types of rendering features are non-trivial to implement and integrate.
(That's why tons of games in these categories don't even support them on PS4)
In such cases, call me an elitist, but I'd rather that the development time (which is always a limited quantity) be spent on forward-looking high-end features, like better AO or LOD or shadow filtering, than these kind of techniques.

I just remember Quantum Break having very obvious ghosting, loss of IQ, and artifacts on PC that I don't see on PS4 Pro checkerboard games.
Well, yes, different implementations of the same category of feature can be more or less successful. There are also PC games with better-received implementations of the same concept.
 
I'm a total PC fanboy, but even I hate the snobbery of those against checker-boarding based on some sort of weird elitist principle. If there's an way to get better graphics with lower hardware requirements at the expense of image quality, then why not have the option? If you don't want to use it, then turn it off.

Yep, the best part of PC gaming is options and control over your experience.
 
I dunno, seems to be an article about Pro's best case scenario which is Horizon and make a point that might not hold up when most of games for Pro are falling in not even 1800p native.

I mean I don't disagree that reconstruction techniques make for an impressive image, but at least what I saw from other retina devices (surface pro and surface book) was the opposite, it's extremely noticeable sub resolution content that upclose, and they usually look worse than I thought it would for a screen at that size.
 
Shit, not only do we have console wars BS but now have Tru4K warriors trying to slay any CBR talk simply because it isn't "true 4k" when the actual benefits in performance are real. It is up to the developer to implement the solution properly. CBR or dynamic scaling isn't a problem in itself, it the implementation by developers that get people's panties in a wad.
 
We do.

If we are talking about a very-high-budget "AAA" game where implementing competent spatial subsampling with temporal reuse is not a big drain on resources, then I'm all for including such an option.

However, if we are talking about a "A", "B" or indie game, which is the vast majority of what I play, then we do need to take into account that these types of rendering features are non-trivial to implement and integrate.
(That's why tons of games in these categories don't even support them on PS4)
In such cases, call me an elitist, but I'd rather that the development time (which is always a limited quantity) be spent on forward-looking high-end features, like better AO or LOD or shadow filtering, than these kind of techniques.

Well, yes, different implementations of the same category of feature can be more or less successful. There are also PC games with better-received implementations of the same concept.
So, this is starting to sound more like a feature engine developers should implement and standardize than something that each smaller dev should be trying to shoehorn onto someone else's engine?

So, for instance, Epic could add their own flavor of the technique to UE4 instead of various devs trying to force UE4 to play well with their own solutions?

I realize this doesn't apply to Remedy and their Northlight Engine, but it got me thinking.


Also, doesn't Rainbow Six: Siege use s technique?
 
just curious though, who's need to step in for this checkerboard render in PC?
GPU makers? OS makers? game developers?
(sorry, I haven't read the article)
 
More options are never a bad thing. Not everyone is running the fastest GPUs around.
That said, I don't necessarily agree with some of the conclusions drawn from this video.

On a 4K native screen, and even in games which use a soft TAA implementation, non-native rendering is usually quite noticeable to me if I'm actually playing the game.
I will typically turn down every other option available first before resorting to dropping the resolution.

Though sample-and-hold is an issue for most displays now, the perceived drop in resolution with motion on these displays does not mask the artifacts caused by rendering at lower resolutions.
If the rendering quality was perfect that may not be the case, but rendering games at lower resolutions typically introduces more artifacts into the image which manage to persist through that motion blur.
TAA solutions usually have a very noticeable drop in resolution/blurring of the image in motion, and checkerboarding artifacts really stand out.
I believe that at least part of the reason this stands out so much with TAA in motion could be caused by a loss of edge contrast, more than the drop in resolution itself.

If the type of checkerboarding used in PS4 Pro games like Horizon is anything like the technique Ubisoft have used in Watch Dogs 2, or other techniques like Resident Evil 7 or Quantum Break used, it's not a good option when gaming up close on a monitor in my opinion.
I don't agree that it's a subtle drop in image quality when the image is static, nor do I agree that it isn't noticeable in motion.

I generally prefer to use a static lower resolution than checkerboarding or dynamic resolutions.
Even if you were to compare a static 1800p to 1800p that can reach 4K maybe 20% of the time, I find the dynamic change in resolution to be more distracting than keeping it fixed - even though it is lower.

Here's a comparison from Watch Dogs 2 in motion that I made a while back.
The settings were targeting 4K30 on a GTX 1070, and I believe this was before they patched in SMAA T2x - not that it affects the 1800p result in any significant way. If anything, it hurts the native 4K image more.
That said, I am at least starting to come around to more recent TAA implementations which include a post-TAA sharpening filter - as long as it is optional, and preferably has a slider to control the strength.
They do still blur the image quite a bit in motion, but it really does seem like the only valid anti-aliasing option for games which use modern rendering techniques like PBR, unless you're doing something like 16x DSR - which is obviously unplayable in such games on today's GPUs.
It's always disappointing to see new games released which are still using old techniques like SMAA T2x instead of a good TAA solution. (comparison between SMAA and TAA in Alien: Isolation)

And I think it says a lot that we have gone from post-process AA techniques like MLAA and FXAA that PC gamers were very down on, to modern TAA implementations that have significantly better image quality yet still have a very low performance hit.
Developers trying to get the most out of console hardware probably played a big role in the development of these techniques, instead of just brute-forcing MSAA or SSAA with ever-faster hardware on PC.
Hopefully today's checkerboarding techniques will be seen as "primitive" compared to what will be available a few years from now, just as post-process AA techniques have improved so much.

---

Something that I do feel needs mentioning, is that there is a decreasing number of PC gamers which need to target a 60 FPS lock now.
Quite a lot of the DigitalFoundry PC videos seem to mention a "60 FPS lock" and it's starting to feel a bit outdated.

If you are building a high-end gaming system today, you're probably going to (or should) pair it with a variable refresh rate display - whether that's 4K or not.
Targeting an absolute lock on 60 FPS can be leaving a considerable amount of performance on the table.
I still don't like it when a game dips below 60 FPS, but with a VRR display, you can allow the minimum framerate to dip into the mid-50s, while average framerates might now be in the 70-80 FPS range.

Personally I still feel that a high refresh rate 1440p display is the sweet spot for today's hardware rather than targeting 4K60 though.
Assuming your CPU can keep up, 4K60 translates to 2560x1440 at 135 FPS, or 3440x1440 at 100 FPS.
And let's say that minimum framerate dips 33% below the average. At 4K that is only 45 FPS, while you are still at 100 FPS at 2560x1440, and 75 FPS at 3440x1440.

---

For what it's worth, it is not required to use CRU if you have an NVIDIA GPU - you can add custom resolutions directly in the NVIDIA Control Panel.
Just set the timings to manual to lock the output resolution to native before you touch the display mode to set the render resolution.

Here's my custom "1080p" ultrawide resolution for example:
custom-resolutionuzkhg.png

The only downside to this is that NVIDIA do not allow custom resolutions and DSR to be active at the same time for some reason.
So if you use DSR a lot, perhaps CRU is the way to go.

What you're not getting is that checkerboard 4K looks virtually identical to "real" 4K when played on a TV at a normal or even close viewing distance. You need to be very close to the screen to appreciate any difference and the difference is often very subtle.
I guess it's because I've been a PC gamer all my life, but I always end up sitting around 3ft from my TV anyway.
If I don't, it feels much smaller than a monitor. (actually, sitting up close to a 34" UW monitor still feels larger)

Resident Evil 7 already supports HDR on PC.
There are only a handful of games which do support HDR output on PC though.
There are a number of games which support HDR on console, such as Deus Ex: Mankind Divided, Forza Horizon 3, and Gears of War 4 that still lack an option for it on PC.
The reverse is also true with some games like Shadow Warrior 2 not supporting HDR on console. (is it an NVIDIA-only implementation on PC?)

It's kind of a mess, which is why I'm still waiting to pick up an HDR TV.
Hopefully they will have things sorted out by the time OLEDs supporting 120Hz HDMI 2.1 VRR are available.
 
Watch Dogs 2 had some kind of "temporal reconstruction" stuff and i really liked it. It had better picture quality that the technique Quantum Break used.

I would like to see more games with that option, it gave me like 30-40 fps more with minimal impact on image quality
 
I have a GTX 1080 and while it is capable of 4K/30FPS it's not capable of 4K/60FPS. If I had the option of checkerboarding or dynamic resolution scaling then I might be able to play a lot more games at high image quality and framerate both at the same time.

This is what DF means when they say what PC gaming can learn from consoles. The advantages are NOT just limited to old GPUs because while the older GPUs get the option to play at a higher image quality at 30FPS, the modern more powerful GPUs get the option to do the same at 60FPS. So people dismissing this with comments such as "only helps old GPUs" or comments such as "oh anyone who has 4K will have a powerful GPU" (which is totally false) need to understand this.

As for checkerboarding itself, if you think it's easily noticeable or looks like crap then you haven't come across good implementation​s.

It absolutely can do 4k/60 . I play nothing but that with that card. Not everything has to be cranked up to max settings you know.

Fake 4k to real 4k have very noticeable differences. I rather they not start adding that kind of stuff to pc gaming. If your card can't do 4k gaming your well you have a console as a cheaper option for you. As someone who games on a ks8000 with their pc, I rather them concentrate on a useful feature, such as more HDR support.
 
It absolutely can do 4k/60 . I play nothing but that with that card. Not everything has to be cranked up to max settings you know.

Fake 4k to real 4k have very noticeable differences
. I rather they not start adding that kind of stuff to pc gaming. If your card can't do 4k gaming your well you have a console as a cheaper option for you. As someone who games on a ks8000 with their pc, I rather them concentrate on a useful feature, such as more HDR support.
To the bolded, not really if it's done well. We aren't talking about just upscaling to 4K, if we could get the same quality we've seen on PS4 Pro it's almost indistinguishable from native 4K but with some serious performance increases.
 
Because you're conversing with people who are either most likely an immature platform warrior or close minded fool. Maybe both.

Let's be real here, it's a great option to have and lowers the bar of entry to 4k quality gaming. But it's associated with weak console hardware, and compromises the elite experience that should be PC gaming.

...and if the bar is lowered, it let's more people in to the 4K ballpark, ready to be served and tantalized into making now relatively minor upgrades to get to true 4K within a year or two (or immediately if they catch the fever). They'll be someone who has the monitor or TV, an almost-there rig... then 4K ready is being expedited towards mainstream, instead of being an elite niche that gets underserved for longer than it has to be.

You'd think that would result in the opposite of lazy devs and lack of true 4K incentive that some seem to be worried about in here, and that an intermediary resolution between high and medium levels that they've already produced wouldn't make them take the true 4K resolution away, or heavily weigh on devs as a significant level of added work that guts or influences the game development and design in some other way.

I can understand their thinking a bit... I have a TV with great HDR, but many people just bought whatever 4K TV with limited or no HDR, due to confusion or just price vs importance. Now sometimes HDR feels like a maybe and an afterthought on some games with PS4 Pro support, when that's what I kinda like the most. It would be easy to go "stupid low nit, non-10bit plebs, holding me back" but that's on devs and ~4K + HDR is still more common than it would be just because regular folk are starting to adopt any 4K TV's at all, whatever make they may be.
 
Multiple PC games have some form of subsampling with temporal reuse. Yes, this is a software feature added by devs.
Oh ok. So it's nothing to do with what pc can learn from ps4 pro, and everything to do with devs including the feature. Some games don't even have pro support, or very little effort put into it. That goes back to the devs again.
 
I think what DF doesn't realize is that a lot of PC gamers don't care about resolutions above 1080p and 1440p because there are other improvements that they can focus on that they might prefer. Higher framerates and 144hz+ are way bigger improvements to the actual experience than fauxK or real 4k. When you hear that they still don't have a display above 60hz (Vanquish DF video) then it makes sense why they aren't really up to date on this stuff.
 
So basically, PC could learn by doing fake 4K, and dynamic resolution scaling?
Ehhh.

Performance to quality ratio of checkerboard type rendering is WAY higher than native 4k. "Learning from consoles" means taking a look at the more efficient rendering techniques that console games sometimes use and maybe adding them or something similar as an option in your pc games.
 
Unless your up close sniffing the pixels on screen, Then the difference between CBR and Native 4K really is minimal at best, Why anyone would be against CBR is baffling considering PC gamers love options, But then if one has bought the latest and greatest GPU and the CBR image looks the same from normal distance, Then no wonder ones nose would be out of joint.

HDR is my most wanted implement, That's what makes scenery pop and wow.
 
So are these rendering techniques on GPU makers or developers? If its on devs, then perhaps the article should be aimed at pc devs?
 
I think what DF doesn't realize is that a lot of PC gamers don't care about resolutions above 1080p and 1440p because there are other improvements that they can focus on that they might prefer. Higher framerates and 144hz+ are way bigger improvements to the actual experience than fauxK or real 4k. When you hear that they still don't have a display above 60hz (Vanquish DF video) then it makes sense why they aren't really up to date on this stuff.
What if I told you those same techniques would give you higher framerates at 1080 or 1440?
 
More options would allways be welcome, that's one of the biggest advantages of PC gaming.

Having played a whole lot of 4K games so I'm no expert but horizon looked awesome with checkboarding


So basically, PC could learn by doing fake 4K, and dynamic resolution scaling?
Ehhh.

Because everyone who plays on PC obviously has a high end machine capable of running all modern games in 4K/60Fps.
I mean, why would you even play on PC if you don't have at least a 1080ti... as if having only one of those wasn't ebarrasing enough
 
Is bandwidth a concern with 4k gaming? With the PC market being mostly downloaded, and consoles heading that way there is going to be alot towards caps.
 
I don't agree with FauxK looking anything like 4K - screenshots maybe. But in motion, there's so much aliasing, it just doesn't give you the same feeling as true 4K. Furthermore, while checkerboarding might be good to have for a mid-range 4K setup, most midrange cards already handle 4K @ 30 with some settings turned down.

I'd still like it as it gives more options and when it works, dynamic resolution scaling is great, but I doubt it'll be that useful overall.
 
What you're not getting is that checkerboard 4K looks virtually identical to "real" 4K when played on a TV at a normal or even close viewing distance. You need to be very close to the screen to appreciate any difference and the difference is often very subtle.

I'm already running a 4K OLED and, I'm telling you, the difference between checkerboard and "real" 4K is very minimal unless you're sitting a foot away from the TV. I don't think people get just how convincing it can be.

Someone should save this and quote it everytime the console warriors start fighting about 4K.

I hope MS go for "sparse rendering" and use most of Scoprios extra GPU power to improve things like aa, af, ao and shadows.
 
Are you arguing or agreeing with me? Should I rage out or high five you?


Bit of both I guess lol, I mean you are right, it does lower the entry level for 4k, which is really great but why is it such a bad thing to be associated with weaker console hardware, just because it would use the same technique ?

People wouldn't suddenly start calling PC's worse than consoles or anything. People really should get over the elite PC thoughts, it's just a more powerful system, there's nothing wrong with owning console, PC or best case scenario both.
 

To say it in layman terms:

Why not get 90% of the visual quality of 4K (like the PS Pro) and have the extra resources spent on AI, lighting etc.

It would seem PC could really explore this aspect right? Or is PC so laser focused on high-fidelity of 100% 4K pixel-pushing and it's missing HUGE other opportunities to explore and push the boundaries.

I mean look at Horizon 4K on a PS PRO and don't tell me it's anot one of the best games you have ever seen. If PC did checkerboard, it could explore other aspects of gaming that actually IMHO should be explored, specifically AI.
 
Also, doesn't Rainbow Six: Siege use s technique?

Siege was one of the first implementations of checkerboard rendering. Watch Dogs 2 also has it (also by Ubisoft).

I don't agree with FauxK looking anything like 4K - screenshots maybe. But in motion, there's so much aliasing, it just doesn't give you the same feeling as true 4K. Furthermore, while checkerboarding might be good to have for a mid-range 4K setup, most midrange cards already handle 4K @ 30 with some settings turned down.

I'd still like it as it gives more options and when it works, dynamic resolution scaling is great, but I doubt it'll be that useful overall.

I agree. You can use games like R6 Siege and Watch_Dogs 2 as a good comparison since you can switch between them on the fly and see the difference very easily. I don't think faukX is a close match for native resolution at all. A lot of the comaprison shots people post miss the point in doing it on a static image and then they are sitting at TV distance and using a gamepad to slowly pan the camera.

Using Watch_Dogs 2 as an example, sitting at monitor distance and using a more direct control method like a mouse to move the camera around makes it really obvious that the CBR image is very ghosty and fuzzy compared to a native one.
 
Checkerboarding is whatever; as someone who's done it for years, I think upscaling works just fine. However, dynamic resolution (or resolution sliders in general, especially ones that let you go above 100%) should definitely be more of a standard on PC. I don't think it'd be too complicated to implement, and I don't mind a momentary resolution drop if it means keeping a consistent 60fps. It doesn't hurt to have options, and PC is all about that.
 
If a PC gamer dont have a 4K capable GPU, he's most likely doesn't have a 4K monitor to take advantage of fake 4K at all
That's the main issue with 4K on PC. 144hz 4K monitors aren't available yet and will probably be super expensive when they arrive.
So do we downgrade to 60hz or do we stick with 1080p/1440p on PC until prices go down?
For me it's the latter. And many of us game on multiple screens but triple monitor 4K just isn't happening right now on PC. So 4K just isn't there yet on PC for me, not without compromises. And upgrading something while downgrading something else is never fun.

On consoles we're getting the 60hz screen update we're used to and the 30fps performance we're used to. It's not better than on PC but from a console gamer perspective we're not getting any downgrades at least. It may sound wrong but looking at it from that perspective I think consoles are more ready for 4K than PCs right now.
 
Top Bottom