DF: 4K gaming: what can PC learn from PlayStation Pro?

I don't agree with FauxK looking anything like 4K - screenshots maybe. But in motion, there's so much aliasing, it just doesn't give you the same feeling as true 4K. Furthermore, while checkerboarding might be good to have for a mid-range 4K setup, most midrange cards already handle 4K @ 30 with some settings turned down.

I'd still like it as it gives more options and when it works, dynamic resolution scaling is great, but I doubt it'll be that useful overall.

That is an area where I think temporal injection wins out over checkerboard. It's a much more stable final image and the data from it can be used to improve other effects that use temporal data, such as motion blur. I really hope Insomniac talks more about their method soon because I think it's spectacular.
 
That's the main issue with 4K on PC. 144hz 4K monitors aren't available yet and will probably be super expensive when they arrive.
So do we downgrade to 60hz or do we stick with 1080p/1440p on PC until prices go down?
For me it's the latter. And many of us game on multiple screens but triple monitor 4K just isn't happening right now on PC. So 4K just isn't there yet on PC for me, not without compromises. And upgrading something while downgrading something else is never fun.

On consoles we're getting the 60hz screen update we're used to and the 30fps performance we're used to. It's not better than on PC but from a console gamer perspective we're not getting any downgrades at least. It may sound wrong but looking at it from that perspective I think consoles are more ready for 4K than PCs right now.

The amount of people using 144 Hz displays or triple screen are combined still a very small minority. I doubt it is affecting the decisions of most PC gamers.
 
Here's some shots of CBR comparisons in two of the PC games where its available:

R6: Siege
temporalupscalenxsp3.png

Watch_Dogs 2 (from Paragon's post)
Native 4K Rendering

1800p Checkerboard Rendering

Even in stills I don't think it holds up to native image quality and in my experience it is much more jarring in motion. I tried to run WD2 in CBR for a while but just couldn't deal with the ghosting/frayed edges on everything so went back to native.
 
More options are never a bad thing. Not everyone is running the fastest GPUs around.
That said, I don't necessarily agree with some of the conclusions drawn from this video.

On a 4K native screen, and even in games which use a soft TAA implementation, non-native rendering is usually quite noticeable to me if I'm actually playing the game.
I will typically turn down every other option available first before resorting to dropping the resolution.

Though sample-and-hold is an issue for most displays now, the perceived drop in resolution with motion on these displays does not mask the artifacts caused by rendering at lower resolutions.
If the rendering quality was perfect that may not be the case, but rendering games at lower resolutions typically introduces more artifacts into the image which manage to persist through that motion blur.
TAA solutions usually have a very noticeable drop in resolution/blurring of the image in motion, and checkerboarding artifacts really stand out.
I believe that at least part of the reason this stands out so much with TAA in motion could be caused by a loss of edge contrast, more than the drop in resolution itself.

If the type of checkerboarding used in PS4 Pro games like Horizon is anything like the technique Ubisoft have used in Watch Dogs 2, or other techniques like Resident Evil 7 or Quantum Break used, it's not a good option when gaming up close on a monitor in my opinion.
I don't agree that it's a subtle drop in image quality when the image is static, nor do I agree that it isn't noticeable in motion.

I generally prefer to use a static lower resolution than checkerboarding or dynamic resolutions.
Even if you were to compare a static 1800p to 1800p that can reach 4K maybe 20% of the time, I find the dynamic change in resolution to be more distracting than keeping it fixed - even though it is lower.

Here's a comparison from Watch Dogs 2 in motion that I made a while back.
The settings were targeting 4K30 on a GTX 1070, and I believe this was before they patched in SMAA T2x - not that it affects the 1800p result in any significant way. If anything, it hurts the native 4K image more.
That said, I am at least starting to come around to more recent TAA implementations which include a post-TAA sharpening filter - as long as it is optional, and preferably has a slider to control the strength.
They do still blur the image quite a bit in motion, but it really does seem like the only valid anti-aliasing option for games which use modern rendering techniques like PBR, unless you're doing something like 16x DSR - which is obviously unplayable in such games on today's GPUs.
It's always disappointing to see new games released which are still using old techniques like SMAA T2x instead of a good TAA solution. (comparison between SMAA and TAA in Alien: Isolation)

And I think it says a lot that we have gone from post-process AA techniques like MLAA and FXAA that PC gamers were very down on, to modern TAA implementations that have significantly better image quality yet still have a very low performance hit.
Developers trying to get the most out of console hardware probably played a big role in the development of these techniques, instead of just brute-forcing MSAA or SSAA with ever-faster hardware on PC.
Hopefully today's checkerboarding techniques will be seen as "primitive" compared to what will be available a few years from now, just as post-process AA techniques have improved so much.

---

Something that I do feel needs mentioning, is that there is a decreasing number of PC gamers which need to target a 60 FPS lock now.
Quite a lot of the DigitalFoundry PC videos seem to mention a "60 FPS lock" and it's starting to feel a bit outdated.

If you are building a high-end gaming system today, you're probably going to (or should) pair it with a variable refresh rate display - whether that's 4K or not.
Targeting an absolute lock on 60 FPS can be leaving a considerable amount of performance on the table.
I still don't like it when a game dips below 60 FPS, but with a VRR display, you can allow the minimum framerate to dip into the mid-50s, while average framerates might now be in the 70-80 FPS range.

Personally I still feel that a high refresh rate 1440p display is the sweet spot for today's hardware rather than targeting 4K60 though.
Assuming your CPU can keep up, 4K60 translates to 2560x1440 at 135 FPS, or 3440x1440 at 100 FPS.
And let's say that minimum framerate dips 33% below the average. At 4K that is only 45 FPS, while you are still at 100 FPS at 2560x1440, and 75 FPS at 3440x1440.

---

For what it's worth, it is not required to use CRU if you have an NVIDIA GPU - you can add custom resolutions directly in the NVIDIA Control Panel.
Just set the timings to manual to lock the output resolution to native before you touch the display mode to set the render resolution.

Here's my custom "1080p" ultrawide resolution for example:


The only downside to this is that NVIDIA do not allow custom resolutions and DSR to be active at the same time for some reason.
So if you use DSR a lot, perhaps CRU is the way to go.

I guess it's because I've been a PC gamer all my life, but I always end up sitting around 3ft from my TV anyway.
If I don't, it feels much smaller than a monitor. (actually, sitting up close to a 34" UW monitor still feels larger)

There are only a handful of games which do support HDR output on PC though.
There are a number of games which support HDR on console, such as Deus Ex: Mankind Divided, Forza Horizon 3, and Gears of War 4 that still lack an option for it on PC.
The reverse is also true with some games like Shadow Warrior 2 not supporting HDR on console. (is it an NVIDIA-only implementation on PC?)

It's kind of a mess, which is why I'm still waiting to pick up an HDR TV.
Hopefully they will have things sorted out by the time OLEDs supporting 120Hz HDMI 2.1 VRR are available.
That's a great post, the only thing I'd add is that it is in fact possible to use arbitrary downsampling resolutions with DSR.
(See here)

What if I told you those same techniques would give you higher framerates at 1080 or 1440?
At those resolutions, I'd personally find the artifacts really distracting in a PC gaming setting (as in on a monitor). They obviously become more easily noticeable the lower you go in terms of native resolution.
 
This suggests Scorpio could learn from Pro too.

It didn't occur to me until now that some games on Pro with checkerboard 4K might actually look better than on Scorpio with native 4K if they are left with more headroom to render better frame rates, lighting & shadows, improve IQ etc.

This would be solved with Scorpio going to checkerboard 4K for those games, but I wonder how many games will let you choose down to that level.

Absolutely. If MS push hard on 'real 4k' then they risk wasting some performance drawing pixels that will not be immediately noticeable and leaving themselves open to close comparisons with PS Pro. If instead they looked at reconstruction, they could likely turn up some of the GPU settings and have a more noticeable difference
 
I don't think the question is whether cb looks as good or close to as good as native 4K.
The question is how cb compares to 1440p or 1620p (or whatever is the closest in performance) when traditionally upscaled to 4K. Those are the comparisons I want to see more of.

I recall seeing distracting artifacts in the RotTR cb implementation which made the 1440p upscaled look better in my eyes. I don't know if artifacts like that are common though, or if that was just a Tomb Raider thing.
 
I like options. If possible I would like every PC game to offer every possible rendering and display option, including checkerboard rendering, resolution scaling and dynamic resolutions and settings.

I also accept Digital Foundry's opinion that these alternative options may be good enough for many people and that they offer better performance at a minimal decrease in quality. However, these is a clear and observable difference between native 4K and checkerboard 4K, never mind native 4K and checkerboard 1800p that many PS4 Pro games use. Whether or not the increase in performanse is worth the decrease in quality is debatable and subjective. Personally I think it is because I don't care about resolution, framerate is king for me. I can understand image quality purists disagreeing.
 
This suggests Scorpio could learn from Pro too.

It didn't occur to me until now that some games on Pro with checkerboard 4K might actually look better than on Scorpio with native 4K if they are left with more headroom to render better frame rates, lighting & shadows, improve IQ etc.

This would be solved with Scorpio going to checkerboard 4K for those games, but I wonder how many games will let you choose down to that level.
In theory, yes, but in practice, for whatever reason (likely the bandwidth) Pro upgrades are being as minimal as they could be, and falling short even on the resolution increase.

Granted it's only a game, but Ms showing Forza 6 at 4K with 4k assets and ultra settings is already a lot bigger update than any Pro version to date.

Edit: That's an unlikely proposition anywayl, the developer wouldn't force Scorpio at a 4k at the expense of visuals, they either would maintain the same effects at a higher resolution (but not 4k) than pro, or would have a similar resolution and higher quality effects.
 
Not sure about checkerboard rendering but I would like that pc games had dynamic resolution scaling similar to some console games like Halo 5 in order to hold 60 FPS with lesser GPU's (if the CPU is up to the task of course)
 
Absolutely. If MS push hard on 'real 4k' then they risk wasting some performance drawing pixels that will not be immediately noticeable and leaving themselves open to close comparisons with PS Pro. If instead they looked at reconstruction, they could likely turn up some of the GPU settings and have a more noticeable difference

Agreed
 
Sure why not? Like the article says, the beauty of PC gaming is that the user is in control as to what they want to prioritize in terms of visual fidelity vs performance. Personally, I don't give a flying fuck about that "every game must be played on ultra" mantra that has become a running joke regarding PC gamers.
 
More options are never a bad thing. Not everyone is running the fastest GPUs around.
That said, I don't necessarily agree with some of the conclusions drawn from this video.

On a 4K native screen, and even in games which use a soft TAA implementation, non-native rendering is usually quite noticeable to me if I'm actually playing the game.
I will typically turn down every other option available first before resorting to dropping the resolution.

Yup. On monitors, you do notice non native rendering as softer and blurrier. In motion it doesn't look good either. Resolution is just one of those things I just never drop on my native screen. Though if you're someone who sits away from a tv or whatever and never get up close to inspect things, you likely won't care.

I recall being told my 360 was broken when I started complaining how blurry it looked on my 1080p tv back then. Turns out the game was rendering at 600p and I was used to 1080p on pc but I was young so I figured it was broken. Monitors spoil you when it comes to crisp image and a tv isn't just as good if you're one of those guys. Besides, there's already some games with checkboarding. It's up to devs to decide if they're going to implement it or not.
 
Not much to say but that I agree with them completely - I'd love to see checkerboarding and dynamic res come to PC. I'd rather a small detail loss to keep things perfectly fluid than a varying framerate.

I also stand by my stance that it was far too early for consoles to be squandering 2 and 4 times more powerful GPUs on merely more pixels, there's still a world of detail to fill in on plain old 1080p.
 
So basically, PC could learn by doing fake 4K, and dynamic resolution scaling?
Ehhh.

edit: since some like to quote without reading other posts: I've never said those options couldn't be in. Just that the former is a bad priority, and the latter is already done on PC.

Looking at top PS4 Pro games like Horizon or Ratchet on my 4k hdr TV I'd happily trade 70 fps native 1440p gaming into 1440p checkerboad 140 fps gaming ;)

And when we are getting 4k 27" 144 Hz displays in 2017 the trade off is going to be even less visible in image quality.
 
That's a great post, the only thing I'd add is that it is in fact possible to use arbitrary downsampling resolutions with DSR.
(See here)

At those resolutions, I'd personally find the artifacts really distracting in a PC gaming setting (as in on a monitor). They obviously become more easily noticeable the lower you go in terms of native resolution.
Yes, but I'd gladly trade that for higher framerate.
Those artifacts would be less distracting than lower resolution or lower framerate.
 
I think what DF doesn't realize is that a lot of PC gamers don't care about resolutions above 1080p and 1440p because there are other improvements that they can focus on that they might prefer. Higher framerates and 144hz+ are way bigger improvements to the actual experience than fauxK or real 4k. When you hear that they still don't have a display above 60hz (Vanquish DF video) then it makes sense why they aren't really up to date on this stuff.
It's not about being up to date. I simply did not find one that fits my needs. At this point, the only way I would consider it, is if it were available at 40" with a 4K resolution, a non-IPS panel with 144hz variable refresh support and local dimming with lots of zones. So many monitors still seem stuck at 27", which is way too small, and 32" is also rather small for 4K. Already using a 32" monitor now and I could definitely go bigger.

Unless...there is such a product? I haven't run across anything like that but I WOULD absolutely go pick up a 40" G-sync monitor right now if it were available. Also...and this is kind of a silly thing, but I really hate the "gaming" bezel and stand design of these monitors. I'm being picky but I'm happy with my current monitor and have no desire to upgrade right now. For the time being, I much prefer playing PC games at 60hz on an OLED display as the inherent picture quality more than makes up for the standard refresh rate.

Since I work from home, I can't really just have lots of monitors lying around for testing but we have several at the DF office anyways so it's not like they aren't available to use.

Fine details are just ruined by checkerboard rendering. Just check out those power lines in the top left corner on the 1800p version - yuck.

EDIT: There are even gaps in the wire as it occupies less space. Seems like it goes sub pixel at some points.
You have to realize that there isn't a standard here - this type of rendering varies heavily per implementation. I really don't think Ubi's implementation in Watch Dogs 2 is very good and it definitely shows artefacts. In the best case, artefacts are minimized (something like Horizon Zero Dawn) to the point where you basically can't see them unless you very closely analyze the image from close proximity.
 
Using Watch_Dogs 2 as an example, sitting at monitor distance and using a more direct control method like a mouse to move the camera around makes it really obvious that the CBR image is very ghosty and fuzzy compared to a native one.
That's a very interesting point.
Something else which occurs to me is that you don't have the option to disable motion blur in most console games either, while many (most?) PC gamers will disable it.
So perhaps the combination of camera motion blur, and using a controller which has a restricted turning speed, is doing a lot to mask the artifacts of these subsampled rendering techniques.

That's a great post, the only thing I'd add is that it is in fact possible to use arbitrary downsampling resolutions with DSR. (See here)
Thank you. It's nice to know that I'm not just posting a wall of text that everyone skips over.
Custom DSR resolutions are indeed very nice to have. It's great being able to run older games at >4x DSR resolutions if there are issues using MSAA with them.
For example: when I tried to play the original Max Payne recently, there were a lot of "seams" appearing on the level geometry when I enabled MSAA.
At 1080p, 4x DSR looked great, but a custom 16x DSR resolution looked perfect.
 
Something else which occurs to me is that you don't have the option to disable motion blur in most console games either, while many (most?) PC gamers will disable it.
Why is it that so many people dislike motion blur? I just don't get it. It's one component that brings real-time rendering closer to pre-rendered CGI.

A good motion blur implementation adds a lot to a presentation. It's hugely important to me.
 
I'm a total PC fanboy, but even I hate the snobbery of those against checker-boarding based on some sort of weird elitist principle. If there's an way to get better graphics with lower hardware requirements at the expense of image quality, then why not have the option? If you don't want to use it, then turn it off.

I don't see any "snobbery" and "elitist" attitudes from anyone saying they don't need it right now.

If anyone tells you no one should have it on PC or that it is completely unnecessary - then I would agree with you, since obviously the idea has a ton of uses regardless of spec.
 
Why is it that so many people dislike motion blur? I just don't get it. It's one component that brings real-time rendering closer to pre-rendered CGI.

A good motion blur implementation adds a lot to a presentation. It's hugely important to me.
Plus you have motion blur in real life too.

I think people just have experience with bad motion blur implementation, especially bad camera motion blur that heavy on the blur and looks even worse at low framerate. I do not see how anyone can ever dislike per object motion blur though, even the mediocre quality ones.
 
But I simply can do 4K60 on any PC games I want by lowering the graphics settings, and if I feel like I want maximum graphical quality then I amp everything up and settle for 4K30.
 
But I simply can do 4K60 on any PC games I want by lowering the graphics settings, and if I feel like I want maximum graphical quality then I amp everything up and settle for 4K30.
But what if you also had to option to keep the settings turned up and get ~4K60.
 
id rather just pay an extra $2k to get visuals that are only better if i sit and stare long enough to pick it apart.
 
Plus you have motion blur in real life too.

I think people just have experience with bad motion blur implementation, especially bad camera motion blur that heavy on the blur and looks even worse at low framerate. I do not see how anyone can ever dislike per object motion blur though, even the mediocre quality ones.

Yeah, good object motion blur is great.
 
Why is it that so many people dislike motion blur? I just don't get it. It's one component that brings real-time rendering closer to pre-rendered CGI.

A good motion blur implementation adds a lot to a presentation. It's hugely important to me.

Because most implementations are shit. The camera motion blur when moving the view is one of the worst effects as it just makes everything look blurry.
 
Why is it that so many people dislike motion blur? I just don't get it. It's one component that brings real-time rendering closer to pre-rendered CGI.

A good motion blur implementation adds a lot to a presentation. It's hugely important to me.
More often than not I feel it just looks like Vaseline smears every time the camera pans.
 
Why is it that so many people dislike motion blur? I just don't get it. It's one component that brings real-time rendering closer to pre-rendered CGI.

A good motion blur implementation adds a lot to a presentation. It's hugely important to me.

Apart from implementations that fall outside personal standards that don't allow for customization? Some people focus more on gameplay than presentation. There's enough motion blur going on with your eyes and the monitor pixels themselves. Some may say too much, even. There are industry-standard monitor features designed to combat blur for a reason. Same reasoning with DOF where your eyes are already doing it for you so a game doing it is just overkill. Games aren't CGI that you sit back and passively take in. It's basically completely unnecessary and distracting for FPS, action, and anything else where you're actively participating in if you're trying to actually perceive what's going on.
 
Why is it that so many people dislike motion blur? I just don't get it. It's one component that brings real-time rendering closer to pre-rendered CGI.

A good motion blur implementation adds a lot to a presentation. It's hugely important to me.

Same

One of the absolute most important effects in games.
 
Apart from implementations that fall outside personal standards that don't allow for customization? Some people focus more on gameplay than presentation. There's enough motion blur going on with your eyes and the monitor pixels themselves. Some may say too much, even. There are industry-standard monitor features designed to combat blur for a reason. Same reasoning with DOF where your eyes are already doing it for you so a game doing it is just overkill. Games aren't CGI that you sit back and passively take in. It's basically completely unnecessary and distracting for FPS, action, and anything else where you're actively participating in if you're trying to actually perceive what's going on.
I feel like you and certain others are focusing entirely on camera blur. Is that right? I can understand the distaste for that (though I don't mind it when done well) but I'm mainly talking about per-object motion blur.

If you really think that is in any way comparable to sample and hold blur of an LCD monitor, well, we'll just have to disagree. Motion blur is especially incredible on low persistence displays like a CRT monitor.
 
Why is it that so many people dislike motion blur? I just don't get it. It's one component that brings real-time rendering closer to pre-rendered CGI.
A good motion blur implementation adds a lot to a presentation. It's hugely important to me.
I don't dislike how motion blur looks, but it typically blurs the image far more than the display does - even 60Hz sample-and-hold.
Sometimes, it feels like I may as well be closing my eyes when turning the camera quickly because there's so much motion blur.
Even when I'm just moving the mouse slowly motion blur often kicks in, instead of only appearing at high speeds.
I generally don't mind object motion blur, but most camera motion blur implementations are too aggressive in my opinion.

This is especially noticeable on an impulse-type display that has very little motion blur of its own.
What's the point of using a display with good motion resolution, if the game is just blurring the image?

Some amount of motion blur is required to eliminate stroboscopic effects though.
And maybe some of these games that I feel add too much motion blur are actually applying the required amount to eliminate them - it's just that I prefer the ability to see while I'm moving the camera instead.

I'm not against the idea of using motion blur, it's just that I don't like most implementations of it.
Unfortunately most games only give you the option to turn it on or off.
They don't let you adjust the strength, or disable camera motion blur but leave object motion blur enabled.

Plus you have motion blur in real life too.
Well that's an argument I often see people using against having motion blur in games.
You have motion blur in real life, and then motion blur in the game being placed on top of that.
So even things that you might have been able to see clearly if they happened in real life are being blurred by the game.
With object motion blur, there's no way to track an object in motion using your eyes so that it stays in clear focus.
 
Why is it that so many people dislike motion blur? I just don't get it. It's one component that brings real-time rendering closer to pre-rendered CGI.

A good motion blur implementation adds a lot to a presentation. It's hugely important to me.

I'd rather have clarity with a high refresh than motion blur, and I think displays should try to minimize any motion blur added by the display itself. The tests on blurbusters that show the difference between ULMB on/off were eye opening to me.
 
I'll take whatever I can get. I have a 1080 right now so it's not exactly a big issue for me, but checkerboard 4k can stretch my gpu's life a bit longer and I don't think that's a bad thing. I'm no PC elitist, but I want the best possible picture while maximizing performance and the checkerboard solution seems optimal for any game I might not be able to run at 4k/60. Not to mention how good this would be for weaker GPUs
 
Well that's an argument I often see people using against having motion blur in games.
You have motion blur in real life, and then motion blur in the game being placed on top of that.
So even things that you might have been able to see clearly if they happened in real life are being blurred by the game.
With object motion blur, there's no way to track an object in motion using your eyes so that it stays in clear focus.
See that argument is the same as "why do I need DoF when my eyes can do the job of focusing?" and I believe it is flawed because it does not consider the fact that looking at objects in a TV screen is different from looking at objects in real life because the TV screen is a flat 2D image rather than an image with actual depth. Firstly there's the fact that the framerate is low, so especially at 30FPS without motion blur I can literally see the frame skipping that happens when panning a camera or when characters move. Secondly you don't focus at the objects on your screen the same way you focus on real objects because of how little screen space they take...but in the context of the ingame environment that area itself can be huge and as such not having motion blur on them looks odd.

Basically because of the way you look at a TV your eye does not really apply motion blur and DoF in the same way it does in real life. HOWEVER, it does hold true for VR/3D. In VR/3D you have actual depth, so you are actually focusing at a point with depth....which is why you get natural DoF and perhaps even motion blur, both due to your eyes themselves.



I can't really elucidate it well so maybe someone else who is technically aware of this can do a better job explaining, but I know that the example argument mentioned in your post is flawed.
 
I'd rather have clarity with a high refresh than motion blur, and I think displays should try to minimize any motion blur added by the display itself. The tests on blurbusters that show the difference between ULMB on/off were eye opening to me.
Yep, I love the fact that my Sony Bravia has a clarity setting to adjust the back light timing to reduce blur. Everything from my HTPC to the Switch looks better in motion with the blur reduced.
 
The more the options the best. It could be interesting, but I don't how much use it would have, since 4k capable GPUs are becoming more and more common and accessible together with 4k screens.

If they add such a thing for PC it's win win. But if they don't... well...
 
So basically, PC could learn by doing fake 4K, and dynamic resolution scaling?
Ehhh.

edit: since some like to quote without reading other posts: I've never said those options couldn't be in. Just that the former is a bad priority, and the latter is already done on PC.

PC needs to embrace Locked FPS/Dynamic Resolution the same.e way they've embraced Locked Resolution/Dynamic FPS.
 
The more the options the best. It could be interesting, but I don't how much use it would have, since 4k capable GPUs are becoming more and more common and accessible together with 4k screens.

If they add such a thing for PC it's win win. But if they don't... well...
As GPUs get more powerful, games become more demanding as devs make use of said power. There will always be a need, that's why there's no such thing as future proofing. Hell, the 290X could play games from several years ago at 4K... But not modern AAA games. Same will be true for the 1080ti in a few years time. Also, just because the 1080ti exists doesn't mean that's where the PC gaming market it. That market is 1060/580 and below.


PC needs to embrace Locked FPS/Dynamic Resolution the same.e way they've embraced Locked Resolution/Dynamic FPS.
This too, set everything in Forza to dynamic except refresh rate.
 
Siege was one of the first implementations of checkerboard rendering. Watch Dogs 2 also has it (also by Ubisoft).



I agree. You can use games like R6 Siege and Watch_Dogs 2 as a good comparison since you can switch between them on the fly and see the difference very easily. I don't think faukX is a close match for native resolution at all. A lot of the comaprison shots people post miss the point in doing it on a static image and then they are sitting at TV distance and using a gamepad to slowly pan the camera.

Using Watch_Dogs 2 as an example, sitting at monitor distance and using a more direct control method like a mouse to move the camera around makes it really obvious that the CBR image is very ghosty and fuzzy compared to a native one.
Hold on, sorry for the double post, but I have to ask. Edit: sending a PM about your username. Don't want to derail.
 
I'd rather have clarity with a high refresh than motion blur, and I think displays should try to minimize any motion blur added by the display itself. The tests on blurbusters that show the difference between ULMB on/off were eye opening to me.

My take on this is that blur inherent to the source is fine as long as it's implemented well. Displays should blur as little as possible.
 
If checkerboarding is that good (not doubting it isn't) why do devs go for native 4K when they can do checkerboarding and add bells and whistles?
 
Top Bottom