Also with HDMI 2.1 standard coming up and VRR, you'll see the benefits of having difference rendering solutions/techniques like this even more. Where every single extra frame per second plays a bigger significance to the overall greater user experience. It'll be fantastic having the option of utilizing these solutions on VRR OLEDs in late 2018-9. *fingers crossed*
Exactly, I loved Richard's really strong support for bringing CBR, dynamic resolution scaling, dynamic resolution adjustments, etc... to PC as a standard. More options for end users is a good thing, especially when it gives our graphics cards longer legs before the eventual upgrade.
Man, when HDMI 2.1 becomes the norm on HDTVs and monitors is when I'll replace my current displays. Big changes like VRR is going standardize that across display devices.
Exactly, I loved Richard's really strong support for bringing CBR, dynamic resolution scaling, dynamic resolution adjustments, etc... to PC as a standard. More options for end users is a good thing, especially when it gives our graphics cards longer legs before the eventual upgrade.
Man, when HDMI 2.1 becomes the norm on HDTVs and monitors is when I'll replace my current displays. Big changes like VRR is going standardize that across display devices.
I think so, here's the info direct from the organization's official page:
Higher Video Resolutions support a range of higher resolutions and faster refresh rates including 8K60Hz and 4K120Hz for immersive viewing and smooth fast-action detail.
Game Mode VRR features variable refresh rate, which enables a 3D graphics processor to display the image at the moment it is rendered for more fluid and better detailed gameplay, and for reducing or eliminating lag, stutter, and frame tearing.
As someone who never has a top of the line PC and hovers between mid to low end depending on current upgrade status I would greatly appreciate as many of these kinds of options as possible to aid my ability to run things well without having to drop everything to low.
I'm a total PC fanboy, but even I hate the snobbery of those against checker-boarding based on some sort of weird elitist principle. If there's an way to get better graphics with lower hardware requirements at the expense of image quality, then why not have the option? If you don't want to use it, then turn it off.
As someone who never has a top of the line PC and hovers between mid to low end depending on current upgrade status I would greatly appreciate as many of these kinds of options as possible to aid my ability to run things well without having to drop everything to low.
Most people are like you, the vast majority of PC gamers do no not have a high end gaming rig. It's why I've been so vocal about Nvidia and how they've been about Gsync and blocking adaptive sync (even though they use it on laptops). Adaptive sync is another thing that I think benefits console, and lower end PC users potentially more than anyone else.
All of these things should be shared and utilized across platforms. I can't wait to see Foveated rendering for VR either, as it will really help proliferate it and increase the potential user-base due to reduced requirements. I really hope we see all of these things pushed across as many platforms as possible. Happiness lies in options.
The amount of people using 144 Hz displays or triple screen are combined still a very small minority. I doubt it is affecting the decisions of most PC gamers.
Motion blur hate comes from too fast console games using it to cover up a too low framerate, which results in a blurry picture which is still stuttering and generally looks like ass in motion.
The Pro's issue is with the CPU, not the GPU, no amount of checkerboarding can fix that. Damned near all PC's haver better CPUs.
PS4 pro is CPU bottlenecked, the Scorpio will be as well.
I think 4K will depend on developer priorities, as always. When devs route all additional power to details then you can never do 4K because the games keep eating resources that you'd need for 4K. Same for 60 FPS on consoles, no console can do 60 FPS when the devs scale their per-frame detail up to the console's power until it gets choked down to 30.
So either play older games that your hardware can already handle at 4K or wait for 4K to become such an important selling point that devs hold back on the details to achieve it.
As I'm sure somebody must already have mentioned, enabling GPU scaling allows whatever resolution you want. I run games at 1080p, 12XXp, 1440p, 1620p, 1800p and 2160p. All depending on the game and target framerate.
It's super easy to set up on AMD cards at least, and I recon it's the same for Nvidia.
Sorry I put you in with the other quote, but do you honestly believe that if devs/pubs can get away with checkerboard 4K, they will care about optimizing proper 4K support?
A lot of people use their PCs on a TV and 4K TVs are becoming common. I'm only running at GTX980ti which is enough for 4K in some games but falls short in many others.
What you're not getting is that checkerboard 4K looks virtually identical to "real" 4K when played on a TV at a normal or even close viewing distance. You need to be very close to the screen to appreciate any difference and the difference is often very subtle.
Well that's why all of this hullabaloo over 4K is rather silly for gaming. On a TV at least. That also applies to movies as well, but to a lesser degree since AFAIK there's nothing mastered at an intermediary resolution between 1080p and 4K. While the difference isn't nearly as pronounced as SD to HD, it's still "somewhat" there, but HDR and WCG are the real improvements.
...but it saves a TON of performance and looks dramatically better than standard 1440p. Being able to do this on my TV would be an amazing thing that would allow me to enjoy better image quality on more games.
Dramatically better? I actually play almost everything exclusively at 1440p on my TV, and some games look crisper, but not that much. I'd much rather play 1440p/60 than 4K/30 any day of the week too. I really wonder how good the vision of some people is too, because while most games are in motion, you can only focus on so many objects at once, and most everything else around it becomes blurred. Games in motion almost never, ever look like they do in screenshots. I do have Horizon on the Pro, and think it looks great, but it would probably look just as good at 1440p. Maybe even better due to being able to add slightly more to the geometry or lighting.
I'm already running a 4K OLED and, I'm telling you, the difference between checkerboard and "real" 4K is very minimal unless you're sitting a foot away from the TV. I don't think people get just how convincing it can be.
I agree, but I also don't see there being this night and day difference in most games going from 1440p to 4K either. Very minor gain for a huge performance hit. The option for checkerboard, or some PC specific version of it would be nice for those who want it though. I personally don't find it necessary since 1440p already looks good on a TV. Those that own, or will own a 4K monitor however might get more mileage out of this. On top of that, by the time I upgrade again (fall of next year), 4K/60 capable cards for modern games will be out.
Sorry I put you in with the other quote, but do you honestly believe that if devs/pubs can get away with checkerboard 4K, they will care about optimizing proper 4K support?
Movies that fully resolve 4k are very rare.
1) Many movies are still mastered in 2k and then upscaled for UHD bluray.
2) Very few cameras actually resolve 4k. Either they have less than 4k pixels (Arri Alexa @ 2880 x 1620) and/or lose resolution from a BAYER filter.
For example Red Dragon 6k resolves around 3400x1900.
3) VFX are rarely done in 4k. Even though Guardians of the Galaxy was shot on 8k RAW, they still used 2k VFX most of the time.
Not gonna lie but it's so annoying people and Sony throwing 4k around like it's basically the definition of the best which it's not.
Of you play with a decent rig (especially with good gpu) you know there's much more than just resolution.
Stuff like settings (from low to ultra) combined with the minimal fps of 60fps (or about 144fps if possible) are much more important most of the time than having native 4k output.
Console versions are often times specially changed and use lowered settings to achieve 60 (and often times unstable or just 30) to get there.
I personally like the idea of the PS4 Pro with the option to play at higher framerates if developers allow it.
That's much more important than just a higher
higher resolution image.
Not gonna lie but it's so annoying people and Sony throwing 4k around like it's basically the definition of the best which it's not.
Of you play with a decent rig (especially with good gpu) you know there's much more than just resolution.
Stuff like settings (from low to ultra) combined with the minimal fps of 60fps (or about 144fps if possible) are much more important most of the time than having native 4k output.
Console versions are often times specially changed and use lowered settings to achieve 60 (and often times unstable or just 30) to get there.
I personally like the idea of the PS4 Pro with the option to play at higher framerates if developers allow it.
That's much more important than just a higher
higher resolution image.
I dont think anyone is saying that resolution is the only thing that matters though. 4k is basically the best resolution at a consumer level at this point in time. The biggest selling point with PS4 Pro is 4k support. It will be the same with Scorpio.
Correct me if I'm wrong, but I've yet to see anyone mention that there are 4k 120/144hz VRR displays coming out this year. When we get GPUs capable of 4k 60 fps, imagine what you could do with CBR on a high refresh rate display.
assuming your game of choice is GPU limited and capable of arbitrary frame rates.
And - correct me if I'm wrong again - CBR inherently benefits from higher framerates because you're getting more relevant information from the previous frame, per frame. In other words, less ghosting and artifacts.
So you're not only getting a better experience from higher refresh rates; the difference between 4k and 4k CBR becomes even less discernible. Every PC gamer should be excited if CBR becomes a widely accessible option, especially if it can be enabled at the driver level.
Yeah that's what I think too, it's just not fun to upgrade something that will cause something else to be downgraded. I think most core PC gamers will stick with 1440p screens, until prices go down on 4K144hz screens that is.
So what's the takeaway here? The fact is that PC gaming still has a lot to learn from the consoles, and PS4 Pro in particular, when it comes to addressing a 4K screen.
Correct me if I'm wrong, but I've yet to see anyone mention that there are 4k 120/144hz VRR displays coming out this year. When we get GPUs capable of 4k 60 fps, imagine what you could do with CBR on a high refresh rate display.
assuming your game of choice is GPU limited and capable of arbitrary frame rates.
And - correct me if I'm wrong again - CBR inherently benefits from higher framerates because you're getting more relevant information from the previous frame, per frame. In other words, less ghosting and artifacts.
So you're not only getting a better experience from higher refresh rates; the difference between 4k and 4k CBR becomes even less discernible. Every PC gamer should be excited if CBR becomes a widely accessible option, especially if it can be enabled at the driver level.
Acer has announced a 4K 144hz gsync screen which is coming this year. There is no price for it yet though. I guess it'll be super expensive since they didn't even hint the price range. :/
For me with a triple screen setup it'll still take years until 4K is a reality, on PC that is. The GPU must be seriously powerful and the prices need to go down to at least 500/screen for me to even consider it.
So I'll most likely go with a Scorpio for my 4K gaming instead. Sad but true. :/
"Good implementation" is the keyword here. It sticks out (or it used to stick out) in many games as an visibly additional effect on top, outside of the coherency of the fluidity of the games overall motion.
If you can identify per object motion blur easily in a running game (not in screenshots) without particulary looking for it, then it is not a good implementation. I can see the strobing argument in favor for it at lower framerates but not at higher framerates.
Wanted to go for an Ryzen 7 and RX580 but between the TDP increase for the RX580 VS RX480 and scarcity with high prices I ended up just going a Kaby Lake Pentium with 1050ti until things improve.
Thing is a beast compared to my last computer (Phenom II 1055t and 560ti). Shocked it held up at 2160p (even if they are older games like Sonic Generations, Cities Skylines, Supreme Commander 2, Sim City 2013, Sims 4 and the ilk). Wasn't expecting that much since the raw teraflops weren't that significantly increased but I suppose efficiency over half a decade works some magic.
I play my AAA multiplats on PS4 Pro by default and have 4K monitor for productivity purposes. Have tried some games like Forza Horizon 3 demo at 2160p at it varies between playable (targeting mostly medium setting at 2160p and 30fps) and a bit stuttery*. I think if you worked checkboarding and dynamic resolution it would be playable for me even on this lowly 1050ti.
*Certainly many PC gamers would find it unplayable and I would certainly turn down the settings so it is a little smoother, but I remember playing GTA3 on a Pentium MMX 166mhz with a TNT2 M64 at about 10 fps so my threshold can go quite low for short periods at least.
I knew that (I guess I was half asleep when posting)! Hell, the first thing I noticed when watching some of my first "4K" UHDs on my TV were how utterly "okay" they looked, and how they weren't anywhere near as impressive as the native 4K/HDR demo videos I had previously watched on my set when I first got it. So I guess my statement was more about how things currently are, and not so much how they could potentially be.
Right now, most of the "4K" films are akin to how a lot of the intial movies and games were in 720p, and labled "HD". It technically was, but it wasn't "full HD" or whatever, and this was still going on long after 1080p sets were the norm.