Digital Foundry PS4 Pro Launch Coverage Begins

Which? Uncharted 4, while clean looked a tad too smudged for my taste, and The Order was even worse. and Driveclub's AA was downright poor..and then theres Bloodborne which IIRC has no AA, and awful IQ thanks to the chromatic aberration.
On top of my head, UC4 in particular has phenomenal AA. I couldn't believe what it looked like on my TV. Infamous and Ratchet and Clank were really good too, The Order was softer looking but it had practically no aliasing, and softer image made sense for a cinematic look the game was going for. Then you have smaller games like Bound or Tearaway which use obscene amounts of AA for practically perfect IQ.
 
Super sampling is one of the best way to enhance the image quality of a game. There's a reason why is still an option on PC even though is very expensive.
 
On top of my head, UC4 in particular has phenomenal AA. I couldn't believe what it looked like on my TV. Infamous and Ratchet and Clank were really good too, The Order was softer looking but it had practically no aliasing, and softer image made sense for a cinematic look the game was going for. Then you have smaller games like Bound or Tearaway which use obscene amounts of AA for practically perfect IQ.

Going from playing games at 4k with SMAA downsampled to 1080p..Uncharted 4 looks a bit smudged..of course it has very nice AA for a console game, but I wouldn't call it a "playable bullshot"

Order and DC [while in motion, which means almost always] were also amazing.

The Order (even in motion) was way too soft for my liking.
 
I'm excited for the highest quality pixels. Sold my Destiny PS4 for a decent amount, so now just need to wait for next week.

pixel1.gif
 
Super sampling is one of the best way to enhance the image quality of a game. There's a reason why is still an option on PC even though is very expensive.
Exactly. It's a huge deal for 1080p TV owners and I'm not sure why it's being downplayed (other than Sony not doing a good job of spelling it out). Even if the game tops out at, say, 1440p (just an example), downscaling that to 1080p provides a massive boost in image quality and stability.

Even if the game still only uses FXAA, rendering at a higher internal resolution will improve visual quality tremendously.
 
Infamous:SS runs at ~45FPS, so it stands to reason that it might be able to run at or very close 60FPS if they choose to focus on boosting the performance rather than visuals on Pro.

sure, but that's a lot different from expecting 30fps games that are demanding on the CPU to suddenly run at 60, and any dev who doesnt do it is just lazy.
 
Exactly. It's a huge deal for 1080p TV owners and I'm not sure why it's being downplayed (other than Sony not doing a good job of spelling it out). Even if the game tops out at, say, 1440p (just an example), downscaling that to 1080p provides a massive boost in image quality and stability.

Even if the game still only uses FXAA, rendering at a higher internal resolution will improve visual quality tremendously.

I've got to be honest, when downsampling from 1440p to 1080 I don't see all that much difference, downsampling from 4k however is a completely different story. 8k is where the fun really starts :P
 
Going from playing games at 4k with SMAA downsampled to 1080p..Uncharted 4 looks a bit smudged..of course it has very nice AA for a console game, but I wouldn't call it a "playable bullshot
He said something along the lines of "almost". Although I guess with Pro this will literally be the case now with supersampled visuals.

sure, but that's a lot different from expecting 30fps games that are demanding on the CPU to suddenly run at 60, and any dev who doesnt do it is just lazy.
It all depends really on what the game runs at internally. A game that rarely, if ever drops below 30 is usually running at something much higher than that on averate, it's just that it's capped. That's for example why ROTR offers the unlocked framerate option.
 
Exactly. It's a huge deal for 1080p TV owners and I'm not sure why it's being downplayed (other than Sony not doing a good job of spelling it out). Even if the game tops out at, say, 1440p (just an example), downscaling that to 1080p provides a massive boost in image quality and stability.

Even if the game still only uses FXAA, rendering at a higher internal resolution will improve visual quality tremendously.

I sort of wish they actually had 1440p support for those of us with those monitors. Before people say that's just a small segment, they have 4K 4:2:0 support on the pro, which I would find hard to believe that there are more 4K monitors with that spec out than 1440p.
 
Is this only confusing because the quoted user has now edited their post? The system definitely comes with an HDMI cable. I can't imagine that wouldn't be rate for 4k.

Edit: Someone already pointed it out to you.

Yep.. I edit it. Fixed. Thank you.
 
Exactly. It's a huge deal for 1080p TV owners and I'm not sure why it's being downplayed (other than Sony not doing a good job of spelling it out). Even if the game tops out at, say, 1440p (just an example), downscaling that to 1080p provides a massive boost in image quality and stability.

Even if the game still only uses FXAA, rendering at a higher internal resolution will improve visual quality tremendously.

I think this is hyperbole. You can see an improvement but it really isn't the day and night difference you are presenting it to be.
 
I sort of wish they actually had 1440p support for those of us with those monitors. Before people say that's just a small segment, they have 4K 4:2:0 support on the pro, which I would find hard to believe that there are more 4K monitors with that spec out than 1440p.

Isn't that just because it's the color sub-sampling used by movies/blurays? (YCbCr/YUV colorspace)
 
I think this is hyperbole. You can see an improvement but it really isn't the day and night difference you are presenting it to be.

Downsampling from 1440-1800p is a huge improvement for me. Low resolutions are the worst thing about consoles for me. I'm still playing on my PS3 (I take my time with consoles^^) and 720p is a mess now. So giving the PS4 a boost in image quality is a fantastic thing for the future.
 
I sort of wish they actually had 1440p support for those of us with those monitors. Before people say that's just a small segment, they have 4K 4:2:0 support on the pro, which I would find hard to believe that there are more 4K monitors with that spec out than 1440p.
Well, there's a good reason for that...

The current HDMI spec cannot support 4K60 using 10-bit color per channel at RGB or 4:4:4. So proper HDR actually requires the use of 4:2:0 which, admittedly, looks very good on a 4K display due to the pixel density. It's tough to pick out the difference from a normal viewing distance.

Still, getting proper 10-bit color at 4K60 in RGB mode is where we need to go but no display supports that yet (well, no HDR capable displays using HDMI).
 
Well, there's a good reason for that...

The current HDMI spec cannot support 4K60 using 10-bit color per channel at RGB or 4:4:4. So proper HDR actually requires the use of 4:2:0 which, admittedly, looks very good on a 4K display due to the pixel density. It's tough to pick out the difference from a normal viewing distance.

Still, getting proper 10-bit color at 4K60 in RGB mode is where we need to go but no display supports that yet (well, no HDR capable displays using HDMI).
So when picking our output settings we should pick 2160p420 then if we want HDR? I thought that was for older 4K TVs from what people said in the early walmart thread
 
...but it's literally happening right in front of you with Pro and Scorpio, especially with Microsoft's "beyond generations" thing

There is no way they are going to go every 2 years for games consoles.

Every 3-5 years? that's probably more feasible or they go every 3 years Mid Cycle Refresh then new console every 5-6 years.
 
...but it's literally happening right in front of you with Pro and Scorpio, especially with Microsoft's "beyond generations" thing
Not really. Every 3-5 years isn't really the phone model and Sony are more than likely going to reset at the PS5.

Is there any screenshots of sumpersampled 1080p in the wild guys?
 
I think this is hyperbole. You can see an improvement but it really isn't the day and night difference you are presenting it to be.

I don't know, discovering how to supersample on my PC via a GAF thread that popped up a couple of years ago was pretty nuts to see for me in some of the games that still had some performance overhead on my i7-3770k+GTX 680 rig at the time. I remember supersampling DmC: Devil May Cry from just 1440p to some very noticeable gains, and doing the same with the relatively more recent Guilty Gear Xrd -SIGN- is pretty crazy too. I was able to 4K supersample Guilty Gear on my GTX 970 (same i7-3770k) this past December to VERY noticeable IQ gains.
 
I think this is hyperbole. You can see an improvement but it really isn't the day and night difference you are presenting it to be.

Mate, listen to people who work in the field and know exactly what they are talking about, rather than spouting your assumptions which are largely based on nothing but guesswork.
 
There is no way they are going to go every 2 years for games consoles.

Every 3-5 years? that's probably more feasible or they go every 3 years Mid Cycle Refresh then new console every 5-6 years.

Not really. Every 3-5 years isn't really the phone model and Sony are more than likely going to reset at the PS5.

Is there any screenshots of sumpersampled 1080p in the wild guys?

"smartphone model" doesn't have to mean every 2 years. It just means there won't be a reset.
 
I think this is hyperbole. You can see an improvement but it really isn't the day and night difference you are presenting it to be.

This is a matter of personal opinion. There are plenty of people who claim to be unable to see a lot of visual improvement when it comes to resolutions bumps or super sampling. It's a thing you have to see for yourself on a proper screen to actually know if it matters to you. To some, it will be night and day. To others, they'd argue there's still nothing wrong or easily discernable with 720p.

I could argue the later are being disingenuous, but it won't get me anywhere. The best I can say is that we're entitled to our opinions, and I feel bad for anyone who can't discern improved image quality as it really seems like they're missing out.
 
Mate, listen to people who work in the field and know exactly what they are talking about, rather than spouting your assumptions which are largely based on nothing but guesswork.

You do know that people have been using downsampling for quite some time on PC and thus don't need anyone's opinion on the subject?
 
So when picking our output settings we should pick 2160p420 then if we want HDR? I thought that was for older 4K TVs from what people said in the early walmart thread
No TV right now supports 10-bits per channel RGB. If you do 2160p RGB then you are limited to 8-bit which means no HDR.

I've used 8-bit with HDR before on the PC (Shadow Warrior 2) and it introduces a lot of nasty color banding as HDR requires a much wider palette. You need to use 420.

I know, it sounds bad, but the results are so much better than you'd expect on a 4K display.
 
So when picking our output settings we should pick 2160p420 then if we want HDR? I thought that was for older 4K TVs from what people said in the early walmart thread

PC? The highest you can get 10 bit color, that is normally 4:2:2 for 10/12bit color for a hdrmi 2.0a GPU.
 
I think this is hyperbole. You can see an improvement but it really isn't the day and night difference you are presenting it to be.

Lol, the dude works at Eurogamer and writes and produces videos for digital Foundry so i think he knows what he's talking about.
 
Mate, listen to people who work in the field and know exactly what they are talking about, rather than spouting your assumptions which are largely based on nothing but guesswork.

It's not guesswork, downsampling has been available through gedosato and driver tools to everyone for quite some time. I've used it extensively. The difference in quality is there but it is hardly tremendous or monumental.

Lol, the dude works at Eurogamer and writes and produces videos for digital Foundry so i think he knows what he's talking about.

Seriously, do you people even know what downsampling is?
 
It's not guesswork, downsampling has been available through gedosato and driver tools to everyone for quite some time. I've used it extensively. The difference in quality is there but it is hardly tremendous or monumental.
Yeah, it's definitely a matter of opinion thing.

I DO think it makes a tremendous difference but I can see why others would not. A 1080p image with FXAA versus a 1440p image downscaled to 1080p with FXAA is a world of difference in my opinion, though, which is the kind of situation we're dealing with here.

Downscaling from 4K or higher is where the meat is, though.

Thankfully, 1440p isn't something we'll be seeing that often on Pro, I'm sure.
 
No TV right now supports 10-bits per channel RGB. If you do 2160p RGB then you are limited to 8-bit which means no HDR.

I've used 8-bit with HDR before on the PC (Shadow Warrior 2) and it introduces a lot of nasty color banding as HDR requires a much wider palette. You need to use 420.

I know, it sounds bad, but the results are so much better than you'd expect on a 4K display.

But wait, in my case since my TV only does 4K but not HDR, I can choose RGB without fear or having issues right ? All HDMI ports in my TV are HDMI 2.0.
 
Leadbetter says not to get your hopes up regarding SATA III due to the "copying bandwidth" being really low. If SATA II is getting hobbled like that (20 mb/sec), then he would expect SATA III to be hobbled as well.

Is that the weirdo SATA over USB internal bridge rearing its head again?
 
Is that the weirdo SATA over USB internal bridge rearing its head again?

It was removed from the slim so It's unlikely they put in in the Pro.

I don't think its a data cap as in "you shalt not pass xxMB/s" more a limitation of either hardware or software in there somewhere. Just having a fast drive and a faster interface doesn't mean everything speeds up accordingly.
 
Yeah, it's definitely a matter of opinion thing.

That it is. For me an increase in framerate is much more important than downsampling as it directly affects the smoothness of gameplay. A clean image is nice to look at and the reduction of shimmering is great but in the end the game still feels and plays the same. Still, since a better framerate is largely out of the question I would rather have more and better effects instead of developers using all the available extra juice for downsampling.
 
Why is this a bad thing?

Because it makes every game perpetually cross gen and the appeal of the ps5 is diminished. Ps5 can still be back compatible yet be home to games that exercise it's full capability without having to run on the vanilla ps4. An easy example is a game that would depend on a more powerful cpu in the ps5.
 
Top Bottom