Killzone: Shadow Fall Multiplayer Runs at 960x1080 vertically interlaced

The moment I got the KZ:SF and fired up the multiplayer, I thought, it looks extremely pretty but something was off with the frames. I popped in battlefield and for some reason looked smoother. Now I know why :(
 
What a fun(ny) thread.

Much like I didn't care about Ryse, or Dead Rising 3, or the res on any of these other console games...I don't care about Killzone's resolution.

It is funny no-one figured out why it was blurry/messy after some noticed. Surprised detective-gaf didn't figure it out.

As I've said every time a resolution thread comes up. People need to let the subject go expectations wise. We are going to see some 1080p games, but the ones that really push the envelope (especially at launch..see Ryse/Killzone), have to make sacrifices in some situations. Sometimes in Single player, sometimes in multiplayer, and sometimes across the board for a singular look.

These consoles don't have the hardware of the gods. Enjoy what they deliver (if it's fun/entertaining)...and don't stress about it.

Now if you're just curious about the tricks these developers use to pull off this stuff, I agree that it's very interesting. The image quality looks excellent in motion from where I sit. My wife and I play sitting back on our bed. We're a good 10-12' from the TV...so there's not a huge difference (image wise) from the PS4/X1/PC games we play.

I only notice if I sit on the edge of the bed (5' away)...and then I can definitely see it, but don't care. I rarely sit that close anyway.

:)
 
Actually, that's quite significantly different. The dynamic framebuffer is simply upscaled. Here you apparently have a spatial and a temporal sample (or actually a range of temporal samples?) per 2 pixels which are fed into some heuristic to derive the final color values.

Hey I agree, but on wipeout on PS3, it made use of the horizontal scaler, as did GT5, but had 1080p . But those are not the same as KZ SF implementation yep, the interpolation based on the frame before is pretty clever.
 
But people did notice it. The multiplayer was generally thought to be a blurry mess by lots of players.

Exactly. The only difference is that since they said "it is native 1080p", people just scratched their heads and went like "well, then it must be their implementation of FXAA, cause this looks like shit". I've seen at least two people claim that BF4 at 900p was way sharper than Killzone's multiplayer.

I don't know where this "no one noticed" thing is coming from, a lot of people did, they just can't count pixels and since they were being told it was 1080p, they just assumed it was a blurry mess for other reasons.

The moment I got the KZ:SF and fired up the multiplayer, I thought, it looks extremely pretty but something was off with the frames. I popped in battlefield and for some reason looked smoother. Now I know why :(

See, just as I mention it another example pops up.

People noticed it, they just couldn't say for sure why, since it was supposed to be 1920x1080.
 
it makes you wonder how much Sony must have paid Digital Foundry to not 'notice' this until so late in the day though...
 
Wouldn't it have been easier to simply turn down some effects, AA, texture res, etc. to keep MP at 60? This seems like a rushed or last-minute solution. As if they developed MP at 30 and changed their minds at the last minute.

Indeed. I'm inclined to think that you're dead on it there. Going to 60fps was a late decision
 
Exactly. The only difference is that since they said "it is native 1080p", people just scratched their heads and went like "well, then it must be their implementation of FXAA, cause this looks like shit". I've seen at least two people claim that BF4 at 900p was way sharper than Killzone's multiplayer.

I don't know where this "no one noticed" thing is coming from, a lot of people did, they just can't count pixels and since they were being told it was 1080p.



So they noticed something.....they just didn't notice the resolution??
In other words, the resolution didn't matter once they THOUGHT that it was 1080p.
 
General question to those saying "see this shows resolution doesn't matter" you all saved money and got 720p tvs, right? I mean why bother getting a 1080p if it doesn't matter.
 
So they noticed something.....they just didn't notice the resolution??
In other words, the resolution didn't matter once they THOUGHT that it was 1080p.

They noticed the resolution.

The problem is, when a company or developer claims "1080p resolution," it causes confusion. People took those official statements at face value.
 
So they noticed something.....they just didn't notice the resolution??
In other words, the resolution didn't matter once they THOUGHT that it was 1080p.

No, they noticed it was blurry, the resolution is usually the first suspect in those cases, but they were told by people who work on this and know more than they do that it was 1080p. That didn't make the image look as clear as 1080p does (like in single player), it was still a blurry mess, but since people they were supposed to trust on this told them it was native 1080p, they just thought it must be another reason.

It still looked like shit, and the reason was the resolution after all. If anything, this proves that even people who can't count pixels do notice the resolution difference. They may not have enough knowledge to instantly say "this definitely isn't rendering at 1920x1080", but only by looking they can see something's off.

It's worth saying that I don't own a PS4 or played Shadow Fall, so I'm just going by what I've read around here. So hyperboles from any side may have affected my view.

EDIT: I don't even think they did wrong with the lower resolution. It's multiplayer, the framerate should be their priority and I'm glad it was, but this "no once noticed, stop crying" nonsense needs to stop.
 
Exactly. The only difference is that since they said "it is native 1080p", people just scratched their heads and went like "well, then it must be their implementation of FXAA, cause this looks like shit". I've seen at least two people claim that BF4 at 900p was way sharper than Killzone's multiplayer.

I don't know where this "no one noticed" thing is coming from, a lot of people did, they just can't count pixels and since they were being told it was 1080p, they just assumed it was a blurry mess for other reasons.



See, just as I mention it another example pops up.

People noticed it, they just couldn't say for sure why, since it was supposed to be 1920x1080.

Can someone provide a link where Guerilla state that KZSF multiplayer render at native 1080P ?
 
General question to those saying "see this shows resolution doesn't matter" you all saved money and got 720p tvs, right? I mean why bother getting a 1080p if it doesn't matter.


Well......TV programs aren't really 1080p either. For the most part, BluRay and Video games are the only time your TV gets fed a true 1080p signal and then, only plasma can do 1080 lines in motion unless the new OLED can as well.
 
Well that's funny, because I've heard so much about the differences in resolution here lately.

And I wonder how many people on NeoGAF who moan and whine about resolution represents the total Shadowfall player base?

How significantly below 1% do you think we're talking? 0.05%? Less?
 
so the multiplayer runs at 1080p30fps

The method they are using is pretty good, I noticed a worse image quality but I thought it was because of the AA and the LOD.

Not bad if it's true, but I would have loved to have it at 1080p60fps
 
so the multiplayer runs at 1080p30fps

The method they are using is pretty good, I noticed a worse image quality but I thought it was because of the AA and the LOD.

Not bad if it's true, but I would have loved to have it at 1080p60fps

no it does not. In fact, this resolution and its inability to hold 60fps implies that 1080p 30 may even be questionable in scenarios during the multiplayer.
 
so the multiplayer runs at 1080p30fps

The method they are using is pretty good, I noticed a worse image quality but I thought it was because of the AA and the LOD.

Not bad if it's true, but I would have loved to have it at 1080p60fps
No, the singleplayer runs at 1080p with 30FPS. The Multiplayer is around 50FPS.
 
so the multiplayer runs at 1080p30fps

The method they are using is pretty good, I noticed a worse image quality but I thought it was because of the AA and the LOD.

Not bad if it's true, but I would have loved to have it at 1080p60fps

No it runs better than 30fps at 960x1080 with some image processing, hence the worse image quality.
 
so the multiplayer runs at 1080p30fps

The method they are using is pretty good, I noticed a worse image quality but I thought it was because of the AA and the LOD.

Not bad if it's true, but I would have loved to have it at 1080p60fps
It actually runs at 960x1080 + a heuristic temporal reprojection to get a full 1080p frame, @ ~50 FPS. There's really no way to make an accurate shorter statement :P
 
And I wonder how many people on NeoGAF who moan and whine about resolution represents the total Shadowfall player base?

I never quite get this argument. You don't have to reflect on the technicalities involved to notice that a game has deficits with its IQ or frame rate. It's hard to reflect on them anyway when one does not have two examples running side by side, but they make a difference and they add up. Otherwise, this argument could have made for every single evolutionary step in computer graphics. If people would have stop taking these small steps because nobody cares, nobody understands and nobody notices anyway, game visuals would not have progressed the way they did.
 
Why did Guerilla even do this? I mean I could kind of understand if it would have guaranteed a 60fps framerate, but the average is not really anywhere close to that. I t kind of reminds me of Bungie when they made the absurd decision to trash IQ to implement higher precision HDR lighting through two framebuffers. Not at all worth the tradeoff.
 
Can someone provide a link where Guerilla state that KZSF multiplayer render at native 1080P ?

I believe Eric Boltjes stated it at the Eurogamer expo. I have to look thru the video again.

Edit: just watched the whole Video and found nothing...sorry
 
Why did Guerilla even do this?

Yeah, they should have patched it to native 1080p @ locked 30fps. 60fps are of course better, but not when the frame rate fluctuates like in this game; it is really annoying.
 
Seems interesting approach. I'd be interested to know how much of the engine/game was designed around 4GB before the jump to 8GB RAM and how much GG were able to adapt to use the additional RAM?

I also wish DF would make up their minds on whether resolution matters or not - they seem to keep bouncing back and forth in recent articales.
 
Why did Guerilla even do this? I mean I could kind of understand if it would have guaranteed a 60fps framerate, but the average is not really anywhere close to that. I t kind of reminds me of Bungie when they made the absurd decision to trash IQ to implement higher precision HDR lighting through two framebuffers. Not at all worth the tradeoff.

Debatable. Halo's lighting was both fantastic and extremely characteristic. Reach probably had the best lighting last gen.
I also wish DF would make up their minds on whether resolution matters or not - they seem to keep bouncing back and forth in recent articales.

In an article debating FPS versus resolution,,,
 
Why did Guerilla even do this? I mean I could kind of understand if it would have guaranteed a 60fps framerate, but the average is not really anywhere close to that. I t kind of reminds me of Bungie when they made the absurd decision to trash IQ to implement higher precision HDR lighting through two framebuffers. Not at all worth the tradeoff.
Well, one could speculate that, if they are shading/fillrate limited, and the game often drops significantly below 60 FPS in multiplayer even with this measure, that rendering at full 1080p would make it drop below 30 FPS. Which really would be unacceptable in a multiplayer FPS.

Debatable. Halo's lighting was both fantastic and extremely characteristic. Reach probably had the best lighting last gen.
I don't think it's debatable in the case of Halo 3. Its IQ was utterly atrocious.
 
They noticed the resolution.

The problem is, when a company or developer claims "1080p resolution," it causes confusion. People took those official statements at face value.

So it's a matter of the messenger, and who GAF, me included, considers trustworthy?
I'll play devils advocate by asking if it didn't look right, wouldn't it be worth it to find out why.




No, they noticed it was blurry, the resolution is usually the first suspect in those cases, but they were told by people who work on this and know more than they do that it was 1080p. That didn't make the image look as clear as 1080p does (like in single player), it was still a blurry mess, but since people they were supposed to trust on this told them it was native 1080p, they just thought it must be another reason.

It still looked like shit, and the reason was the resolution after all. If anything, this proves that even people who can't count pixels do notice the resolution difference. They may not have enough knowledge to instantly say "this definitely isn't rendering at 1920x1080", but only by looking they can see something's off.

It's worth saying that I don't own a PS4 or played Shadow Fall, so I'm just going by what I've read around here. So hyperboles from any side may have affected my view.

EDIT: I don't even think they did wrong with the lower resolution. It's multiplayer, the framerate should be their priority and I'm glad it was, but this "no once noticed, stop crying" nonsense needs to stop.

So in other words. Once they were told by a trusted source, the resolution didn't matter. The games faults were thought to be something else.

it's just crazy. KZ's definitive 1080pness.....I know, I know.....had been trumpeted so hard for months, it's just a shame.
 
Why did Guerilla even do this? I mean I could kind of understand if it would have guaranteed a 60fps framerate, but the average is not really anywhere close to that. I t kind of reminds me of Bungie when they made the absurd decision to trash IQ to implement higher precision HDR lighting through two framebuffers. Not at all worth the tradeoff.
I'm glad they did. Halo 3 had the best natural lighting of any Halo. I loved it.
 
Well......TV programs aren't really 1080p either. For the most part, BluRay and Video games are the only time your TV gets fed a true 1080p signal and then, only plasma can do 1080 lines in motion unless the new OLED can as well.

OMG this not 2005 anymore, there are plenty of affordable LCDs now that can display full 1080p in motion(aka "motion resolution").
 
Well, one could speculate that, if they are shading/fillrate limited, and the game often drops significantly below 60 FPS in multiplayer even with this measure, that rendering at full 1080p would make it drop below 30 FPS. Which really would be unacceptable in a multiplayer FPS.

The SP doesn't drop below 30fps, so the MP should have been able to hit that target as well.
 
That's what happens when your 30fps game needs to run at 60 because of a last minute decision.

When the marketing team realised they could win a few battles with 1080p.

It's like ps3 with its 1080p support, 3D support, move support etc, it will be dropped as disregarded once they realise that it's not technically feasable or that 99% of their userbase don't care for it.
 
The SP doesn't drop below 30fps, so the MP should have been able to hit that target as well.
That depends, the MP could be more demanding. It's almost always more demanding CPU-wise (which is of course irrelevant w.r.t. rendering resolution), but it could also have more unpredictable/demanding GPU load.
 
Well, I've seen more threads about resolution in the past couple of months on NeoGAF than I have in the 8+ years combined that I've visited the site.

So my question is: doesn't it?

I might be playing devil's advocate a bit here, but the proof in in the pudding I think. This is a 1st party game, in a lot of people's hands. No one noticed until now.
 
So in other words. Once they were told by a trusted source, the resolution didn't matter. The games faults were thought to be something else.

it's just crazy. KZ's definitive 1080pness.....I know, I know.....had been trumpeted so hard for months, it's just a shame.
The single player is still running at 1080p though, so it depends on what people have been referring to. Its likely that it was believed that both single player and multi player was in 1080p though, i think thats the case indeed, i agree, but at least a solid part of the game is still runing in 1080p.
 
Why did Guerilla even do this? I mean I could kind of understand if it would have guaranteed a 60fps framerate, but the average is not really anywhere close to that.

50fps is still a big leap in perceptible fluidity, and also a significant decrease in input lag. If there were no framerate tests done and posted on Youtube, not a single soul would have ever been able to tell that it wasn't actually running at a full 60fps clip.

A lot of people seem to be under the preconception that it's either "all or nothing" with 60fps. But every increase in frames rendered is beneficial, especially in a competitive MP game.
 
Debatable. Halo's lighting was both fantastic and extremely characteristic. Reach probably had the best lighting last gen.


In an article debating FPS versus resolution,,,

I was speaking to the fact they seem to alternate between articles that debate issues with face offs where they made definite statements on what matters more (possibly because they feel locked into having to declare a "winner" even when games have different resolutions/effects making this challenging).

It just feels like they change goalposts depending on the article type.
 
Top Bottom