ThaRagingAsian
Banned
The moment I got the KZ:SF and fired up the multiplayer, I thought, it looks extremely pretty but something was off with the frames. I popped in battlefield and for some reason looked smoother. Now I know why 

This is pretty damning. Guerrilla needs to make a statement. I wonder if people will lose their jobs over this.
It looks the same as 1080p when you're not moving. It's 960x1080 with some post processing when you are moving.How did Digital Foundry not notice this during their testing?
They notice that the mp was blurry and blamed it on the AA used. If people were as sensitive to res as most claimed they're than this should have come to light a lot sooner.But people did notice it.
Actually, that's quite significantly different. The dynamic framebuffer is simply upscaled. Here you apparently have a spatial and a temporal sample (or actually a range of temporal samples?) per 2 pixels which are fed into some heuristic to derive the final color values.
This is pretty damning. Guerrilla needs to make a statement. I wonder if people will lose their jobs over this.
But people did notice it. The multiplayer was generally thought to be a blurry mess by lots of players.
The moment I got the KZ:SF and fired up the multiplayer, I thought, it looks extremely pretty but something was off with the frames. I popped in battlefield and for some reason looked smoother. Now I know why![]()
it makes you wonder how much Sony must have paid Digital Foundry to not 'notice' this until so late in the day though...
Wouldn't it have been easier to simply turn down some effects, AA, texture res, etc. to keep MP at 60? This seems like a rushed or last-minute solution. As if they developed MP at 30 and changed their minds at the last minute.
Exactly. The only difference is that since they said "it is native 1080p", people just scratched their heads and went like "well, then it must be their implementation of FXAA, cause this looks like shit". I've seen at least two people claim that BF4 at 900p was way sharper than Killzone's multiplayer.
I don't know where this "no one noticed" thing is coming from, a lot of people did, they just can't count pixels and since they were being told it was 1080p.
Or possibly doesn't care about what is a laughably minor issue that has absolutely no impact on their ability to enjoy the game they're playing or not.
So they noticed something.....they just didn't notice the resolution??
In other words, the resolution didn't matter once they THOUGHT that it was 1080p.
So they noticed something.....they just didn't notice the resolution??
In other words, the resolution didn't matter once they THOUGHT that it was 1080p.
Exactly. The only difference is that since they said "it is native 1080p", people just scratched their heads and went like "well, then it must be their implementation of FXAA, cause this looks like shit". I've seen at least two people claim that BF4 at 900p was way sharper than Killzone's multiplayer.
I don't know where this "no one noticed" thing is coming from, a lot of people did, they just can't count pixels and since they were being told it was 1080p, they just assumed it was a blurry mess for other reasons.
See, just as I mention it another example pops up.
People noticed it, they just couldn't say for sure why, since it was supposed to be 1920x1080.
General question to those saying "see this shows resolution doesn't matter" you all saved money and got 720p tvs, right? I mean why bother getting a 1080p if it doesn't matter.
Well that's funny, because I've heard so much about the differences in resolution here lately.
so the multiplayer runs at 1080p30fps
The method they are using is pretty good, I noticed a worse image quality but I thought it was because of the AA and the LOD.
Not bad if it's true, but I would have loved to have it at 1080p60fps
Come on now... Why would somebody get fired over this?
No, the singleplayer runs at 1080p with 30FPS. The Multiplayer is around 50FPS.so the multiplayer runs at 1080p30fps
The method they are using is pretty good, I noticed a worse image quality but I thought it was because of the AA and the LOD.
Not bad if it's true, but I would have loved to have it at 1080p60fps
so the multiplayer runs at 1080p30fps
The method they are using is pretty good, I noticed a worse image quality but I thought it was because of the AA and the LOD.
Not bad if it's true, but I would have loved to have it at 1080p60fps
They lied to us! Many people bought this game under the assumption that it ran at 1080p in multiplayer; it doesn't. This sets a bad precedent. Resolution matters now more than ever.
It actually runs at 960x1080 + a heuristic temporal reprojection to get a full 1080p frame, @ ~50 FPS. There's really no way to make an accurate shorter statementso the multiplayer runs at 1080p30fps
The method they are using is pretty good, I noticed a worse image quality but I thought it was because of the AA and the LOD.
Not bad if it's true, but I would have loved to have it at 1080p60fps
Resolution matters now more than ever.
What a fun(ny) thread...
And I wonder how many people on NeoGAF who moan and whine about resolution represents the total Shadowfall player base?
Does it?
Can someone provide a link where Guerilla state that KZSF multiplayer render at native 1080P ?
Why did Guerilla even do this?
Why did Guerilla even do this? I mean I could kind of understand if it would have guaranteed a 60fps framerate, but the average is not really anywhere close to that. I t kind of reminds me of Bungie when they made the absurd decision to trash IQ to implement higher precision HDR lighting through two framebuffers. Not at all worth the tradeoff.
I also wish DF would make up their minds on whether resolution matters or not - they seem to keep bouncing back and forth in recent articales.
Well, one could speculate that, if they are shading/fillrate limited, and the game often drops significantly below 60 FPS in multiplayer even with this measure, that rendering at full 1080p would make it drop below 30 FPS. Which really would be unacceptable in a multiplayer FPS.Why did Guerilla even do this? I mean I could kind of understand if it would have guaranteed a 60fps framerate, but the average is not really anywhere close to that. I t kind of reminds me of Bungie when they made the absurd decision to trash IQ to implement higher precision HDR lighting through two framebuffers. Not at all worth the tradeoff.
I don't think it's debatable in the case of Halo 3. Its IQ was utterly atrocious.Debatable. Halo's lighting was both fantastic and extremely characteristic. Reach probably had the best lighting last gen.
They noticed the resolution.
The problem is, when a company or developer claims "1080p resolution," it causes confusion. People took those official statements at face value.
No, they noticed it was blurry, the resolution is usually the first suspect in those cases, but they were told by people who work on this and know more than they do that it was 1080p. That didn't make the image look as clear as 1080p does (like in single player), it was still a blurry mess, but since people they were supposed to trust on this told them it was native 1080p, they just thought it must be another reason.
It still looked like shit, and the reason was the resolution after all. If anything, this proves that even people who can't count pixels do notice the resolution difference. They may not have enough knowledge to instantly say "this definitely isn't rendering at 1920x1080", but only by looking they can see something's off.
It's worth saying that I don't own a PS4 or played Shadow Fall, so I'm just going by what I've read around here. So hyperboles from any side may have affected my view.
EDIT: I don't even think they did wrong with the lower resolution. It's multiplayer, the framerate should be their priority and I'm glad it was, but this "no once noticed, stop crying" nonsense needs to stop.
I'm glad they did. Halo 3 had the best natural lighting of any Halo. I loved it.Why did Guerilla even do this? I mean I could kind of understand if it would have guaranteed a 60fps framerate, but the average is not really anywhere close to that. I t kind of reminds me of Bungie when they made the absurd decision to trash IQ to implement higher precision HDR lighting through two framebuffers. Not at all worth the tradeoff.
Debatable. Halo's lighting was both fantastic and extremely characteristic. Reach probably had the best lighting last gen.
Well......TV programs aren't really 1080p either. For the most part, BluRay and Video games are the only time your TV gets fed a true 1080p signal and then, only plasma can do 1080 lines in motion unless the new OLED can as well.
Well, one could speculate that, if they are shading/fillrate limited, and the game often drops significantly below 60 FPS in multiplayer even with this measure, that rendering at full 1080p would make it drop below 30 FPS. Which really would be unacceptable in a multiplayer FPS.
That's what happens when your 30fps game needs to run at 60 because of a last minute decision.
That depends, the MP could be more demanding. It's almost always more demanding CPU-wise (which is of course irrelevant w.r.t. rendering resolution), but it could also have more unpredictable/demanding GPU load.The SP doesn't drop below 30fps, so the MP should have been able to hit that target as well.
The SP doesn't drop below 30fps, so the MP should have been able to hit that target as well.
Well, I've seen more threads about resolution in the past couple of months on NeoGAF than I have in the 8+ years combined that I've visited the site.
So my question is: doesn't it?
The single player is still running at 1080p though, so it depends on what people have been referring to. Its likely that it was believed that both single player and multi player was in 1080p though, i think thats the case indeed, i agree, but at least a solid part of the game is still runing in 1080p.So in other words. Once they were told by a trusted source, the resolution didn't matter. The games faults were thought to be something else.
it's just crazy. KZ's definitive 1080pness.....I know, I know.....had been trumpeted so hard for months, it's just a shame.
Why did Guerilla even do this? I mean I could kind of understand if it would have guaranteed a 60fps framerate, but the average is not really anywhere close to that.
Debatable. Halo's lighting was both fantastic and extremely characteristic. Reach probably had the best lighting last gen.
In an article debating FPS versus resolution,,,