Killzone: Shadow Fall Multiplayer Runs at 960x1080 vertically interlaced

I wonder then why this approach hasn't been used more frequently in the past.

Takes processing power and a good data pipe. I'm also guessing it looks like turds at non 1:1 pixel resolutions, as the blurring and artifacts would be compounded by up-scaling.

The first post example doesn't seem to have it right either. From my reading every other frame is doing the 1/2 render and interpolation with the next full frame.

1080P frame 1 > 960 + 960 Calc between frame 1-3 > 1080P frame 3

So you're getting 2/3 frames at 1080P, and the one in between is a best guess using data from 1/2 a frame and the two good frames. It also might kick in dynamically based on performance, and in the future it could become much more complicated (maybe just sections/tiles of a frame are done this way, or certain shaders)

Shame on GG for not being upfront, but can you blame them with the DERP going on in this thread? Even the title is wrong.
 
Well, it's good to finally know why the MP was so janky-looking compared to the campaign. I think, in retrospect, they should have just aimed for 30fps across the board. MP wound up being this weird batch of compromises. They couldn't commit in one direction or the other, so we wound up getting a soft-looking, weird-feeling MP that usually runs about 40 or 45 fps. It's too bad, because the lighting and levels themselves are often gorgeous.
 
But people did notice it. The multiplayer was generally thought to be a blurry mess by lots of players.
I'm talking about last gen. People would probably be shocked at how many games weren't actually 1280x720.

I'm not it saying doesn't affect image quality or that its not noticeable in Killzone, just that I think there is a precedent set that this isn't a 'scummy' thing to do. It seems like a ridiculous thing to be upset about.

Would this really have put anybody off buying?
 
I wonder then why this approach hasn't been used more frequently in the past.

Fear of Gaf backlash obviously.

With some fine tuning, I wonder how this technique could hold up in VR. Maybe this is how we'll get our higher perceived framerate? SW:TFU2 explored similar framerate interpolation techniques.
 
Fear of Gaf backlash obviously.

With some fine tuning, I wonder how this technique could hold up in VR. Maybe this is how we'll get our higher perceived framerate? SW:TFU2 explored similar framerate interpolation techniques.

I was wondering this too, but with the screen being stretched so much, wouldn't you notice the artifacts more? I was thinking maybe they could update the centre of the screen every frame, and the outsides using a method like this, as those areas would be more in the periphery of your vision. The problem there is that those areas would also be moving the most, so would be more susceptible to artifacts.
 
DF is proving this generation that they literally are no better than forum posters on GAF at picking things up. Their evaluation of Thief was so faulty as to essentially be worthless to anyone trying to decide on version of the game based on technical merits. DF pretends to be experts but in reality they seem to know much less than some posters here in GAF and they are more biased too.

Jn6TxEE.png
.
 
Seems like there is a fuss about every games resolution and framerate on next gen systems unless it is an exclusive.
Fixed. The whole point is to know which version is better so that we can make an informed choice on which version of a game to buy or which console to buy if we haven't bought one yet.

If it's an exclusive it's not like there's a better version to get.
 
Of course it does.

NTSC 3:2 pulldown
3-2pulldown.gif


The only big differences are that instead of the images being baked in, in KZ they are calculated, and that it's along the vertical axis.
If you really want to draw the similarity between them:

3:2 pulldown alternates stretching a single time-sample over 3 fields of video (1.5 frames)
KZSF MP technique generates a new time-sample during the render of every frame of video

I guess they're vaguely similar enough to be compared, but they're two techniques that were created for two wildly different reasons and have wildly different results (3:2 pulldown basically just resulting in a stuttery representation of the original 24fps footage).
 
Fixed. The whole point is to know which version is better so that we can make an informed choice on which version of a game to buy or which console to buy if we haven't bought one yet.

If it's an exclusive it's not like there's a better version to get.

That's why no one made/is making a fuss about Titanfall's resolution?
 
720P 900P 1080P 13P who cares

The game is free for a week starting tomorrow even if you don't have PSN+

Even at 13P that's pretty kool of Those devs & sony.

Play games quit arguing about them so much.
 
Must admit this news is mildly hilarious - still haven't completed the single player yet, which I got on launch!

AC4 and Fifa are the only games I have really spent some time on so far.
 
"Resolution doesn't matter, KZ:SF MP proves that"

I'm afraid comments like the line above will be used a lot in future DF threads.
 
Takes processing power and a good data pipe. I'm also guessing it looks like turds at non 1:1 pixel resolutions, as the blurring and artifacts would be compounded by up-scaling.

The first post example doesn't seem to have it right either. From my reading every other frame is doing the 1/2 render and interpolation with the next full frame.

1080P frame 1 > 960 + 960 Calc between frame 1-3 > 1080P frame 3

So you're getting 2/3 frames at 1080P, and the one in between is a best guess using data from 1/2 a frame and the two good frames. It also might kick in dynamically based on performance, and in the future it could become much more complicated (maybe just sections/tiles of a frame are done this way, or certain shaders)

Shame on GG for not being upfront, but can you blame them with the DERP going on in this thread? Even the title is wrong.

That's more like what frame interpolation does on a TV. I think each frame is a 960x1080 + 960x1080 blend here. There's more consistency, and more blur.
 
This explains so very much.

Many of us complained about the motion blur. I even got massive headaches from it for the first couple of weeks. It was the kind of thing you'd adjust to by training yourself to not look at the edges of the screen when running...but it was never something to get used to. Looks great when not moving much.
 
The first post example doesn't seem to have it right either. From my reading every other frame is doing the 1/2 render and interpolation with the next full frame.

1080P frame 1 > 960 + 960 Calc between frame 1-3 > 1080P frame 3

So you're getting 2/3 frames at 1080P, and the one in between is a best guess using data from 1/2 a frame and the two good frames.
You're understanding wrong, every frame is doing the half render it just alternated which half it's rendering.

Even lines > Last frames's even Lines + New Odd lines > New even lines + last frame's odd lines > repeat steps 2/3 forever.
 
"Resolution doesn't matter, KZ:SF MP proves that"

I'm afraid comments like the line above will be used a lot in future DF threads.

it's almost like DF set the perfect trap which would forever undermine inter-platform resolution shit fights.
 
So, it's neither 1920x1080 nor 60fps. Yet people act like it's somehow both. By the same logic I could run Crysis 3 ray-traced on a laptop at 1080/60.

Ghosting is great! It's a clever, fascinating technique, unlike upscaling! Can't wait to see what the wizards at ND will do with this.
 
720P 900P 1080P 13P who cares

The game is free for a week starting tomorrow even if you don't have PSN+

Even at 13P that's pretty kool of Those devs & sony.

Play games quit arguing about them so much.

You came to the wrong neighborhood.
 
All that matters is that the viewable screen area maintains the 1080 fill rate.

I don't care if it's "cropped".

As long as the image is crystal clear and and the framerate is high, I'm happy.

Same deal with The Order 1886 - 1080 fill with a high framerate is all I expect.
 
I guess they're vaguely similar enough to be compared, but they're two techniques that were created for two wildly different reasons and have wildly different results (3:2 pulldown basically just resulting in a stuttery representation of the original 24fps footage).

Agree there. Just seems the most similar to anything else we have to compare it to, since the composite frames are not "real frames" but approximations of whats expected in both implementations.

DF is claiming a 960X1080 frame-buffer, but I don't see how that's true, because it should still show up on pixel counting and/or would simply be a interlaced image. More seems to be going on than that.
 
My buddies Onslaught and Indy must be scratching their heads on this one. Indy for sure. He saw 1080p on AC when it was still 900p and there was no patch yet, and he's been seeing 1080p on his KZ MP this whole time. And he's been having fun. Imagine that. He had no idea, yet was having fun this whole time playing it.

Having fun playing video games. Wild! ha ha!
 
720P 900P 1080P 13P who cares

The game is free for a week starting tomorrow even if you don't have PSN+

Even at 13P that's pretty kool of Those devs & sony.

Play games quit arguing about them so much.

Right, I guess we should stop discussing how these games are made and the methods used.

"It's just games dude, just play the games"

If you want to just play games and not add to the discussion you're free to do that, but don't tell other people to "Just play the games lol".
 
The fact that no one knew or cared about this shows just how little all this shit matters outside of petty console wars on either side. Always built up into the worst news ever when publicized, but if no one bothers to check, no one knows or cares at all.

That's completely false.
 
I might be playing devil's advocate a bit here, but the proof in in the pudding I think. This is a 1st party game, in a lot of people's hands. No one noticed until now.

But the thing is people did notice, as many have said in this thread. They noticed something was up but couldn't pinpoint that it was a lower resolution. Why would they expect that when Guerrilla themselves said it was 1080p??!
 
I love the plethora of posters who immediately stated "See? No one even noticed! Resolution doesn't matter".

How many posters on this board are they speaking for? Tens of thousands?
 
DF is claiming a 960X1080 frame-buffer, but I don't see how that's true, because it should still show up on pixel counting and/or would simply be a interlaced image. More seems to be going on than that.
They're referring to the 960x1080 that constitutes the 100% new component generated with each frame. The other pixels are reprojected from the previous frame, presumably being shuffled around based on motion buffers and such to minimize combing artifacts.
 
Agree there. Just seems the most similar to anything else we have to compare it to, since the composite frames are not "real frames" but approximations of whats expected in both implementations.

DF is claiming a 960X1080 frame-buffer, but I don't see how that's true, because it should still show up on pixel counting and/or would simply be a interlaced image. More seems to be going on than that.

The blended frames produce a clean 1080p image if you keep still. You can see the interlace though if you look for it. See the fence and rain in this pic:

Multiplayer:
1519915_1403350109912467_937314060_o.jpg
 
But the thing is people did notice, as many have said in this thread. They noticed something was up but couldn't pinpoint that it was a lower resolution. Why would they expect that when Guerrilla themselves said it was 1080p??!

But it isn't a lower resolution... the resolution is 1920x1080 100% of the time. Their code simply only updates 50% of it per frame and works some magic (with blurring etc) so you don't notice that 1/2 the pixels aren't changing.

Also, Guerrilla is right, it is 1080p 100% of the time.
 
We sure Shadow Fall doesnt have dynamic resolution? Even the Vita Killzone game does that.

It looks pretty solid that it's the 960x1080 frame-buffer, with alternating lines for the active frame on one axis with an optical approximation for the fill. That explains the interlaced effect in motion, ghosting and blurry appearance overall compared to single-player.
 
But it isn't a lower resolution... the resolution is 1920x1080 100% of the time. Their code simply only updates 50% of it per frame and works some magic (with blurring etc) so you don't notice that 1/2 the pixels aren't changing.

Also, Guerrilla is right, it is 1080p 100% of the time.
1080i, but with vertical interlace instead of horizontal.
 
So, it's neither 1920x1080 nor 60fps. Yet people act like it's somehow both. By the same logic I could run Crysis 3 ray-traced on a laptop at 1080/60.

Ghosting is great! It's a clever, fascinating technique, unlike upscaling! Can't wait to see what the wizards at ND will do with this.
It's 1080p30 for static parts of the scene. Half res@60fps for moving portions.
It's the same tradeoff as interlaced video, basically.
 
since you need to combine your image with a weighted version of the previous one, making it random is not a good idea, since you would like to fill only the pixels you have not touched the last frame.
Good point, but you can shoot for 75/25 average (75% real 25% interpolated pixels) and make sure that the interpolated pixels from last frame are generated for real so you don't generate any MPEG style interpolated->interpolated pixels, and the remaining budget for generating real pixels is distributed randomly.
 
I can't believe some posters are defending/spinning Sony's EXPLICITLY stating that the multiplayer was native 1080p. I do not blame outsiders who call this forum "SonyGAF."

Deception is deception. I do not care if they produced the best console. They lied, and they at least should apologize.

If you defend this shit, then you probably have no problem with EA deceptive practices too.
 
Man you guys, if this was an XB1 game and not PS4 we'd have 28 pages by now.

I can't believe some posters are defending/spinning Sony's EXPLICITLY stating that the multiplayer was native 1080p. I do not blame outsiders who call this forum "SonyGAF."

Deception is deception. I do not care if they produced the best console. They lied, and they at least should apologize.

If you defend this shit, then you probably have no problem with EA deceptive practices too.

This is a dumb thing to say, because there have been PLENTY of people in here, like me, saying that it was deceiving/misleading for them to say that. There are people that defend and have opinions on EVERYTHING, and I think its a GREAT thing that opinions are so diverse on here.
 
I can't believe some posters are defending/spinning Sony's EXPLICITLY stating that the multiplayer was native 1080p. I do not blame outsiders who call this forum "SonyGAF."

Deception is deception. I do not care if they produced the best console. They lied, and they at least should apologize.

If you defend this shit, then you probably have no problem with EA deceptive practices too.
Can you quote those posters that are defending Sony about explicitly stating it was native 1080p when it wasn't?
 
I can't believe some posters are defending/spinning Sony's EXPLICITLY stating that the multiplayer was native 1080p. I do not blame outsiders who call this forum "SonyGAF."

Deception is deception. I do not care if they produced the best console. They lied, and they at least should apologize.

If you defend this shit, then you probably have no problem with EA deceptive practices too.

Some posters out of thousands defend Sony's deception makes the "SonyGAF" claims justified?
 
This explains so very much.

Many of us complained about the motion blur. I even got massive headaches from it for the first couple of weeks. It was the kind of thing you'd adjust to by training yourself to not look at the edges of the screen when running...but it was never something to get used to. Looks great when not moving much.

Yea I got headaches as well. Something just looked off, so now I know why.
 
I can't believe some posters are defending/spinning Sony's EXPLICITLY stating that the multiplayer was native 1080p. I do not blame outsiders who call this forum "SonyGAF."

Deception is deception. I do not care if they produced the best console. They lied, and they at least should apologize.

If you defend this shit, then you probably have no problem with EA deceptive practices too.

I'm trying to defend sony but it hasn't lied at all.
 
Top Bottom