Killzone: Shadow Fall Multiplayer Runs at 960x1080 vertically interlaced

Not really because it doesn't just add the lines, but approximates the non-rendered lines from information gather during the rendering of the last frame. That's an additional step of computation and responsible for that method not being easily detectable by pixel counting.

If I have this right... There's another buffer that records the velocity of each rendered object, so that the missing information in the current frame can be filled by looking at the previous frame and gathering pixel information from there (with the offset provided by the object's movement) to estimate how the missing pixels should be coloured. Is this a basic understanding of what's happening?

If that's the case, would we expect the interlacing effect to be worse near the edges of the screen, where there's a greater chance that information needed to fill the pixels possibly wasn't rendered in the previous frame?
 
I'm assuming that this temporal interpolation method uses a nice, big chunk of video memory. 32MB of ESRAM would prohibit this method from ever being used in an Xbox One game.

Which is a shame (and ironic), because if there were ever a console that desperately needs a higher-quality scaling technique, it would be the Xbone.

You'd be assuming wrongly. It uses much less bandwidth & space rendering half-res frames. Blended framebuffer can even live in DDR3.

It would be a good move for X1 too, if people can tolerate the interlerlacing.
 
It's essentially how mpeg encoding works, they can reduce overheads on the GPU by re-using buffered frames, but couldn't before because they simply didn't have the required memory available for the GPU, PS4 has ample amounts!
 
Scummy move, but it's telling that nobody played Killzone Shadow Fall multiplayer enough to realize they were pulling the wool over our eyes.
 
The discussion was starting to take over the other thread about this article and since it's only one small point and we didn't know this before, it would be best to make another thread.



http://www.eurogamer.net/articles/digitalfoundry-2014-in-theory-1080p30-or-720p60

I was very disappointed to learn this. I always thought that the added blur in MP was due to a poor implementation of FXAA but in reality it's not rendering too many more pixels than 720p. I have no idea why this wasn't mentioned in their previous article about the game's tech since he clearly talked to GG about it back then. For me this really kills my opinion of the tech on the multiplayer side of the game. Low res and can't hit 60fps regularly. They even claimed full res too

EDIT
Explanation of the upscaling process

So we have a game running at 960i x 1080p at 50 fps (avg.).

I can only guess that interlacing horizontally has less of an noticeable effect than doing so vertically.

I own the game but never bothered with multiplayer. Now I am just curious enough to check it out, and see what this looks like.
 
I guess from now on, we should be skeptical when Sony says it's a 1080p or 60fps game but what about games like Ground Zeros

To be honest, I feel like if Microsoft pulled this especially on Forza MP, the thread would not have been pretty.

Nonetheless, I kinda want to get this game now to see how the MP looks myself. Too bad I cant go back in time and get it (cause I want to know if I would have gotten fooled).
 
You know if Killzone was multiplatform or an X1 exclusive someone would have found this out on the first day multiplayer screens were released.

Resolution only matters when comparing multiplats. With exclusives it doesn't matter much.
lol go look at X1 threads for exclusive games before release. The preview threads are dominated by resolution talk.
 
You wouldn't have been fooled Leadbetter is the fool, 1080p is a video standard, this method of using the preceding frame is like mpeg GOP, are blurays not 1080p? you don't have constant i frames on any digital video, but it's still 1080p.
 
Wouldn't it have been easier to simply turn down some effects, AA, texture res, etc. to keep MP at 60? This seems like a rushed or last-minute solution. As if they developed MP at 30 and changed their minds at the last minute.
 
Scummy move, but it's telling that nobody played Killzone Shadow Fall multiplayer enough to realize they were pulling the wool over our eyes.

I figured the blurry graphics was due to their high performance anti-aliasing application in mp. To be honest, I found the shitty, narrow field of view more annoying than anything else.
 
So we can rename the PS4 as PS-960i ? Ok, no.

This is good to show that PS4 is not a TITAN compared to ONE.

So we have a game running at 960i x 1080p at 50 fps (avg.).

I can only guess that interlacing horizontally has less of an noticeable effect than doing so vertically.

I own the game but never bothered with multiplayer. Now I am just curious enough to check it out, and see what this looks like.

I believe 1920i would be the more accurate term, otherwise at 960i it would rendering at 480 pixels along the dimension in the same way that 1080i refers to alternating lines of 540 pixels.
 
You wouldn't have been fooled Leadbetter is the fool, 1080p is a video standard, this method of using the preceding frame is like mpeg GOP, are blurays not 1080p? you don't have constant i frames on any digital video, but it's still 1080p.
Interested in what you're saying but I would need someone else to back you up or some independent source . And then we have have a real discussion on 1080p . If that's true could be worth a new topic. I'll create one if this is commonplace -'s has a sognificant difference from scaling
 
Wouldn't it have been easier to simply turn down some effects, AA, texture res, etc. to keep MP at 60? This seems like a rushed or last-minute solution.

This way they are pretty much cutting rendering time in half, while keeping all effect. And it will still look as sharp if nothing moves. The downside is interlace artefacts.
 
I believe 1920i would be the more accurate term, otherwise at 960i it would rendering at 480 pixels along the dimension in the same way that 1080i refers to alternating lines of 540 pixels.

Not accurate it's not simple interlacing and at the least they seem to be doing way more than any tv does while interlacing .
 
Interested in what you're saying but I would need someone else to back you up or some independent source . And then we have have a real discussion on 1080p . If that's true could be worth a new topic. I'll create one if this is commonplace -'s has a sognificant difference from scaling

I think it's definitely worthy of discussion, it's a new technique for games, but only because GPU's didn't have the memory available to them before, but this is absolutely not an upscale, not from a video standpoint, you are getting the GPU presenting a 1920x1080p image, it's just not drawing all 1920 lines it's drawing 960 and using 960 from the preceding frame, it's still a 1080p image in the framebuffer, it's just filling the framebuffer using a preceding frame.
 
You wouldn't have been fooled Leadbetter is the fool, 1080p is a video standard, this method of using the preceding frame is like mpeg GOP, are blurays not 1080p? you don't have constant i frames on any digital video, but it's still 1080p.

I do alot of x264 encoding so this sort of rings a bell with me. Are you saying it's similar to the implentation of i, p and b-frames?
 
So they practically lied about native 1080p, that's not nice. I hate it when the consumer gets bullshitted. They just shouldn't have tout around 1080p 60fps if it isn't true.

The tech behind it seems interesting though.
 
I am reminded of that episode of Seinfield where the little person who acts as a stand-in for a child actor is caught using lifts in his shoes because the kid is growing up.
 
I do alot of x264 encoding so this sort of rings a bell with me. Are you saying it's similar to the implentation of i, p and b-frames?

Exactly, and GPU's have always stored frames for this sort of thing, they will be using triple buffering or maybe even higher as they have the memory, Leadbetter is just clickbaiting he's a fool.
 
I think it's definitely worthy of discussion, it's a new technique for games, but only because GPU's didn't have the memory available to them before, but this is absolutely not an upscale, not from a video standpoint, you are getting the GPU presenting a 1920x1080p image, it's just not drawing all 1920 lines it's drawing 960 and using 960 from the preceding frame, it's still a 1080p image in the framebuffer, it's just filling the framebuffer using a preceding frame.

It's still not right to say 1080p 60, when it's only updating half the frame.


It's clear enough it isn't 1920x1080 whenever you move the camera. Leadbetter seems to be spot on with the detail, but you'd rather the reason for worse IQ in MP be a secret?
 
It's still not right to say 1080p 60, when it's only updating half the frame.
Well technically it's updating both portions alternately that's different from just rending half straight up . But as you mentioned the screens indicate not much else is going on but others day there is more . I'm no graphics expert but to some extent this is just an approximation like scaling is just a way better one I guess . Worth a discussion if people chip in with relevant details from video encoding as seems to be the case . I mean we don't doubt movies are 1080p and if they do something similar then what exactly is 1080p
 
Haven't actually noticed since I haven't even accessed the multiplayer section of the game. Although I'm sure I would've noticed immediately considering how surprised I was when I started playing SP and couldn't believe how good it looked and how clean the IQ was.

That blur is impossible to miss on a 55 inch LCD from 7-8 feet away. 1080p really goes a long way.
 
1080p is implied as 1920x1080 pixels and not just vertical resolution.

but the final outcome on screen is still a 1080p image. it's just made up of 2 frames worth of data. it isn't like the image is upscaled. for all intents and purposes, the image IS 1080p.
 
I doubt there will be any. The game has already been out for 4 months now. What damage control needs to be done when Infamous, The Order and other anticipated games with higher resolutions are coming out. The only people who are going to hold on to this are disgruntled Xbox users who can finally find a game in the Sony arsenal that isn't exactly what it is. AC4, BF4, NBA2k14 and others remain unbothered. At this point, it's a non issue but mostly moving forward (Sony exclusives only) we will have to keep a keen eye out.

I think there will be, simply because this makes for juicy click bait. I expect GG to have a rough PR day today.

I'm pretty sure every single person that bought the game thought this was running at 1080P native, even though there was noticeable blur, especially when running. Some even asked GG about it on the forums. They said they would look into it. :/
 
but the final outcome on screen is still a 1080p image. it's just made up of 2 frames worth of data. it isn't like the image is upscaled. for all intents and purposes, the image IS 1080p.

The p means something. The image is interlaced, therefore by definition it is not 1080p. After upscaling Ryse the final image is 1920x1080, doesn't mean it is native 1080p.
 
but the final outcome on screen is still a 1080p image. it's just made up of 2 frames worth of data. it isn't like the image is upscaled. for all intents and purposes, the image IS 1080p.

Eeh that's iffy using that logic an upscaled 720p is also 1080p it's also approximating what it would look later on ... The question is when does the approximation become good enough and what are the significant differences and advantages between them that a solid distinction can be made .
 
I am one of the people who really hates all the resolution talks and fanboyism. Heck, my braincells are still decreasing after reading certain topics here or on N4G hours afterwards. Especially since my preference lies with MS consoles.

But to me this isn't such a big deal. Yes Sony and Guerilla Games shouldn't have said it was 1080p. But is it really a thing to be upset about? If you thought it looked good and enjoyed the game, will knowing this really ruin your day or the fun you've had with the game? I seriously hope not; otherwise you should really take a look at your priorities.

Becoming more skeptical about what Sony states is the only thing you can really take out of this, nothing else. The game hasn't changed one bit and you shouldn't enjoy it any less than you did before.
 
Not accurate it's not simple interlacing and at the least they seem to be doing way more than any tv does while interlacing .

The rain streaks suggest they aren't doing much.

I realise it's not entirely accurate to call it 1920i, I was just correcting the use of 960i. It's like a smart de-interlace (if my understanding is correct). Regarding the rain streaks, and I'm probably talking a bit above my knowledge here, but it might be possible that their transparent objects aren't writing to the velocity buffer. Somebody with more experience please feel free to jump in here and stomp out this idea, but something interesting I found after thinking about this and thinking about what else velocity buffers are used for:

http://john-chapman-graphics.blogspot.com.au/2013/01/per-object-motion-blur.html

Transparency presents similar difficulties with this technique as with deferred rendering: since the velocity buffer only contains information for the nearest pixels we can't correctly apply a post process blur when pixels at different depths all contribute to the result. In practice this results in 'background' pixels (whatever is visible through the transparent surface) to be blurred (or not blurred) incorrectly.

The simplest solution to this is to prevent transparent objects from writing to the velocity buffer. Whether this improves the result depends largely on the number of transparent objects in the scene.

Another idea might be to use blending when writing to the velocity buffer for transparent objects, using the transparent material's opacity to control the contribution to the velocity buffer. Theoretically this could produce an acceptable compromise although in practice it may not be possible depending on how the velocity buffer is set up.

A correct, but much more expensive approach would be to render and blur each transparent object separately and then recombine with the original image.

This hinges on my understanding of what they're doing being correct, though.

Exactly, and GPU's have always stored frames for this sort of thing, they will be using triple buffering or maybe even higher as they have the memory, Leadbetter is just clickbaiting he's a fool.

I don't think the current conversation has much to do with triple buffering?
 
I think it's definitely worthy of discussion, it's a new technique for games, but only because GPU's didn't have the memory available to them before, but this is absolutely not an upscale, not from a video standpoint, you are getting the GPU presenting a 1920x1080p image, it's just not drawing all 1920 lines it's drawing 960 and using 960 from the preceding frame, it's still a 1080p image in the framebuffer, it's just filling the framebuffer using a preceding frame.

This is, as always, just semantics. They could not get the targeted framerate at 1920x1080 so they reduced the resolution they and to render at any given time. it is not rendering a 1920x1080 image, so it isn't 1080p. At least not in a sense that anyone cares about.

If you upscale you also get a 1920x1080 image but you don't get any additional information causing artifacts and blurrying.

Is 10x1080 1080p if you interlace it? You could make that argument, but it would look like crap. The same is true regarding upscaling.

I don't understand peoples insistence to decouple 1080p as a term from resolution.
It makes the term worthless, just so they can claim that their favorite game is "1080p" regardless of what its actual rendering resolution is.
 
It's still not clear to me what's going on. Is it that half the pixels are preserved from the previous frame?

Preserved and then some sort of extra calculations done to predict or mitigate the differences (differing views I'm seeing) a prediction would be almost like
Simulating to closer to real 1080p mitigating would
Be more a aa/scaling soon I guess .
 
I've only played KZ:SF's single player, but nonetheless, this is pretty damn slimy. The person I put the blame on first is Guerilla for doubling down and spreading this lie of 1080p MP. I don't know if the Sony higher ups knew exactly what resolution the MP was (if they did fuck them too), but I know 100% the guys at Guerilla knew. The game running at the lower resolution in MP isn't the problem. The fact that they constantly lied rubs me the wrong way.
 
The p means something. The image is interlaced, therefore by definition it is not 1080p. After upscaling Ryse the final image is 1920x1080, doesn't mean it is native 1080p.

No, the P means progressive vertical scan.

And gooner4life_uk seems to be one of the few that actually understand what's happening here.

If you don't think this is "native" then fine, but you also have never seen a native 1080p bluray in your life in that case.
 
Top Bottom