Killzone: Shadow Fall Multiplayer Runs at 960x1080 vertically interlaced

I hope more developers implement this tech. Aim for at least 30-40fps, use varying amounts of interlacing for 60fps. Surely the blurriness can be reduced with time.

I don't. While not a total mess, it's still a complicated solution to a problem that's probably better addressed in other ways, and it hurts one of the more fundamental aspects of image quality. Like the article says, it's computationally expensive and wasn't enough to lock 60 despite the fairly large drop in fill-rate requirements.
 
It's not the end of the world that everyone tends to make every little thing that pops up in this "console war" but I am disappointed in Sony. We were mislead. I want transparency.



I don't hate them. I'm not selling my ps4. No PRE-ORDER CANCELLED LOL>OLOL. I merely think that it's disappointing and I expect better out of them and every other developer that I like.

Yeah, I expect more transparency and hopefully they will learn from this in the future.
 
There sure is a lot of sky falling in this thread. Why does this really matter? Either the game looks good and plays good or doesn't. Does the minute technical details really change your enjoyment of the game?

Very reasonable post.... unless you use it to defend an Xbox game, then prepare to be stoned to death.
 
That still makes it interlaced instead of p.
It's like deinterlacing (blending) a 1080i video.
It is 1080i horizontal.
All that is incorrect. The thread title is wrong. Shadow Fall is not using any kind of interlacing, vertical or horizontal. Nor is it using horizontal upscaling. This is a new technique, and can't be described using terms applied to previous methods.

Interlacing and upscaling have particular meanings, and refer to methods of using pixels of an undersized video stream to fill in display pixels on the same or subsequent frames. In distinction, the Shadow Fall method cannot be used on a video stream. It requires data only used within 3D rendering engines, and therefore couldn't (for example) make existing Youtube videos look better at higher resolutions.

Because it uses this extra data, it gives much better results than interlacing or upscaling--so much better, that in certain cases pixel counting can't detect the native rendering resolution. (In other cases it can.) The drawback is artifacts--manifested as blurred textures or motion ghosts--when the approximation fails. Despite this, the final image still has much higher IQ than an equivalent upscale or interlace.
 
^ It's quite blurry in motion, only if you keep the camera still do you get the illusion of 1920x1080p.

I don't. While not a total mess, it's still a complicated solution to a problem that's probably better addressed in other ways, and it hurts one of the more fundamental aspects of image quality. Like the article says, it's computationally expensive and wasn't enough to lock 60 despite the fairly large drop in fill-rate requirements.

I doubt it's very computationally expensive. It looks like normal video processing with a few more artefacts. There could be other bottlenecks like CPU for multiplayer?
 
I doubt it's very computationally expensive. It looks like normal video processing with a few more artefacts. There could be other bottlenecks like CPU for multiplayer?

Yeah, the real bottleneck(s) is/are elsewhere. It could probably be attributed to the decision to target 60 for MP being made too late, or to some weird kink in GG's notoriously complex rendering pipelines.
 
It's not the end of the world that everyone tends to make every little thing that pops up in this "console war" but I am disappointed in Sony. We were mislead. I want transparency.



I don't hate them. I'm not selling my ps4. No PRE-ORDER CANCELLED LOL>OLOL. I merely think that it's disappointing and I expect better out of them and every other developer that I like.

Same here. I'm a big KZ fan and avid supporter of GG, but this is a lie by omission.

Tighten up Hulst!
 
All that is incorrect. The thread title is wrong. Shadow Fall is not using any kind of interlacing, vertical or horizontal. Nor is it using horizontal upscaling. This is a new technique, and can't be described using terms applied to previous methods.

But like interlaced video there's an odd and even frame. In KZ:SF it's using a evolution and any extra information (such as motion) it has to help blend the next frame from the previous frame for 1080p 60fps?
 
The really odd thing about this story is that Digital Foundry apparently learned from this technique back then when they visited Guerrilla for their Killzone tech article, yet they did not think that it was worthy of being mentioned in that article. At least we can be sure that Guerilla must have explained that technique to DF themselves, since the technique is quite unique and DF's description of it so specific. It is highly unlikely that DF did find this out alone.

If Guerilla wanted to "lie" as many here say, then why did they tell DF, and why did DF not report this info until now? That is really odd.

not trying to defend sony or gg but yeah they posted it like it was breaking news. seems convenient to report it now when it fits the narrative to some kind of message they are trying to push.

::puts on tinfoil beanie:: :D
 
I think this looks better than 720p if that were the alternative to get to ~60 FPS. Yeah, the edges are fuzzy if you're focusing on it but for fast or slow camera pans it's very slight.

Like sebbbi on B3D said, we're going to see more of this as the gen goes on. I expect these kinds of interpolation algos to improve by a lot, and we'll probably see them used in some form for VR too. Multiplat Bone games would be well served to look into techniques like this.
 
Well at least I know now why it had that odd blur in multiplayer.

Every game in every mode gets the The Count from now on. No exceptions!

countvoncount.jpg


ah hah hah hah
 
But like interlaced video there's an odd and even frame. In KZ:SF it's using a evolution and any extra information (such as motion) it has to help blend those two frames together to 1080p?
The old terminology of "odd and even fields" (not frames) and "interlacing" were established to describe a very specific method of generating a video stream. Guerrilla's "half-sample temporal reprojection analytic AA" (let's call it HSTRAA for short) does different work on alternating columns of pixels. But this is a very superficial similarity to interlacing, and calling it that is both inaccurate and far more confusing than helpful. Here's some fundamental differences:

- Interlacing can be applied to any video stream, including those already recorded; HSTRAA can only be used by a 3D rendering engine at runtime.
- Interlacing produces comb artifacts on any object in motion; HSTRAA on only some.
- On progressive displays interlacing can only be a render-saving method if it's either upscaled or half framerate; HSTRAA saves render time without either of those effects.

And probably more that don't occur to me. It's a very different method than interlacing, with much better results. It's best that we call it something else.
 
All that is incorrect. The thread title is wrong. Shadow Fall is not using any kind of interlacing, vertical or horizontal. Nor is it using horizontal upscaling. This is a new technique, and can't be described using terms applied to previous methods.

Interlacing and upscaling have particular meanings, and refer to methods of using pixels of an undersized video stream to fill in display pixels on the same or subsequent frames. In distinction, the Shadow Fall method cannot be used on a video stream. It requires data only used within 3D rendering engines, and therefore couldn't (for example) make existing Youtube videos look better at higher resolutions.

Because it uses this extra data, it gives much better results than interlacing or upscaling--so much better, that in certain cases pixel counting can't detect the native rendering resolution. (In other cases it can.) The drawback is artifacts--manifested as blurred textures or motion ghosts--when the approximation fails. Despite this, the final image still has much higher IQ than an equivalent upscale or interlace.

Wow. You are really arguing semantics here.

So because the unrendered lines are using frame-old data, it doesn't count?

Call it interlacing+ then, like SuperAMOLED+. It's still the exact same technique.

Guerrilla's "half-sample temporal reprojection analytic AA" (let's call it HSTRAA for short)

WOW. You really don't like the word "interlaced", do you?
 
I think this looks better than 720p if that were the alternative to get to ~60 FPS. Yeah, the edges are fuzzy if you're focusing on it but for fast or slow camera pans it's very slight.

Like sebbbi on B3D said, we're going to see more of this as the gen goes on. I expect these kinds of interpolation algos to improve by a lot, and we'll probably see them used in some form for VR too. Multiplat Bone games would be well served to look into techniques like this.

I hope he is wrong. I would rather have strait 720p than this trick. It is aweful and needs to be run out of town by gamers.
 
The really odd thing about this story is that Digital Foundry apparently learned from this technique back then when they visited Guerrilla for their Killzone tech article, yet they did not think that it was worthy of being mentioned in that article. At least we can be sure that Guerilla must have explained that technique to DF themselves, since the technique is quite unique and DF's description of it so specific. It is highly unlikely that DF did find this out alone.

If Guerilla wanted to "lie" as many here say, then why did they tell DF, and why did DF not report this info until now? That is really odd.
Is it incompetence? Leadbetter had to know an article on this subject would get tons of clicks. Is it an agenda? This was around when he tore the XB1 a new one I believe. Was it a purposeful agreement to omit such a fact between Sony/Guerrilla and DF? Seriously. I think this is the bigger question right now that we've decidedly figured this resolution thing out
*clap*
All that is incorrect. The thread title is wrong. Shadow Fall is not using any kind of interlacing, vertical or horizontal. Nor is it using horizontal upscaling. This is a new technique, and can't be described using terms applied to previous methods.

Interlacing and upscaling have particular meanings, and refer to methods of using pixels of an undersized video stream to fill in display pixels on the same or subsequent frames. In distinction, the Shadow Fall method cannot be used on a video stream. It requires data only used within 3D rendering engines, and therefore couldn't (for example) make existing Youtube videos look better at higher resolutions.

Because it uses this extra data, it gives much better results than interlacing or upscaling--so much better, that in certain cases pixel counting can't detect the native rendering resolution. (In other cases it can.) The drawback is artifacts--manifested as blurred textures or motion ghosts--when the approximation fails. Despite this, the final image still has much higher IQ than an equivalent upscale or interlace.
Seriously. Whoever changed the title needs to change it to something that's you know... accurate. This is NeoGAF, there's a standard for accuracy, and that title is not accurate. It's about as disingenuous as calling it 1080p. Once again, unless something has changed in the last few hours, the most accurate description is

Killzone: Shadow Fall's multiplayer runs at 960x1080 temporally reprojected to 1080p

Don't coddle the people who don't understand, or misrepresent the facts we have in front of us, that's what got us here in the first place, make them come into the thread and learn why it says what it says.
Wow. You are really arguing semantics here.
So because the unrendered lines are using frame-old data, it doesn't count?
Call it interlacing+ then, like SuperAMOLED+. It's still the exact same technique.
WOW. You really don't like the word "interlaced", do you?
This entire thread is semantics. And it looks like your accusing him of the same thing you're doing right now. Creating new terms to explain what is happening in the engine.
 
Though the thread it's in does, the post you're referring to has nothing to do with the merits of 1080p over other resolutions.

I think the sentence "A developer like Guerilla Games can produce a game that looks like this at 1080p x 60fps variable" suggests he was talking about the technical achievement of the resolution.

And considering his next post after someone points out that it looks blurry for that resolution is, "I can see Microsoft's pure BS approach is actually swaying some people. How ..... sad. FYI Killzone does not use FXAA. Dat 900p FEEL LMAO," I think that cements what I was noting about the comparison.
 
Seriously. Whoever changed the title needs to change it to something that's you know... accurate. This is NeoGAF, there's a standard for accuracy, and that title is not accurate. It's about as disingenuous as calling it 1080p. Once again, unless something has changed in the last few hours, the most accurate description is

Killzone: Shadow Fall's multiplayer runs at 960x1080 temporally reprojected to 1080p

960x1080 interlaced is accurate, but with a filter that uses the previous frame to interpolate the gaps. You keep the frames and get reduced interlacing effect, but more blur in motion.
 
I hope he is wrong. I would rather have strait 720p than this trick. It is aweful and needs to be run out of town by gamers.

I've no doubt it will get better, KZSF technique is just the beginning. Research into these kinds of things by console devs is how things like post AA becomes viable and rendering becomes more efficient. Hopefully GG explains it at GDC.
 
Heard people complain that the multiplayer was blurry. FXAA took all the blame I guess, lol. Man, but FXAA on top of a sub native res? Well there's your blurriness right there.
 
I've no doubt it will get better, KZSF technique is just the beginning. Research into these kinds of things by console devs is how things like post AA becomes viable and rendering becomes more efficient. Hopefully GG explains it at GDC.

I hate interlacing period it should of died with SD TVs. It is horrible for fast motion I don't care what fancy technique they use. Give me the progressive image at a lower resolution. I can't believe that these consoles are so under powered they would need to resort to interlacing. Cut back the effects or what ever it takes anything but interlacing.
 
960x1080 interlaced is accurate, but with a filter that uses the previous frame to interpolate the gaps. You keep the frames and get reduced interlacing effect, but more blur in motion.
Having "but" in your explanation means the thread title is still wrong. An analogy I just came up with is having a thread title say

Title :
Uncharted 4 announced
Post :
But it's actually a really big Uncharted 3 expansion.

It's just as technically wrong, as calling it 1080p. It needs to be fixed.
 
I'm not particularly fussed as to what it's called, as long as it's not 960 x 1080 interlaced. 1920 x 1080 interlaced minimum, 1920i x 1080p if you wanted to avoid confusion with 1080i.

But really, this thread is getting pretty hilarious in the cycles it's taking.
 
Wow. You are really arguing semantics here.

So because the unrendered lines are using frame-old data, it doesn't count?

Call it interlacing+ then, like SuperAMOLED+. It's still the exact same technique.
Yes, I am arguing semantics. It actually matters to use the correct terms for things. If you'd like to call this "interlacing+" that's better than simply "interlacing". But it's definitely not "the exact same technique". This isn't some personal idiosyncrasy, and the term you find WOW-worthy ("half-sample temporal reprojection") isn't my creation. Read back through the thread and you'll see technical gurus much more knowledgeable than me saying all this stuff.
It is hard to detect because 1080i and 1080p both output at 1920x1080
No, that's not why. Interlaced video running at 60fps would have to be 120 fields per second. That would require the game engine to render half the frame twice as often as the target framerate. In other words, there would be literally zero efficiency gain--indeed, the game would actually be native 1080p! On the other hand, if the game were rendering at 60 fields per second, then when displayed at 60fps on a progressive display, upscaling would be taking place every frame and pixel counting would always find it, on every edge.

[Artifacts]Sounds like the same problem as 1080i
Sort of, but unlike interlacing it doesn't happen everywhere there's motion. It only occurs with high parallax, so is in scattered areas onscreen.
 
I hope he is wrong. I would rather have strait 720p than this trick. It is aweful and needs to be run out of town by gamers.

I don't really understand this. Yes it is worse then native 1080p, but preferring upscaled 720p to this is pure insanity. This technique can produce a image quality on par with 1080p (at best), and at worst looks like a 1080p image of reduced quality (blurring, ghosting, and artifacts).

Just the very fact no one thought KZ:SF MP was sub 1080p despite being pixel counted is amazing. If anything I'd love to see games on the Xbone pick up this trick if the system has the horse power for it. Upscaled 720p looks pretty bad.
 
I don't really understand this. Yes it is worse then native 1080p, but preferring upscaled 720p to this is pure insanity. This technique can produce a image quality on par with 1080p (at best), and at worst looks like a 1080p image of reduced quality (blurring, ghosting, and artifacts).

Just the very fact no one thought KZ:SF MP was sub 1080p despite being pixel counted is amazing. If anything I'd love to see games on the Xbone pick up this trick if the system has the horse power for it. Upscaled 720p looks pretty bad.

The trick of lying?
 
I think the sentence "A developer like Guerilla Games can produce a game that looks like this at 1080p x 60fps variable" suggests he was talking about the technical achievement of the resolution.
I disagree, since the game he compared it to was also 1080p; in that post I think he was emphasizing that the game "looks like this", i.e. better than other games at the same res (especially since he mentions different developers' varying abilities).

All that said...

And considering his next post after someone points out that it looks blurry for that resolution is, "I can see Microsoft's pure BS approach is actually swaying some people. How ..... sad. FYI Killzone does not use FXAA. Dat 900p FEEL LMAO," I think that cements what I was noting about the comparison.
...that stuff surely is just as you described. Sorry for the confusion, I didn't read that thread myself so I didn't know about that followup.
 
Yes the multiplayer looks significantly worse than the single player. But no one can deny the single player graphics are a revelation. It blows everything out of the water. Truly mind blowing shit.
 
Having "but" in your explanation means the thread title is still wrong. An analogy I just came up with is having a thread title say

Title :
Uncharted 4 announced
Post :
But it's actually a really big Uncharted 3 expansion.

It's just as technically wrong, as calling it 1080p. It needs to be fixed.

960x1080 pixels are rendered anew each frame, so it's not technically wrong. 1080i with some fancy GG interpolation would be close enough. It's interlaced, and 1080 here is 1920x1080 pixels standard.

Filling in the gaps this way causes blurred, lower-res motion, and still there are visible interlacing problems. It's a good solution though, especially if they can clean it up further.
 
Shadow Fall really is gorgeous, single and multi player .. the gameplay is of course a horrible turd, but man is it the prettiest turd I've seen.
 
I just hope the ppl who have been making a lot of noise about X1s resolution, dont all of a sudden change there opinion, and they are as loud about the PS4's res gate.

Otherwise the biased is exposed.

Cant say i'am I really bothered, Resolution alone has never been the be all and end all of graphics, I just go by if the game looks good or improved to me or not.

I've said it once and I'll say it again, to the naked eye the difference in visuals is about equal to the difference last gens HD twins.

If people dont like a console with lower resd/framerate games, dont buy it. Why complain?


So to all those those ps4 owners who were like ''720p vs 1080p is a massive difference'', well it clearly isnt if cant even notice it,lol.
 
I just hope the ppl who have been making a lot of noise about X1s resolution, dont all of a sudden change there opinion and be as loud about the PS4's res gate.

Otherwise the biased is exposed.

Cant say i'am I really bothered, Resolution alone has never been the be all and end all of graphics, I just go by if the game looks good or improved to me or not.

I've said it once and I'll say it again, to the naked eye the difference in visuals is about equal to the difference last gens HD twins.

If people dont like a console with lower resd/framerate games, dont buy it. Why complain?

I don't think resolution gate is actually about resolutions. It more about "this console is more powerful than that console because it plays game xy in a higher resolution"

There is only a discussion if you can compare a Ps4 and XboxOne version of a game.
Resolutions of exclusive titles will never spark huge discussions. What we have right now is an issue because they lied to us.
 
strange GG hasn't replied to the issue yet, maybe in their hearts they believe it is still 1080p


Or they are just ashamed of being caught lying
 
I don't think resolution gate is actually about resolutions. It more about "this console is more powerful than that console because it plays game xy in a higher resolution"

There is only a discussion if you can compare a Ps4 and XboxOne version of a game.
Resolutions of exclusive titles will never spark huge discussions. What we have right now is an issue because they lied to us.

Is that a straight up contradiction.
 
1080i with some fancy GG interpolation would be close enough. It's interlaced, and 1080 here is 1920x1080 pixels standard.
Except even if we accepted the use of the term "interlaced", it still wouldn't be "1080i" because it's actually working on columns, not rows. But calling it "1920i" would also be out of whack, since all other "i" and "p" numbers are vertical.

Seriously, it's best just to avoid the term "interlaced" completely. Guerrilla's method uses a different axis, employs a different fill technique, achieves different efficiencies, and produces better results. It's almost as different as possible. Do you think felines and canines should all just be called "dogs" because, man, they're "close enough"? On a less silly and more germane note, do you think the terms FXAA, TMAA, etc. are meaningless hair-splitting?
 
I just hope the ppl who have been making a lot of noise about X1s resolution, dont all of a sudden change there opinion, and they are as loud about the PS4's res gate.

Otherwise the biased is exposed.
.

People are making a lot of noise about X1's resolution compared to the same titles on the PS4, for a direct apples-to-apples comparison. Simply put - better resolution is preferable.

However, I think the whole notion of 1080p or bust is completely overblown; Ryse is a gorgeous looking game at 900p, and I don't care one bit that The Order is letterboxed.

In the end it's a developer's choice. If they want to push the limits of the system and lower the resolution, I'm fine with that.
 
Top Bottom