Killzone: Shadow Fall Multiplayer Runs at 960x1080 vertically interlaced

They notice that the mp was blurry and blamed it on the AA used. If people were as sensitive to res as most claimed they're than this should have come to light a lot sooner.
The problem with lower resolutions really lies with modern display technology. With fixed pixels, any resolution failing to match the native screen resolution needs to be scaled. It's the scaling bit the resolution itself that is the real problem. If displays weren't limited by this factor the whole argument would go out the window.

Killzone uses a method that side steps the need for scaling and instead produces a very different look in motion. The issues most people have with lower resolutions doesn't apply here.
 
Seems interesting approach. I'd be interested to know how much of the engine/game was designed around 4GB before the jump to 8GB RAM and how much GG were able to adapt to use the additional RAM?

I also wish DF would make up their minds on whether resolution matters or not - they seem to keep bouncing back and forth in recent articales.

DF are not doing that since this isn't a "face-off" it's just more information.

More RAM is always good, but wouldn't make any difference to framerate or resolution here.
 
Well......TV programs aren't really 1080p either. For the most part, BluRay and Video games are the only time your TV gets fed a true 1080p signal and then, only plasma can do 1080 lines in motion unless the new OLED can as well.

They can't without interpolation, sadly.
 
Well, one could speculate that, if they are shading/fillrate limited, and the game often drops significantly below 60 FPS in multiplayer even with this measure, that rendering at full 1080p would make it drop below 30 FPS. Which really would be unacceptable in a multiplayer FPS.

I don't think it's debatable in the case of Halo 3. Its IQ was utterly atrocious.

are you telling me that the online portion of the game is more demanding than the single player campaign in KZ:SF? how is that even possible when the environments are much bigger and more detailed, coupled with the fact the system will also need to handle the enemy AI.
 
are you telling me that the online portion of the game is more demanding than the single player campaign in KZ:SF? how is that even possible when the environments are much bigger and more detailed, coupled with the fact the system will also need to handle the enemy AI.
A lot of game's multiplayer portions are more strenuous than the singleplayer portions. It is not rare at all.
 
50fps is still a big leap in perceptible fluidity, and also a significant decrease in input lag. If there were no framerate tests done and posted on Youtube, not a single soul would have ever been able to tell that it wasn't actually running at a full 60fps clip.

A lot of people seem to be under the preconception that it's either "all or nothing" with 60fps. But every increase in frames rendered is beneficial, especially in a competitive MP game.
Uhhh totally not true. MP has consistently been a framey mess for me.
 
This is fascinating, nope i don´t think anybody gets fired over such an ingenious solution.
Actually the game renders an entire 1920x1080p frame, i know it´s all about the semantics here but still, very very impressive for a launch game.
Maybe the term "full HD" needs to be specified better.
 
The single player is still running at 1080p though, so it depends on what people have been referring to. Its likely that it was believed that both single player and multi player was in 1080p though, i think thats the case indeed, i agree, but at least a solid part of the game is still runing in 1080p.

I think it's fair to say that the references were to the game as a whole, not parts of it.
 
what? but everyone told me 1080p makes such a difference and you can spot lower res from a mile away!
People noticed.

There have been plenty of posts here and in other forums complaining about lower IQ in multiplayer, but most didn't know why that was the case and assumed an inferior AA solution. Let's not pretend people are blind and IQ has no impact in how a game looks out of convenience to win stupid online arguments. That's just plain dumb.

Complaining about Sony and GG misleading people with their 1080p/60fps multiplayer statements is fair game though.
 
So no one noticed this difference in resolution? That's very interesting. It would seem the general console gaming public doesn't have the eye for it.

Are you addressing the general console gaming public by responding to a thread on neogaf? You do understand how dumb that sounds right?
 
Good job missing the point. You don't care about resolutions? Good for you. That still doesn't make it okay for GG/Sony to mislead consumers.

I don't buy games based on resolution, so I don't care if other people fall for false PR.

Obviously I get the "point" when it comes to these companies saying whatever best suits them in the eyes of picky gamers. I'd rather they didn't even say...but of course they would get reamed once it was "found out"....as if they're hiding something that actually matters. bleh

I simply don't care what they or any other PR says. They could have said it was 1440p/120...I would have laughed, said "whatever", and played (or not) the game based on my enjoyment of the content...no matter the resolution.

These developers need to drop resolution from their vocabulary. Tell us what your game does to entertain us. Tell us about new concepts. Tell us about your new tech/physics/gameplay designs.

The crying over resolution (and developers keying on it), has become boring. Some of the most fun I've had in the last few months didn't come close to 1080p.

Again, it's only going to get worse with time/pushing of these systems. These companies need to drop it (or they'll be lying a LOT).
 
This opens the door at many more extreme rendering methods such as:

- Instead of rendering every other column, randomize that shit, same pattern as random noise...
- Based on shader load, decide how much of the next frame you're going to render. 50% (as in KZSF) to 100% variable. Every pixel that's rendered is selected at random.
- You can improve the random selection by making sure transparencies are rendered 100%.

This technique would be very interesting, I'd wager most people won't be able to distinguish it from 1080p if you're rendering 90% of the pixels and making up the other 10% from the previous frame. I hope more developers look into this option.
 
So in other words. Once they were told by a trusted source, the resolution didn't matter. The games faults were thought to be something else.

Of course it mattered. People noticed a significant difference in image quality been the MP and SP modes. That difference was a result of resolution. Thus, the resolution had a direct, tangible, demonstrable effect on the perceived image quality of the game. That matters.

Whether or not the public knew precisely why the image quality suffered, is a different matter.

Let's say that the government decides to secretly test a weaponized toxin on a small town. All of the people in this town get sick. But they don't know why they got sick. So they all just assumed it was the flu, and blamed it on the flu. Does that mean the weaponized toxic gas "didn't matter?"
 
Of course it mattered. People noticed a significant difference in image quality been the MP and SP modes. That difference was a result of resolution. Thus, the resolution had a direct, tangible, demonstrable effect on the perceived image quality of the game. That matters.

Whether or not the public knew precisely why the image quality suffered, is a different matter.

Let's say that the government decides to secretly test a weaponized toxin on a small town. All of the people in this town get sick. But they don't know why they got sick. So they all just assumed it was the flu virus. Does that mean the weaponized toxic gas "didn't matter?"

Wait, I left this thread for a minute. And this is what I come back to?

You can't be serious? We've gone to Sess levels of argument?

What the fuck? Lol.
 
Debatable. Halo's lighting was both fantastic and extremely characteristic. Reach probably had the best lighting last gen.


In an article debating FPS versus resolution,,,

I'm glad they did. Halo 3 had the best natural lighting of any Halo. I loved it.
Well let's agree to disagree, I thought Halo 3 was ugly trash and the lighting was absolutely nothing special. There are plenty of games that have far more natural looking lighting, that run at full res on 360, and which also have realtime shadows (RDR, for instance).

Also, as far as I know, Reach actually dropped the higher precision HDR lighting, too.
Well, one could speculate that, if they are shading/fillrate limited, and the game often drops significantly below 60 FPS in multiplayer even with this measure, that rendering at full 1080p would make it drop below 30 FPS. Which really would be unacceptable in a multiplayer FPS.

Isn't the campaign a pretty consistent 30fps, though?
 
Wait, I left this thread for a minute. And this is what I come back to?

You can't be serious? We've gone to Sess levels of argument?

What the fuck? Lol.

His point is perfectly valid, and if I understood your earlier posts correctly, you both actually agree.
 
Really? He buried it in a story about something else.
It's like when FOX News personalities talk about Librul bias in the mainstream media. You see it because you feel it, even when it isn't there.

Well let's agree to disagree, I thought Halo 3 was ugly trash and the lighting was absolutely nothing special. There are plenty of games that have far more natural looking lighting, that run at full res on 360, and which also have realtime shadows (RDR, for instance).

Also, as far as I know, Reach actually dropped the higher precision HDR lighting, too.

That's why Halo 3 had the best lighting. You don't get the proper daylight of Tsavo Highway in the more recent games. A high-res Halo 3 would beat Reach for that reason, in spite of the terrible character models.
 
It's like when FOX News personalities talk about Librul bias in the mainstream media. You see it because you feel it, even when it isn't there.



That's why Halo 3 had the best lighting. You don't get the proper daylight of Tsavo Highway in the more recent games. A high-res Halo 3 would beat Reach for that reason, in spite of the terrible character models.

I can assure you, that the extra precision in the HDR code from halo 3 (which most peoples monitors probably cannot even display correctly) would not inch out over the countless other things reach did to make the game look better.

Reach has good animations, character models, higher quality particle effects, and better lighting (deferred rendering actually making muzzle flashes and plasma light the environement correctly). Its toned down HDR precision/range is not a big deal in light of everything else that it does.
 
Funny how no one noticed this.

Just like they didn't notice Playstation 4 was running Ghost at 720p instead of 1080p during Previews.

Pixel Counters be sleeping on the job. ;)
I actually noticed the game looked funny. I'm not a pixel counter so I wasn't sure. Didn't really play much of the MP so I hadn't given it much thought.
 
As a PC gamer with no bias towards dev or console, my thoughts on this are: pretty damn neat way of hitting 60fps while kinda sorta hitting 1080p! I'd prefer this over the game running at a straight-up lower res and tbh I'd even love to be able to enable this kind of vertical interlacing trick on a game like BF4 on my PC to double my framerate while upping my AA. Very cool. Should they have been honest about it in interviews? I guess, I didn't really read much of the interviews so I don't know how explicitly they lied about it.
 
Well let's agree to disagree, I thought Halo 3 was ugly trash and the lighting was absolutely nothing special. There are plenty of games that have far more natural looking lighting, that run at full res on 360, and which also have realtime shadows (RDR, for instance).

Also, as far as I know, Reach actually dropped the higher precision HDR lighting, too.

Bloody opinions. FWIW, DF say the double-buffered HDR remains:
Reach manages to up the resolution to nigh-on full 720p, while retaining HDR and employing an inordinate amount of dynamic lights - every needle from the needler is a bespoke light source, for example.

http://www.eurogamer.net/articles/digitalfoundry-halo-reach-tech-analysis-article
 
His point is perfectly valid, and if I understood your earlier posts correctly, you both actually agree.

The weaponized toxin analogy is perfectly valid?

I can't. In the weaponized toxin scenario I would want heads to roll. It's a ridiculous comparison.
 
Seems like a cool trick. I just wonder if it might have been better to do full 1920x1080 and lock it at 30fps, possibly combined with a minor reduction in effects (reflections?). But I like seeing new ideas, so the only thing I can really complain about is that they didn't come clean about it right away. They could have spun it as a new interlacing 1080 technology that delivers much of the image quality of full 1920x1080, but with a big boost to framerates. But instead they left it to get discovered and now they look like they tried to hide it.
 
I wonder how people would have reacted if Guerrilla had just sacrificed some graphical effects in MP to get it running at straight-up 1080p/60. It seems no console developer ever considers doing what is actually necessary to get that image quality.
 
Seems like a cool trick. I just wonder if it might have been better to do full 1920x1080 and lock it at 30fps, possibly combined with a minor reduction in effects (reflections?). But I like seeing new ideas, so the only thing I can really complain about is that they didn't come clean about it right away. They could have spun it as a new interlacing 1080 technology that delivers much of the imagine quality of full 1920x1080, but with a big boost to framerates. But instead they left it to get discovered and now they look like they tried to hide it.
It's hard to say they hid it when it was right in front of our faces the whole time. :p
 
Bloody opinions. FWIW, DF say the double-buffered HDR remains:

You're putting words into that sentence that aren't true. :p

http://www.eurogamer.net/articles/digitalfoundry-halo-reach-tech-interview

Digital Foundry: How is HDR being handled this time? The dual framebuffer seemed to get a lot of flak in Halo 3 in terms of the resolution downgrade, but there wasn't much explained about it. Were other framebuffer formats (7e3/FP10 or INT16) just nowhere near comparable? Your previous GDC presentation only described the differences in terms of numbers, but the real-world comparison is difficult to visualise otherwise. What's the approach in Reach?

Chris Tchou: We use a single 7e3 buffer for our final render target in Reach. This results in a more limited HDR (about 8x over the white point, as opposed to 128x in Halo 3) but is much faster for transparents and post-processing. In practice, the difference between 8x and 128x HDR is slight - the main thing you may notice is that the bloom around bright areas loses its color more often, desaturating to white.

And yes, a single 7e3 buffer gives us more available EDRAM for the final lighting pass, but the render resolution is still limited by the three buffers used in the main deferred pass. The resolution in Halo 3 was more limited because we save some EDRAM for dynamic shadows during the lighting pass, alongside the 2 HDR buffers and a depth buffer. But with a single 7e3 buffer, we have plenty of extra room available for the shadows, and it's only limited by the 3 buffers used during the deferred pass.

Anyways, this is off-topic.
 
These developers need to drop resolution from their vocabulary. Tell us what your game does to entertain us. Tell us about new concepts. Tell us about your new tech/physics/gameplay designs.

The crying over resolution (and developers keying on it), has become boring. Some of the most fun I've had in the last few months didn't come close to 1080p.
The display industry needs to free us from fixed pixel displays first.
 
Of course it mattered. People noticed a significant difference in image quality been the MP and SP modes. That difference was a result of resolution. Thus, the resolution had a direct, tangible, demonstrable effect on the perceived image quality of the game. That matters.

Whether or not the public knew precisely why the image quality suffered, is a different matter.

Let's say that the government decides to secretly test a weaponized toxin on a small town. All of the people in this town get sick. But they don't know why they got sick. So they all just assumed it was the flu, and blamed it on the flu. Does that mean the weaponized toxic gas "didn't matter?"


Ummm no. More like, they showed you two samples, told you both were 1080p and even though you knew something was off, you rolled with it anyway and looked for some other reason to justify the difference in image quality. A little closer to the placebo effect, hence my statement about resolution. IQ is subjective, resolution is not.
 
Well, it's pretty clear GG's engine can't render full geometry and effects at more than 1080p@30, is compromising IQ for 60fps responsiveness in MP the right answer?
Are interlacing artifacts better or worse than flat out up scaling vertically and horizontally?

In MP, I think the framerate target is right. It gets framey sometimes but after struggling through kz2 constant fps drops I much prefer them setting a higher fps bar.

Is the IQ better than 720p upscaled? I dunno, I suppose most of the time it just seems like really aggressive motion blur to me. I guess this method just prioritizes the vertical pixels and vertical detail over horizontal...I would say my personal preferences for fps multiplayer would be 1080p@30 < 720p@60 < GG's interlacing. Given a budget of ~60M pixels per second (1080 x 1920 x 30hz).
 
I wonder how people would have reacted if Guerrilla had just sacrificed some graphical effects in MP to get it running at straight-up 1080p/60. It seems no console developer ever considers doing what is actually necessary to get that image quality.

Well, Forza 5 did with the crowd, and it gets crapped on constantly, even though it's rock solid 1080p/60.
 
This is fascinating, nope i don´t think anybody gets fired over such an ingenious solution.
Actually the game renders an entire 1920x1080p frame, i know it´s all about the semantics here but still, very very impressive for a launch game.
Maybe the term "full HD" needs to be specified better.

Seriously?

1080p is defined as 1920 pixels horizontally x 1080 pixels vertically with an aspect ratio of 16 : 9. Thats 2,073,600 pixels that has to calculated within the time bugdet for a frame, in case of 60fps thats 16.6ms. As I understand GG uses a method to calculate that same amount of pixels in the timeframe for 2 frames by some sort of interlacing 2 half pictures-> which means in 33.3ms.

Just for comparison purposes I did a little calculation:

960 x 1080 pixels = 1,036,800 pixels, problem is that the aspect ration is 8:9 not 16:9 which makes comparison a little bit difficult.

After an extrapolation to an aspect ratio of 16:9 I got the following result:

1,036,800 pixels equals a rounded resolution of 764p (approx. 1358 x 764 pixels)

You get the feeling ...
 
Alot of people are saying this is why the game is blurry, however we don't know that, it could be that this technique is very close to 1080p and it's actually the FXAA being applied to two difference frames and then being merged that causes the blur....
 
Seriously?

1080p is defined as 1920 pixels horizontally x 1080 pixels vertically with an aspect ratio of 16 : 9. Thats 2,073,600 pixels that has to calculated within the time bugdet for a frame, in case of 60fps thats 16.6ms. As I understand GG uses a method to calculate that same amount of pixels in the timeframe for 2 frames by some sort of interlacing 2 half pictures-> which means in 33.3ms.

Just for comparison purposes I did a little calculation:

960 x 1080 pixels = 1,036,800 pixels, problem is that the aspect ration is 8:9 not 16:9 which makes comparison a little bit difficult.

After an extrapolation to an aspect ratio of 16:9 I got the following result:

1,036,800 pixels equals a rounded resolution of 764p (approx. 1358 x 764 pixels)

You get the feeling ...
It's unlikely the technique they're using is equivalent to rendering at 764p as there is an overhead.

More importantly, the technique used produces a much better image than native 764p. It's hugely improved.
 
it's not that "nobody noticed" it looked less than stellar, it's that whenever someone did notice, it wasn't worth crashing up against the wall of noise that would accompany having a less than rabidly enthusiastic opinion over the visuals of a PS4 launch title.

the fact some people are using this occasion more as a springboard to congratulate GG on a well executed deception, rather than take the opportunity for a much needed moment of reflection, is certainly telling.
 
This opens the door at many more extreme rendering methods such as:

- Instead of rendering every other column, randomize that shit, same pattern as random noise...
- Based on shader load, decide how much of the next frame you're going to render. 50% (as in KZSF) to 100% variable. Every pixel that's rendered is selected at random.
- You can improve the random selection by making sure transparencies are rendered 100%.

This technique would be very interesting, I'd wager most people won't be able to distinguish it from 1080p if you're rendering 90% of the pixels and making up the other 10% from the previous frame. I hope more developers look into this option.

kind of like real time MPEG encoding - you vary your compression based on the complexity of the scene, but bias towards areas of specific interest. Could be a really interesting area of development for this generation
 
Alot of people are saying this is why the game is blurry, however we don't know that, it could be that this technique is very close to 1080p and it's actually the FXAA being applied to two difference frames and then being merged that causes the blur....

It isn't a blur what people mean though, it's obvious artifacting that you'd get from something that's low precision.
 
Though plenty of other GAFfers are capable pixel counters, I believe you're referring to me. I'm not infallible, but you can quit with the character assassination. I couldn't tell before because in some situations this technique will defeat pixel counting. I can now tell, given new screenshots, but even with some of those the true resolution is not apparent.

I hope you can forgive me for not detecting an interpolation method that has apparently never been used before, just as I will forgive you for being a jerk in the name of winning an argument.

Pointing out that you missed something like this isn't "character assassination," and I was responding specifically to someone attacking the group that did actually identify and document the process. This process results in obvious artifacting in all of the direct screen-grabs we've looked at here with even slight camera motion. Considering people are simultaneously criticizing DF, who did eventually identify it, while celebrating your work, which didn't identify it until it was pointed out to you, I hope you'll "forgive" accurate, non-exaggerated statements of fact.
 
guerilla are bullshitters, it's what they do

killzone 2 input lag
killzone mercenary 'native res' (designed so that it'll always be native res when you take a screenshot but not during gameplay)
killzone shadowfall 'dedicated servers' peer2peer multiplayer
now this

The fact that people didn't immediately scream 'this isn't 1080p' is probably testament to how big a crap fxaa takes on image quality


Pretty much. People want 60 Fps but apparently without any sacrifices.
People crap on forza because of the downgrades from the original trailer, as they should
it is seperate from the game being 1080p 60 fps (which is highly commendable for gameplay and image quality)
 
guerilla are bullshitters, it's what they do

killzone 2 input lag
killzone mercenary 'native res' (designed so that it'll always be native res when you take a screenshot but not during gameplay)
killzone shadowfall 'dedicated servers' peer2peer multiplayer
now this

The fact that people didn't immediately scream 'this isn't 1080p' is probably testament to how big a crap fxaa takes on image quality
Mercenary was advertised as variable resolution. It's also a different studio.
 
Alot of people are saying this is why the game is blurry, however we don't know that, it could be that this technique is very close to 1080p and it's actually the FXAA being applied to two difference frames and then being merged that causes the blur....

It's temporal 1080p. You get a full 1920x1080 of new pixels every 2 frames. Without the blurring/image processing, it'd look like every other line of pixels was a frame late.
 
Top Bottom