Digital Foundry vs Metro Redux

Oh yea also wanna add that on top of the apparently great port it's also very reasonably priced. Square Enix could really learn something from Deep Silver here.
 
921p? Wtf is up with these funky X1 resolutions? Hopefully they can reach 1080p, pre or post release.
Did you see the resolutions for the last gen versions of these games? Weird resolutions were downright common last-gen.

Rock solid 60fps is more important anyways.

Sounds like they did top notch work with Metro Redux overall.
 
What are all resolutions that Xbone hit so far? 792p was also strange.

1080p is elitist and pretentious. 792p and 912p on the other hand are tried-and-true working man's resolutions.
 
giphy.gif

Is this animation canned or dynamic?
 
Indeed, locking it to 60fps regardless of platform is quite the accomplishment. 2033 in particular ran like crap on even the best PC's, so running it like they are on the new consoles is amazing.
Ran like crap if you turned on some very, very taxing special effects. Chances are those effects are not applied in these console releases.
Canned, just like the ones from MGS games.
In PC? Are you sure? because it supported PhysX for GPU accelerated soft materials. AT least in the first Metro.
 
1152x640 framebuffer on PS3 (1280x672 on 360)

so much for reading the op. this is nothing new

So much for reading my comment:

921p? Wtf is up with these funky X1 resolutions? Hopefully they can reach 1080p, pre or post release.

I wasn't talking about last-gen, and I'm well aware of last-gen having funky resolutions. I'm just surprised to see the trend carrying over to this gen with the X1.
 
Devs do anything to stay away from that 720P label, even if the resolution does't make a lick of sense. Could it be mandated by Ms?
 
Ran like crap if you turned on some very, very taxing special effects. Chances are those effects are not applied in these console releases.

In PC? Are you sure? because it supported PhysX for GPU accelerated soft materials. AT least in the first Metro.

Yes i am sure , every time you watch it , the cloth falls down in exactly the same way .
 
Devs do anything to stay away from that 720P label, even if the resolution does't make a lick of sense. Could it be mandated by Ms?

A resolution itself doesn't have to make "sense". They probably just fine-tuned all other performance parameters (ESRAM usage, bandwidth, GPU time, etc.) and chose the highest possible resolution that matches them the best while delivering the target frame rate.
 
Indeed. Still it's a fucking joke that X1 can't hit 1080p in 2014. Native resolutio is the 2nd most important thing right after framerate.

It can hit 1080p. There are a bunch of titles which are 1080, it's just the devs decided they'd rather have the same graphical effects for both PS4 and X1 rather than the same res.

It obviously looks better with great effects at 912p than less great effects at 1080p.
 
Indeed. Still it's a fucking joke that X1 can't hit 1080p in 2014. Native resolutio is the 2nd most important thing right after framerate.


Posts like these are bollox. Ps3 and xbox 360 weren't hitting 720p in 2007. It doesn't mean anything. There's games on the ps4 that aren't 1080p is that also a joke? Or in guessing that's the devs fault. Comments like these are what will stop graphical effects being pushed by devs when they read it because they will think gamers just want native resolution.

Ill take 60fps all day. Resolution is the first thing to tweak in my opinion
 
It can hit 1080p. There are a bunch of titles which are 1080, it's just the devs decided they'd rather have the same graphical effects for both PS4 and X1 rather than the same res.

It obviously looks better with great effects at 912p than less great effects at 1080p.
Not works exactly in this way. The problem of the native 1080p it's not just some fancy effects at lower res. I though was pretty clear from awhile. EsRam it's the big problem.
 
A resolution itself doesn't have to make "sense". They probably just fine-tuned all other performance parameters (ESRAM usage, bandwidth, GPU time, etc.) and chose the highest possible resolution that matches them the best while delivering the target frame rate.

I was under the impression that resolution scaling works better in absolute steps, hence the comment. Doesnt 720P with a superior AA solution > weirdo resolutions?
 
Posts like these are bollox. Ps3 and xbox 360 weren't hitting 720p in 2007. It doesn't mean anything. There's games on the ps4 that aren't 1080p is that also a joke? Or in guessing that's the devs fault. Comments like these are what will stop graphical effects being pushed by devs when they read it because they will think gamers just want native resolution.

Ill take 60fps all day. Resolution is the first thing to tweak in my opinion
There are plenty of 720p games on both consoles in 2007, including games like Uncharted. You aren't reflecting reality here.
 
So, (1920 * 1080) / (1620 * 912) = 1,4035 --> PS4 pushing about 40% more pixels than Xbox One?

Hmm, I'm not a mathematician, but is that really the correct way of looking at it? I prefer to look at it like this.


(1620 * 912) / (1920 * 1080) = 0.7125


Meaning, the Xbox One version have 71% the amount of pixels the PS4 version has.


PS4 have 2 073 600 pixels - Xbox One 1 477 440 pixels = The diffrence is 596 160 pixels.

596 160 is ~29% out of 2 0736 600 (596 160 / 2 073 600 = 0.2875)


Shouldn't it be that the PS4 version have roughly 29% more pixels?
 
Hmm, I'm not a mathematician, but is that really the correct way of looking at it? I prefer to look at it like this.


(1620 * 912) / (1920 * 1080) = 0.7125


Meaning, the Xbox One version have 71% the amount of pixels the PS4 version has.


PS4 have 2 073 600 pixels - Xbox One 1 477 440 pixels = The diffrence is 596 160 pixels.

596 160 is ~29% out of 2 0736 600 (596 160 / 2 073 600 = 0.2875)


Shouldn't it be that the PS4 version have roughly 29% more pixels?

Not this again.
 
Hmm, I'm not a mathematician, but is that really the correct way of looking at it? I prefer to look at it like this.


(1620 * 912) / (1920 * 1080) = 0.7125


Meaning, the Xbox One version have 71% the amount of pixels the PS4 version has.


PS4 have 2 073 600 pixels - Xbox One 1 477 440 pixels = The diffrence is 596 160 pixels.

596 160 is ~29% out of 2 0736 600 (596 160 / 2 073 600 = 0.2875)


Shouldn't it be that the PS4 version have roughly 29% more pixels?
Time to go back to school.
 
Yes i am sure , every time you watch it , the cloth falls down in exactly the same way .
To be clear im not saying you are wrong but, (always a but) the fact that it always fold down the same way doesn't mean it's necessarily pre computed. What it means it's that the set of variables that affect the simulation are always equals thus results remain the same. Like i said, im not saying you are incorrect for the Last Light case.

But if the results are consistent when the variables always play the same, what would be the benefit of on the fly physics computing? Well for animation since it becomes an automated more fast process to get results.
 
Shouldn't it be that the PS4 version have roughly 29% more pixels?

I always wondered why Albert Penello talked about a "30%" advantage in his famous meltdown, not a 40% one. We might have discovered the answer.
 
Shouldn't it be that the PS4 version have roughly 29% more pixels?
No. When you say console X renders (percentage) more / less pixels than console Y, the percentage is based on console Y's pixel count. So the PS4 version renders 40% more pixels than the XBox version, while the XBox version renders 29% less pixels than the PS4 version.
 
There are plenty of 720p games on both consoles in 2007, including games like Uncharted. You aren't reflecting reality here.
You're right, but there are plenty of 1080p games on XB1 as well. I think his point is - nothing has really changed since last gen in terms of resolution. Some people are acting like this is something new.
 
Glad they chose to prioritize frame rate. I can barely tell the difference from 800 to 1080, but 30 vs 60 is extremely noticeable to me.
 
What about Destiny then?
Unless I'm mistaken, the final code hasn't been compared side by side. And while parity in resolution and frame rate isn't out of the question, some other sacrifices to IQ, features, and effects can be made instead.

I'm looking at this from the idea that the hardware gap cannot be plugged. Having said that, Destiny should provide the closest performance between any PS4/XBO versions, if we go by news reporting MS sending of their programmers to help with bringing up the port to speed.
 
PS4 have 2 073 600 pixels - Xbox One 1 477 440 pixels = The diffrence is 596 160 pixels.

596 160 is ~29% out of 2 0736 600 (596 160 / 2 073 600 = 0.2875)


Shouldn't it be that the PS4 version have roughly 29% more pixels?

A has 100 pies - B has 10 pies = The difference is 90 pies.

90 is 90% out of 100 (90 / 100 = 0.9)

So A has 90% more pies than B.
In other words 100 is 90% more than 10.

All you have to do is to divide them to get the perecentage
 
No. When you say console X renders (percentage) more / less pixels than console Y, the percentage is based on console Y's pixel count. So the PS4 version renders 40% more pixels than the XBox version, while the XBox version renders 29% less pixels than the PS4 version.

My misstake, thank you. =)
 
A has 100 pies - B has 10 pies = The difference is 90 pies.

90 is 90% out of 100 (90 / 100 = 0.9)

So A has 90% more pies than B.
In other words 100 is 90% more than 10.

All you have to do is to divide them to get the perecentage

You're making the exact same mistake as Ason was, dividing by the wrong number. A actually has 900% more pies than B.
 
A has 100 pies - B has 10 pies = The difference is 90 pies.

90 is 90% out of 100 (90 / 100 = 0.9)

So A has 90% more pies than B.
In other words 100 is 90% more than 10.

All you have to do is to divide them to get the perecentage
This thread just took an awesome turn.
 
A has 100 pies - B has 10 pies = The difference is 90 pies.

90 is 90% out of 100 (90 / 100 = 0.9)

So A has 90% more pies than B.
In other words 100 is 90% more than 10.

All you have to do is to divide them to get the perecentage

Please let this be a joke
 
People talk about 'strange resolutions' as if they're still not beneficial. Maybe TitanFall should have been 720p, but when you're practically maintaining 60fps already, bumping the resolution whatever you can is nice.
 
A has 100 pies - B has 10 pies = The difference is 90 pies.

90 is 90% out of 100 (90 / 100 = 0.9)

So A has 90% more pies than B.
In other words 100 is 90% more than 10.

All you have to do is to divide them to get the perecentage

lmao

People talk about 'strange resolutions' as if they're still not beneficial. Maybe TitanFall should have been 720p, but when you're practically maintaining 60fps already, bumping the resolution whatever you can is nice.

Pretty much, why not.
 
A has 100 pies - B has 10 pies = The difference is 90 pies.

90 is 90% out of 100 (90 / 100 = 0.9)

So A has 90% more pies than B.
In other words 100 is 90% more than 10.

All you have to do is to divide them to get the perecentage

Maybe basic arithmetic should be part of Gaf's account approval process.
 
Top Bottom