Digital Foundry vs Metro Redux

900% more pies? Can I haz pie? You seem to have plenty. Also this game looks amazing, happy the PC version gets the remade goods this time around too.
 
A has 100 pies - B has 10 pies = The difference is 90 pies.

90 is 90% out of 100 (90 / 100 = 0.9)

So A has 90% more pies than B.
In other words 100 is 90% more than 10.

All you have to do is to divide them to get the perecentage


You mean like this?

PS4 have 2 073 600 pixels - Xbox One 1 477 440 pixels = The diffrence is 596 160 pixels.

596 160 is ~29% out of 2 0736 600 (596 160 / 2 073 600 = 0.2875)

lol

That doesn't help much because that is exactly how I did. =)
 
Posts like these are bollox. Ps3 and xbox 360 weren't hitting 720p in 2007. It doesn't mean anything. There's games on the ps4 that aren't 1080p is that also a joke? Or in guessing that's the devs fault. Comments like these are what will stop graphical effects being pushed by devs when they read it because they will think gamers just want native resolution.

Ill take 60fps all day. Resolution is the first thing to tweak in my opinion
It's unfortunate you have to hearken back to last generation to explain the shortcomings of performance of the Xbox One. Resolution may be subjective for some, but I don't see how. Image clarity and pixel density is a huge benefit to graphics performance.
 
There are plenty of 720p games on both consoles in 2007, including games like Uncharted. You aren't reflecting reality here.


I meant there were titles not hitting 720p. Not that every single one wasn't


And to the post above it will happen in every gen from now. You think that if 4k becomes standard in this consoles life cycle every game will be 4k native next gen?

I'm trying to say that at the time last gen things were the same. There will be games that aren't 1080p on ps4 to cone. I promise people that. The resolution thing just needs to be passed on. Comments like "it's a joke that this console isn't hitting 1080p" both consoles will have their chances to miss that and it doesn't mean shit. It just means the Xbox One will miss it more than the ps4. Just like the ps3 last gen


IIt also means fanboy will use resolution to downplay years of hard work by individuals and try to ruin good games. Just for LTTP threads to appear about how good the game is now. Alan wake for example

Ie resolution doesn't make a game good or bad.
 
It's unfortunate you have to hearken back to last generation to explain the shortcomings of performance of the Xbox One. Resolution may be subjective for some, but I don't see how. Image clarity and pixel density is a huge benefit to graphics performance.
Clarity IS the reason I could not wait for my PS4 over my PS3.. clarity
 
Hmm, I'm not a mathematician, but is that really the correct way of looking at it? I prefer to look at it like this.

That's your problem right there. You are bringing preference into the equation.


(1620 * 912) / (1920 * 1080) = 0.7125


Meaning, the Xbox One version have 71% the amount of pixels the PS4 version has.

So far so good.

PS4 have 2 073 600 pixels - Xbox One 1 477 440 pixels = The diffrence is 596 160 pixels.

596 160 is ~29% out of 2 0736 600 (596 160 / 2 073 600 = 0.2875)


Shouldn't it be that the PS4 version have roughly 29% more pixels?

And this is where you are wrong. Finish that sentence. "Shouldn't it be that the PS4 version has roughly X% more pixels....than the XBox One."

We are comparing how many pixels the PS4 has in relation to the XBox One. Notice how for the first, correct calculation you did, you were comparing how many pixels the XBox One had in relation to the PS4. When you did that calculation, which number went into the divisor (the bottom part of the fraction)? The number that you want to compare the other one to. In the first case, it was the PS4. So you used the PS4's power as the baseline, what you are comparing the XBox One's power to.

In the second example, your calculations are still done with the PS4 as the baseline, but your conclusion is wrong. Your conclusion is assuming the XBox One was used as the baseline. So for your second conclusion, you would be correct if you said "The XBox One has 29% less pixels than the PS4." You can not just switch it around and say "The PS4 has 29% more pixels than the XBox One."

So how do we correct what you said? We use the XBox One as the baseline. Our calculation thus becomes:

(2073600 / 1477440) * 100 = 140.35

So the PS4 is 140% of the XBox One's pixel count. Or the PS4 is 40% more pixels than the XBox One. We can not do what you did, and say the XBox One is 40% less pixels than the PS4.
 
So awesome to see, and a good price to boot.

The Remaster Era is upon us, ladies and gentlemen, and what better time to release "Final Fantasy XII & Final Fantasy XII IZJS HD" than now.
Get to it, Square.
 
1080P/60 FPS is great but it doesn't sound like DF are too excited about the game's visuals and technology, I guess it's not much of a looker?
I would much prefer better eye candy and 30 FPS, I hope the game will look good enough and not like an up rez PS3 game.
 
Ah ok. I'm not trying to take anything away from it. I just would have been doubly impressed if that kind of physics was done on the fly.

And what animations are you referring to in the MGS games?
Check the start it looks so natural but it's canned, even MGS3 and MGS4 had these kind of cloth animation that looked smooth and natural but were canned.


https://www.youtube.com/watch?v=K28vLfwR3Oc


What about Destiny then?

Destiny is a 30FPS game.
Achieving 60FPS consistent is much harder.

Destiny PS4 could be running upwards of 40-45FPS on PS4 like TR DE but hovering around 30-35 on Xbox one. But capping it at 30 both get the same performance but that doesn't mean they are equal. (I'm only using the numbers as example as by no means I am aware of them)
 
Guys, I'm sorry I derailed this thread, that was definatly not my intention.

No. When you say console X renders (percentage) more / less pixels than console Y, the percentage is based on console Y's pixel count. So the PS4 version renders 40% more pixels than the XBox version, while the XBox version renders 29% less pixels than the PS4 version.

This is ofcourse correct.


But truth is, my simple brain have an easier time to cope with (1620 * 912) / (1920 * 1080), meaning the Xbox One version have 71% the amount of pixels the PS4 version has.

Just gives a better understanding off the pixelcount for me, hence the missunderstanding.


=)
 
Man if the only difference this gen is going to be 1080p vs 900p then I don't think the xbox one has anything to worry about, and it was seriously put down on here. The performance difference may have been drastically exhagerated in real world performance.

I would put money that the average user could not notice the difference between the resolutions of 1080p and 912p (lol)
If you keep saying it, it might actually become true!

It obviously looks better with great effects at 912p than less great effects at 1080p.
I disagree. It's better to tune other variables than have the whole picture degraded by sub-native res on a 1080p display.
 
That's your problem right there. You are bringing preference into the equation.




So far so good.



And this is where you are wrong. Finish that sentence. "Shouldn't it be that the PS4 version has roughly X% more pixels....than the XBox One."

We are comparing how many pixels the PS4 has in relation to the XBox One. Notice how for the first, correct calculation you did, you were comparing how many pixels the XBox One had in relation to the PS4. When you did that calculation, which number went into the divisor (the bottom part of the fraction)? The number that you want to compare the other one to. In the first case, it was the PS4. So you used the PS4's power as the baseline, what you are comparing the XBox One's power to.

In the second example, your calculations are still done with the PS4 as the baseline, but your conclusion is wrong. Your conclusion is assuming the XBox One was used as the baseline. So for your second conclusion, you would be correct if you said "The XBox One has 29% less pixels than the PS4." You can not just switch it around and say "The PS4 has 29% more pixels than the XBox One."

So how do we correct what you said? We use the XBox One as the baseline. Our calculation thus becomes:

(2073600 / 1477440) * 100 = 140.35

So the PS4 is 140% of the XBox One's pixel count. Or the PS4 is 40% more pixels than the XBox One. We can not do what you did, and say the XBox One is 40% less pixels than the PS4.

Yupp, got it! =)
 
I meant there were titles not hitting 720p. Not that every single one wasn't


And to the post above it will happen in every gen from now. You think that if 4k becomes standard in this consoles life cycle every game will be 4k native next gen?

I'm trying to say that at the time last gen things were the same. There will be games that aren't 1080p on ps4 to cone. I promise people that. The resolution thing just needs to be passed on. Comments like "it's a joke that this console isn't hitting 1080p" both consoles will have their chances to miss that and it doesn't mean shit. It just means the Xbox One will miss it more than the ps4. Just like the ps3 last gen


IIt also means fanboy will use resolution to downplay years of hard work by individuals and try to ruin good games. Just for LTTP threads to appear about how good the game is now. Alan wake for example

Ie resolution doesn't make a game good or bad.

I don't really think anyone feels resolution makes a game good or bad. But higher resolution is better and it does matter and I feel like I have this conversation in every one of these threads and I should not be.
 
This is why most of the technical talk should be out the window when it comes to consoles. I bet the pc versions smoke the current console versions.




I think he meant a comparison between original PC releases and Redux PC releases. They talked about performance improvements !
 
If you keep saying it, it might actually become true!


I disagree. It's better to tune other variables than have the whole picture degraded by sub-native res on a 1080p display.


So I'm watching the latest walking dead season on dvd because I borrowed it off a friend. I've normally watched the blu rays. What do you know. I'm quiet enjoying it quality wise. Not as crisp as the blu ray but I'm still getting the whole story. When I look back and talk about episodes I can't even remember the quality just what happened.

I will buy all multiplats on ps4 if they are the best version. It's not like xbox one users are losing out. There's so much hyperbole on here it's just embarrassing. Did you play ps3 last gen or just pc? Have you only ever accepted 1080p? Some people just focus too much on bullshit resolution and miss out on great games.

Alan wake was amazing at 540p. I thought it actually added to the game. That kind of gritty grainy look. I play it maxed at 1080p and sure as he'll looks crisp but it's not how I remembered on my big TV with surround sound
 
921p?
SIavK6z.gif
 
I was under the impression that resolution scaling works better in absolute steps, hence the comment. Doesnt 720P with a superior AA solution > weirdo resolutions?

More pixel information allows the scaler to guess less when scaling up. Garbage in garbage out.

Resize two images of different size in Photoshop and the one with the larger start size will look better. Basically the same idea.
 
All these uberatives wouldn't exist if majority would have been nicer towards PS3 during the past generation.

This in now the reaping of the seeded.
 
I don't really think anyone feels resolution makes a game good or bad. But higher resolution is better and it does matter and I feel like I have this conversation in every one of these threads and I should not be.


I completely agree that it's better. (apart from Alan wake for me, that's personal) I think it only gets discussed because of the people that say 900p that's embarrassing etc. It's not embarrassing. It's what it is.
 
I'd rather have a steady frame-rate than 1080p with big fluctuations, good to see them make that choice and use the little boost they got from the June SDK to get that right
 
the fact that this guys provided DF with a copy for test speaks amazingly well of their confidence on this product and shows how awesome they are
 
You mean like this?



lol

That doesn't help much because that is exactly how I did. =)

Really?!?
I thought he and everyone else would get that this is obviously wrong with a simple as possible example that ends in an absurd statement like
"100 is 90% more than 10".
I even put a spoiler in there with the correct answer.
Guess i was wrong lol.
 
All these uberatives wouldn't exist if majority would have been nicer towards PS3 during the past generation.

This in now the reaping of the seeded.
People that took anything personal last-gen and are now lashing out as a result of it need to seriously evaluate their own maturity level.
 
I completely agree that it's better. (apart from Alan wake for me, that's personal) I think it only gets discussed because of the people that say 900p that's embarrassing etc. It's not embarrassing. It's what it is.

It's a discussion because this is a comparison thread just like any other DF thread. Differences between the consoles are going to be discussed.
 
So I'm watching the latest walking dead season on dvd because I borrowed it off a friend. I've normally watched the blu rays. What do you know. I'm quiet enjoying it quality wise. Not as crisp as the blu ray but I'm still getting the whole story.
DVD is a working man's Blu-ray. There's nothing wrong with quietly enjoying working man's resolution every now and then.

Back on topic, question:

Screen-tearing on Xbox One at this point is a problem.. Can it be avoided by lowering resolution /fps or is there a problem on a hardware level?
 
A has 100 pies - B has 10 pies = The difference is 90 pies.

90 is 90% out of 100 (90 / 100 = 0.9)

So A has 90% more pies than B.
In other words 100 is 90% more than 10.

All you have to do is to divide them to get the perecentage

Fuck! I'm hungry now.
Want some pie :(
 
Can you elaborate?

Selectively choosing facts to draw a negative conclusion. Destiny went from 900p to 1080p from the June SDK improvements.

Yet Metro comes along and they mention the SDK allowed them to go from 900p to 912p and the poster infers negativity. The poster has no idea if other changed occurred as well, but weren't mentioned, and the article referenced in the OP suggests as much.
 
Didn't they say Xbone 1080p?

I don't think they have ever explicitly stated the XBox One version would be 1080p. A lot of people seemed to infer it though, for various reasons. For example, the Metro Redux video released yesterday mentions 60fps with "native" resolution.
 
Top Bottom