Rogue_Ledr
Banned
At launch? Probably little to no difference.
Probably not after either, sadly.
At launch? Probably little to no difference.
Why would there be a difference after Xbone got CPU megaboosted?
It's going to vary from dev to dev and game to game. There's certainly no good reason for any multiplat to look worse on PS4, though- unless the devjust doesn'ttry.
Why would there be a difference after Xbone got CPU megaboosted?
How big is the power gap between PS4 and Xbox One compared to the gap between PS3 and 360?
Before, the ps4 version of a game should have run, at least, faster in certain areas. With the bump of the Xbone cpu and assuming the ps4 on is clocked at 1.6 Ghz and 2 cores are not usable for games there are now situations possible where the XBone version runs a little faster.
The same type? Uh?
Why would there be a difference after Xbone got CPU megaboosted?
It isn't GDDR3 (which was used in the Xbox 360) vs. GDDR5.As in GDDR. Yes, 3 vs. 5 but my point being we aren't comparing GDDR vs. XDRAM this time around. They are on an even playing field for the most part.
This is clearly how all potential differences should be measured going from one platform generation to the next.
Xbox employees say 40% GPU power gap will be unnoticeable.
Not sure how much I believe that.
At launch? Probably little to no difference.
the framerates are always locked 30/60.
This is clearly how all potential differences should be measured going from one platform generation to the next.
Considering how it was in the last gen, I'd say there won't be much difference.
Multiplat devs are lazy. I stand by it.40% of difference in power, 99% of the same graphic. Magic microsoft.
X1 = Will look like total shit
PS4 = Glorious HD 1080p goodness
/s
Honestly the real goodness will shine in the first party efforts, a few years later. I'm talking ND, SSM, Black Tusk etc those studios will really do something special.
BigPapaGlueHands said:Multiplat devs are lazy.
What? I'm just curious. The PS3 had the better specs, yet generally had worse versions of multiplats than 360. I don't know if that was due to Cell, 360 versions being the lead platform, etc. Maybe the architecture change going to PS4 will change things.
There is nothing needed to get more power out of a stronger GPU when the architecture is so similar.I think it's reasonable to assume it will take some time before developers can learn how to really take advantage of the extra power.
Very evenly matched at launch, significant difference in a few years.
How big is the power gap between PS4 and Xbox One compared to the gap between PS3 and 360?
Should be more noticeable than it was this generation given the hardware gap.
I still don't follow this line of thought. Why is it going to take a few years when the architecture between the two systems is so similar and so much simpler now? I would expect the gap for multi-platform titles to remain pretty consistent throughout the gen.
How significant will it be? I don't think your average Joe will be able to tell, but we'll see. I do expect lower IQ on the X1 at the least.
Multiplat devs are lazy. I stand by it.
What? I'm just curious. The PS3 had the better specs, yet generally had worse versions of multiplats than 360. I don't know if that was due to Cell, 360 versions being the lead platform, etc. Maybe the architecture change going to PS4 will change things.
Multiplat devs are lazy. I stand by it.
What should that tell me?
It was do to the complexity of Cell. The 360 had a slightly better GPU as well. When Cell was used correctly, you could run graphical tasks on the SPE's which freed up resources on the GPU. That's why you have games like The last of Us, Uncharted, Beyond, etc. But it was not practical for third party devs to put in that much effort (especially when 360 was lead platform) so some third party games suffered on PS3 as a result.
PS4 and Xbone won't repeat that situation. The PS4 is more powerful no matter how you look at it and even games built from the ground up on Xbone will benefit from the extra raw GPU power on PS4. And that's not even taking into consideration the compute benefits.