Assassin's Creed "Parity": Unity is 900p/30fps on both PS4 & Xbox One

Status
Not open for further replies.
Actually it was the home of the 6GB VRAM controversy, where we got dozens of meltdowns from some PC gamers thinking that their video cards were now "worthless". It also went through a bit of a console roller-coaster, where people were mocking the XBO version for being "720p30", based on some sort of vague information that the PS4 was expected to be 1080p60. It turned out that XBO was 900p30 and PS4 was 1080p30, so nearly everybody was wrong until we got our hands on the game.

There were also differences in settings so the difference was bigger than just 900p vs 1080p.
Still, the difference was not remotely as big as 1080p//60 vs. 720p/30.
 
nXeb.gif

There it is
 
It's not about the resolution it about them gimping the game because of a certain reason which don't sound right.
I sure you won't like if UBI said there were gimping the PC version to 900p.

Its already kinda gimped with Ubisoft Kyiv and Uplay. Might be the first AC title I won't buy, gonna wait until some other PC suckers buy the game to test it out.
 
They gave back 10% reserve of a weaker GPU than what the PS4 has. Does it give the XB1 a boost? Yes. Will it make the two machines equal? Never.

Shadows of Mordor is a good example of how to do a title right. It was tailored for each platform. Hardly anyone complained. That flimsy excuse in the OP is disgusting because this has sparked more controversy and backlash full of conspiracies and fanfiction by going the parity route.

In case your being serious no you can make up a 50% difference with 10%.
This is a GPU problem also not a CPU one.

Sorry, my original question was misinterpreted. Someone said that Ubi games always had lower resolution on X1 but suspicious that ACU has parity, equating this to dirty MS tactics. As what has happened with Diablo 3 and Destiny though, I was asking if that 10% was enough to up the resolution.

For absolute clarity, I'm not saying the 10% is enough to make X1 and PS4 equal but it has been proven to help increase resolution :)

Sorry, I'm tired and need sleep!
 

its hyperbolic comparisons like that gif that fuels the flames for the resolution topic.
There is a difference but its not that big of a difference perspectively.

Me I usually get multi plats on pc where possible. but AC4 was a complete train wreck on pc so I don't know :(
 
Sucks.. Brad from GB said they're not going to talk about this (or at least hinted they're not).

"why would we when Ubi already explained the technical reason behind it?"

:(

Games Journalists in action! Too busy working up an op/ed piece on feminism or violence in gaming, you know, way more pertinent topics than forced parity in games.

Help me Jim Sterling.. You are our only hope...
Help me Jim Sterling.. You are our only hope...
Help me Jim Sterling.. You are our only hope...
 
How were your Xbox games 'gimped all the time'? Name them.

MGS2 was clearly and intentionally gimped. They just took the PS2 texture files and compressed them again for the Xbox version.
That said, Xbox versions were consistently better. 480p (or 720p) vs. 480i and better frame rates being the biggest differences usually.
 
its hyperbolic comparisons like that gif that fuels the flames for the resolution topic.
There is a difference but its not that big of a difference perspectively.

Me I usually get multi plats on pc where possible. but AC4 was a complete train wreck on pc so I don't know :(

I agree it's not that massive. But if you have a 55 inch tv and aren't sitting across the room (like a living room) it can be a pretty big difference. AC IV was really noticeable.

Edit: I mean look at the AC IV gifs, that is pretty substantial. I noticed this myself before people were making the gifs.
 
Games Journalists in action! Too busy working up an op/ed piece on feminism or violence in gaming, you know, way more pertinent topics than forced parity in games.

Help me Jim Sterling.. You are our only hope...
Help me Jim Sterling.. You are our only hope...
Help me Jim Sterling.. You are our only hope...

brad shoemaker is my one stop shop for articles on feminism and violence
 
MGS2 was clearly and intentionally gimped. They just took the PS2 texture files and compressed them again for the Xbox version.
That said, Xbox versions were consistently better. 480p (or 720p) vs. 480i and more better frame rates being the biggest differences usually.

How is that gimped when it ran better?
 
How is that gimped when it ran better?

?
The Xbox version ran worse. (In terms of frame rate, at least. It was running at 640x480 instead of the weird sub-SD resolution the PS2 version was running at.)
 
My friend has been telling me you can't really see a difference, but I can. I had the beta of Destiny for both my X1 and PS4. I used the same exact TV settings for both. In comparison, the X1 version looked like I was viewing it through a slightly fogged window while the PS4 version looked like it was through a window that was just cleaned with windex. It really did.

So if I have the option, I will buy the higher resolution version.

Thats weird, isnt Destiny Xbox ONE 1080P too?
 
The salty PS4 only owners are making this great. I love my PS4, and XBone. I'll be picking this up on PS4 assuming performance is better. But if it's not, I'll pick it up on the XBone as I haven't played too much of anything on it in a while.


general question....would xb1 owners be "salty" if their version of a game was gimped for parity with the wii u?
 
MGS2 was clearly and intentionally gimped. They just took the PS2 texture files and compressed them again for the Xbox version.
That said, Xbox versions were consistently better. 480p (or 720p) vs. 480i and better frame rates being the biggest differences usually.

That game was specifically tailored to the PS2 hardware, it wasn't as simple as copying and pasting, so it suffered in the translation.
 
This was possibly my game of the year and my most anticipated but on a matter of principle alone I canceled this preorder. Oh well, I have more Dragon Age Inquisition time now.
 
If they were CPU bound, why talk about choosing parity? With both CPUs being roughly the same, it would make sense both are crippled to similar degrees.
 
Your right. As you can see I edited my post. That part was unnecessary but I still can't see how anybody can support this.

Ahhhh, we've all been there. I called someone here an unfunny pile of stupid once and it has haunted me ever since. You're right, though, about it being tough to see how someone can support this.
 
Status
Not open for further replies.
Top Bottom