Assassin's Creed "Parity": Unity is 900p/30fps on both PS4 & Xbox One

Status
Not open for further replies.
True, but I mean, GAF is still extremely console-centric, so that can't really be a surprise.
Well, it's also that people get frustrated over resolution/fps locking specifically, as those should easily scale for 3D games (especially resolution, fps can be trickier due to physics breaking or whatever weird surprises a game has in its coding.) When we see true parity like with Dark Souls (well, close enough since the FPS was ironed out throughout on the right computer) everyone becomes livid.

Well, you still saw a few people who played down blowing off the PC advantage and acted as if people are acting like entitled babies, but I think they were probably in far fewer numbers because no seriously what the hell.
 
Again, how software is making hardware "weaker"? It's same dumb logic blind Xbox fanboys use in regards to "DX12 will close dat gap" and make Xbox One "stronger". The Matt guy clearly said "PS4 CPU is more powerful" and people ran with this statement as if it was a fact, when facts are that Xbox One is actually slightly better due to slim overclock from HW point of view . Memory systems have nothing to do with CPU performance, it will change overall picture, but it has nothing to do with a CPU performance which is being discussed here.

What?

Matt clearly says you "can get more out of the PS4 CPU then the XB1 CPU"

Yes, you can get more out of the PS4's CPU than you can the Xbox's.

How is that the same as saying

The Matt guy clearly said "PS4 CPU is more powerful" and people ran with this statement as if it was a fact,

It could easily be on the software side of things
 
This entire thread, all 118 pages, is hilarious to me, by the way

I truly believe that in the same way we dig up that Uncharted 3 review thread and say "Oh my god, that's so embarrassing, I can't believe GAF was like that back then" we're going to open this thread (and those similar to it) in a few years and have the same reaction
 
Again, how software is making hardware "weaker"? It's same dumb logic blind Xbox fanboys use in regards to "DX12 will close dat gap" and make Xbox One "stronger". The Matt guy clearly said "PS4 CPU is more powerful" and people ran with this statement as if it was a fact, when facts are that Xbox One is actually slightly better due to slim overclock from HW point of view . Memory systems have nothing to do with CPU performance, it will change overall picture, but it has nothing to do with a CPU performance which is being discussed here.

The reason I am even arguing is that recently there is huge hypocrisy going around on NeoGaf, when people are getting outraged even at slight sign of devs trying to improve Xbox version in regards to resolution, and when you see upgrades to PS4 version you don't see similar outrage from Xbox crowd or from PC crowd, see Destiny, Diablo 3 and now Unity, people became so obsessed with resolution that they chose to ignore any other difference being it effects, draw distance, LoD or AA. It's now came down to PS4 needs to be better resolution that Xbox or no buy. Despite differences in framerate stability, AA quality and post processing.


Wait how is a downgrade the same as an upgrade, in those instances you brought up it was the xbox one version given a bump so why would there be an outrage, this is the next game in the series that we see a downgrade thats whats pissing people off. That and the fact weve been told in plain words we did this for parity.
 
cancelled as soon as I heard about it, took my son to school walked to EB games and transferred the funds from unity to gtav.

9kXXgv2.jpg

Gaming hero

2f0.gif
 
I don't really get the blowout

Destiny did this?

is it because they have had the balls to come out and say we are going for parity to stop all the claims and counterclaims from both camps?

if as many believe Destiny was gimped, then why not the backlash over that... (and whilst there were some grumbles certainly not the drama on this scale)

I am not saying it is not frustrating, but Ubisoft are like EA... you dance with the devil!

oh, and if no one buys it on the ps4 they will get the message, but we all know that is not going to happen...

also this has been going on for years against the PC, that was ok?

Dont think Destiny is too pushing though really... I think we can say its a last gen standard game, up res-ed and some extra effects thrown in for the next gen version.
That and they were planning to make PS4 1080p and X1 900p, (Therefore pushing both systems to their max), it was after the Kinect reserve was able to be turned off 1080p was possible on Destiny.

I guess with AC:U in fairness, we don't know if they can push AC:U to 1080p on PS4 or not, seeing it seems more of a game made with next gen specs in mind, but I'm betting they can. Seeing PS4 is around 40% more powerful, I expect games to make use of that power.
 
Kinect doesn't change the equation in the benchmark. Kinects removal did not impact the CPU reservation. The benchmark also gives results on a per core basis. Results are written out but only amount to a 12-14MB/s stream which does not stress the RAM at all. It pretty obviously suggests that for whatever reason each Xbox One Jaguar core at 1.75Ghz can do less work than one PS4 Jaguar core at 1.6Ghz. I maintain it is likely the hardware virtualization on the Xbox One sapping performance.

My guess is that Kinect was also an integral part of the OS running which MS could've changed.
I don't need you to explain the test for me as to me it is still lacking in details as we need way more data to see why there is a difference. Of course it could be the virtualization layer blocking the CPUs but don't Sony and MS give 2 cores directly for OS purposes? This wouldn't stop the other 6 cores at running "free". Or this is not true but a task for the scheduler where we also don't have any details given.
Either way, the test itself is so sparse on info this test tells close to nothing, in my opinion.
 
Destiny has a nice art style, but still is and looks like a last-gen game. Unity vastly outdoes Destiny on a technical level.
Talking strictly graphically, I care more about the artstyle and the game looking clean and running smoothly. Unity is a mess from what we know, even though maybe its animation model and lighting and whatever is more demanding it doesn't really look like a big step up from AC4 (it arguably looks worse).

In Destiny, what I assume that happened is that they had a modest target they wanted to hit, and they did so with both consoles. They didn't go beyond that, not taking advantage of the specific systems, but the end result was satisfying.
 
Thanks, some people just dont get it tho, its the principle rather than the res and I wont give money to a company that does this.

I love AC games so I can't bring myself to not play it, but I'll be getting it at Christmas second hand instead of day 1. Good on you though bud.
 
Wait how is a downgrade the same as an upgrade, in those instances you brought up it was the xbox one version given a bump so why would there be an outrage, this is the next game in the series that we see a downgrade thats whats pissing people off. That and the fact weve been told in plain words we did this for parity.

Have you seen those threads? There was an outrage, and that what was so hilarious. People got angry and mad that Xbox One version got a bump, overnight from just being happy with solid 1080p version and 60FPS (in case of Diablo 3) people got upset that they now are at same resolution as an Xbox One, even though it doesn't have as stable framerate.

For Unity it was very likely that it would be 900p on PS4 from the get go, when Watch Dogs were 900p, Unity internal beta screens that got leaked were 720p, with later ones at 798p (according to pixel counters). So clearly Xbox One was underpeforming before. I would assume that they actually pulled Xbox One version to 900p, rather than "downgrading" PS4 from 1080p to 900p. I am not sure why they chose the wording about parity, I am 99% sure there will be difference in object details, post processing and AA between two versions anyway, the only parity being 30FPS and 900p, which is exactly what's wrong with gaming community later, suddenly resolution became more important than anything else.
 
The idea that Microsoft DIDN'T somehow do this is pretty out there. Sony didn't. Nintendo sure as hell didn't. So, y'know...

I'm arguing that the only one who reasonably did anything was Ubisoft, and what they did was make the call that optimizing for 1080p on PS4 wasn't worth the effort. Maybe the PS4 version will get some extra bells and whistles, maybe it won't. But collusion between two parties to sabotage for a third is lawsuit territory, something neither Microsoft nor Ubisoft would want to get involved in.
 
Wait how is a downgrade the same as an upgrade, in those instances you brought up it was the xbox one version given a bump so why would there be an outrage, this is the next game in the series that we see a downgrade thats whats pissing people off. That and the fact weve been told in plain words we did this for parity.

We're not seeing a downgrade, but rather a lack of an upgrade.
 
We're not seeing a downgrade, but rather a lack of an upgrade.

Considering the game is still performing crappily on Xbox One, it wouldn't be unfair to think the Xbox One version may have been upgraded from 720p to 900p or something along those lines. That said, if the Xbox One version was able to receive that upgrade, there's no reason why the PS4 wouldn't have been able to receive a 900p to 1080p upgrade.
 
Again, how software is making hardware "weaker"? It's same dumb logic blind Xbox fanboys use in regards to "DX12 will close dat gap" and make Xbox One "stronger". The Matt guy clearly said "PS4 CPU is more powerful" and people ran with this statement as if it was a fact, when facts are that Xbox One is actually slightly better due to slim overclock from HW point of view . Memory systems have nothing to do with CPU performance, it will change overall picture, but it has nothing to do with a CPU performance which is being discussed here.

The "facts" are in the evidence you keep choosing to ignore, and they point to an advantage for the PS4. If you are going to pretend to be an authority on the matter it would behoove you to have a working knowledge of things like hardware virtualization:

Virtualization often exacts performance penalties, both in resources required to run the hypervisor, and as well as in reduced performance on the virtual machine compared to running native on the physical machine.

The Xbox One is literally running three operating systems in parallel. The Game OS runs on "simulated" hardware resources managed and arbitrated by the hypervisor. This has an inevitable performance penalty even with explicit hardware acceleration built into the CPU.

Granted, the Xbox One CPU hardware is "more powerful" if considered in a vacuum. But games run inside a real life software environment and in those situations the PS4 CPU takes a lead.
 
Again, how software is making hardware "weaker"? It's same dumb logic blind Xbox fanboys use in regards to "DX12 will close dat gap" and make Xbox One "stronger". The Matt guy clearly said "PS4 CPU is more powerful" and people ran with this statement as if it was a fact, when facts are that Xbox One is actually slightly better due to slim overclock from HW point of view . Memory systems have nothing to do with CPU performance, it will change overall picture, but it has nothing to do with a CPU performance which is being discussed here.

The reason I am even arguing is that recently there is huge hypocrisy going around on NeoGaf, when people are getting outraged even at slight sign of devs trying to improve Xbox version in regards to resolution, and when you see upgrades to PS4 version you don't see similar outrage from Xbox crowd or from PC crowd, see Destiny, Diablo 3 and now Unity, people became so obsessed with resolution that they chose to ignore any other difference being it effects, draw distance, LoD or AA. It's now came down to PS4 needs to be better resolution that Xbox or no buy. Despite differences in framerate stability, AA quality and post processing.

Software does not make hardware weaker, software can increase or decrease your ability to fully utilise the hardware. In this case the Xbox software has more CPU overhead which means there is less available CPU resources for doing the important stuff.

When the PR is that the CPU is a bottleneck and they are maintaining parity to avoid the debates that means there is GPU headroom available to use on both consoles, now if there is not enough on the Xbox One to get that upto 1080p then fine I would hope they turn up effects until they are balanced in terms of CPU and GPU bottlenecking. On the PS4 you have a greater headroom so you can turn on those effects and then make it 1080p, not doing this is the same as gimping the better performing platform.

With Diablo 3 nobody really cared that they had hit parity because it was parity across all platforms which meant the graphics settings in the game were at their highest settings, this is a perfectly acceptable scenario.

With Destiny there was some minor backlash because it meant that there was GPU headroom left on the PS4 version that could have been used to improve the IQ and there are a few simple ways to achieve that but ultimately because they were at 1080p already it was no big deal.

With AC:Unity though not only do we know there is GPU headroom available but we are not even at 1080p which is just pure laziness. If they wanted they could either increase the resolution on the PS4 version or they could set the frame cap to 60 FPS (and I bet it would be locked there). I would prefer they increased the resolution to 1080p but others prefer 60 FPS so that is a personal choice and I would be happy with either.
 
Thanks, some people just dont get it tho, its the principle rather than the res and I wont give money to a company that does this.

Yeah, I dont get whats too hard to get really. I was worried about companies doing this at launch for PS4/X1, matching them to simply not upset Microsoft, but was very happy to see they were fine to push each system to their max, and thats the best you can ask for, so this move is just unacceptable really, it seems a purely political move, which is just BS simply.
Makes me hate Microsoft more too, I'm sure they had a hand in it. This move doesnt benefit anyone other than Microsoft. More evidence of them not being worthy to be in the games industry, ha ha. All their moves are simply to make one version worse, or to not exist at all, instead of actually using their consoles strengths and/or making new games for it.

Cancelled my pre-order too, but too lazy to get a screenshot and post it, ha ha.
 
Have you seen those threads? There was an outrage, and that what was so hilarious. People got angry and mad that Xbox One version got a bump, overnight from just being happy with solid 1080p version and 60FPS (in case of Diablo 3) people got upset that they now are at same resolution as an Xbox One, even though it doesn't have as stable framerate.

For Unity it was very likely that it would be 900p on PS4 from the get go, when Watch Dogs were 900p, Unity internal beta screens that got leaked were 720p, with later ones at 798p (according to pixel counters). So clearly Xbox One was underpeforming before. I would assume that they actually pulled Xbox One version to 900p, rather than "downgrading" PS4 from 1080p to 900p. I am not sure why they chose the wording about parity, I am 99% sure there will be difference in object details, post processing and AA between two versions anyway, the only parity being 30FPS and 900p.
True, usually when we get proper parity it's from games grotesquely overshooting what the consoles are even capable of, and occasionally extra perks show up anyway as the case was for The Hobbit downsampling on PS4. A game that's hitting 900p/30fps though, possibly in a situation where the XB1 was actually catching up anyway going by those posts about older shots... you're probably going to be seeing differences on PS4 easily enough.
 
Considering the game is still performing crappily on Xbox One, it wouldn't be unfair to think the Xbox One version may have been upgraded from 720p to 900p or something along those lines. That said, if the Xbox One version was able to receive that upgrade, there's no reason why the PS4 wouldn't have been able to receive a 900p to 1080p upgrade.

But there is an reason! Ubisoft doesnt want us to do debate and stuff. Thats your reason! ;)
 
Of course it could be the virtualization layer blocking the CPUs but don't Sony and MS give 2 cores directly for OS purposes? This wouldn't stop the other 6 cores at running "free". Or this is not true but a task for the scheduler where we also don't have any details given.
Either way, the test itself is so sparse on info this test tells close to nothing, in my opinion.

All cores on the Xbox One are always going through the hypervisor. There is virtualization overhead for every single thing. Nothing is ever "running free". That's the whole point of using a VM.

The test isn't that sparse. More details would be great, but the results are persuasive as they are. You can choose to ignore them because you don't like the outcome, but we have more convincing theories and corroborations to go on than speculating there was secret, unspoken "Kinect stuff" dragging down the CPU.
 
When the PR is that the CPU is a bottleneck and they are maintaining parity to avoid the debates that means there is GPU headroom available to use on both consoles, now if there is not enough on the Xbox One to get that upto 1080p then fine I would hope they turn up effects until they are balanced in terms of CPU and GPU bottlenecking. On the PS4 you have a greater headroom so you can turn on those effects and then make it 1080p, not doing this is the same as gimping the better performing platform.

That what I said before it got side tracked into "PS4 CPU is better - period". CPU performance would bottleneck framerate, just like in case of Alien Isolation, where 7850 which is weaker than PS4 GPU can maintain 70FPS+ on average, while console version runs at 30FPS locked.

I have no idea how CPU would bottleneck the resolution though. As I said, that was some horrible PR on all fronts, from talking about parity to whole confusion with CPU bottlenecking.
 
Have you seen those threads? There was an outrage, and that what was so hilarious. People got angry and mad that Xbox One version got a bump, overnight from just being happy with solid 1080p version and 60FPS (in case of Diablo 3) people got upset that they now are at same resolution as an Xbox One, even though it doesn't have as stable framerate.

For Unity it was very likely that it would be 900p on PS4 from the get go, when Watch Dogs were 900p, Unity internal beta screens that got leaked were 720p, with later ones at 798p (according to pixel counters). So clearly Xbox One was underpeforming before. I would assume that they actually pulled Xbox One version to 900p, rather than "downgrading" PS4 from 1080p to 900p. I am not sure why they chose the wording about parity, I am 99% sure there will be difference in object details, post processing and AA between two versions anyway, the only parity being 30FPS and 900p, which is exactly what's wrong with gaming community later, suddenly resolution became more important than anything else.

The truth is we don't know either way. The problem with WD may have been it was a slap dash engine, which married previous ten and current gen elements, and didn't perform particularly well on either system. AC Unity does look much better, but on the other hand you have no real proof that Xbone was under performing in relation to WD. Given the clear difference in specs, if you relate them to PC equivalents, the differences we have seen in other games have been in keeping with what we might expect. The differences in architecture (unified vs split) work in PS4s favour if anything.
 
All cores on the Xbox One are always going through the hypervisor. There is virtualization overhead for every single thing. Nothing is ever "running free". That's the whole point of using a VM.

The test isn't that sparse. More details would be great, but the results are persuasive as they are. You can choose to ignore them because you don't like the outcome, but we have more convincing theories and corroborations to go on than speculating there was secret, unspoken "Kinect stuff" dragging down the CPU.

Don't do this... Please, as a former researcher this test tells close to nothing and this won't change until there is a clear statement WHO tested WHAT in WHICH environment. And of course result data would need to be much richer than a graph...
As to the virtual machines, yes, this is true for hardware virtualization but this virtualization only means a software virtualization in my understanding. This is not a Virtualbox running the Xbox Os or something, trying to mimic a whole computer. If there is a low-level ring0 kernel running with a specifically designed scheduler anything running above this will not stop cores from running free.
 
Considering the game is still performing crappily on Xbox One, it wouldn't be unfair to think the Xbox One version may have been upgraded from 720p to 900p or something along those lines. That said, if the Xbox One version was able to receive that upgrade, there's no reason why the PS4 wouldn't have been able to receive a 900p to 1080p upgrade.

They probably ran out of time to work on the PS4.
And it makes more sense that they got the PS4 to stabilize at 900/30 early on but the X1 was lagging behind, so they needed to spend more time working on the X1.

And if that the case, their PR statement makes sense as well.
They didn't downgrade anything, they just didn't feel like it was worth the hassle to work on the PS4.

It was already locked at 900/30 from the get go just like they stated.
And from a business standpoint, it will always be easier to save money by not upgrading and optimizing your product.

Who actually knows right?
But let's be honest, Ubisoft is known for these kind of thing, they are the definition of lazy devs and liars last generation.
 
Well it seems to have been all over the gaming press so I'm not sure. Then again after reading a few of your posts the agenda is laughably obvious. Some people will defend and shill for anything. Blows my mind.

no consumer in their right mind should be defending this parity practice, unless they work for the respective company(ies). besides that, I can't comprehend any reasonable person coming to the rescue of a company that's screwing over consumers. again, how would Xbox one owners feel if their games were gimped for parity with the Wii U?
 
They probably ran out of time to work on the PS4.
And it makes more sense that they got the PS4 to stabilize at 900/30 early on but the X1 was lagging behind, so they needed to spend more time working on the X1.

And if that the case, their PR statement makes sense as well.
They didn't downgrade anything, they just didn't feel like it was worth the hassle to work on the PS4.

It was already locked at 900/30 from the get go just like they stated.
And from a business standpoint, it will always be easier to save money by not upgrading and optimizing your product.

Who actually knows right?
But let's be honest, Ubisoft is known for these kind of thing, they are the definition of lazy devs and liars last generation.

We'll see next month when the game releases. If the two are exactly the same in effects and performance (good or bad), I'd probably call shenanigans.
 
The truth is we don't know either way. The problem with WD may have been it was a slap dash engine, which married previous ten and current gen elements, and didn't perform particularly well on either system. AC Unity does look much better, but on the other hand you have no real proof that Xbone was under performing in relation to WD. Given the clear difference in specs, if you relate them to PC equivalents, the differences we have seen in other games have been in keeping with what we might expect. The differences in architecture (unified vs split) work in PS4s favour if anything.

First leaked screenshots from Xbox One Unity beta were all 720p, later ones were 798p. So it seems like they brought Xbox One version to 900p, rather than downgrading PS4 from 1080p to 900p. There still should be ~30% headroom left for PS4, which might result in better AA, draw distance, post effects. I see no reason to outrage, unless two versions are exactly the same, then you can point fingers at Ubi for not using extra GPU power at least for better AA solution.
 
They probably ran out of time to work on the PS4.
And it makes more sense that they got the PS4 to stabilize at 900/30 early on but the X1 was lagging behind, so they needed to spend more time working on the X1.

And if that the case, their PR statement makes sense as well.
They didn't downgrade anything, they just didn't feel like it was worth the hassle to work on the PS4.

It was already locked at 900/30 from the get go just like they stated.
And from a business standpoint, it will always be easier to save money by not upgrading and optimizing your product.

Who actually knows right?
But let's be honest, Ubisoft is known for these kind of thing, they are the definition of lazy devs and liars last generation.

just how small of a company is Ubisoft? they don't have the manpower to squeeze all the performance out of the PS4?
 
First leaked screenshots from Xbox One Unity beta were all 720p, later ones were 798p. So it seems like they brought Xbox One version to 900p, rather than downgrading PS4 from 1080p to 900p. There still should be ~30% headroom left for PS4, which might result in better AA, draw distance, post effects. I see no reason to outrage, unless two versions are exactly the same, then you can point fingers at Ubi for not using extra GPU power at least for better AA solution.

Even then, the worst thing you could accuse them of is failing to optimize on the platform - there's still no evidence that they sabotaged the PS4 version. There are a depressing number of people who seem to be thinking that the game literally was native 1080p or 60fps, but then they just turned the settings down because MS is paying them money.
 
Do people realize this might not be always reversing the PS4 version to X1 parity, but rather they reach the same point on both copies? Then choose not to use extra funds to add bells and whistles to PS4? Similar to lazy PC ports.

I think the key to all of this is the comment that it was held back so people wouldn't debate. Keep that in mind.
 
So refunds on PSN preorders should be granted based on false advertising.
Nah, legalese would claim that the 1080p Full HD referenced in that text is about PS4 scales any internal resolution to 1080p. And you need a 1080p screen to display that.

The game's actual resolution does not enter into it.
 
Even then, the worst thing you could accuse them of is failing to optimize on the platform - there's still no evidence that they sabotaged the PS4 version. There are a depressing number of people who seem to be thinking that the game literally was native 1080p or 60fps, but then they just turned the settings down because MS is paying them money.
60 fps is interesting to me because I can actually imagine a lot of games locked at 30 fps could run smoother on PS4... just likely not to a stable 60 fps, and thus there's the design call to stick with a stable fps than let it vary wildly for everyone.

Though you do seem to have them deciding to go 30 because a game's a horror title or whatever, I wonder if that was really the case for Alien Isolation and it apparently is that for The Evil Within. But that's kind of the opposite, a game that could POTENTIALLY be 60 or mostly 60 on both platforms but isn't because of the development team's creative calls.
 
What? You can get a refund and cancel pre-order at any time on PSN. Go to you purchase history on account page, find the pre-ordered game, go to details, press "Cancel Pre-order".

Or is it EU only thing?
 
Status
Not open for further replies.
Top Bottom