Performance Analysis: Assassin's Creed Unity (Digital Foundry)

Clearly, a 9% increase accumulate across 6 active game cpu cores is.

I wonder if the crowd A.I is fixed across a few cpu cores or are scheduled across all the available cpu cores as they become available.

It does not work like that though. If AI was locked to 2 cores it would still only be 9% faster.

Besides performance sucks with or without crowds so I doubt CPU bottlenecking is the reason for shitty performance anyway.
 
Pushed more in the 8 cores parallel cpu work and porting on ps4, for example, could penalize it.

That's not what optimization is... If your game demands lots of cpu time what you gonna do?

Also, xbone got a pretty crappy cpu too, putting too much work for it would hardly be optimizing.

It might come as a surprise, but Ms guidelines for performance on xbone are very similar to Sony's regarding cpu usage. Bottom line it is: Use the GPU to alleviate the CPU whenever possible.
 
Why the fuck are so may people being smug saying they're getting this on xbone now when both versions are utter shit?

Ok so one version is slightly less shit but it's shit all the same and nothing to be proud about supporting.

I'm a huge AC fan that has owned every past game across Microsoft, Sony and PC formats but I won't support this shit in any form.

The worst part is Ubi won't give two fucks about it if everyone else just laps this crap up.
 
Why should I care about The framerate in cutscenes? Movies Are in 24 fps and nobody cares.

FPS in gameplay is way more important than in cutscenes.

Where exactly in my post did I talk about the framerate?
I mentioned the perspective bug (both versions) and the inferior lighting on the XB1 version.

And LOL at the "movies are 24FPS" bit.
 
It is not additive... It is a flat 9% increase.

The clock speed difference between Xbox One and PS4 is 9%. That would mean each core has theoretically 9% more ability to process loads. That must be accumulative.
 
Some other "fun with ACU cutscenes" moments.

Perspective, how does it work?

acv-21msbq.jpg



Pssssst, hey, XB1 lighting, that window isn't wide open.

acv-3s1skm.jpg

It's interesting how in the cutcenes, which usually have more graphical detail but need less cpu because it's all predetermined, the ps4 is the one with 4-5 more fps. It really seems a case of a cpu bound game.
 
Holy shit, is this real?

well there is a whole crowd of people saying

'i dont mind sub 60 fps i dont even notice it'

which has already slid into

'i dont mind sub 30 fps i dont even notice it'

and now we get to the fun

'i dont mind that its sub 20 fps most of the time i dont even notice it'

whatever it makes for pretty steam page screenshots and they can just bruce force it on stage with titan z's who cares how it runs
 
Can't believe the terrible PS4 performance. But alas Ubisoft have no doubt sold millions of copies so they won't care about the complaints.
 
If the game is CPU bound dropping the resolution on the PC version would not have improved framerate but it did. The game is not CPU bound. Framerate dropping only when crowds are in view indicates a rendering problem as AI would need to be tracked regardless if they are in view or not.
 
I've said this in almost every AC: Unity thread, but I am really disappointed I bought this, never trusting UBI again with any of its games. Without a review of how the game performs I won't be purchasing their games.
 
Some other "fun with ACU cutscenes" moments.

Perspective, how does it work?

acv-21msbq.jpg

LMFAO.

After an initial wtf feeling for this game I'm slowly warming to it as I go along and their efforts in terms of presentation at least in terms of design is working on me too but, man, some things are just unforgivable.
 
So the analysis shows:
- Xbox One has a better framerate (still not locked at 30) when running around outside
- Both consoles are roughly 30 FPS while indoors
- PS4 has a better frame rate during fights via DF? Weird
- PS4 has better cutscene FPS
- Both versions suffer from graphical glitches

Sounds like both versions are train wrecks to me. Arguing over which train wreck is better than another train wreck seems like a waste of time when we can just roll our eyes at Ubisoft.
 
If the game is CPU bound dropping the resolution on the PC version would not have improved framerate but it did. The game is not CPU bound. Framerate dropping only when crowds are in view indicates a rendering problem as AI would need to be tracked regardless if they are in view or not.

Depends on the GPU you are using.
 
Oh Ubishit, you never fail to impress. Horrible performance on both consoles and PC. Hopefully it bombs.
It won't. :(

Actually compared to this the PC performance isn't that bad and the gtx 680 with a mid level cpu for minimum requirements sounds logical...

I suddenly feel like saying 'jehova'
 
Something seems awry. Traversing through the crowds the disparity between the framerate is 3-4 fps at best, on rooftops whilst chasing it seems the xbox has a consistent advantage. In combat the PS4 hardly drops the 30fps whilst the xbox drops. Later on in the video it's holds 30fps a bit more consistently and whilst going through crowded areas the xbone drops to the 28's and 29's whilst PS4 stays more at 30.


Despite that, this game looks like it was optimized a bit more for the XBONE, I really don't think a 1080p PS4 version would have affected the PS4 framerate too much from where it is currently. Also, I would not like to see this game before the patch. GPGPU should have been implemented to bridge the small cpu divide and better AA should have been implemented on the PS4 as well.
 
The clock speed difference between Xbox One and PS4 is 9%. That would mean each core has theoretically 9% more ability to process loads. That must be accumulative.
That's not how it works. If it's using the same number of threads on both CPUs, which it should be, it's just a 9% difference.
 
Shouldn't we take into account that there are 8 cores? A 0.15 Ghz bonus per core should be useful in a CPU bound situation. No?

NO. a 9% increase is still a 9% increase, when its competitor has the same number of cores. You don't multiply it.

I have two cores. You have two cores. I increase my clock speed 20%. Does that now make me 40% faster than you? No, it's still 20 freakin percent, because we have the same number of cores still.


Past me -
Was this facetious, or...
You realize "on each core" is irrelevant to the performance gain. 9% clock increase on 6 usable cores, vs a competitor with the same amount of cores, is still a 9% increase. More cores doesn't make a clock increase relatively bigger when compared to the same number of cores.
 
The clock speed difference between Xbox One and PS4 is 9%. That would mean each core has theoretically 9% more ability to process loads. That must be accumulative.

Not how it works. (1.6 × 6) ÷ (1.75 × 6) = 1.09375. As the number of cores is the same the clockspeed difference is all that matters.
 
i totally called it, i was a 100% sure that the PS4 would be a match of the Xbox, but to have lower performance?, this is what parity truly means, they are not getting the best of all plataforms.
" Barring a few glitches here and there, overall image quality looks like a match between both PS4 and Xbox One. A 900p resolution is confirmed, while anti-aliasing, NPC count and other factors that may impact performance appear to be a complete match."
 
They shouldn't need to. The PS4 has additional CU's which can be dedicated to cpu tasks. This is perfect for NPC's, and would give the PS4 much more processing capacity then the XB1 while still having a GPU edge .

DF alluded to it in many ways but it looks like forced parity to me,,,

The developer not being given time/budget to specifically optimise for GPU compute doesn't mean forced parity.
 
Wouldn't gpgpu severely help the ps4 version in regards to a lower cpu speed.....did they not use any graphics techniques to offload cpu load to the gpu?
 
My guess is this was a straight up port from the Xbox One over to the PS4. And little attention was paid to the PS4 in particular. Though I am sure the Xbox One could have done with more time on the optimization oven for a better frame-rate performance as well.

Horrible!!!!!!!!!
 
Ubisoft really fucked up with this game.
Seems like it was totally rushed to the market to meet the holiday season deadline (while it should have been delayed like Watch Dogs last year) with the Xbox One as a clear lead platform because of marketing deals or whatever.
It's still a hot mess everywhere but PS4 version has been left totally unoptimized, there's really no excuse for this, they're not using the extra GPU power at all, even if that meant using GPGPU to get better performance on the CPU side.
Maybe they can improve things with patches but this year has been absolutely terrible for Ubisoft, two big games, two failures at quality.
 
I'm still confused about people saying 'at least the game looks good'. It doesn't look good at all.

Based on the 4K PC screenshots I've seen, the 40 ft radius circle around your character can look pretty good. Beyond that, the level of detail drops off a cliff and NPCs are phasing in from oblivion.
 
Crowds require CPU and the CPU in the Xbox One is faster. In this corner case, the game is playing the strengths of Xbox One rather then PS4.

Many previous reports were indicating that the PS4 CPU, depending on the task, was at least equal to (or better) than the XB1 CPU.
The PS4 CPU clock was never officially revealed either.

At most, in the "worst" case scenario, the clock difference is 10% which would never translate in a pure 10% difference in a real world scenario (aka outside of a benchmark).

How do you then explain up to a 25% difference in framerate (20 vs 25)?

At best for Ubi, it looks like sheer technical incompetence & that they would have spent less time to optimize the PS4 version. At worst, it's possible downright sabotaging of the PS4 version which will then be "magically" corrected in a patch that will show up "gasp" after the tech analysis is published and most medias report on "PS4 has the worst version!" (hint: it was also the case for both COD and AC4 last year).
Looks to me like, if it's option 2, their "shackle" on the PS4 worked a little too well (was probably meant to obtain either complete parity or slightly better on XB1 by 1 or 2 fps to be explained away by "CPU offset").

/teamtinfoihat

Disgusting.
 
Top Bottom