Performance Analysis: Assassin's Creed Unity (Digital Foundry)

You guys are still fighting console warz even though everyone with half a brain should be blaming Ubisoft for rushing this shit out, and nothing else?
Come on!

Everyone else is busy laughing their asses off in the "framerate is abysmal" thread. I laughed so hard my textures stopped loading.

But yeah, game's performance is a travesty and trying to include the actual consoles into that is crazy. Ubi fucked up, end of discussion.
 
Picard-facepalm-o.gif


Well you would have a point if PS4 didn't have HUMA and simultaneous read/write access on the ram for the CPU and GPU. The fact is the single advantageous feature hardware wise for the XB1 is a 150mhz up clock on the CPU. It is outperformed in literally every other capacity on the competing platform. The FPS gap between versions is not proportional to a minor cpu clock boost.

This combined with the fact that we have members going on record saying that XB1 was the lead platform would indicate it's an optimization issue not a result of any hardware advantage. There's also the matter of resolution and effect parity which essentially means that the GPU advantage the PS4 enjoys is just being left unused for some unknown reason. Readily available and easily utilized power on a locked hardware system is being left unused and you are claiming some superiority for the system that has served as lead platform and as such received extensive optimization. Either you're being purposefully obtuse or you're just incapable of grasping the big picture. There are more factors involved here than hardware limitations.

But hey don't just take my word for it. (look below)

We have a vetted dev (matt) who said (paraphrasing here) there's no practical scenario where Xbone beat PS4 in performance. He also commented in this thread saying there's no excuse for this.
Just keep reposting every few pages.


So can we agree that since both versions are bugged as hell that this is just a rush/$hit job from Ubi and not some money-grabbing, parity causing, i.e "MS paid Ubi to shat on the PS4 version," or are we still at an impass on the whole thing with not a word from Ubi on th issues yet?


If they wanted to avoid debates and stuff, wouldn't why tell everyone they are working on fixing both versions so they are workign in parity, or are they hiding in their bunker still hoping it will all just blow over and it will sell millions regardless?
 
Because GPU's are good at maths (physics, collision, graphics), but not decision making making them pretty bad at AI and game logic.

Depends what the AI is doing though. In the scene with 10,000 NPCs the AI for each NPC is going to be very simple and not at all branchy. Putting the load of 10,000 AIs onto the GPU makes a lot of sense in that scenario as that sort of parallelism is what GPUs are amazing at.

The combat AI and the other more branchy AI type code can still sit on the CPU where it can handle dealing with different scenarios more effectively.
 
It's not just not an easy task, it's also simply not a good idea for some types of code. It's not like the GPU can do everything -- if that was the case we wouldn't have CPUs.

These things take time, they have an engine designed around strong CPU performance and they cannot move everything to GPU. BUT you can pretty much run anything on them if you want to, how practical and efficient they are is debatable.

All that aside right now this game is proof of too many cooks and sales figures. It is various levels of crap on all platforms, poor performance and bugs outside of NPC's. Pop in, stutter 1-2fps section with nothing going on, changing models, falling through scenery the list is endless of clear signs this game was unfinished and most likely dropped onto consoles and with any time and effort being spent on X1 due to MS involvement.

It really is a mess of a game and engine that they need to rethink and re-design from the ground up as the inconsistent performance is legend at this point.
 
Just keep reposting every few pages.


So can we agree that since both versions are bugged as hell that this is just a rush/$hit job from Ubi and not some money-grabbing, parity causing, i.e "MS paid Ubi to shat on the PS4 version," or are we still at an impass on the whole thing with not a word from Ubi on th issues yet?


If they wanted to avoid debates and stuff, wouldn't why tell everyone they are working on fixing both versions so they are workign in parity, or are they hiding in their bunker still hoping it will all just blow over and it will sell millions regardless?

I don't really think reposting is necessary as pretty much anyone with even the slightest amount of common sense can see the truth of the matter here. It's ibvious to anyone who considers the whole picture. Those that rail against it are basically just fucking themselves.
RikerStopFuckingYourself.gif

Stop it guys. It's not a pleasant sight to behold.


As far as UbiSoft goes it's likely the latter as its unfortunately true that it will have minimal impact on sales for this release. The problem for them will come next year when AC:Next is released. People who otherwise bought the previous game based on name alone won't forget the piss poor performance bug ridden mess that they were sold on release. It'll hurt them dearly on the sales for the next game in the franchise. Probably.
Hopefully.
 
Depends what the AI is doing though. In the scene with 10,000 NPCs the AI for each NPC is going to be very simple and not at all branchy. Putting the load of 10,000 AIs onto the GPU makes a lot of sense in that scenario as that sort of parallelism is what GPUs are amazing at.

The combat AI and the other more branchy AI type code can still sit on the CPU where it can handle dealing with different scenarios more effectively.

That is a fair point sounds like an interesting take, splitting it up into enemy / civilian AI like that.
 
Likely the latter as its unfortunately true that it will have minimal impact on sales for this release. The problem for them will come next year when AC:Next is released. People who otherwise bought the previous game based on name alone won't forget the piss poor performance bug ridden mess that they were sold on release. It'll hurt them dearly on the sales for the next game in the franchise. Probably.
Hopefully.

Yeah, we'll see how smart consumers are next year. I don't have a great deal of hope though, people will probably eat up the phony apology they throw out next year.

So we're in agreement then that a cake and a BJ both provide much better entertainment and satisfaction than this piece of $hit game ever could...

Oh yeah, but it's not a high bar. Running around your home in slow motion would probably provide more.
 
Maybe, but with the graphical fidelity on display in those cut scenes you would expect a rock solid 30 fps on both consoles. They are no better than other games so what is causing the crappy frame rates other than inefficient code.
Totally agree with You that it could look better and run better with better optimization, but giving the circumstances of AC:U launch, the resolution boost would hurt PS4 version more than helped.
 
But you wouldn't run AI task's on the GPU, it would probably be slower.

There's lots more than just AI. The point is to run things traditionally done only on the CPU, on the GPU to free up the CPU to have more time to do CPU suited things, like AI.
 
I've never seen anyone talking about the PS4 CPU being overclocked in any thread (other than speculation threads from 2013). Can you give us 2 or 3 examples of this? Thanks.

From today, this very same post:

Isn't the clock speed of the PS4 CPU just a guess anyway? We don't know for certain what it's clocked at?

Don't pretend like it isn't easy to spot people in this forum refusing to accept 1.6Ghz as PS4 final clocks to this date.


We have a vetted dev (matt) who said (paraphrasing here) there's no practical scenario where Xbone beat PS4 in performance. He also commented in this thread saying there's no excuse for this.

That wasn't from a year ago? Several SDK updates arrived since then, including Kinect reservation removal.

You have this game showing better performance on crowded scenes on Xbox One, and better performance on cut scenes and some other emptier scenes on PS4. That says something. Most likely, you have a CPU bound and GPU bound console (One) and slightly more CPU bound console (PS4). Lawl.

Well you would have a point if PS4 didn't have HUMA and simultaneous read/write access on the ram for the CPU and GPU. The fact is the single advantageous feature hardware wise for the XB1 is a 150mhz up clock on the CPU. It is outperformed in literally every other capacity on the competing platform. The FPS gap between versions is not proportional to a minor cpu clock boost.

hUMA can't prevent the missing cycles and performance loss coming from both elements sucking from the same RAM. That is a logic solution to add coherent simultaneous access to the same pool of RAM for both GPU and CPU, not to fix the hardware penalties.

The bandwidth CPU steals from GPU is above a proportional ratio. Meaning that if it takes 20Gb/s, GPU have much less than mathematical 156Gb/s. It isn't such a problem because it still have plenty, but far from ideal.

Problem comes when CPU misses enough write/read cycles to stall. In the way memory subsystem is designed, PS4 CPU will be slighty more prone to that. Nothing that can't be prevented with proper code, much less coming from crappy PS3.

This combined with the fact that we have members going on record saying that XB1 was the lead platform would indicate it's an optimization issue not a result of any hardware advantage. There's also the matter of resolution and effect parity which essentially means that the GPU advantage the PS4 enjoys is just being left unused for some unknown reason. Readily available and easily utilized power on a locked hardware system is being left unused and you are claiming some superiority for the system that has served as lead platform and as such received extensive optimization. Either you're being purposefully obtuse or you're just incapable of grasping the big picture. There are more factors involved here than hardware limitations.

But hey don't just take my word for it.

Xbox or PS4 being the lead platform makes no difference. They are running the same code. So everything that you code for one will run mostly the same or better on the other.

Where you all say there is unused GPU resources used on PS4, I clearly see a system struggling to hold 30 FPS even on cutscenes, and performing better than Xbox at the same time. Maybe your expectations are unrealistic.

Minor CPU advantage on Xbox is a fact. And analysis on AC4 makes sense with things we know about both consoles.
 
Ubisoft mentioned that much of the processing power goes into having all those NPCs. They have like a 1000 (or is it 10000) on screen at once?

I can't speak for everybody else, but I would much rather they scale back that number to give a much more consistent performance.
 
Ubisoft mentioned that much of the processing power goes into having all those NPCs. They have like a 1000 (or is it 10000) on screen at once?

I can't speak for everybody else, but I would much rather they scale back that number to give a much more consistent performance.

1080people.
 
If the game looks exactly the same on two platforms, and the game is otherwise CPU bound in some scenarios and the XboxOne performs better here, and GPU bound in other scenarios (where the PS4 performs better), but the differences are mostly small, then it follows that on one platform, 50% of CUs are mostly idle.

Hopefully next time they will have had a chance to consider using CUs for their AI. But the Xbox One wouldn't benefit from that, so for a multi-platform game, it wouldn't have been worth it for them on the short run. Still the first year though, and this game has been long time coming, so hopefully things get better. Most AI is path-tracing and collision detection in this game, and this should be something GPU Compute should be better at.
 
From today, this very same post:



Don't pretend like it isn't easy to spot people in this forum refusing to accept 1.6Ghz as PS4 final clocks to this date.




That wasn't from a year ago? Several SDK updates arrived since then, including Kinect reservation removal.

You have this game showing better performance on crowded scenes on Xbox One, and better performance on cut scenes and some other emptier scenes on PS4. That says something. Most likely, you have a CPU bound and GPU bound console (One) and slightly more CPU bound console (PS4). Lawl.



hUMA can't prevent the missing cycles and performance loss coming from both elements sucking from the same RAM. That is a logic solution to add coherent simultaneous access to the same pool of RAM for both GPU and CPU, not to fix the hardware penalties.

The bandwidth CPU steals from GPU is above a proportional ratio. Meaning that if it takes 20Gb/s, GPU have much less than mathematical 156Gb/s. It isn't such a problem because it still have plenty, but far from ideal.

Problem comes when CPU misses enough write/read cycles to stall. In the way memory subsystem is designed, PS4 CPU will be slighty more prone to that. Nothing that can't be prevented with proper code, much less coming from crappy PS3.



Xbox or PS4 being the lead platform makes no difference. They are running the same code. So everything that you code for one will run mostly the same or better on the other.

Where you all say there is unused GPU resources used on PS4, I clearly see a system struggling to hold 30 FPS even on cutscenes, and performing better than Xbox at the same time. Maybe your expectations are unrealistic.

Minor CPU advantage on Xbox is a fact. And analysis on AC4 makes sense with things we know about both consoles.

So let me get this straight here just to make sure I'm reading this right. You're actually arguing that platform optimization has no impact because "they are running the same code."

RikerCANT.gif


Yea no. I'm done. You're clearly completely separated from reality and common sense.
 
It's probably simple:
X1 was "lead" platform and the PS4 port didn't get much love

More like Xbox One was the main platform. Developing for the lowest common denominator so nobody can bitch about better graphics on ps4.

Bingo. That's what definitely happened.

There would absolutely be no way that the Xbox One version would be more "superior" if it were the other way around with PS4 being the lead platform.
 
Ubisoft mentioned that much of the processing power goes into having all those NPCs. They have like a 1000 (or is it 10000) on screen at once?

I can't speak for everybody else, but I would much rather they scale back that number to give a much more consistent performance.

Yeah, it would just make sense. Either delay and fix it up because I'm sure there were some design decisions made around the number of people, or just reduce it. Based on the performance of the game, I'd rather it was just the Forza 5 crowd rather than deal with this.
 
Worse on PS4. I can't believe this shit. Ubisoft really are shitty devs. 10 million people working on the game and they can't use the hardware properly :(

Ubisoft development consists of putting as much shit in as possible. Optimization and bug fixing seem to always take a back seat.
 
Xbox or PS4 being the lead platform makes no difference. They are running the same code. So everything that you code for one will run mostly the same or better on the other.

The issue here is the APIs are different so the code required to issue a draw call in DX is going to be a bit different from the code required in LibGCM. The API then takes your command and translates that into something the GPU will understand and the GPU does the work. In some cases what will work well in DX will not work well in LibGCM so if you just port your code as is you are giving up a lot of performance.

Minor CPU advantage on Xbox is a fact. And analysis on AC4 makes sense with things we know about both consoles.

It is only a 9% clock speed increase, there is no way that the FPS advantage from a 100% CPU limited scenario could ever be > 9% on Xbox One. The fact the FPS difference here is as much as 20% suggests something else is going on and the only real way to explain it is that they did a quick port without performance tuning on the PS4 because they ran out of time.
 
I think Sony may have made things TOO easy for developers. They can get their games in a "working state" so easily on PS4 while on XBO, they have to work a lot harder with ESRAM management, etc so developers throw WAY more manhours at the XBO version than the PS4 version. Add the marketing partnership and console bundling and Ubisoft made sure to get that version running as well as possible and didn't have the manpower to devote to improving what they got going on PS4.

It's baffling though why any company would want the version that is releasing on the console with 2x+ more consumers available to be the worse performing version when it seems like more manhours devoted to optimization would have improved it based on the spec differences.

In the end though, it's a failure of a game technically. These consoles should not already be struggling to hit 30fps and based on how poorly this game plays on PC as well, it's not the console's fault here.

Should have made Rogue cross-gen this time and released Unity next year spring or something.
 
How are the sales of ACU in your shop? Good, OK or bad?



I'll check the numbers later tonight when I get in. I'll assume they are pretty good on PS4 and so-so on XB1 because we are selling so many AC bundles right now... But who knows, last time I checked on Mon night, we had sold a few more copies of AW on XB1 than PS4, so it's up in the air. Everyone is startign their holiday shopping and dumping those bundles on layaway, just in case they can't get one of the BF bundles.


Lets face it, outside of forums, unless it hits more mainstream blogs/website, then nobody knows about the issues or cares about it. They are just pumped for a new Ass Creed game, ya know.
 
It is only a 9% clock speed increase, there is no way that the FPS advantage from a 100% CPU limited scenario could ever be > 9% on Xbox One. The fact the FPS difference here is as much as 20% suggests something else is going on and the only real way to explain it is that they did a quick port without performance tuning on the PS4 because they ran out of time.

It isn't necessarily a 1:1 relationship.

The game sucks and is poorly optimized, it seems, on every platform. No need for these silly conspiracy theories about MS paying people off or developers being fanboys.
 
Why are you guys paying attention to dr. apocalipsis when it's been clear for a while now that he doesn't know what he is talking about?
 
It's probably simple:
X1 was "lead" platform and the PS4 port didn't get much love

This is a rational thought. To think about it logically....I think we can all agree that the game runs bad on both systems. I'm not going to put my tinfoil hat on and say that this is some conspiracy between Ubi and MS to shit on Sony. That is ridiculous. MS did have marketing rights for the game. This is true. So, the way I see it is that the XB1 definitely got more priority when Ubisoft was getting builds up to show at events and to the media. Because, well that was the deal with MS, that whenever the game was shown it had to be running on an XB1.(except of course E3 where very specific showcase demos were running on PC using XB1 controllers) Judging from the final release, both versions didn't have much if any time for optimization since it had to make its stupid holiday release window. The game really should of been delay for a least another couple months.
 
I think Sony may have made things TOO easy for developers. They can get their games in a "working state" so easily on PS4 while on XBO, they have to work a lot harder with ESRAM management, etc so developers throw WAY more manhours at the XBO version than the PS4 version. Add the marketing partnership and console bundling and Ubisoft made sure to get that version running as well as possible and didn't have the manpower to devote to improving what they got going on PS4.

It's baffling though why any company would want the version that is releasing on the console with 2x+ more consumers available to be the worse performing version when it seems like more manhours devoted to optimization would have improved it based on the spec differences.

Don't worry. Once they see how there are a lot more sales for the PS4 version of this game than on the Xbox One version, Ubisoft will think twice before making the Xbox One version of their games the lead platform.
 
Xbox or PS4 being the lead platform makes no difference. They are running the same code. So everything that you code for one will run mostly the same or better on the other.

Where you all say there is unused GPU resources used on PS4, I clearly see a system struggling to hold 30 FPS even on cutscenes, and performing better than Xbox at the same time. Maybe your expectations are unrealistic.

Minor CPU advantage on Xbox is a fact. And analysis on AC4 makes sense with things we know about both consoles.

They are not the same code, that is ridiculous. At best they run the same algorithm with similar code.

The rest of your hypothesis is just confirmation bias. You are taking data (CPU clocks) and fitting it to the results. The best case you could build without profiling the games is knowing the game is 100% bottlenecks by the CPU and then the XB1 version has a % better frame rate that exactly matched the % upclock. Of course you don't have that and there are so many other variables that you cannot make that case.

I could just as easily make the case that Ubi has a bundle and marketing deal with the MS and therefore spent 1.5x the man-hours on the XB1 version. This is far more controversial, so no one will ever admit it publicly.

If Ubi was smart they would have removed 9% of the crowds and made the two versions equal, but of course that excuse about the AI is probably just that, an excuse. The engine is probably new and a POS at this point, but hey MS paid for some extra TLC.
 
I think Sony may have made things TOO easy for developers. They can get their games in a "working state" so easily on PS4 while on XBO, they have to work a lot harder with ESRAM management, etc so developers throw WAY more manhours at the XBO version than the PS4 version. Add the marketing partnership and console bundling and Ubisoft made sure to get that version running as well as possible and didn't have the manpower to devote to improving what they got going on PS4.

I think so too. In other words, they got lazy. In project management terms they optimized internal resources, because they could get away with it. Ubi Sony relations must be kind of awkward now.
 
It isn't necessarily a 1:1 relationship.

The game sucks and is poorly optimized, it seems, on every platform. No need for these silly conspiracy theories about MS paying people off or developers being fanboys.

Usually it is a lot less than 1:1 in terms of % cpu clockspeed increase vs FPS but in very cpu limited games such as Starcraft 2 it is 1:1. That is a best case though.
 
The issue here is the APIs are different so the code required to issue a draw call in DX is going to be a bit different from the code required in LibGCM. The API then takes your command and translates that into something the GPU will understand and the GPU does the work. In some cases what will work well in DX will not work well in LibGCM so if you just port your code as is you are giving up a lot of performance.

Then you are blaming Sony for a worse API or compiler, not Ubi. AFAIK that isn't an issue, so I don't agree.

It is only a 9% clock speed increase, there is no way that the FPS advantage from a 100% CPU limited scenario could ever be > 9% on Xbox One. The fact the FPS difference here is as much as 20% suggests something else is going on and the only real way to explain it is that they did a quick port without performance tuning on the PS4 because they ran out of time.

My point is that:

a. Xbone's CPU have an advantage on both clockspeed and memory setup. Just not a plain 9%. Would be more or less depending on load scenario.

b. You can't expect a linear progression on a heavily CPU bound scenario.
 
Top Bottom