Assassin's Creed Unity - PC Performance thread

Corpekata

Banned
A shame the shitty performance is going to be the legacy of this game because I am sort of floored at how much I'm enjoying the gameplay.
 

iNvid02

Member
just played a bit and its look really good, if only they could sort this pop in/lod issues with this engine. still an open world game doing this year one makes me excited to see what linear games will be pushing from other devs.

also, god damn this gtx 980 at full load is quiet.
 

Herne

Member
Getting okay performance out of my i5 2500, 12GB and Radeon 7970 with everything set to ultra but shadows (low) and anti-aliasing (2x msaa), but I'm experiencing a lot of noticeable pop-in and bugs like characters appearing and disappearing in and out of the ground. Haven't tested the frame rate yet but it's a tad choppy. Hope the next patch comes out soon...
 

Melpontro

Member
Getting about 30 - 60 FPS maxed ou on my 970 paired with a Core i5 2500k at 4.8 Ghz. I'm pleased with the performance for now, Unity has one of the best Global Illumination implementations I have seen:
iboMKT37lmvemP.png

iCU0z27BNr7mx.png
 
Looking at the russian site, this is the breakdown f the recent games, so that you see the difference. Similar preset between games, almost identical hardware (identical, but Mordor has a very slightly weaker setup).

Min / Max:

Unity: 33 / 39 (900p, 22-30 fps)
Advanced Warfare: 85 / 96 (1080p, 50-60 fps)
Mordor: 54 / 66 (1080p, 30 fps)

So basically Unity runs at half or less the framerate of its contemporary games.

We know that even on consoles there are differences, but smaller than the gap of performance on PC.
 

Kevyt

Member
Getting about 30 - 60 FPS maxed ou on my 970 paired with a Core i5 2500k at 4.8 Ghz. I'm pleased with the performance for now, game definitely Unity has one of the best Global Illumination implementations I have seen:

Are those Ultra settings? That seems to be the same performance for me. It reminds me of Tomb Raider DE on PS4, with the variable frame-rate. Also, is it me, but the lag doesn't feel as bad? I mean I don't have an Asus Rog Swift monitor with G-sync, but when the frame rate drops and I'm in the middle of action, like running or fighting, it doesn't feel as bad as I thought it would. This is in contrast to other titles like State of Decay where sudden frame-rate drops can be felt, and in some situations the controls don't feel responsive. This is not the case with Unity.

whats with the MASSIVE stuttering my god. i am literally dropping to 1fps with a 4790k, 32 gb ram and a 680..

It could be because of VRAM.
 

Melpontro

Member
Are those Ultra settings? That seems to be the same performance for me. It reminds me of Tomb Raider DE on PS4, with the variable frame-rate. Also, is it me, but the lag doesn't feel as bad? I mean I don't have an Asus Rog Swift monitor with G-sync, but when the frame rate drops and I'm in the middle of action, like running or fighting, it doesn't feel as bad as I thought it would. This is in contrast to other titles like State of Decay where sudden frame-rate drops can be felt, and in some situations the controls don't feel responsive. This is not the case with Unity.



It could be because of VRAM.
Yeah, for whatever reason even when it drops it doesn't bother me too much in this title, usually I'm very sensitive to FPS drops, but they are tolerable here. Much better than Watch_Dogs, which I had to lock to 30 as the drops bothered me too much there.
 
I was waiting for this chart in particular:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Assassins_Creed_Unity-test-ac_proz.jpg


Seems like the game scales really well up to 6 cores. As I expected from a "next-gen" engine in such a CPU-heavy game. 5820k looking good :p

Also, the results on low-end CPUs are much better than I thought they would be. i3 comfortably beating the consoles :p

This might prove the opposite, though. If a standard, non-overlocked 2500k can run the basic simulation at a MINIMUM of 60fps at max detail, then it means the CPU is basically pointless for this game.

The 2500k, without any overclock, is plenty enough to run Unity at its max desirable.

So it would mean that the great majority of us are instead entirely bottlenecked by videocard.

The interesting chart is the one that is missing: seeing how all the videocards behave on a 2500k to see how much they lose on a weaker CPU.
 
Well'p, decided to go ultra, TXAA, 1440p locked to 30fps. Not smooth, but it's consistent. Tried 1620p and, although it stayed at 30fps for the short time I tested it, I started getting strange graphical glitches almost immediately, like seams in between textures and lighting effects that moved with me as I walked around.

i5-3570k @ 4.4ghz
16gb ddr3 @ 2133mhz
Gigabyte G1 gtx 970 sli
250gb ssd - games
100gb ssd - OS
Win7 64
 
New SLI profile is giving me significantly better performance, 50-75fps in the starting town to 65-80+, and one of the graphical glitches I was getting seems to have gone away (a white flash when the camera changed quickly).
 
I'm fine with the performance. I locked the framerate to 30 in the Driver and can set everything on high. Shadows PCSS, FXAA and 1080p.
The only problem are the cutscenes where the framerate drops below 30.
System :
i5-2500 non-oc
770 2gb
8gb ram

Worth noticing that "first impressions" are delusional. In the sense that framerate tanks greatly in the mid game compared to when you start. So you have this illusion (and so many optimistic reports here) that the game runs actually well.
 
New SLI profile is giving me significantly better performance, 50-75fps in the starting town to 65-80+, and one of the graphical glitches I was getting seems to have gone away (a white flash when the camera changed quickly).

I'm still getting that white flash, and I'm pretty sure I have the updated sli profile. =/
 

Lulubop

Member
So uh, first areas as adult Arno

Getting 70-80 fps...

Environmental Quality at Very High

Shadows at high

Textures at Ultra

FXAA

HBAO+

Bloom

Can't complain,

780ti
3570k Stock
@ 1080p.

Runs and looks better than Watch_Dogs for me.
 

JaseC

gave away the keys to the kingdom.

rBose

Banned
They won't ever bother fixing shit on PC because they dont give a shit about PC, they will mostly fix some LOD and Crash issues but that is all, dont expect the game to miraculously get a 30 fps boost from 2-3 patches.

It's not going to happen, people need to learn and understand that they need to stop preordering bullshit based on E3 hype, we all know what those E3 presentations are, prototype builds pumped with a high number of polys running on God knows how many nvidia k6000s that will never be available in the final build.

Its going be fixed on consoles because its their market, but on PC ?
 

Renekton

Member
This might prove the opposite, though. If a standard, non-overlocked 2500k can run the basic simulation at a MINIMUM of 60fps at max detail, then it means the CPU is basically pointless for this game.

The 2500k, without any overclock, is plenty enough to run Unity at its max desirable.

So it would mean that the great majority of us are instead entirely bottlenecked by videocard.
It doesn't prove the opposite. Durante said the game scales with CPU, and that's what happened in the chart should GPU limitation scenario be removed (SLI 980).

"Max desirable" is just an arbitrary cut-off point you made.
 

Kevyt

Member
They won't ever bother fixing shit on PC because they dont give a shit about PC, they will mostly fix some LOD and Crash issues but that is all, dont expect the game to miraculously get a 30 fps boost from 2-3 patches.

It's not going to happen, people need to learn and understand that they need to stop preordering bullshit based on E3 hype, we all know what those E3 presentations are, prototype builds pumped with a high number of polys running on God knows how many nvidia k6000s that will never be available in the final build.

Its going be fixed on consoles because its their market, but on PC ?

That's a shame... I:

But the game has problems in both PC and console. I think they ought to patch both versions. I actually think that Ubisoft Kiev didn't have a lot of time to optimize the game as much as they would have wanted. I feel like Unity is a bit rushed. Just my two cents.
 

def sim

Member
7970 GHz Edition 3GB VRAM 1150 core 1600 mem
3570k @ 4.5GHz

Environment, Texture, and Shadow are set to high. SSAO, FXAA, and bloom.

I'm just trying this out until my new card comes in. The framerate hangs around 40 and jumps up and down depending on where I am or if I'm in a cutscene. Mid 20s does happen. I have not yet reached Paris.

I've been playing the console version so my expectations were low. This is, so far, absolutely so much more playable than the mess on XBO or PS4.
 
They won't ever bother fixing shit on PC because they dont give a shit about PC, they will mostly fix some LOD and Crash issues but that is all, dont expect the game to miraculously get a 30 fps boost from 2-3 patches.

It's not going to happen, people need to learn and understand that they need to stop preordering bullshit based on E3 hype, we all know what those E3 presentations are, prototype builds pumped with a high number of polys running on God knows how many nvidia k6000s that will never be available in the final build.

Its going be fixed on consoles because its their market, but on PC ?
What a wonderfully well-thought out and enlightening post.

Somewhat related, I'm going to wait for AMD to get their drivers in order before I attempt to play the game again. It's fantastic, but I can't handle the framerate I'm getting right now. WCS I put it on the back burner until Jan/Feb when I biuld a new PC.
 
It doesn't prove the opposite. Durante said the game scales with CPU, and that's what happened in the chart should GPU limitation scenario be removed (SLI 980).

"Max desirable" is just an arbitrary cut-off point you made.

I simply meant there's virtually no reason to have something bigger than a 2500k, or the massive overclock that people here sport.

Of course 60 fps is "arbitrary cut-off", but it also works for most people. Moreover, Unity is a borderline case on CPU usage, so you can expect upcoming games to use LESS cpu (unless game companies really love that their game runs at 20 fps on PS4, and I don't think this is a mistake they'll repeat).

So saying that buying a new CPU pays off when it comes to gaming is still largely false. Unless 60 fps is still not enough for you.

Bottom line: even Unity is largely fine on a stock 2500k and above, CPU-wise.
 
TB gave up on Unity because of performance issues and some crashing on top as cherry. This with his new rig.

Here talking about his experience with game: https://www.youtube.com/watch?v=SgpzT5V5Mgs
Holy shit the game has terrible bugs. Everything is glitching out of place. I honestly can't remember a game this buggy since getting into gaming. I thought that gaf was just dramatizing things but this is ridiculous. Sucks as the game looks stunning.
 
Here's a short Shadowplay video I captured that shows the performance I'm getting with the new SLI profile. The video actually doesn't play very smoothly, I'm not sure why but other than a couple big hitches that you can see in the video, it actually plays very smoothly in that starting area.

1080p everything maxed except I'm using 4xMSAA
https://www.youtube.com/watch?v=iNwEsw35fJQ&feature=youtu.be

Running on:
i7 5820K @ 4.1 Ghz
16 GB DD4 Ram
2x Gigabyte GTX 970 G1 Gaming
2x MX100 512 SSDs
MSI X99S SLI Plus MB

Edit:
Unsurprisingly, I guess, the 60fps video runs much better.
 

HariKari

Member
New SLI profile is giving me significantly better performance, 50-75fps in the starting town to 65-80+, and one of the graphical glitches I was getting seems to have gone away (a white flash when the camera changed quickly).

Yeah, seems better for my setup as well (2x770 2GB). I highly recommend the special "smooth" v-sync setting for SLI setups. Cured pretty much all the stuttering. The new profile downloads automatically if you check for updates IIRC. It isn't included in a driver release or anything.
 
Yeah, seems better for my setup as well (2x770 2GB). I highly recommend the special "smooth" v-sync setting for SLI setups. Cured pretty much all the stuttering. The new profile downloads automatically if you check for updates IIRC. It isn't included in a driver release or anything.

Does it tell you that it downloaded an updated profile or is there any way to tell if you have it?
 

Darkman M

Member
Getting about 30 - 60 FPS maxed ou on my 970 paired with a Core i5 2500k at 4.8 Ghz. I'm pleased with the performance for now, game definitely Unity has one of the best Global Illumination implementations I have seen:

This is my experience as well with a 2600k @4.4 and a 970.
 
Could someone explain the oddities of these reports?

If the game is not CPU bottlenecked, considering a 2500k can do 60 fps at max detail. Then we assume the videocard is what's relevant here.

Yet, if this is true, it means that the resolution is the bigger point in performance.

So why I've read that running the game in low resolution only gives an handful of fps more? If lowering resolution doesn't MASSIVELY help it means we are CPU capped.

So how comes that there's so much inconsistency on these reports?
 
Could someone explain the oddities of these reports?

If the game is not CPU bottlenecked, considering a 2500k can do 60 fps at max detail. Then we assume the videocard is what's relevant here.

Yet, if this is true, it means that the resolution is the bigger point in performance.

So why I've read that running the game in low resolution only gives an handful of fps more? If lowering resolution doesn't MASSIVELY help it means we are CPU capped.

So how comes that there's so much inconsistency on these reports?

Probably because the game is a mess and every PC seem to be running it differently performance-wise. I get 50-60 on 1440p but 20-25 on 4K with 780 TI SLI. Also there was a portion in the game where the framerate randomly dropped to about 20 fps on 1440p and lowering it to 900-fucking-p only got me about 10-15 frames more.
 

HariKari

Member
Also there was a portion in the game where the framerate randomly dropped to about 20 fps on 1440p and lowering it to 900-fucking-p only got me about 10-15 frames more.

The game does horribly when you switch settings ingame and then try to play. Restart and the new settings are much, much more effective.
 

Kevyt

Member
Anyone with a 7850 2gb and a i5-2500k try this yet? Or something similar to these specs?

Yes, somewhat similar (r9 270 which is a renamed 7850 paired with a 4790k).

Anyways, here are my previous posts on my experience from it.

I just tried the game on my Sapphire R9 270.

I had to set environment detail to medium, shadows to low, no SSAO, FXAA for AA, and turn off bloom, V-sync off. I was getting 30-40 fps in the very first sequence. The game isn't that bad. It still looks better than on consoles. I was also using a controller so the lag didn't feel as bad as when playing with mouse + keyboard. Overall, it's not bad, I can't wait to get to Paris.

Oh and this is without beta drivers. I'll install those and see if my performance improves.

I got to the Paris sequence and the performance was atrocious. I was getting 30-40fps on my 270. But when I moved the camera around and run for a good distance, the game would stutter a lot and the frame rate would drop to 15 fps. This is with environment settings set to medium, high settings, FXAA, bloom off, v sync off. Running at 1920 x 1080. I switched to 1600 x 900, everything set to low and only managed to gain 10 fps. There's not a lot of difference between low and high, there should be more options. But yeah even at the lowest settings at 900p, the performance is still terrible. Too much stuttering. I did not have a chance to download the AMD beta drivers, will do this tonight and see if it makes a difference.

This is my last report with my r9 270. It's pretty much unplayable. I have tried lowering all the settings, installing beta drivers, and running the game at 720p and the game stutters a lot. It's insane, frame-rates will drop so violently when running around the city that it's just atrocious. In addition, during combat, the game stutters a lot too. I was getting 30-50 fps, but while running, jumping and climbing things the frame rate drops to the 20s. So... I don't think my 650m will cut it. I won't even bother. Now it's time to try my zotac 970.

My biggest issue is the stuttering, I can't stand it. I think it has to do with the 2 GB of VRAM. I got really bad for me when in Paris, even at the low settings when running, and fighting guards. I hope the new AMD drivers improve performance.
 

def sim

Member
I arrived in Paris. Performance seemed the same, so I changed Environment and Texture quality to ultra. The rest remained as is.

Framerate hangs around low to mid 30s with the most taxing area (Palais de Justice crowd) going to 28. Minimum framerate is 25 and only in some cut scenes. My hardware does not like stained glass; the stutter while climbing around on one is present here as it is on console.
 
Another thing to test: does environmental quality (or any other setting) have effect on the crowd pop-in, of both pop-in from nothing and between LOD?

Or the only setting you have control on is the static environment?
 

Evo X

Member
What kind of performance should I expect with a 4670k @4.4ghz and a single MSI GTX 980 TF at ~1500mhz?

What resolution do you play at?

I've got similar hardware and I'm running everything max, except for shadows on High and FXAA at 2560x1440.

Gameplay in Paris fluctuates from 45-60 usually, although cut scenes go down to 30 sometimes. GSync is a godsend in this game.
 
Yes, somewhat similar (r9 270 which is a renamed 7850 paired with a 4790k).

Anyways, here are my previous posts on my experience from it.







My biggest issue is the stuttering, I can't stand it. I think it has to do with the 2 GB of VRAM. I got really bad for me when in Paris, even at the low settings when running, and fighting guards. I hope the new AMD drivers improve performance.

Thanks for the info, I hope some AMD drivers come out soon.
 
Top Bottom