Assassin's Creed Unity - PC Performance thread

Next card I buy will have at the minimum 6GB of VRAM and will be the fast, not slow type.

Can't run ultra textures, PFFFT. High looks bad for NPCs. It's more like medium, which makes sense (the options are low, high and ultra).
 
Thanks, this seemed to improve input lag somewhat. Sadly, I am testing it in Versailles instead of Paris. The game just deleted my single-player progress for no reason. One part of me wants to look if there is still something left that can actually be salvaged, but the much more logical thing seems to be to leave the game behind at this point.

Where is the Unity save game located usually?

Edit: I now get the following message when starting the game: "Data corrupt! Your save has become corrupted. Do you wish to overwrite it and start a new game? A - YES, B - NO"

My computer was not forcefully reset or powered down the last time I played the game. I guess this is some uplay bug. Synching with uplay corrupted my save somehow?

I had this happen day of release and I disabled cloud sync in Uplay settings and haven't had another problem yet. Almost gave up when I had to play the beginning of this game 4 times.
 
I had this happen day of release and I disabled cloud sync in Uplay settings and haven't had another problem yet. Almost gave up when I had to play the beginning of this game 4 times.

Weirdly, cloud synch seems to be disabled already in uplay. When I logged in to play the game the first time, it was as if there was no save file at all. I restarted uplay + steam and was asked to enter my uplay email and password. Now when I started the game, I got the "corrupted" message. So, either this is not related to cloud synching or my uplay profile reconfigured/reset itself somehow.
 
After 20 hours of Unity, I'm still torn on whether I prefer 4x MSAA at locked 30fps or 60fps with dips to 45-55 fps with no AA (prefer both to FXAA, FXAA is too blurry). 60fps feels nice, but everytime it dips it's really noticable. 30fps at least is consistent and goddamn with 4x MSAA the game looks incredible.
 
After 20 hours of Unity, I'm still torn on whether I prefer 4x MSAA at locked 30fps or 60fps with dips to 45-55 fps with no AA (prefer both to FXAA, FXAA is too blurry). 60fps feels nice, but everytime it dips it's really noticable. 30fps at least is consistent and goddamn with 4x MSAA the game looks incredible.

ewww
 
I purchased the GTX 980. These fucking Ubisoft games. Seriously what gives. Assassins Creed: Unity will stay locked at 60fps with everything maxed out except for AA which I have set to MSAAx2. When it gets to a cutscene it instantly drops to 45-30 range. I just don't understand why this game does this. Watch Dogs will stay at 52-60fps range with everything maxed out except AA at MSAA4x, but if I enable TXAA it drops to the lower 40s.

Is TXAA that demanding that a single 980 can't handle it or all Ubisoft ports just fucking shit? Or could it be my CPU (i5 3570k)?

Reference point... Call of Duty: Advance Warfare, Shadows of Mordor, Borderlands Pre-Sequel all settings maxed out give me a smooth and stutter free 1080p 60fps experience.
 
Weirdly, cloud synch seems to be disabled already in uplay. When I logged in to play the game the first time, it was as if there was no save file at all. I restarted uplay + steam and was asked to enter my uplay email and password. Now when I started the game, I got the "corrupted" message. So, either this is not related to cloud synching or my uplay profile reconfigured/reset itself somehow.

I also deleted the most recent save game files that were in the uplay save games folder because I figured they were corrupted when I had the problem, I don't know if that helps.
 
I purchased the GTX 980. These fucking Ubisoft games. Seriously what gives. Assassins Creed: Unity will stay locked at 60fps with everything maxed out except for AA which I have set to MSAAx2. When it gets to a cutscene it instantly drops to 45-30 range. I just don't understand why this game does this. Watch Dogs will stay at 52-60fps range with everything maxed out except AA at MSAA4x, but if I enable TXAA it drops to the lower 40s.

Is TXAA that demanding that a single 980 can't handle it or all Ubisoft ports just fucking shit? Or could it be my CPU (i5 3570k)?

Reference point... Call of Duty: Advance Warfare, Shadows of Mordor, Borderlands Pre-Sequel all settings maxed out give me a smooth and stutter free 1080p 60fps experience.

Switch to FXAA and drop shadows to High and you should have great performance at 1080P. If you're still not satisfied, pick up a second 980.

Nobody owes you a flawless experience at max settings. This game has a hell of a lot more going on technically speaking than any of those games you listed.
 
I purchased the GTX 980. These fucking Ubisoft games. Seriously what gives. Assassins Creed: Unity will stay locked at 60fps with everything maxed out except for AA which I have set to MSAAx2. When it gets to a cutscene it instantly drops to 45-30 range. I just don't understand why this game does this. Watch Dogs will stay at 52-60fps range with everything maxed out except AA at MSAA4x, but if I enable TXAA it drops to the lower 40s.

Is TXAA that demanding that a single 980 can't handle it or all Ubisoft ports just fucking shit? Or could it be my CPU (i5 3570k)?

Reference point... Call of Duty: Advance Warfare, Shadows of Mordor, Borderlands Pre-Sequel all settings maxed out give me a smooth and stutter free 1080p 60fps experience.

cpu is probably whats holding you back. the scale of this game is what separates it from the others you mentioned. its better thanks to its scale and crowd density. moving through a crowd and shanking somebody while you walk by then moving on without any notice being taken to you specifically is satisifying.
 
I
Reference point... Call of Duty: Advance Warfare, Shadows of Mordor, Borderlands Pre-Sequel all settings maxed out give me a smooth and stutter free 1080p 60fps experience.
Shit you could probably downsample from 4K and get 60 fps in those games. But still Unity is doing so much more, which might not be necessary with all the NPCs and stuff but the lighting and details and scale really shine, Mordor is really plain and simple in comparison.
 
Man I cant wait til I get my hands on a future graphics card. Thing is I recently got my rig with the 760 around April. So I'll wait next summer see if another card gets released. But I got my eye on that 970/980.

One thing I'll give credit to Ubisoft is controller functionality. You can plug in the PS4 controller via micro usb and play, with vibration and playstation button prompts. Neat stuff for sure!
 
Man I cant wait til I get my hands on a future graphics card. Thing is I recently got my rig with the 760 around April. So I'll wait next summer see if another card gets released. But I got my eye on that 970/980.

One thing I'll give credit to Ubisoft is controller functionality. You can plug in the PS4 controller via micro usb and play, with vibration and playstation button prompts. Neat stuff for sure!

That's great.
 
After 20 hours of Unity, I'm still torn on whether I prefer 4x MSAA at locked 30fps or 60fps with dips to 45-55 fps with no AA (prefer both to FXAA, FXAA is too blurry). 60fps feels nice, but everytime it dips it's really noticable. 30fps at least is consistent and goddamn with 4x MSAA the game looks incredible.

Hopefully, GSync monitors (or some tech similar) becomes ubiquitous. It makes such a difference in the that 45-55 range.

One thing I'll give credit to Ubisoft is controller functionality. You can plug in the PS4 controller via micro usb and play, with vibration and playstation button prompts. Neat stuff for sure!

Yeah that was a pleasant surprise. I hope we see more of it and it would be really great if it worked over bluetooth as well.
 
I know unity is underoptimzed and all, but im starting to regret my 290x purchase. Doesnt seem to be that future proof compared to the 970 and 980..
 
Thanks, this seemed to improve input lag somewhat. Sadly, I am testing it in Versailles instead of Paris. The game just deleted my single-player progress for no reason. One part of me wants to look if there is still something left that can actually be salvaged, but the much more logical thing seems to be to leave the game behind at this point.

Where is the Unity save game located usually?

Edit: I now get the following message when starting the game: "Data corrupt! Your save has become corrupted. Do you wish to overwrite it and start a new game? A - YES, B - NO"

My computer was not forcefully reset or powered down the last time I played the game. I guess this is some uplay bug. Synching with uplay corrupted my save somehow?

That's unfortunate. Yes, It definitely sounds like yet another uPlay bug. In any case, save files can generally be found at following location:

"C:\Program Files (x86)\Ubisoft\Ubisoft Game Launcher\savegames\41290769-ecea-4acf-8a7d-8d56dfb47fc3\720"
 
That's unfortunate. Yes, It definitely sounds like yet another uPlay bug. In any case, save files can generally be found at following location:

"C:\Program Files (x86)\Ubisoft\Ubisoft Game Launcher\savegames\41290769-ecea-4acf-8a7d-8d56dfb47fc3\720"

My files were found on a similar file path, but 857 instead of the bolded. Sadly there were no working copies of my corrupted (still existing) save file. In a sense I may have been in a better if uplay synching was activated, because there could have been a copy of the last save game that was successfully uploaded.

Edit: To clarify, I discovered that I actually had uplay synching deactivated after writing that post.
 
FYI: If you ever need to find the folder for save data for a Uplay game, create a Desktop Icon for the game from UPlay, then right click and see the target

For AC:U for me it shows "uplay://launch/720/0", the 720 is the game ID and matches the folder in the "savegames" folder in the UPlay install location.

This method doesn't always work, such as games bought on STEAM that require UPlay to launch as you can't create a desktop icon from UPlay for the game but it can help.
 
I'm going to post a video of a bug I've had happen to me 3 times today, once this morning and then twice about an hour ago. It never happened before today. The game world freezes. I can still rotate the camera and access the map and menus but fast traveling causes the game to forever load. I get the sense that it's Uplay related. The menu options that do not work are Uplay related, like the character customization pages that use the stupid hack currency and the uplay option itself. The video is taking forever to process on YouTube. Anyone else having this issue?
 
Yes! Finally, AFR Friendly works wonders on my 295x2. Holding steady so far with everything on high, HBAO+ and FXAA at 2560x1440

FxseHxN.jpg
 
The new patch seems to have broken dual GPU performance. I have two HD7970's in crossfire, performance was good before the patch, now everything is a stuttering, hitching mess. In fact, I'm even getting negative scaling now.

Disabling the second GPU brings better performance for me at the moment.

Well, nothing seems to work for me. I still get a very noticeable stuttering (and the added resetting of the cloth physics every time it happens) with the newest patch, Low textures and SLI disabled. I'm pretty sure Ubisoft is not going to fix it either. They'll stall with minor patches that amount to nothing until people move on from the game, and then they'll begin the cycle again for next year's AC.

This game is a fucking trainwreck. Never again, Ubisoft.
 
Oh shit, where does this game keep its saves? I just booted the game and got hit with the intro.

Try:

That's unfortunate. Yes, It definitely sounds like yet another uPlay bug. In any case, save files can generally be found at following location:

"C:\Program Files (x86)\Ubisoft\Ubisoft Game Launcher\savegames\41290769-ecea-4acf-8a7d-8d56dfb47fc3\720"

Although if you get the message that your save file is corrupted, that probably means you are out of luck :(
 
From Ubi forums

What Ubi say and WCCF's response to it, i don't have much to add to what WCCF said as they hit the nail on the head, really.

But there is one thing that i will say at the end of all this.

Ubi

We are aware that the graphics performance of Assassin’s Creed Unity on PC may be adversely affected by certain AMD CPU and GPU configurations. This should not affect the vast majority of PC players, but rest assured that AMD and Ubisoft are continuing to work together closely to resolve the issue, and will provide more information as soon as it is available.

WCCF
It goes without saying that I had serious trouble believing that the entirety of the glitches present in Assassins Creed Unity are the cause of Catalyst Drivers (AMD). While modern drivers can be the cause of low frame rates in certain cases, they are not usually behind texture popping and entity glitches. One of the primary selling points of Assassins Creed Unity (from Ubisoft’s Marketing) was the fact that the game supported ‘thousands of NPCs on screen’. Well, they were right about that, but looks like they conveniently forgot to mention the performance hit that would ensue from using so many dynamic objects. We sent some emails and and found out what is really happening:


Ubi
The game (in its current state) is issuing approximately 50,000 draw calls on the DirectX 11 API. Problem is, DX11 is only equipped to handle ~10,000 peak draw calls. What happens after that is a severe bottleneck with most draw calls culled or incorrectly rendered, resulting in texture/NPCs pooping all over the place. On the other hand, consoles have to-the-metal access and almost non-existent API Overhead but significantly underpowered hardware which is not able to cope with the stress of the multitude of polygons. Simply put, its a very very bad port for the PC Platform and an unoptimized (some would even go as far as saying, unfinished) title on the consoles.

WCCF
Games should be created with the target hardware in mind. And from what I have seen so far, high end rigs built with the likes of Titans (Nvidia) and R9 295Xs are glitching as well. So unless the Titan GPU was secretly made by AMD, I am not really sure what Ubisoft PR is on about. The game appears to be barely functional, something that would automatically merit low scores. The post-launch embargo on reviews seems to have foreshadowed the condition of the title. I really enjoyed Assassins Creed Black Flag, but to me, Ubisoft has been making bad calls after bad calls lately, and their PR is heading towards a colossal train wreck. Alienating PC users is one thing, but at this rate, pretty soon, even console users will be vary of their games. Still, Far Cry 4 has yet to be released, so maybe not all hope is lost yet (fingers crossed).

http://wccftech.com/ubisoft-points-finger-amd-technical-bugs-assassins-creed-unity/#ixzz3IzQdlUNs

This is what i want to address.
The game (in its current state) is issuing approximately 50,000 draw calls on the DirectX 11 API. Problem is, DX11 is only equipped to handle ~10,000 peak draw calls

This is absolutely true, if they are pushing 50K Draw Calls they have problem with DX11, on AMD and Nvidia, its no good saying its only an AMD problem because the internet is full of Nvidia users with the same performance problems, there would be, DirectX has some serious efficiency problems affecting Draw Call performance amongst other things.
That is exactly why AMD got together with 'other' developers to create Mantle, its why Apple and Khronos 'OpenGL' and Microsoft are now following AMD's lead.
Existing API's like DX11 are too high level and bloated.

What Ubisoft need is Mantle, Their partners Nvidia might not like it, be that as it may then hold AMD to their word, they said Mantle will be open, So get AMD and Nvidia together to make it work for both.

http://www.amd.com/en-us/innovations/software-technologies/technologies-gaming/mantle#overview

Link
 
I played for 3 hours with high textures with 2 GB vram without a single dip.

That's impressive but how is it possible unless Ubi have been very conservative with their targets ? Usually when I set textures to a level my GPU can't handle the game stutters like mad, pauses every 10 seconds or so.
I tried ultra textures and 3gb don't seem to be enough for smooth gameplay.
 
I know unity is underoptimzed and all, but im starting to regret my 290x purchase. Doesnt seem to be that future proof compared to the 970 and 980..

1) The performance difference between a 290x and a 970 isn't that big.
2) The nvidia gpus were released 10 months later than the 290x, so of course they are better. 2013 gpu vs 2014 gpu. And then later amd will launch an improved gpu in 2015 better than the 9x0, and then nvidia will launch an even better gpu in 2016, and the hardware race will go on.
 
I know unity is underoptimzed and all, but im starting to regret my 290x purchase. Doesnt seem to be that future proof compared to the 970 and 980..

Don't. The 970/980 are actually a monumentally small jump in performance from 780/290s and they came out quite a bit later than the 290X as well. You've got similar performance, the same amount of VRAM, and Mantle (for games that support it) well before DX12 will be a viable alternative for other GPUs. In some scenarios, the 290X actually performs better. It's an excellent GPU you got for a good price. Newer, better stuff will always come out in the future, but I would frankly hesitate to even call the 980/970 "better" in any significant way for the first time in a new series' debut in a very long time unless power consumption is your thing. I'd say you got quite lucky going with a 290X, it's still top-notch. I own two 970s too, ergo no personal motive to sugarcoat things. :)
 
From Ubi forums



Link
So yea, they just tried to do too much. What we all pretty much suspected.

Shame, cuz if they'd just pushed the development timeline back a year, they might have had time to implement DX12. Wouldn't help consoles much, but at least the PC version could have realized their vision.
 
So yea, they just tried to do too much. What we all pretty much suspected.

Shame, cuz if they'd just pushed the development timeline back a year, they might have had time to implement DX12. Wouldn't help consoles much, but at least the PC version could have realized their vision.

Yikes :/ I wonder how they can optimize this when the issue is due to the scalability of the API they are using.
 
So yea, they just tried to do too much. What we all pretty much suspected.

Shame, cuz if they'd just pushed the development timeline back a year, they might have had time to implement DX12. Wouldn't help consoles much, but at least the PC version could have realized their vision.

I would have been completely happy playing Rogue on PC this year and Unity next year.
 
That's impressive but how is it possible unless Ubi have been very conservative with their targets ? Usually when I set textures to a level my GPU can't handle the game stutters like mad, pauses every 10 seconds or so.
I tried ultra textures and 3gb don't seem to be enough for smooth gameplay.

There is something wrong with savegames in the game. To achieve that performance on my rig my fix was:

-Start game
-Change texture option to ultra
-Restart the game (texture quality only changes with a game restart)
-Play with ultra textures until a new checkpoint
-Change texture option to high
-Restart the game
-Profit. 60 fps locked in my 3way 680 (2GB) setup.

This was done with internet disabled. Activating my net connection after Alt+Tab from the game, bring back stutters every three seconds.
 
Finally got around playing this on my desktop computer @ 1080p

i5-4670 @ 3.4 GHZ
8 GB RAM DDR3
Geforce GTX 660 2 GB VRAM

Environment and Texture quality on high, shadows at high, FXAA, HBAO+ and Blur enabled. The game averages 30 fps, dipping to mid 20s in scenes and going as high as 40-45 in certain areas. Like everyone else, the game lags crazily in the stained window in the church, and I have the occasional stutter here and there, but overall the game is quite smooth.
 
Quick update from me after a couple of hours of playing.
I've managed to settle on the settings for now, and I'm not bothering anymore. Fuck it. :/
I'll try to enjoy the game itself, rather than fiddling with the settings.

rig: Core i5-4670K (3.4GHz), 8GB RAM, GTX 970 (Gigabyte G1), 1920x1200

Settings:
- in game AA & vsync off
- triple buffering, v sync, FXAA, 16x AF forced via NVIDIA Inspector
- everything in game on ultra expect shadows on High
- HBAO+

I'm getting 55-60 free roaming through Paris.
Falls to ~45 (rarely) when those huge NPC crowds appear (can't belive this was their main selling point, uggghh).

Every...10-15min (ballpark) the game just hangs (draw call limit/issue?) for a couple of secs only to resume afterwards.

So that's it. I hope they fix the game ASAP because it's truly gorgeous and I can see my self losing 50+ hours in it.
 
I'm getting 55-60 free roaming through Paris.
Falls to ~45 (rarely) when those huge NPC crowds appear (can't belive this was their main selling point, uggghh).
I have the same setup minus the 970, which I'm getting soon. Do you personally get any boost in performance if you run it at 1080p instead?
 
I don't know if I quite buy that anonymous email from Ubisoft in the WCCF article to explain the LOD and pop in issues. I think they just implemented a really shitty method to handle those 2 aspects in some quick and dirty way to salvage performance. To say that it's being caused by a limitation of DX11 on PC and dynamically culled or improperly rendered draw calls but the same exact symptom on consoles is caused by hardware limitations instead doesn't sound right. I'd guess it's how the engine is configured for this game and the implementation is complete shit. There are too many people getting decent performance on powerful hardware for it to be hamstrung by inherent DX11 limitations.

Edit:
Also, Youtube apparently failed to process the video I uploaded yesterday to demonstrate the issue I had yesterday that I mentioned a little earlier in the thread.
 
Just looking at the textures. They look the same to me. The shadow resolution is different.

The ground textures are higher quality in one picture. It's not really noticeable unless it's a close up or you're really looking for it. Same as in Watch Dogs.
 
Don't. The 970/980 are actually a monumentally small jump in performance from 780/290s and they came out quite a bit later than the 290X as well. You've got similar performance, the same amount of VRAM, and Mantle (for games that support it) well before DX12 will be a viable alternative for other GPUs. In some scenarios, the 290X actually performs better. It's an excellent GPU you got for a good price. Newer, better stuff will always come out in the future, but I would frankly hesitate to even call the 980/970 "better" in any significant way for the first time in a new series' debut in a very long time unless power consumption is your thing. I'd say you got quite lucky going with a 290X, it's still top-notch. I own two 970s too, ergo no personal motive to sugarcoat things. :)

1) The performance difference between a 290x and a 970 isn't that big.
2) The nvidia gpus were released 10 months later than the 290x, so of course they are better. 2013 gpu vs 2014 gpu. And then later amd will launch an improved gpu in 2015 better than the 9x0, and then nvidia will launch an even better gpu in 2016, and the hardware race will go on.

Thanks for your input guys- guess it is an overreaction on my part...
 
i5 3770K @ 4.5Ghz
GTX 670 2GB Boost Clock 1226
8GB 1600

Using the following setting I am getting between 35-55 fps in Paris with the framerate unlocked @ 1080p It was usually hovering around 41 fps.

Environment Quality - Medium
Texture Quality - High
Shadow Quality - Low
Ambient Occlusion - HBAO+
Antialiasing Quality - FXAA
Bloom - Off

I have since locked the game to 30fps in NI. When the framerate was unlocked with these settings I played for around 2 hours and never saw the framerate drop below 35 outside of loading and cutscenes. Cutscenes can drop as low as 18fps. HBAO+ had very little performance impact compared to SSAO and I had a bit of headroom anyway when I locked to 30. The visual difference is night and day. GPU utilization unlocked was at 97-99%, locked is around 75-81%. With High textures the game takes around 1980MB of GPU memory and I am not getting the hitching that people are reporting. When I bumped shadows to High I was getting dips to around 28FPS.
 
So, I turned textures to low, restarted the game, and am still having severe freezing. I've noticed that the hair physics in the game spaz every time it freezes and objects in the distance render when it freezes too. I'm just going to assume this is an engine issue, but does anyone have any ideas as to what might be wrong?
 
So, I turned textures to low, restarted the game, and am still having severe freezing. I've noticed that the hair physics in the game spaz every time it freezes and objects in the distance render when it freezes too. I'm just going to assume this is an engine issue, but does anyone have any ideas as to what might be wrong?
That happened to me without the latest drivers. Worth a shot.

I also wanted to report that changing shadows to low fixed the 5-10 second locking up. I played for hours and it never happened with low shadows, which visually aren't much of a downgrade. My hip fire guess is something's wrong with PCSS implementation, however I didn't do long enough testing on high shadows to substantiate this.
 
Performance is really good on my system.
I7 4770K @ stock (will OC again later)
GTX 780 (902/954)
8gb of RAM
344.65 WHQL drivers

I wish I could record a video with FRAPS but it destroys my FPS, and I don't like Geforce Experience at all.
Even in crowed areas the framerate is high enough for me, the recent patch had a positive impact for sure.
 
That happened to me without the latest drivers. Worth a shot.

I also wanted to report that changing shadows to low fixed the 5-10 second locking up. I played for hours and it never happened with low shadows, which visually aren't much of a downgrade. My hip fire guess is something's wrong with PCSS implementation, however I didn't do long enough testing on high shadows to substantiate this.

I'm using latest drivers, unfortunately. I'll wait for a patch and see what happens. Thanks for the response!
 
I wish I could record a video with FRAPS but it destroys my FPS, and I don't like Geforce Experience at all.
Even in crowed areas the framerate is high enough for me, the recent patch had a positive impact for sure.

Why do you dislike Geforce Shadowplay? It records at a very high quality with low performance costs. You really should give it a try!

By new patch you mean the 1.2 one, no? For a second I thought there would be a new patch
 
Top Bottom