Assassin's Creed Unity - PC Performance thread

UnrealEck

Member
Weird how much more FPS the GTX 780 gets over the 770 at VHQ. I thought it was the extra GB of VRAM but then I saw the dual GPU GTX690 2GB getting much higher performance too. I don't know what to make of that.
Anyone with a 770 either 4GB or 2GB, can you get a stable rock solid 30 frames with a mix on high medium settings?

At around 1080p on all high with a 2GB 770 you'll easily get over 30 FPS average. On a 4GB 770 you can turn a few settings up too, like textures.
You can get around 40-50 FPS on a 2GB GTX 770 on with very high environment, HBAO+ and high shadows
 

SaberEdge

Member
theres too much shimmer and image instability to put this above ryse. also the lod is extremely noticeable, even worse than bf4, and the shadows have a lot of weirdness.

Despite being normally sensitive to aliasing and image quality issues, I still found myself being impressed more often in Unity than in Ryse. It's the way it animates and the density and movement in the environments more than anything. They both have excellent lighting, shading, and materials...although even here I think Unity edges it. It's true, though, that Ryse is a bit more consistent...except for the horrible looking fire and waving flags/cloth in some areas.

I honestly didn't notice any real LOD issues on buildings or the environment, mostly just the nearby LOD changes on NPCs.
 
Weird how much more FPS the GTX 780 gets over the 770 at VHQ. I thought it was the extra GB of VRAM but then I saw the dual GPU GTX690 2GB getting much higher performance too. I don't know what to make of that.

GK110 has better and more shaders, just like how the 690 has more shading power than the 770. Makes sense IMO.
 

Damian.

Banned
I honestly didn't notice any real LOD issues on buildings or the environment, mostly just the nearby LOD changes on NPCs.



That is my only real beef with this game, other than the NPCs changing clothes as you move towards them, the game itself looks incredible for the most part. One of the best looking games this year, if not the best.
 

Kinthalis

Banned
The recent SLI problem is mostly a development resources/engine design/DirectX 11 issue. There's not much time to turn around these ports and DX11 does not help developers use multiple GPUs. Most developers need hands on help from NVIDIA for SLI support, so you can imagine that it'll only happen if the developers can spare resources to apply to the problem.

Microsoft mentioned at GDC that DX12 will have constructs that will allow developers to better use all GPUs in the system. They showed nothing though, so who knows if the situation will ever improve. I know that DX11.3 and DX12 will co-exist, so even good multi-GPU support in DX12 doesn't guarantee that things will improve for all Windows games.

15587576389_3c7c0e7cb9_o.jpg


Still only need 8GB system RAM I see.

I believe DX12 and DX 11.3 will both have feature parity. The main difference will be 11.3 will retain the CPU overhead while DX12 will significantly reduce it.
 

SaberEdge

Member
What is so difficult to understand? Me turning down a setting from Ultra to High so I can run at 1080p/60fps is *not* the same thing as you turning down a setting from High to Medium so you can run 762p/30fps or whatever your laptop res is. Now, don't get me wrong, I have *nothing* against people who game on lower end PC hardware, but 1080p/60fps is a far more impressive experience, all else being equal. That is a large part of why people pay for nice PC hardware. But just because I have to turn down a setting here and there doesn't mean that suddenly 1080p/60fps isn't impressive anymore and that its all been a massive waste of money. It will *still* look great and way better than 762p or 900p/30fps.


I realize its a conflict of interest for you and that the more console versions of games sacrifice performance for graphics, the better the potential gains on PC will be if you've got a good PC, but I'm willing to be unselfish here in the name of people finally getting that framerate and playability should be treated as a high priority. That might actually stimulate the popularity for 120hz gaming as well.

And yes, it would mean that games should hit 1080p/60fps easier on PC, as well. Extra GPU power can go into even higher resolutions, extra PC-specific settings, mods or even higher framerates. It would also mean that you could probably spend less if you didn't need all that and just wanted something that will perform solidly, opening up PC gaming to more people.

VR would certainly benefit with more games being designed to run at appropriate levels for VR headsets, too. That's a biggie for me.

Graphics will always get better and better. That will not change. It doesn't mean that games will stop being impressive. Its not as if 60fps for a console game necessarily has to be 'ugly', either, ya know? It would be a mere temporary step back before we get back to where we were and then could move forward again. I would gladly take that step back for all the advantages it would bring.

You make some good points. I do think that developers should prioritize a stable 30fps for the consoles. I personally don't think 60fps is necessarily a good trade off for fixed platforms though. That performance can be better spent in other areas, in my opinion.
 

Marmac

Banned
I bought this retail and installed from the discs. Came on 5 DVDs. After install I booted up uPlay and entered my CD Key and everything worked smoothly. Only thing it had to download was the Day 1 patch.


Great! I'll try that. I think what I did wrong was load up Uplay while it was installing from the discs.
 

SaberEdge

Member
Im confident when Witcher 3 comes out, it will look beautiful and run much better than AC Unity across a broad range of cards.

I hope you are right, but I'm not so sure you are. I predict that The Witcher 3 at max settings will look somewhat better than Unity and will be very similar in terms of its hardware demands.
 

Dr Dogg

Member
I know. That's why I said what I said.

The thing is Jase you're asking a question no one in here can really answer. You could try and take it from what the (poor) translation spits out and work it from there, or even better if you're a native russian speaker. Looking at the article (and all the one's they have previously done) they take their results from MSI Afterburner.
All cards are tested to the highest quality graphics program MSI Afterburner
Now does that mean they perform their run, (demoed in this YouTube video here.)
https://www.youtube.com/watch?v=QtmnXYVZOeA

And on each setup and then take the lowest number and work out the average number over the period of that run as well? You'd have to be them to know but one thing is possible, that run could stutter like porky pig but still report a lowest frame rate inline with what they have shown. You know you've never going to get a definite answer to your question from a bar graph only reporting FPS. Now if there was a frame time analysis for each run there you get an awful lot more information shown.
 

888

Member
I am still not seeing a mess of issues yet. Still running between 65-80fps outside of cut scenes. My biggest issue right now is that everyone has assassin abilities in these games. It's like Oprah gave away assassin abilities.

Anyways for specs.

MSI Z87-G45 Gaming
Intel Core i5 4670K OC'd to 4.2Ghz
16GB DDR3 1600Mhz Patriot Viper Memory
MX100 512GB SSD
Asus GTX 970 Strix

Edit: I found something of a mess. Peoples clothes totally change textures while moving thru a crowd. Yeah...
 

Dysun

Member
Game runs nice after some tweaking, everything on Ultra and SMAA injected via SweetFX. Average 60 FPS in game, drops a bit within cut scenes though

i5 4690K Stock
GTX 970 Stock
16GB DDR3 1866 Mhz
 

Dr Dogg

Member

Sorry for my poor formatting but you missed out...
And on each setup and then take the lowest number and work out the average number over the period of that run as well?

Because I can clearly see the video they have taken of their testing run but don't know how are they recording their results. If they are indeed using Afterburner to grab their numbers and then chart all the data then hmmmmm. Sure not everyone can have a FCAT setup and FRAPS may still be a pain sometimes with certain titles but I've got more faith in a FRAPS benchmark than one cobbled together from Afterburner.
 

Parsnip

Member
Looking at the benches, it seems I should maybe able to get a solid 30fps with some settings turned down to accommodate my 2GB card. Just going to have to tweak a bit on what exactly I need to turn down. I think I'm fine with that.
 

GHG

Gold Member
God damn AMD CPU's are still suffering even with games now being multithreaded:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Assassins_Creed_Unity-test-ac_proz.jpg


And there was me thinking things would improve for owners of 8350 once next gen games started hitting...
 

ISee

Member
Just for some interested this is a comparison screenshot between PC and PS4.
The game looks, obviously, better on PC. I already assumed that, no big deal.
But the combination of higher Textures, HBAO+, better shadows and higher resolution makes the game look so much better on PC. It's like night and day.

Just look for yourself. Picture with slider.
 

Genio88

Member
God damn AMD CPU's are still suffering even with games now being multithreaded:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Assassins_Creed_Unity-test-ac_proz.jpg


And there was me thinking things would improve for owners of 8350 once next gen games started hitting...

I new that, indeed i upgraded last summer with a 4770k. Talking about the game, i've played it a few hours on my rig, 4770k and r9 290 OC'd and the game is running good with everything maxed out at 1080p and FXAA, i've got like 40-60fps almost all the time in Paris, i'm pretty happy with that, also the graphics looks good, the only thing i complain about is the texture pop-in in the npc while you get closer to them, i hope they'll fix that with the new patch
 

JaseC

gave away the keys to the kingdom.
The thing is Jase you're asking a question no one in here can really answer.

I was just thinking out loud with "I wonder what the benchmark was". If I were looking for an answer I'd have asked for one. ;) I mean, the reason I said that is because the framerates seem high to me and you're beating me over the head with "We don't have all the details, so we don't know why that's the case", which was my underlying point in the first place.
 

Tugatrix

Member
Game runs like ass even on low (I have the newest drivers). I also have i5 4570... can I do something decent with that? :-/ I think that is a bottleneck too...

jeesus I was about to get one on my new PC and wouldn't be able to play AC, better open the wallet more or wait for an optimization(maybe)
 

Kezen

Banned
Just for some interested this is a comparison screenshot between PC and PS4.
The game looks, obviously, better on PC. I already assumed that, no big deal.
But the combination of higher Textures, HBAO+, better shadows and higher resolution makes the game look so much better on PC. It's like night and day.

Just look for yourself. Picture with slider.

Holy crap. If those screens are legit consoles do not use the max texture setting.
 

JaseC

gave away the keys to the kingdom.
Holy crap. If those screens are legit consoles do not use the max texture setting.

Watch Dogs doesn't, either, but the big difference here is that Unity eats up much more system RAM, which on consoles means even less of a budget for textures.
 
Just for some interested this is a comparison screenshot between PC and PS4.
The game looks, obviously, better on PC. I already assumed that, no big deal.
But the combination of higher Textures, HBAO+, better shadows and higher resolution makes the game look so much better on PC. It's like night and day.

Just look for yourself. Picture with slider.

That's interesting. Shots like that will look even better on PC when the tessellation update comes through. All those bricks and stones should pop a bit more.

PC image is obviously much sharper, but I don't think the PS4 looks bad at all.
 
Yeah 780 is GK110.

I am liking the CPU scaling in those benchmarks btw. It makes me all the more hungry for DX12.

Ok, i have tried that SLi profile posted some hours ago. It's the same profile that i already have installed with the latest Nvidia driver. I'm waiting for patch 1.2 that it will be released in the same time with the EU version of Unity.
 

Dr Dogg

Member
I was just thinking out loud with "I wonder what the benchmark was". If I were looking for an answer I'd have asked one. ;) I mean, the reason I said that is because the framerates seem high to me and you're beating me over the head with "We don't have all the details, so we don't know why that's the case", which was my underlying point in the first place.

Hahaha alright fair do's. Sorry if that came out a little grumpy my end but in a roundabout way what I was trying to say is take those numbers with a pinch of salt but you know that from your own experience ;)
 

hoserx

Member
Ok, i have tried that SLi profile posted some hours ago. It's the same profile that i already have installed with the latest Nvidia driver. I'm waiting for patch 1.2 that it will be released in the same time with the EU version of Unity.

You mean the one that Andy from nvidia suggested we all get by updating through geforce experience this morning?
 

Kezen

Banned
Watch Dogs doesn't, either, but the big difference here is that Unity eats up much more system RAM, which on consoles means even less of a budget for textures.

Watch Dogs uses a mix of medium, high and ultra textures apparently. I haven't checked for myself, it's been a long time I played it on PS4.
But it's a bit surprising to me, I mean sure on PC it requires 4GB of VRAM but I was under the assumption that you could manage your memory more efficiently on consoles.
 

JaseC

gave away the keys to the kingdom.
Hahaha alright fair do's. Sorry if that came out a little grumpy my end but in a roundabout way what I was trying to say is take those numbers with a pinch of salt but you know that from your own experience ;)

No harm, no foul. <3 The sooner I can get off these 670s, the better. (As I mentioned in passing in the Steam thread earlier, I have a job interview tomorrow this arvo, so fingers crossed.)

Watch Dogs uses a mix of medium, high and ultra textures apparently. I haven't checked for myself, it's been a long time I played it on PS4.

Ah, well, if so, at least Ultra textures are partially there, haha.
 
The only thing that creates the biggest difference is the HBAO+, the other stuff are just little eyecandy features that many people won't notice. The feeling between HABO+ , sssao and no AO is like the feeling between a finished game and a badly lit unfinished game:

It adds to the realism very well, look the statue: http://www.geforce.com/whats-new/guides/assassins-creed-unity-graphics-and-performance-guide

but sometimes it gets exaggerated and not realistic like the HBAO+ example of the towel. I wonder if AMD users can enable it and how does it influence the performance compared to Nvidia cards.
 
D

Deleted member 17706

Unconfirmed Member
For some reason, in this game, FXAA looks way better than TXAA. On my system anyway.

I agree, actually. Last night, I actually moved over to FXAA, and it wasn't just for the large FPS benefit. It actually looks better than TXAA and even 4x MSAA to my eyes in this game.
 
The only thing that creates the biggest difference is the HBAO+, the other stuff are just little eyecandy features that many people won't notice. The feeling between HABO+ , sssao and no AO is like the feeling between a finished game and a badly lit unfinished game:

It adds to the realism very well, look the statue: http://www.geforce.com/whats-new/guides/assassins-creed-unity-graphics-and-performance-guide

but sometimes it gets exaggerated and not realistic like the HBAO+ example of the towel. I wonder if AMD users can enable it and how does it influence the performance compared to Nvidia cards.

No, HBAO+ won't make that difference. It's the texture setting, which also changes lighting. HBAO+ is the definition of just a little eyecandy.
 

RVinP

Unconfirmed Member
God damn AMD CPU's are still suffering even with games now being multithreaded:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Assassins_Creed_Unity-test-ac_proz.jpg


And there was me thinking things would improve for owners of 8350 once next gen games started hitting...

This explains a lot for those who are experiencing very low frame rate, they are bound by AMD processors performance in the game and not their GPU's.
 

SlickVic

Member
Agreed, even with an 860m laptop I can achieve a mostly locked 1080p30 while looking better than PS4/XBone. Ubisoft really fucked console players with this title. It literally should not have been released on console in the state that it is in.

What settings are you running if you don't mind me asking? I have the same card and with everything on high, vsync off (no tearing with borderless window for me), hbao+, 1080p, and I'm just barely getting low-mid 20s. I've tried turning some settings down but the performance gains are so minimal it didn't seem worth it. Only thing I haven't tried tweaking is bloom off, so I'm not if that would make much of a difference.

I refuse to lower hbao+ as well since it looks much nicer than ssao imo, and my framerate only went up by like 1-2 with ssao.
 

Jin

Member
You are using SLI. Double-buffered Vsync does not affect you the way it affects those using a single GPU. NVIDIA's AFR SLI requires the use of an extra buffer, so that the GPUs can work on each alternate frame. This has two effects; 1) Adds latency of 1 frame. 2) Means that you have a triple-buffered-like experience, where frame rate can fluctuate freely, even when the game only uses a back and front buffer.

SLI is only supported in true full-screen mode. Windowed modes of any kind will decrease your performance.

I have vsync on with my single 980 and my framerate is about 45-60fps. It never drops to 30. I really think the vsync is triple buffered.
 
No, HBAO+ won't make that difference. It's the texture setting, which also changes lighting. HBAO+ is the definition of just a little eyecandy.

I wasn't talking about the textures resolution. This thing is granted and for years and can make even a 20 year old game look better. I was talking about the enw features Nvidia brought like PCSS, TXAA, Tessalation and HBAO+. We can drop any of those and the game can stay great but without the HBAO+ the game will look dull and bland lacking soul in many places (look the statue).

Also the textures won't change lighting since Ubisoft aren't using any type of dynamic GI and they won't (hence the use of AO of any type) since the most practical one is VXGI shown yesterday needs at leats a GTX970 to run a completely empty scene, let alone an open world game).
 

Genio88

Member
I agree, actually. Last night, I actually moved over to FXAA, and it wasn't just for the large FPS benefit. It actually looks better than TXAA and even 4x MSAA to my eyes in this game.

Yes definitely, i've noticed that too, on my r9 290 i there is not the TXAA option, should be Nvidia exclusive, but FXAA looks better than MSAA 4x, and also the increase of fps is huge, so FXAA all the way.
 
What settings are you running if you don't mind me asking? I have the same card and with everything on high, vsync off (no tearing with borderless window for me), hbao+, 1080p, and I'm just barely getting low-mid 20s. I've tried turning some settings down but the performance gains are so minimal it didn't seem worth it. Only thing I haven't tried tweaking is bloom off, so I'm not if that would make much of a difference.

I refuse to lower hbao+ as well since it looks much nicer than ssao imo, and my framerate only went up by like 1-2 with ssao.

Make sure your laptop is not on battery and AC connector is plugged in. What CPU do you have? Did you tried 900p? Set it to fullscreen mode, not borderless.
 
Top Bottom