Assassin's Creed Unity - PC Performance thread

Bought my 780 for 280€ brand new. :) Killer price but I should have waited.
No, unlike you I prefer to keep my expectations in check. There is no way to know how demanding future releases will be, I have no issues with lowering settings regardless of the hardware I'd own.

Wow, thats a nice price mate! On Amazon it costs 480€, lol. Did you bought it on Ebay?

"I should have waited" You can say that with every piece of hardware, really. I bet Pascal with its eDRAM, will pee all over a GTX 970.

Maybe I have that expection because I only play on laptops. 4100 3DMark11 graphics score to 9000 of a GTX 970 (?) is a nice difference so I expect it to run everything it at 1080p60 maxed out cuz my laptop already do 1080p30 and high. That's the way I think. I guess thats not how it works based on these threads...?
 

Durante

Member
Those are borderline cases.

But what matters here is that on consoles this seems due to CPU issues. On PC we heave HUGE, INSANE headroom on CPU compared to a console.

So it really doesn't make any sense.
When you are talking about wanting to go from dropping to 19 FPS at worst to a solid 60 FPS, you are already talking about requiring >3x the CPU performance assuming equal efficiency (which isn't fully the case).

Gaming PC CPUs are a lot faster, but >3 is also a very significant factor.
 

SaberEdge

Member
Yea in all the previous titles ive personally played, even 2x TXAA provides better iq than downsampling from 4k(i value temporal stability more than super sharpness.). The cgi look also goes amazingly well with pbr games(see ryse for example.). Goes along way in creating a more natural, lifelike appearance.

Couldn't agree more with you. I, too, value temporal stability and a cleaner image quality more than sharpness. Besides, I've found that if I can use a bit of downsampling along with the TXAA it really helps to reduce any associated blurring. I also agree that the slightly softer CGI look works really well with games that use PBR.
 

Damian.

Banned
Also this can't be Crysis of this gen. You still can't max Crysis and run it at a locked 1080p60. The engine simply doesn't scale past 2 CPU cores, so you will always get framedrops in a lot of areas.
 
Seems like ubi ported this game from PC to consoles and XB1 being closer to PC in memory architecture and API made to run better than PS4. This might please PC users with 600$+ of CPU and GPU but ubi going to lose money from consoles sales.

So you're saying:
1- That this was mainly developed on PC and downported on consoles.
2- This sells much better on PC.

Sadly it's very likely far from truth. Or even plausibility.

We know that the PC port has been outsourced to a different studios. This means that the guys who made the game and coded it aren't even aware of what the other studio does. This is how outsourcing works. You outsource so that you can focus on something else.

Secondly, GAF is not representative of gamers out there. Only a tiny, tiny, tiny minority has PC that run SLI configurations, OC processors and liquid cooling. This is enthusiast level, and enthusiast level is a niche.

If you make a game that works only for enthusiasts of PC hardware you make a game that sells very poorly and that will be a commercial failure.

The simple idea that this was ALWAYS BEEN INTENDED to run poorly on consoles because their main target is PC is ludicrous. The reason why this is so terrible on consoles is that the game was rushed to make the holiday season.

The game is filled with bugs. The game looks barely in beta. It's an unfinished mess. The fact that PC offers so much hardware that you can bruteforce it to run well isn't the proof that this is was Ubisoft plan from the beginning.

It's just you deluding yourself because you are a PC enthusiast and now believe that Ubisoft cares about you and your expensive hardware.
 

Durante

Member
No I'm not trolling. That's the reason why Gaming PCs are much better than consoles. 1080p30 on good settings can do a console (and my laptop) too.
But not in this game. Consoles (and probably your laptop) can do 900p with drops to 19 FPS in this game.
Your argument is all over the place since you don't take into account the massive inherent differences between games.
 

valkyre

Member
So you're saying:
1- That this was mainly developed on PC and downported on consoles.
2- This sells much better on PC.

Sadly it's very likely far from truth. Or even plausibility.

We know that the PC port has been outsourced to a different studios. This means that the guys who made the game and coded it aren't even aware of what the other studio does. This is how outsourcing works. You outsource so that you can focus on something else.

Secondly, GAF is not representative of gamers out there. Only a tiny, tiny, tiny minority has PC that run SLI configurations, OC processors and liquid cooling. This is enthusiast level, and enthusiast level is a niche.

If you make a game that works only for enthusiasts of PC hardware you make a game that sells very poorly and that will be a commercial failure.

The simple idea that this was ALWAYS BEEN INTENDED to run poorly on consoles because their main target is PC is ludicrous. The reason why this is so terrible on consoles is that the game was rushed to make the holiday season.

The game is filled with bugs. The game looks barely in beta. It's an unfinished mess. The fact that PC offers so much hardware that you can bruteforce it to run well isn't the proof that this is was Ubisoft plan from the beginning.

It's just you deluding yourself because you are a PC enthusiast and now believe that Ubisoft cares about you and your expensive hardware.

Finally some fucking sense in here...

I salute you!
 
But not in this game. Consoles (and probably your laptop) can do 900p with drops to 19 FPS in this game.
Your argument is all over the place since you don't take into account the massive inherent differences between games.

Yeah I know that, I was talking about good optimized games in my post. That's why I said a GTX 970 should do 1080p60 with maxed settings in this game, but because its unoptimized thats of course not the case.

I'm actually very curious how it would run on my laptop. I will probably wait until it costs 20€. Black Flag runs in 28-34 FPS in 1080p and 34-43 FPS in 900p, at high (with HBAO+ on low, SMAA, Vsync and god rays on to low) settings in the first town. Based on that, can you come to a conclusion?
 
When you are talking about wanting to go from dropping to 19 FPS at worst to a solid 60 FPS, you are already talking about requiring >3x the CPU performance assuming equal efficiency (which isn't fully the case).

No, not what I was saying.

I was comparing Black Flag on consoles, versus Unity STILL on consoles.

I mean, try to calculate the difference on performance of the two games on the same console. Then calculate the difference on performance on the same PC between the two games.

I do believe that the yields are better for the consoles than they are for PC. Black Flag ran at 30FPS fixed and 1080p on PS4. So that's the difference in processing power between the two engines and games (actually we don't even know how above 30fps Black Flag can go, on PS4).
 

SaberEdge

Member
People saying this is the Crysis of this gen are also forgetting one simple fact:

Crysis was monumentally better looking than everything else when it came out. It still stands up as a graphics showcase today (Anandtech still use it as such).

AC:U - not so much.

Well, it's true that Crysis was much further ahead of its time, but Unity is still a mighty impressive game. I'd say it's probably the best looking game this year, and by extension the best looking game in general so far.

I wonder if we will ever again see something equivalent to what Crysis achieved. That game can still hang with many "current gen" games releasing today, even though it came out around 7 years ago.
 

Durante

Member
I do believe that the yields are better for the consoles than they are for PC. Black Flag ran at 30FPS fixed and 1080p on PS4. So that's the difference in processing power between the two engines and games (actually we don't even know how above 30fps Black Flag can go, on PS4).
The fact that Black Flag was 30 FPS locked on PS4 makes your argument almost impossible to maintain though. We can assume it never hit above 60, but little beyond that.

E.g. if the low range of what Black Flag actually dropped to being able to maintain on PS4 would be 40 FPS, that's twice the framerate at a higher resolution than Unity. And I don't think that performance impact would be much different to what we see on PC.
 

Seanspeed

Banned
Yeah I can understand you totally. I would be totally pissed off if I had bought a Gaming PC. I for one would buy a Gaming PC to max everything at 1080p60. Since thats not the case what's the point? Just a bit better framerates, a bit higher res shadows and 1080p instead of 900p? That's absolutely not worth it.
Just being able to do 1080p over 900p and game at higher framerates are the sort of things that justify spending more money, yes. These are the things that have the biggest and most tangible effect on the gaming experience.

But maxing every setting? That's where you start to see diminishing returns. Turning environmental detail from Ultra down two notches to High in this game provides a pretty much unnoticeable reduction in the graphics, for example. Not maxing this setting is something that only the psychologically compulsive will have an issue with. I think *most* PC gamers with half decent rigs are typically happy with reasonably good settings. The difference between a game on High and a game on Ultra is usually pretty miniscule, while the difference between 900p and 1080p on a computer monitor is generally quite drastic.

I honestly don't even understand your stance anymore. You say that going to 1080p and higher framerates isn't worth spending money on, but if you can get 1080p/60fps at max settings, suddenly it is worth it? That makes no sense at all.

I don't think you understand much about PC gaming and I think you understand even less what you're actually saying.
 
Performance isnt great, but its also not the worst ive come across.

I guess people with older 5/6 series cards are getting screwed and thats a shame. Theres no reason that a 670 or so shouldnt be able to run it on medium/low at 1080p60.

But, i gotta say, its a fucking beautiful game and i can see where the performance is coming from. Its the first next gen looking game ive come across. The cutscenes in particular, are unbelieveable.

Ive tried alot of mixtures of DSR and in game AA. I found DSR without AA is still very jagged. Only by maxing my DSR level (4x 1680x1050) was i able to get jaggy free, but also lowered me to 25fps.

Native resolution with 4xMSAA looked better for me than with DSR at 1440p with 2xMSAA.

I also found that TXAA gives less of a hit (but not by much) than 4xMSAA, despite what i read from some people. 8xMSAA is a deathblow. Regardless of res, 8xMSAA kills me totally, like down to 10fps. Yet the shimmering is still present with TXAA and 4xMSAA.
 

hoserx

Member
People saying this is the Crysis of this gen are also forgetting one simple fact:

Crysis was monumentally better looking than everything else when it came out. It still stands up as a graphics showcase today (Anandtech still use it as such).

AC:U - not so much.

AC:U is one of the most beautiful games I have ever played. At 2560x1600 with very high settings and some AA, it's truly jaw dropping. Sure it has bugs and problems...... I am not arguing against that..... so did Crysis, though. The comparison I am making between this game and Crysis is that they are both "hardware reality checks" for pc gamers who thought there was a formula that related money spent to framerate attained. Your gpu is at 99% because it is being pushed as far as it can go. Your 8 threaded i7 cpu is being utilized on all cores because there are a ton of people walking around a gigantic city filled with detailed buildings. I don't think that the beauty of this game is apparent in screenshots or videos sadly...... I think people are discrediting it based on that.
 
Just being able to do 1080p over 900p and game at higher framerates are the sort of things that justify spending more money, yes. These are the things that have the biggest and most tangible effect on the gaming experience.

But maxing every setting? That's where you start to see diminishing returns. Turning environmental detail from Ultra down two notches to High in this game provides a pretty much unnoticeable reduction in the graphics, for example. Not maxing this setting is something that only the psychologically compulsive will have an issue with. I think *most* PC gamers with half decent rigs are typically happy with reasonably good settings. The difference between a game on High and a game on Ultra is usually pretty miniscule, while the difference between 900p and 1080p on a computer monitor is generally quite drastic.

I honestly don't even understand your stance anymore. You say that going to 1080p and higher framerates isn't worth spending money on, but if you can get 1080p/60fps at max settings, suddenly it is worth it? That makes no sense at all.

I don't think you understand much about PC gaming and I think you understand even less what you're actually saying.

Ok, that is your opinion. Personally, I don't think that justifies it.

No, no I know that. There are many uneccessary settings which are just eating FPS and do nothing to visual quality and that's something I like on PC Gaming, I can turn that lower and get more FPS with the same quality, but the thing is I do that already with my laptop. I know what my hardware can do and what not and thats fascinating and the key reason why I play on PC, but why should I do that on a high end PC build for gaming, why shouldn't a energy eating monster that is way more powerful than my laptop max games at 1080p60? That's something I don't understand.
Im not trolling or anything like some of you are saying (NeoGAF is the last board I would do that, for that I have some different boards!)

My last Gaming PC was one with a 8800GT and a AMD X2 4200+ heh ;)
 

ISee

Member
Just a quick question as I am a bit baffled. There is no tessellation in the game? And they will patch it in later? I even did not know that (pc release is on the 13th for me). I wonder what impact tessellation is going to have on performance and why they are delaying it.
 

riflen

Member
I am not really understanding the vsync that can be activated in fullscreen mode. Some people have said that it is double buffered, but my fps seems to hover between 53-60 in fullscreen with vsync activated(try to start the game at 60hz running these settings, I noticed that messing around with settings could permanently decrease the fps to the 45-55 range until a restart). Anyway, if vsync was double buffered, shouldn't the fps drop to 30 when it can't hold 60 fps?

Subjectively, fullscreen + vsync appears much smoother than borderless windowed mode, even though the fps fluctuate in the same range. Why is this? Remember to restart the game into fullscreen + vsync mode, as messing with settings can introduce chugginess that stays until you quit.

AndyBNV: What does the new sli bits do? Improve scaling? Decrease graphics bugs? It is a bit hard to judge performance before/after. Fps seems to stay in the same range, but stuttering may have decreased somewhat.

specs: 3570k@4.6GHz, 8 gig ram, 2 x geforce 780 sli, ssd drive, running win 7 64 bit, steam version with patch 1.1.0, fps tested with FRAPS

You are using SLI. Double-buffered Vsync does not affect you the way it affects those using a single GPU. NVIDIA's AFR SLI requires the use of an extra buffer, so that the GPUs can work on each alternate frame. This has two effects; 1) Adds latency of 1 frame. 2) Means that you have a triple-buffered-like experience, where frame rate can fluctuate freely, even when the game only uses a back and front buffer.

SLI is only supported in true full-screen mode. Windowed modes of any kind will decrease your performance.
 
E.g. if the low range of what Black Flag actually dropped to being able to maintain on PS4 would be 40 FPS, that's twice the framerate at a higher resolution than Unity. And I don't think that performance impact would be much different to what we see on PC.

I think you still misunderstanding me.

On PS4, in the best cases both Unity and IV are 30 fps. So the difference is 900p > 1080p.

In the WORST cases instead the difference is 20 fps vs 30. On top of the resolution, still.

Ok? To see how this compares on PC one would need, keeping equal settings, to lower resolution at 900p and see if the loss in performance is proportional to the 20 to 30 we see on consoles. Imho, it's going to be bigger.
 

Durante

Member
I know what my hardware can do and what not, but why should I do that on a high end PC build for gaming
Because, regardless of the performance profile of any individual game, that high-end PC will be able to do proportionally more than a laptop or console.

There is no reason to introduce absolutes (like 1080p, or 4k, or 60 FPS, or 120 FPS, or "maxed" settigns) in the discussion, since the difference in performance profiles and settings across games is vast. Regardless of that difference, the fact that the high-end system can proportionally do more will remain.
 

hoserx

Member
Because, regardless of the performance profile of any individual game, that high-end PC will be able to do proportionally more than a laptop or console.

There is no reason to introduce absolutes (like 1080p, or 4k, or 60 FPS, or 120 FPS, or "maxed" settigns) in the discussion, since the difference in performance profiles and settings across games is vast. Regardless of that difference, the fact that the high-end system can proportionally do more will remain.

Well said. Can we close this part of the discussion here? Just because we arent hitting an arbitrary level of performance with a certain piece of hardware, does that mean it is worthless and not performing up to its capabilities?

I think a lot of people here are new to pc gaming and don't really know what to expect....... I mean people are asking "why does this game make my hardware so hot?" all over the place in this thread..... look at the utilization...... utilization = more power being used = more heat being created..... They think that causing hardware to warm up is a byproduct of an "inefficient engine." While the game may not be 100% bug free, it surely does a good job of putting your hardware to work. You should be pleased that this is happening.
 
Because, regardless of the performance profile of any individual game, that high-end PC will be able to do proportionally more than a laptop or console.

There is no reason to introduce absolutes (like 1080p, or 4k, or 60 FPS, or 120 FPS, or "maxed" settigns) in the discussion, since the difference in performance profiles and settings across games is vast. Regardless of that difference, the fact that the high-end system can proportionally do more will remain.

Yeah that's right, High End PCs of course have way more power than consoles and laptops and are doing greater even in unoptimized games (instead of drops below 23 FPS, it will drop below 40 FPS, that's a great difference) What confuses me is that there are people with quite powerful PCs who drop below 25 FPS in med settings with a GTX 770 and a i7 or so lol

Yeah I guess that's my mistake, Im really blinded by these absolutes, in the end we all have the settings to match perfectly on our hardware and thats something that will do every PC gamer, regardless if high end, mid end, low end or laptops. So everyone has different expections too. I'm already happy to be between Xbox One and PS4 with the graphics and FPS, you are happy if you have 1080p and always over 30 FPS at better graphics quality than PS4.

I guess I understand it now!
 

Dezzy

Member
I'm running an i7 4790k, GTX 970 and 16gb of ram. I only tried the first 5 minutes so far, but it seems to run at 60fps just fine, but that could change once I get to the main part of the game with the big cities and crowds of course.
Actually, it seems more like 59fps, since it runs smooth, but a frame gets skipped every second or so. Weird. (like Mario Kart 8)

Anyone nail down how much each graphics setting affects the performance? For me it's 1080p/60fps > effects. I imagine SSAO is first to go?
 

Rwinterhalter

Neo Member
I just had the weirdest bug climbing the side of Notre Dame. My framerate crashed down to 8 and then up again repeatedly. Checking MSI it looks my GPU usage dropped down to zero at that time. I have no idea why. Anybody else have this?
 

Rwinterhalter

Neo Member
Anyone nail down how much each graphics setting affects the performance? For me it's 1080p/60fps > effects. I imagine SSAO is first to go?

TXAA should be the first because FXAA costs less and does a better job with temporal and shader aliasing.

Then PCSS, followed by environmental quality. None of the other settings make that much of an impact as the games CPU bound beyond those settings.
 

d00d3n

Member
You are using SLI. Double-buffered Vsync does not affect you the way it affects those using a single GPU. NVIDIA's AFR SLI requires the use of an extra buffer, so that the GPUs can work on each alternate frame. This has two effects; 1) Adds latency of 1 frame. 2) Means that you have a triple-buffered-like experience, where frame rate can fluctuate freely, even when the game only uses a back and front buffer.

SLI is only supported in true full-screen mode. Borderless modes of any kind will decrease your performance.

Thanks, this explains a lot.
 
You're in a performance thread.........and it doesn't perform badly. The game is truly demanding, and people have unfounded expectations as to what their hardware can do.

Gimme a break, some of the fucking cutscenes with two people in them render at 20 fps on my 770. There's all kind of sequences in this game where Ive stood in an enclosed room with three NPCs in it and the game will churn out frames in the low 20s on moderately high settings. Really, an enclosed room with three people rendering at 20 fps, with soft shadows disabled is good optimization, the jewel of PC gaming graphics? Some of you people are out of your mind.
 

rivalchild

Neo Member
AC:U is one of the most beautiful games I have ever played. At 2560x1600 with very high settings and some AA, it's truly jaw dropping. Sure it has bugs and problems...... I am not arguing against that..... so did Crysis, though. The comparison I am making between this game and Crysis is that they are both "hardware reality checks" for pc gamers who thought there was a formula that related money spent to framerate attained. Your gpu is at 99% because it is being pushed as far as it can go. Your 8 threaded i7 cpu is being utilized on all cores because there are a ton of people walking around a gigantic city filled with detailed buildings. I don't think that the beauty of this game is apparent in screenshots or videos sadly...... I think people are discrediting it based on that.
I would argue that your hardware is being pushed to it's limits because the game seems that it's not optimised very well. Which has been a recurring trend with ubisoft games on pc as of late.
 

hoserx

Member
Gimme a break, some of the fucking cutscenes with two people in them render at 20 fps on my 770. There's all kind of sequences in this game where Ive stood in an enclosed room with three NPCs in it and the game will churn out frames in the low 20s on moderately high settings. Really, an enclosed room with three people rendering at 20 fps, with soft shadows disabled is good optimization, the jewel of PC gaming graphics? Some of you people are out of your mind.

I know it sucks, but look at the suggestions for settings that andy from nvidia has shared....mostly low settings for your card..... what cpu are you using? What resolution are you playing at?
 
I'm running an i7 4790k, GTX 970 and 16gb of ram. I only tried the first 5 minutes so far, but it seems to run at 60fps just fine, but that could change once I get to the main part of the game with the big cities and crowds of course.
Actually, it seems more like 59fps, since it runs smooth, but a frame gets skipped every second or so. Weird. (like Mario Kart 8)

Anyone nail down how much each graphics setting affects the performance? For me it's 1080p/60fps > effects. I imagine SSAO is first to go?

The AO is not that big of a performance hog and actually is very noticeable in terms of the image. I've found FXAA to be fine as far as IQ with the same PC specs as you and the higher AA's easily drop FPS by 10 or more.
 
the in game fxaa option seems to produce a visual output nearly identical to watch dogs temporal smaa. maybe its a temporal version of fxaa?
 

SaberEdge

Member
It might a decent port job, but I'm not going to praise the game's performance, sorry. It seems excessively demanding for what the final output is. There may be valid reasons for why it runs the way it does and it may not necessarily be 'unoptimized', either. That doesn't mean they didn't make a lot of bad choices to get here, though.

I bet if you took a poll and asked how many people would take a halving of NPC counts for a respectable increase in performance, people would be all for it. Or maybe if there was some breakthrough in CPU performance on the near horizon, we could chalk it up to being 'ahead of its time', but that's not really the case.

Bottom line, some people want to enjoy *playing* the game and that's made difficult with a violently inconsistent and demanding framerate. Another example of why I fully support the push for 60fps console games, so we can at least get an emphasis on playability before thinking about tacking on the shiny.

So you basically want graphically pared back games that don't look nearly as good, but at least they will run at 60fps on consoles? That doesn't sound like a very PC gamer perspective to have. I thought you were a PC gamer, but maybe I'm wrong.

I have the completely opposite view: I want developers to push the consoles as hard as they reasonably can, even if it means rendering at a lower resolution. Then, on top of that, add extra graphical features for the PC versions.

The worst possible scenario I can imagine is if all this complaining actually makes devs decide to start delivering less graphically demanding games just so that console gamers get their 60fps and PC gamers with mid-range gaming PCs can max them out with 60fps too. Then devs can brag about their PC versions supporting "resolutions above 1080p". But,hey, at least everybody can max out the game with 60fps right?

Edit: By the way, I think some of you are assuming too much when you imply that the crowds are the only thing that makes this game demanding. I see a lot of different aspects to this game's visual design that explain why it is demanding.

Also, I don't experience any kind of "violently inconsistent" framerate in Unity. I cap at 30fps and it's very consistent.
 

Genio88

Member
I've just started the game on my rig i7 4770k 4.2ghz, r9 290 OC, 8GB DDR3 RAM, on a Samsung evo SSD, everything maxed out, 1080p and Vsync off, i tried first MSAA 4X and then FXAA, in the very initial scenes,
when there is a battle you have to pass through
, i got 35fps with MSAA 4x and 55 with FXAA, a huge fps difference, though from what i've seen so far, which actually is not pretty much and is pretty dark, the FXAA seems better than MSAA, is that possible?
Unfortunately i can't play further for now, will try better later
 
So

i2500K @ 3.30
570 msi Twin Frozr III ,no OC
8 GB Ram

Graphics Settings AC Unity
Res 1080p
Refresh 60hz

Environment Medium
Texture High
Shadow Low
Ambient Occ OFF
AA FXAA
V-Sync Off
Bloom Off

FPS average 22-23 with peaks of 33-34
Last Driver Nvidia 344.65

According to experts , where can I improve?
 
So

i2500K @ 3.30
570 msi Twin Frozr III ,no OC
8 GB Ram

Graphics Settings AC Unity
Res 1080p
Refresh 60hz

Environment Medium
Texture High
Shadow Low
Ambient Occ OFF
AA FXAA
V-Sync Off
Bloom Off

FPS average 22-23 with peaks of 33-34
Last Driver Nvidia 344.65

According to experts , where can I improve?

900p instead of 1080p. See if that helps
 

KKRT00

Member
So

i2500K @ 3.30
570 msi Twin Frozr III ,no OC
8 GB Ram

Graphics Settings AC Unity
Res 1080p
Refresh 60hz

Environment Medium
Texture High
Shadow Low
Ambient Occ OFF
AA FXAA
V-Sync Off
Bloom Off

FPS average 22-23 with peaks of 33-34
Last Driver Nvidia 344.65

According to experts , where can I improve?

Seems like Textures are bringing Your performance down. Try low and medium instead.
 

hoserx

Member
So

i2500K @ 3.30
570 msi Twin Frozr III ,no OC
8 GB Ram

Graphics Settings AC Unity
Res 1080p
Refresh 60hz

Environment Medium
Texture High
Shadow Low
Ambient Occ OFF
AA FXAA
V-Sync Off
Bloom Off

FPS average 22-23 with peaks of 33-34
Last Driver Nvidia 344.65

According to experts , where can I improve?



you have a 2500 K

use that K.

K?
 
Top Bottom