Assassin's Creed Unity - PC Performance thread

SaberEdge

Member
I think the last console generation has spoiled some people. It was so long that they really didn't push pc hardware at the end. Couple that with the lack of real cpu/gpu performance jumps allowed people to use the same hardware for years.

Now gamers are complaining when their single cards don't run things on max. That is how it always use to be. You had to have a beast machine for that and still there were games you couldn't play with everything turned and have good fps.

So true. I've been thinking the same thing. People have unreasonable expectation based on the fact that for the past few years even people with mid-range PCs could basically max everything out. Assassin's Creed Unity is one of the first fully new generation games, not a cross-gen game like most of the games we've got so far this generation. Dragon Age Inquisition is also pretty damn demanding. So is Shadow of Mordor for that matter, despite being a cross-gen game.

I think some people are in for a rude awakening when they realize that all the upcoming demanding games are going to have similar requirements to AC Unity.
 

sobaka770

Banned
How even taxing is FXAA? I know it's pretty negligible, but still, i want to know if i can gain like 1 or 2 fps without it.

DON'T turn FXAA off. Decrease whatever else you want, put shadows on High, remove SSAO or anything, but don't go cheap on AA.

I did this mistake once and the jaggies murdered my eyes. Paris buildings are full of small sculptures, roof tiles etc. which all become a mess at a distance without alias. It's worth a 2 FPS decrease and the occasional blur. Don't fuck with FXAA in this game.
 
Unreasonable expectations are an issue for both console and PC gaming because some people have problems grasping some very basic concepts. Examples:

1) "I bought a next gen console, it should be able to run everything at 1080p/60fps because of optimized metal Carmack!". No. This time around console hardware is rather weak and depending on how much developers are willing to push it you'll have to deal with lower than full HD resolutions and sub-30 framerates. Also, a midrange gaming PC will outperform both consoles and an entry-level one will match them.

2) "I paid $600 for a graphics card, I should be able to max out every game at 1080p/60fps!". No. The level of graphical quality differs from game to game and the fact that you run one maxed at that res and framerate doesn't mean that you will do so for every game. Also, the further you move up the power scale the bigger the price premium you have to pay for owning the absolute best, the level of performance you'll get doesn't scale linearly with price.
 

Setsuna

Member
People saying this is the Crysis of this gen are also forgetting one simple fact:

Crysis was monumentally better looking than everything else when it came out. It still stands up as a graphics showcase today (Anandtech still use it as such).

AC:U - not so much.
Big difference Crysis is an FPS Unity is an open world game They both have different objectives

EDIT FPS games are usually about models and textures and particles. Open world games NPCs view distances while trying to keep textures and models as high as possible
 

luca_29_bg

Member
CGilme7.jpg


yeah take a look at the clock....7950 here, and i7 860 3.7 ghz....

i can't believe!
 
Actually, FXAA is "going cheap on AA" ;)

What you are saying is "don't go no AA". Which is always good advice.

Eh, I don't know guys. If my machine can't handle proper AA I often prefer running games without AA at all rather than blur the graphics even a little bit.
 

Arulan

Member
Eh, I don't know guys. If my machine can't handle proper AA I often prefer running games without AA at all rather than blur the graphics even a little bit.

Injecting SMAA is usually an option, but I would easily opt for lowering graphics settings than a heavily aliased image.
 
Eh, I don't know guys. If my machine can't handle proper AA I often prefer running games without AA at all rather than blur the graphics even a little bit.

depends on the symptoms the individual game displays with no AA. bf3 and 4 for example have white speckly dots everywhere so in that case fxaa is the lesser evil. you actually even need some with deferred aa on just to blur over all the white speckles anytime a surface displays some specular.
 

UnrealEck

Member
Eh, I don't know guys. If my machine can't handle proper AA I often prefer running games without AA at all rather than blur the graphics even a little bit.

I go no AA. The game's FXAA takes a pretty noticeable chunk of performance. I'm even running it at 1820x1024 too.
 

sobaka770

Banned
Eh, I don't know guys. If my machine can't handle proper AA I often prefer running games without AA at all rather than blur the graphics even a little bit.

I'm not saying that you should always have some AA in all games based on what performance you want out of it, but in AC: Unity specifically, it's quite mandatory.

Obviously you can go SMAA and TXAA for great picture quality but instead of 2 FPS drop you'll have 30-40% performance hog.
 

FLAguy954

Junior Member
Specs: i5 4670K @ 4.5, R9 290 @ 1007/1400, 8 GB of ram, playing at 1920x1200 borderless

Okay, so I ran the game this morning and it runs fine out of the cut scenes (40~60 fps). The game uses a VERY heavy DoF effect in the cut scenes that effectively brings my frames to around 30 fps. Like others have mentioned, the visuals of this game are a mixed bag but overall I think it looks great, especially the character models. I wonder how the game will run for me when Ubi patches in tessellation :/.

Edit: Oh yeah, as for the graphics, everything is maxed out except for shadows (high) and environmental quality (second highest setting) with SMAA injected via RadeonPro (thinking of forcing 2x MSAA because SMAA doesn't kill enough of the jaggies).
 

Rnr1224

Member
I'm getting about 30-40 FPS with some dips into the 20s at 720p at the lowest settings.

Radeon 7790 1 GB
FX-4130
8 GB RAM

The introduction gave me much better performance but now I have im getting into the 20s when I am outside in paris.
 
Specs: i5 4670K @ 4.5, R9 290 @ 1007/1400, 8 GB of ram, playing at 1920x1200 borderless

Okay, so I ran the game this morning and it runs fine out of the cut scenes (40~60 fps). The game uses a VERY heavy DoF effect in the cut scenes that effectively brings my frames to around 30 fps. Like others have mentioned, the visuals of this game are a mixed bag but overall I think it looks great, especially the character models. I wonder how the game will run for me when Ubi patches in tessellation :/.

The 280 series is much better at Tesselation than all other Radeons, so probably alright.

This game is seeming more and more like one that requires Gsync or a 30fps lock.
 

MisterM

Member
I'm getting about 30-40 FPS with some dips into the 20s at 720p at the lowest settings.

Radeon 7790 1 GB
FX-4130
8 GB RAM

The introduction gave me much better performance but now I have im getting into the 20s when I am outside in paris.

Thank you.

I find reports of people playing these games on lower end hardware MUCH more useful than those with i7s and 970s etc, no offence to those with higher end kit.
 

AHA-Lambda

Member
Got my new pc parts coming on very soon, how well/poorly will Unity run on it? I'm wondering just how bad it is and maybe getting Far Cry 4 instead.

i7 4GHz
GTX 970
16GB Ram
 

SaberEdge

Member
Very. With txaa working like it does in crysis 3 or black flag for example, this might match or surpass ryse as the best looking pc game

Once I got Unity running well on my PC, I thought it looked a bit better than Ryse on PC. Plus it's open world and has a lot more stuff on screen at any given time. The more and more areas I saw of Unity the more impressed I became.
 
Once I got Unity running well on my PC, I thought it looked a bit better than Ryse on PC. Plus it's open world and has a lot more stuff on screen at any given time. The more and more areas I saw of Unity the more impressed I became.

theres too much shimmer and image instability to put this above ryse. also the lod is extremely noticeable, even worse than bf4, and the shadows have a lot of weirdness.
 

FLAguy954

Junior Member
The 280 series is much better at Tesselation than all other Radeons, so probably alright.

This game is seeming more and more like one that requires Gsync or a 30fps lock.

Yeah I might just go for a 30 fps lock (locking Watch_Dogs at 30 fps allowed me to play it @ 1600p with a mixture of high and ultra settings).
 

UrbanRats

Member
DON'T turn FXAA off. Decrease whatever else you want, put shadows on High, remove SSAO or anything, but don't go cheap on AA.

I did this mistake once and the jaggies murdered my eyes. Paris buildings are full of small sculptures, roof tiles etc. which all become a mess at a distance without alias. It's worth a 2 FPS decrease and the occasional blur. Don't fuck with FXAA in this game.

lol, damn.
I wonder if i can keep decent 30 with TXAA and HBAO+, everything else up for grabs.

(970&2500k stock)
 

Seanspeed

Banned
Ok, that is your opinion. Personally, I don't think that justifies it.

No, no I know that. There are many uneccessary settings which are just eating FPS and do nothing to visual quality and that's something I like on PC Gaming, I can turn that lower and get more FPS with the same quality, but the thing is I do that already with my laptop. I know what my hardware can do and what not and thats fascinating and the key reason why I play on PC, but why should I do that on a high end PC build for gaming, why shouldn't a energy eating monster that is way more powerful than my laptop max games at 1080p60? That's something I don't understand.
Im not trolling or anything like some of you are saying (NeoGAF is the last board I would do that, for that I have some different boards!)

My last Gaming PC was one with a 8800GT and a AMD X2 4200+ heh ;)
What is so difficult to understand? Me turning down a setting from Ultra to High so I can run at 1080p/60fps is *not* the same thing as you turning down a setting from High to Medium so you can run 762p/30fps or whatever your laptop res is. Now, don't get me wrong, I have *nothing* against people who game on lower end PC hardware, but 1080p/60fps is a far more impressive experience, all else being equal. That is a large part of why people pay for nice PC hardware. But just because I have to turn down a setting here and there doesn't mean that suddenly 1080p/60fps isn't impressive anymore and that its all been a massive waste of money. It will *still* look great and way better than 762p or 900p/30fps.

So you basically want graphically pared back games that don't look nearly as good, but at least they will run at 60fps on consoles? That doesn't sound like a very PC gamer perspective to have. I thought you were a PC gamer, but maybe I'm wrong.

I have the completely opposite view: I want developers to push the consoles as hard as they reasonably can, even if it means rendering at a lower resolution. Then, on top of that, add extra graphical features for the PC versions.

The worst possible scenario I can imagine is if all this complaining actually makes devs decide to start delivering less graphically demanding games just so that console gamers get their 60fps and PC gamers with mid-range gaming PCs can max them out with 60fps too. Then devs can brag about their PC versions supporting "resolutions above 1080p". But,hey, at least everybody can max out the game with 60fps right?

Edit: By the way, I think some of you are assuming too much when you imply that the crowds are the only thing that makes this game demanding. I see a lot of different aspects to this game's visual design that explain why it is demanding.

Also, I don't experience any kind of "violently inconsistent" framerate in Unity. I cap at 30fps and it's very consistent.
I realize its a conflict of interest for you and that the more console versions of games sacrifice performance for graphics, the better the potential gains on PC will be if you've got a good PC, but I'm willing to be unselfish here in the name of people finally getting that framerate and playability should be treated as a high priority. That might actually stimulate the popularity for 120hz gaming as well.

And yes, it would mean that games should hit 1080p/60fps easier on PC, as well. Extra GPU power can go into even higher resolutions, extra PC-specific settings, mods or even higher framerates. It would also mean that you could probably spend less if you didn't need all that and just wanted something that will perform solidly, opening up PC gaming to more people.

VR would certainly benefit with more games being designed to run at appropriate levels for VR headsets, too. That's a biggie for me.

Graphics will always get better and better. That will not change. It doesn't mean that games will stop being impressive. Its not as if 60fps for a console game necessarily has to be 'ugly', either, ya know? It would be a mere temporary step back before we get back to where we were and then could move forward again. I would gladly take that step back for all the advantages it would bring.
 

Seik

Banned
I have already raised the frequency @3.30Ghz,i have to push it even further ?

Dude, buy a bigger heatsink and overclock the shit out of your processor, it's begging for it.

The 2500K is still one of the best out there @ 4.5Ghz, I can't believe it's still so relevant after all these years.
 

SaberEdge

Member
How are people locking down the frame-rate? I can not lock it down to 30. I tried in-game v-sync and through NVIDIA control panel.

i5 4670k @ stock
GTX 760 @ stock
8 GB RAM

Personally, I would recommend you download Nvidia Inspector and use the "half refresh rate" vsync option, since this allows you to use "standard" vsync to avoid screen tearing but capped at half your display's refresh rate.

If you don't care about screen tearing you can use the adaptive half refresh rate vsync option in the control panel though.

Those are the only ways I have found to avoid the micro-stutters that I get with any other kind of frame limiter.
 
Top Bottom