Assassin's Creed Unity - PC Performance thread

hoserx

Member
Those 970's putting some work huh? :p

Glad you're enjoying the game, I am yet to try it on my zotac 970. I expect the performance to be similar to OP's 780. I was thinking I can still get another one and SLI, but just for this game, nah not worth it, imo.

haha, they are, but that isn't the reason for the game crashing..... these arent graphics driver related issues luckily......just AC being AC.
 

AndyBNV

Nvidia
Thanks. Is this profile any different/improved from the profile that came with the new drivers on Monday?

Yes.

doesnt do much of anything to combat temporal aliasing and shimmering in this title. txaa has always produced a 99% rock solid/stable image with the exception of the crew beta so i figured there was a chance it was bugged.

Without video it's hard to say for sure, but I'm guessing it's shader aliasing?
 

Renekton

Member
I don't have the game myself and haven't been collating the posts from people who have. Perhaps evidence is too strong a word at this stage, but the overwhelming impression I've gotten is that the game has all kinds of technical issues other than performance, and given Ubisoft's track record, I'm more inclined to go with Unity being rushed rather than too demanding for today's hardware.
Even from layman POV, it looks very demanding with more intricate architecture, environment variety and sheer number of NPCs. The other game of similar scope is Dead Rising 3, and it ran poorly on good PCs too.
 

ISee

Member
The problem is that my 970 runs Crysis 3 on High/Very high 1080p 60fps and looks much, much better than AC:U.

I understand that one is open world and one is not, but that's where the expectations are coming from.

1.) as you already said, you need a mix of high/very high settings to maintain 60 fps even in a 2 year old PC game. On it's highest settings, wich were designed for PC enthusiasts in the first place, a gtx 970 does not maintain stable 60 fps. At least not mine. On max. with MSAA my gtx 970 sometimes even drops to below 50 fps, especialy in the "welcome to the jungle level" or whatever it's called.

2.)I would like to ask you one stupid question. Do you expect you could run ryse son of rome in ultra settings with stable 60 fps just because you have a gtx 970, the cry engine is better and ryse isn't an open world game? I do not want to provoke you, but just want to show you that maybe you expect too much out of yor gtx 970.

And before someone responds with "but ryse looks better". Yeah you are right.
 

axb2013

Member
I don't have the game myself and haven't been collating the posts from people who have. Perhaps evidence is too strong a word at this stage, but the overwhelming impression I've gotten is that the game has all kinds of technical issues other than performance, and given Ubisoft's track record, I'm more inclined to go with Unity being rushed rather than too demanding for today's hardware.
There is a lot of info that lines up with this theory, not just recent user observations, it goes back to Giant Bomb/Ubi dev "6 months ago we were at 10fps" emails.

I think people expecting more visual quality (for the performance they are having) aren't taking in account this is a open world game in a big, dense, complex city with thousands npcs.

Yes, you could have better visuals with the same computer if this was a corridor fps. It isn't.
That's not good enough to issue a blank check to Ubi and call it a day.


People say that comparisons to Ryse aren't fair, I think that's debatable. I don't buy the "reasonable doubt" excuse Ubi is getting here. If the NPC count is that relevant to performance, insisting on having thousands was foolish to say the least, why not reduce the number or better yet, give the end user ability to set the count on his end. They focused on quantity over quality, visually the NPC's are the worst Unity asset. They don't look right on any setting, they call attention to draw distance with the annoying pop in and having as many on the screen has another negative effect, the animation and face clones are even more accented with so many simultaneously on the screen, it hurts the visuals more than it would with a more reasonable count. I want thousands of ugly NPC's at the cost of performance, said no one ever.
 

sobaka770

Banned
I don't have the game myself and haven't been collating the posts from people who have. Perhaps evidence is too strong a word at this stage, but the overwhelming impression I've gotten is that the game has all kinds of technical issues other than performance, and given Ubisoft's track record, I'm more inclined to go with Unity being rushed rather than too demanding for today's hardware.

Actually, there is a lot of evidence that the game is actually pretty well optimized for PC.

I think people forget a few key things when talking about PC optimization.

1) Game was made with Xbox One as leading platform. Therefore, the game was made with 8Gb shared RAM target. Your PC cannot have that and this game pushes so many textures at a time, even at low you eat 2Gb. This is okay for new-gen consoles but not for older graphics cards.

2) Game is not being downscaled to match the demands of lower-end PCs. Watch Dogs was made so that despite all the "next-gen" city talk it could be run on PS3 and X360. Therefore, the underlying mechanics could be scaled to fit 128Mb ram of PS3 at worst.

Unity, on the other side, could not be run on last-gen and no development effort was spared to downscale anything for older hardware. Can you imagine the task of Kiev team, when they receive next-gen console code and need to optimize it for older PCs? It's not like Total War series which is made for PC only and has a slider where you can adjust unit size.

3) This game is the best-looking game on PC. If you run into a tree in Crysis 3 you can also make it look like ass. In motion, Assassin's Creed Unity is insane. It doesn't push millions of blades of grass killing the framerate, instead it pushes people. This time, it pushes 10000 NPCs even if you're sitting perched on top of the Notre-Dame cathedral. The LOD on those is insane.

This game is also running like ass on consoles. Compared to PS4 and XboxOne, you can run this game almost maxed out at 60FPS with midrange modern Nvidia card without SLI.

4) People here clearly underestimate the potential of new consoles and overestimate the power of PC. The Unified RAM for example is something which doesn't happen on PC. You all wanted next-gen games, without any links to old consoles to "provide better experience" and "bolder vision" and here it is. Unity doesn't compromise. It is given 5Gb of RAM and it fills 3Gb with textures. It doesn't dial down the number of NPCs. It doesn't downscale the city size. It has a huge amount of sophisticated animations. It also runs at 900p barely at 20-30 FPS on modern consoles.

I think that the results we're seeing from PC hardware are more than reasonable. People are able to run this thing at twice the FPS with 50% more resolution, better textures and effects on mid-range 970GTX and an i5 processor from 2 years ago. Nvidia cards perform admirably, AMD cards may have a bug, but this happened before.

And yes, this is the Crysis 3 of next-gen. It doesn't care for performance and pushes its vision to the limits. Crysis was just the same. It ran like shit and even now you can barely max it out. Yet it's seen not as a failure to optimize, but as a benchmark for PC performance, even though we all know that you can do the Crysis-like graphics and make them run better.

There are bugs in this game. There are issues. It may not be fully optimised, but it performs admirable considering the source. All I want is to say that guys at Kiev did a good job.
 

Sickbean

Member
1.) as you already said, you need a mix of high/very high settings to maintain 60 fps even in a 2 year old PC game. On it's highest settings, wich were designed for PC enthusiasts in the first place, a gtx 970 does not maintain stable 60 fps. At least not mine. On max. with MSAA my gtx 970 sometimes even drops to below 50 fps, especialy in the "welcome to the jungle level" or whatever it's called.

2.)I would like to ask you one stupid question. Do you expect you could run ryse son of rome in ultra settings with stable 60 fps just because you have a gtx 970, the cry engine is better and ryse isn't an open world game? I do not want to provoke you, but just want to show you that maybe you expect too much out of yor gtx 970.

And before someone responds with "but ryse looks better". Yeah you are right.


I was agreeing with you, just playing devils advocate.
 

Qassim

Member
1) Game was made with Xbox One as leading platform. Therefore, the game was made with 8Gb shared RAM target. Your PC cannot have that and this game pushes so many textures at a time, even at low you eat 2Gb. This is okay for new-gen consoles but not for older graphics cards.

It's worth noting only around 4.5-5GB is available to games on the PS4/Xbox One. But I get and largely agree with your points.
 
@Isee

Wow, thank god I didn't bought a gaming PC. I would be so disappointed if my shiny new i7 4770K and 970 GTX can't max every game out in 1080p60. Lol.
 

elelunicy

Member
Crysis 3 or Ryse's graphics are just more consistent than Unity's. In a lot of areas I'd say Unity is more impressive.

The game runs very well for me as long as I don't use MSAA or TXAA. Nearly 60fps locked with all other settings maxed out @ 3440x1440. Great SLI utilization too as both of my GPUs have constant 90%+ usage.
 
Ubisoft is known to make very bad engine tech on PC. Black Flag was a disaster and it's still much more acceptable than Unity. The gap on PC between Black Flag and Unity is probably much bigger than the difference on PS4 (which is 1080p -> 900p, and 25-30fps to stable 30).

But then comparing this to another bad engine (cryengine) is not a good argument.

Sadly the age of good engines on PC is basically over, as the best engineers and coders find their home in bigger console dev studios. That's where the money is.

The remains of good PC engines are basically left to Unreal. Then also Frostbite is decent enough, despite it's gone worse since Bad Company 2. Titanfall, even if not too sophisticated is a technical excellence on PC. The last CoD is very good. And Mordor also has at least a passable engine.

Even the engine behind Final Fantasy 14 is good, despite its obvious limits.

That's it.

Ryse not. It's way too heavy compared to what it delivers and it's a rushed product done as fast as possible by a bankrupt studios. It's enough taking a look at Archeage or State of Decay to notice how the technical quality is subpar compared to the hardware costs.

But again, the fact that now PC universally gets the short end of the stick is not a justification for a disaster like Unity. Or Watch Dogs. Ubisoft is simply not delivering on the technical front.
 

Rwinterhalter

Neo Member
So, with a bit of testing it looks that this is CPU bound for sure.

On my 970 PCSS and Environmental Quality are the only settings that impact performance and even then it's small. Changing resolution makes absolutely no difference in framerate at all, even at sub-720p I'm hovering around 50fps.

This makes me think that my FX-8350 just isn't up to 60fps. At the same time, I don't see anyone in this thread with Haswell i5s or i7s getting a solid sixty either. Could we infer that the game is exceptionally good at using multiple cores based on this? Or am I just talking out my ass?
 
25 to 30? From what I've seen in the other thread, it's more like 19-25.

Those are borderline cases.

But what matters here is that on consoles this seems due to CPU issues. On PC we heave HUGE, INSANE headroom on CPU compared to a console.

So it really doesn't make any sense.
 

martino

Member
I think the PC gaming evangelists are partially to blame for this attitude. They ran around forums telling everyone that they could play games super-sampled, at 1440p, with high-res textures, at 120fps, for hardware that costs a pittance. So people went ahead and bought gaming PCs.

No problem is not being interested in things and thinking $ = ultra.
You want to speak about performance ...learn what cost performance or not ..tweak your setting...interest yourself on what is behind low,medium,high, ultra in the game you are testing things...
then analyse it ...is performance hit normal in the current game context ?
Etc... etc...
 

pa22word

Member
I think the PC gaming evangelists are partially to blame for this attitude. They ran around forums telling everyone that they could play games super-sampled, at 1440p, with high-res textures, at 120fps, for hardware that costs a pittance. So people went ahead and bought gaming PCs.

If people went out and spent 500-1000$ on a machine on the whims of people with obvious agendas, without doing any research at all, then they deserve what they get in these situations. No sympathy from me at all on this subject, tbqh.
 

Leb

Member
Ubisoft is known to make very bad engine tech on PC. Black Flag was a disaster and it's still much more acceptable than Unity. The gap on PC between Black Flag and Unity is probably much bigger than the difference on PS4 (which is 1080p -> 900p, and 25-30fps to stable 30).

But then comparing this to another bad engine (cryengine) is not a good argument.

Sadly the age of good engines on PC is basically over, as the best engineers and coders find their home in bigger console dev studios. That's where the money is.

The remains of good PC engines are basically left to Unreal. Then also Frostbite is decent enough, despite it's gone worse since Bad Company 2. Titanfall, even if not too sophisticated is a technical excellence on PC. The last CoD is very good. And Mordor also has at least a passable engine.

Even the engine behind Final Fantasy 14 is good, despite its obvious limits.

That's it.

Ryse not. It's way too heavy compared to what it delivers and it's a rushed product done as fast as possible by a bankrupt studios. It's enough taking a look at Archeage or State of Decay to notice how the technical quality is subpar compared to the hardware costs.

But again, the fact that now PC universally gets the short end of the stick is not a justification for a disaster like Unity. Or Watch Dogs. Ubisoft is simply not delivering on the technical front.

This is a bafflingly incoherent post, but I agree with your central premise: the PC version is, indeed, surprisingly competent and I will join you in commending Ubisoft on their technical accomplishments.
 

Qassim

Member
I'm so annoyed that I put together a high end gaming PC because Assassins Creed Unity performs as it does.

May as well sell it. What's the point in it now?
 

hoserx

Member
Actually, there is a lot of evidence that the game is actually pretty well optimized for PC.

I think people forget a few key things when talking about PC optimization.

1) Game was made with Xbox One as leading platform. Therefore, the game was made with 8Gb shared RAM target. Your PC cannot have that and this game pushes so many textures at a time, even at low you eat 2Gb. This is okay for new-gen consoles but not for older graphics cards.

2) Game is not being downscaled to match the demands of lower-end PCs. Watch Dogs was made so that despite all the "next-gen" city talk it could be run on PS3 and X360. Therefore, the underlying mechanics could be scaled to fit 128Mb ram of PS3 at worst.

Unity, on the other side, could not be run on last-gen and no development effort was spared to downscale anything for older hardware. Can you imagine the task of Kiev team, when they receive next-gen console code and need to optimize it for older PCs? It's not like Total War series which is made for PC only and has a slider where you can adjust unit size.

3) This game is the best-looking game on PC. If you run into a tree in Crysis 3 you can also make it look like ass. In motion, Assassin's Creed Unity is insane. It doesn't push millions of blades of grass killing the framerate, instead it pushes people. This time, it pushes 10000 NPCs even if you're sitting perched on top of the Notre-Dame cathedral. The LOD on those is insane.

This game is also running like ass on consoles. Compared to PS4 and XboxOne, you can run this game almost maxed out at 60FPS with midrange modern Nvidia card without SLI.

4) People here clearly underestimate the potential of new consoles and overestimate the power of PC. The Unified RAM for example is something which doesn't happen on PC. You all wanted next-gen games, without any links to old consoles to "provide better experience" and "bolder vision" and here it is. Unity doesn't compromise. It is given 5Gb of RAM and it fills 3Gb with textures. It doesn't dial down the number of NPCs. It doesn't downscale the city size. It has a huge amount of sophisticated animations. It also runs at 900p barely at 20-30 FPS on modern consoles.

I think that the results we're seeing from PC hardware are more than reasonable. People are able to run this thing at twice the FPS with 50% more resolution, better textures and effects on mid-range 970GTX and an i5 processor from 2 years ago. Nvidia cards perform admirably, AMD cards may have a bug, but this happened before.

And yes, this is the Crysis 3 of next-gen. It doesn't care for performance and pushes its vision to the limits. Crysis was just the same. It ran like shit and even now you can barely max it out. Yet it's seen not as a failure to optimize, but as a benchmark for PC performance, even though we all know that you can do the Crysis-like graphics and make them run better.

There are bugs in this game. There are issues. It may not be fully optimised, but it performs admirable considering the source. All I want is to say that guys at Kiev did a good job.

I was going to make a similar post. The game takes hardware and spits it out. I remember freaking out that my relatively new 8800 gts 640mb card couldn't run crysis at max detail / high framerates....... in hindsight we all look back at that as a game that really moved hardware forward.

I'm glad to see my gpus pegged at 90+ %, and my cpu cores all getting used. Sure, there is room for optimization of course.......but it's good to have stuff pushing the envelope.... Weren't we complaining about "cross gen games" last week? We got our wish... a game that truly thrashes "next gen" hardware.
 

sobaka770

Banned
Crysis 3 or Ryse's graphics are just more consistent than Unity's. In a lot of areas I'd say Unity is more impressive.

The game runs very well for me as long as I don't use MSAA or TXAA. Nearly 60fps locked with all other settings maxed out @ 3440x1440. Great SLI utilization too as both of my GPUs have constant 90%+ usage.

Just to add to previous points:

5) Unity is an open-world simulation/game. You cannot compare it to Ryse, because Ryse is a linear experience. If you make a linear game, such as Ryse of CoD single-player, you know where the player can go, your arenas are small and you don't simulate anything you don't need to. By limiting the interactions between the player and the arenas, games like Ryse and Crysis 3 get such a good-looking impression.

All the background assets in these kind of games can be of outstanding visual quality because developers know that you can only see them from certain angles, and only create these specific angles. They also know that you will never see them up-close so the texture can be adjusted to fit the bill. By contract in Unity, what you see is what you get. You can turn into any street at will, enter any window. That means that you can't cheat and create false geometry, load less textures or simulate a small number of pre-determined NPC patterns. The whole issue with pop-in is happening because the game has trouble predicting your movements and load stuff on-time for you.

By having linearity in their design, Crysis, Ryse, CoD they all perform better. They don't tax your processor to generate and simulate the world, they don't use up all your VRAM to load textures you may or may not need.

Look at how low-res vanilla Skyrim is, without any mods, because it had to run on old-gen hardware. The issues it had on PS3 were due to low amount of RAM which killed the simulation. There is no easy fix for that, the game needs to be scaled down somewhere to reduce the number of variables to process at one time.
 
I'm so annoyed that I put together a high end gaming PC because Assassins Creed Unity performs as it does.

May as well sell it. What's the point in it now?

Yeah I can understand you totally. I would be totally pissed off if I had bought a Gaming PC. I for one would buy a Gaming PC to max everything at 1080p60. Since thats not the case what's the point? Just a bit better framerates, a bit higher res shadows and 1080p instead of 900p? That's absolutely not worth it.

And no, I'm not a console fanboy. More like a laptop fanboy. I just have a Wii U and a laptop and I'm happy with that.
 
We got our wish... a game that truly thrashes "next gen" hardware.

If it was merely a matter of low performance, sure.

But this is a game with bad performance on top of:

- Stale, boring gameplay
- Uneven graphic quality
- Terrible pop-in at highest quality
- Mush textures on the building 1 road away
- Glitched animations everywhere
- Plenty of bugs and crashes even on consoles

It's pretty obvious that it's a game rushed only to be in the holiday release window.
 

Kezen

Banned
I for one would buy a Gaming PC to max everything at 1080p60. Since thats not the case what's the point?
You have to be trolling at this point.

And no, I'm not a console fanboy. More like a laptop fanboy. I just have a Wii U and a laptop and I'm happy with that.
I would be pissed if I bought a 780€ laptop with crappy specs and awful performance.
 

d00d3n

Member
I tried to shadowplay fullscreen + vsync staying in the fps range 53-60 fps and feeling smooth, but the shadowplay recording appeared to make the experience more choppy. (I am not sure how Shadowplay works, maybe it took away necessary gpu resources or could not be combined with sli properly).

Why is the fps not halved to 30 fps when going below the 60hz refresh rate using fullscreen + vsync on? Is this vsync mode using triple buffering? Is the game using adaptive vsync by default? (I don't see how this can be possible, the game appeared to be tearing free even when down in the 53 fps range)
 

Evo X

Member
Fuck the reviewers and whiners.

I just played the game for the past 2 hours and had a blast. It is technically astonishing. Not only the quality of the assets, but the insane scale of the whole thing. I honestly think the game might be too ambitious for the current consoles. I visited Paris for the first time earlier this year, and it's so amazing roaming the virtual streets and checking out the landmarks.

I feel like a lot of people just hopped on the Ubisoft shitwagon because of some of the higher up management decisions without commending the core team on on all of the effort they put in and things they did right.
 

Damian.

Banned
But I bought and have a PC. Just not a High End Gaming PC. And as I see in this thread, there was no need for it, cuz even high end PCs can't do 1080p60 on max.

I can achieve a 95% locked 1080p60 with a 2600k@4.5 | 970@980 levels if I bump Shadows and environment Quality down to high using FXAA. SLI usage on Nvidia hardware seems great on this game, I don't doubt with an OC 6 core Haswell and SLI 980's you could achieve a locked 1080p60 maxing out the game. This is NOT representative of past Ubi PC titles, this game actually scales well with higher end hardware.
 

GHG

Gold Member
I tried to shadowplay fullscreen + vsync staying in the fps range 53-60 fps and feeling smooth, but the shadowplay recording appeared to make the experience more choppy. (I am not sure how Shadowplay works, maybe it took away necessary gpu resources or could not be combined with sli properly).

Why is the fps not halved to 30 fps when going below the 60hz refresh rate using fullscreen + vsync on? Is this vsync mode using triple buffering? Is the game using adaptive vsync by default? (I don't see how this can be possible, the game appeared to be tearing free even when down in the 53 fps range)

The performance impact of shadowplay is much higher on SLI compared to what is on a single card.

It was 20-25% Las time I tried it. Although that was a while ago.
 

hoserx

Member
If it was merely a matter of low performance, sure.

But this is a game with bad performance on top of:

- Stale, boring gameplay
- Uneven graphic quality
- Terrible pop-in at highest quality
- Mush textures on the building 1 road away
- Glitched animations everywhere
- Plenty of bugs and crashes even on consoles

It's pretty obvious that it's a game rushed only to be in the holiday release window.

You're in a performance thread.........and it doesn't perform badly. The game is truly demanding, and people have unfounded expectations as to what their hardware can do.
 

Qassim

Member
Yeah I can understand you totally. I would be totally pissed off if I had bought a Gaming PC. I for one would buy a Gaming PC to max everything at 1080p60. Since thats not the case what's the point? Just a bit better framerates, a bit higher res shadows and 1080p instead of 900p? That's absolutely not worth it.

1. I and others play on PC for more than just the ability to play console games at better performance and visuals, and if you look at the data, most others do too. This console port segment of the market is quite small compared to the rest of the PC gaming market. Even if the console ports never looked better on PC, I'd still go PC.

2. "Max everything at <resolution>/<frame rate>" is a dumb expectation, and no one should even consider buying hardware in the pursuit in that sort of nonsense.
 
You have to be trolling at this point.


I would be pissed if I bought a 780&#8364; laptop with crappy specs and awful performance.

No I'm not trolling. That's the reason why Gaming PCs are much better than consoles. 1080p30 on good settings can do a console (and my laptop) too.

I think the opposite is the case. You are pissed off because your 1000$+ hardware can't max this game out so you adjusted your "realistic" expections that you cannot be disappointed anymore. I bet you thought the exactly same thing as I (1080p60 with max) when you bought your GTX 770 (or was it even a 780)? For 500$ and yet, left disappointed that it hardly pushed 50 FPS in modern games.

Awful performance, crappy specs? You seriously have no clue about laptop hardware and about the architecture. You bought a PC for feeling superior to consoles, I bought a laptop to play at console settings and to be mobile with low energy cost. That's the difference between us.
You are also underrate ULV CPUs here to give you an idea, I can edit my 1080p60 videos fast and fluently, everything is smooth as possible. And I can play as I said, modern games like Tomb Raider, BF4, Crysis 3, Dolphin GC emulated games in 900p40/60 (Crysis 3, BF4 high) and in 1080p50-60 (Tomb Raider) . Similar to a PS4 and I don't need more than that. Just because ULV processors have a low TDP doesn't bad that they are bad, they are high efficient haswell processors just at 2.4 GHz instead of 3 GHz. That's the only difference to a standard voltage i5 CPU. Also for 780&#8364; you can't get a better laptop. Are you expecting something like GTX 860M and i7 4710HQ for that price? Hah, only with a crappy display, low battery lifetime and low quality plastic maybe outside of Germany and the difference isn't that big as you want it to be.
 
I can achieve a 95% locked 1080p60 with a 2600k@4.5 | 970@980 levels if I bump Shadows and environment Quality down to high using FXAA. SLI usage on Nvidia hardware seems great on this game, I don't doubt with an OC 6 core Haswell and SLI 980's you could achieve a locked 1080p60 maxing out the game. This is NOT representative of past Ubi PC titles, this game actually scales well with higher end hardware.

Because, following your reasoning, past Ubi PC titles that run on a 6 core and 980 can't get a locked 1080p60?

The fact that you can bruteforce an engine to run well (now that there's the hardware) doesn't mean the engine is good. It just means that we have better hardware compared to a year ago.
 
I was going to make a similar post. The game takes hardware and spits it out. I remember freaking out that my relatively new 8800 gts 640mb card couldn't run crysis at max detail / high framerates....... in hindsight we all look back at that as a game that really moved hardware forward.

I'm glad to see my gpus pegged at 90+ %, and my cpu cores all getting used. Sure, there is room for optimization of course.......but it's good to have stuff pushing the envelope.... Weren't we complaining about "cross gen games" last week? We got our wish... a game that truly thrashes "next gen" hardware.
Yes, a game not designed around console hardware advantages thrases them. These next gen hardware is capable of more than this ACU shit which can be proved from their exclusives next year. Seems like ubi ported this game from PC to consoles and XB1 being closer to PC in memory architecture and API made to run better than PS4. This might please PC users with 600$+ of CPU and GPU but ubi going to lose money from consoles sales.
 

Seanspeed

Banned
2.)I would like to ask you one stupid question. Do you expect you could run ryse son of rome in ultra settings with stable 60 fps just because you have a gtx 970, the cry engine is better and ryse isn't an open world game? I do not want to provoke you, but just want to show you that maybe you expect too much out of yor gtx 970.
I would. And I can. With 1.5x supersampling to boot.
 

Kezen

Banned
I think the opposite is the case. You are pissed off because your 1000$+ hardware can't max this game out so you adjusted your "realistic" expections that you cannot be disappointed anymore. I bet you thought the exactly same thing as I (1080p60 with max) when you bought your GTX 770 (or was it even a 780)? For 500$ and yet, left disappointed that it hardly pushed 50 FPS in modern games.
Bought my 780 for 280€ brand new. :) Killer price but I should have waited.
No, unlike you I prefer to keep my expectations in check. There is no way to know how demanding future releases will be, I have no issues with lowering settings regardless of the hardware I'd own.
 

Angry Fork

Member
So the performance is directly related to the CPU then?
Game would only run better on Intel i5's?

I'm not sure I don't know much about processors, I only got AMD a while back because they're way cheaper than Intel but I know Intel is better.

But I just changed the resolution from 1080p to 900p and the stuttering reduced significantly, I went to 720p and it went away completely. I'm gonna keep playing it at 900p for now since 1080 stuttering became too much.

So for anyone who has less than minimum specs you might be able to play at a stable 30fps if it's low settings and 720p or 1600x900. I haven't tried increasing the graphics settings yet with 720 though, I might be able to get away with better effects and lower res than high res and low effects.
 

d00d3n

Member
The performance impact of shadowplay is much higher on SLI compared to what is on a single card.

It was 20-25% Las time I tried it. Although that was a while ago.

Thanks. Yeah, in that case it is not very surprising that performance took a dive when using shadowplay.

I really recommend people who want to have vsync to try and go back to fullscreen + vsync. It does not decrease fps to 30 fps when going below the 60 fps target. Subjectively feels much more smooth than Borderless windowed.
 

Damian.

Banned
Because, following your reasoning, past Ubi PC titles that run on a 6 core and 980 can't get a locked 1080p60?

The fact that you can bruteforce an engine to run well (now that there's the hardware) doesn't mean the engine is good. It just means that we have better hardware compared to a year ago.

ACIII/IV really can't be bruteforced to run a locked 1080p60 at max settings with high AA, the engines simply can't do it, mostly due to the CPU limitations of being single thread heavy.



EDIT: On another note, this game has good framepacing at 30fps using the Nvidia Inspector 1/2 refresh rate option and a 30 lock in RTSS. I can run it completely maxed out and never skip a beat like this. With a solid frametime I may be able to get used to this seeing as the cutscenes will also not chug comparitively to the rest of the game.
 
Yes, a game not designed around console hardware advantages thrases them. These next gen hardware is capable of more than this ACU shit which can be proved from their exclusives next year. Seems like ubi ported this game from PC to consoles and XB1 being closer to PC in memory architecture and API made to run better than PS4. This might please PC users with 600$+ of CPU and GPU but ubi going to lose money from consoles sales.

So you're saying:
1- That this was mainly developed on PC and downported on consoles.
2- This sells much better on PC.

Sadly it's very likely far from truth. Or even plausibility.

We know that the PC port has been outsourced to a different studios. This means that the guys who made the game and coded it aren't even aware of what the other studio does. This is how outsourcing works. You outsource so that you can focus on something else.

Secondly, GAF is not representative of gamers out there. Only a tiny, tiny, tiny minority has PC that run SLI configurations, OC processors and liq
 

hoserx

Member
Yes, a game not designed around console hardware advantages thrases them. These next gen hardware is capable of more than this ACU shit which can be proved from their exclusives next year. Seems like ubi ported this game from PC to consoles and XB1 being closer to PC in memory architecture and API made to run better than PS4. This might please PC users with 600$+ of CPU and GPU but ubi going to lose money from consoles sales.

Games may look better on consoles over time , but there isn't any more magic sauce left in them. They have finite limits of power.....the gpu can do so much, the cpu can do so much, and these things will not change. Developers may learn tricks to help fool you into thinking more is being done in these games, but at this point, this game has pushed the hardware harder than anything on the ps4 / xb1.....

As for PC, yes it does please me that my $700ish of gpu power is being put to use, and put to use hard. The game is beautiful and puts a huge smile on my face when I play it. The entire package looks more impressive than any game I've played in 2014.
 

Sickbean

Member
People saying this is the Crysis of this gen are also forgetting one simple fact:

Crysis was monumentally better looking than everything else when it came out. It still stands up as a graphics showcase today (Anandtech still use it as such).

AC:U - not so much.
 
Top Bottom