Far Cry Primal PC performance thread

Just played for an hour. 970, everything on high except fxaa, locked at 30 through DXTory and it never dropped below 30.

I'm thinking the day 1 patch alleviated a lot of the issues early reviewers saw.

Can't you get 60fps if you're playing on high settings?
 
2016 and we are back to bad PC ports lately. I know its not easy with 100s of specs out there, but I thought this would be solid because far cry 4 was good. Might just play on PS4 in a while then.
 
I do wonder why a lot of recent PC games also seem to be pretty CPU-bound, considering they usually run at a steady 30FPS or greater on the consoles with far weaker CPUs. Even a Haswell Core i3 at 3+ GHz has more processing power than all 8 cores on the consoles.
 
Plus the game still looks and runs better on pc, sometimes I wonder about people.

Most people have no idea that console versions use much lower settings to achieve 30 fps. It happens with every game. "I can not achieve 60fps on ultra, so I will get back to the console version where I will happily play at 30fps on low/medium".
 
Most people have no idea that console versions use much lower settings to achieve 30 fps. It happens with every game. "I can not achieve 60fps on ultra, so I will get back to the console version where I will happily play at 30fps on low/medium".

most console graphics settings in games this gen shared with PC have been medium high with low for shadows or LOD at the worst cases. THey've actually kept up graphically otherwise
 
With all this crying of bad ports when a game is demanding, I can forsee a future were pc games are in visual parity with console games.

I'm not saying bad ports don't exist but what I am saying is ultra is far better looking than the consoles version hence why its more demanding.

However I can see a future were developers go screw it and the max options look very similar to console options, and the game runs at 120 fps and it will be "optimised" , but then people will cry about console parity.

Like how assassins creed SYNDICATE "OPTIMISED" by making the visuals worse than unity.

I do think a new nvidia driver may help 's bit though.

Yep.

To be frank, PC gamers who have this attitude are kind of ruining things for everyone.

Look at how they nerfed the PC version of dying light for example as a result of people moaning about their 970 not being able to max it out at 1080p/60 with draw distance maxed out and tons of foliage detail.

If you dont like having to tweak a few settings either buy better hardware or stick to console gaming. Its quite simple really.
 
On my setup 970, i7-4790k, 16gb 2133mhz ddr4, loaded on a 240gb kingston ssd, windows 10. I run on a mix of high and very high and I get 55-65fps stable, that's good enough for me.
 
most console graphics settings in games this gen shared with PC have been medium high with low for shadows or LOD at the worst cases. THey've actually kept up graphically otherwise

Keeping up by playing at half the framerate (and more in a lot if cases), lower resolution, lower quality IQ and lower details.

Good stuff.
 
Yep.

To be frank, PC gamers who have this attitude can fuck off back to console for all I care. They dont have a clue and its just ruining things for everyone.

Look at how they nerfed the PC version of dying light for example as a result of these bitches crying about their 970 not being able to max it out at 1080p/60 with draw distance maxed out and tons of foliage detail.

If you dont like having to tweak a few settings either buy better hardware or fuck off back to console. Its quite simple really.

Chill, man
 
Yep.

To be frank, PC gamers who have this attitude can fuck off back to console for all I care. They dont have a clue and its just ruining things for everyone.

Look at how they nerfed the PC version of dying light for example as a result of these bitches crying about their 970 not being able to max it out at 1080p/60 with draw distance maxed out and tons of foliage detail.

If you dont like having to tweak a few settings either buy better hardware or fuck off back to console. Its quite simple really.

Expect a ban for that if there's any consistency to the moderation here...
 
Dunia does a decent job at landscapes and stuff but man this game can look downright bad at times. Ubi needs to upgrade their engine already. Otherwise running pretty well for me.
 
This was the kind of game I'd consider picking up for $20 after a year on steam sale but damn if thats its current performance I doubt it will even run on this computer. 30 fps with a 4gig 960 and an i7-6700k is just damning to me since I have the 4g 960 but an ollllld ass i5-750
 
thats not at all great performance given the visual quality of the game IMO

I'm happy. Looks gorgeous, plays well, and runs 60+fps avg at max settings 1440P.

No stutters, hitches, mouse acceleration issues, or crashes. Could be a hell of a lot worse for sure.
 
SSjGZFK.jpg


Overclocked GTX970

ULZHlSR.jpg


Non overclocked GTX970

My settings are all max, except for shadows atVery High instead of ultra.

That's at 2560x1440

*CPU: Intel i7 4770k @ 4.3GHz (so far) overclock
Cooler: Corsair H105 Self Contained Watercooling
Motherboard: Asus Sabertooth Z97
RAM: Corsair 2x8GB DDR3 1886MHz
Graphics Card: Gigabyte GTX-970 1500MHz Boost / 7666MHz Memory
Harddrives:1x Samsung 500GB SSD 1x1TB WD, 1 X 3TB Seagate (sent for RMA)
PowerSupply: Beats me, and can't be arsed opening my case!*
 
Specs, homie?

Posted benchmark results and specs on the last page. Here they are again:

5820K @ 4.4Ghz
Titan X @ 1,340Mhz
16GB DDR4
1TB 850 Pro SSD
Windows 10 x64

Played the game over 2 hours and had no problems. Max VRAM usage was 4.3GB. Gameplay and cutscenes were buttery smooth.
 
It runs great in-game for me with SLI, no hitching or LOD streaming issues at all. I guess the benchmark is bugged in some way. It looks pretty damn good too, the foliage is some of the best I've seen.

I just wish I could enable SMAA instead of FXAA...
 
Runs great at 1440p using 970 SLI with Ultra settings on a 4770K @ 4.4 Ghz and 16gb ram. Constant 60 fps (goes from 60-80ish without v-sync). Much better than how Far Cry 4 ran at launch and this is without a Gameready driver. I can't believe a game launched with excellent SLI performance, a very rare breed nowadays :(.
 
I'm happy. Looks gorgeous, plays well, and runs 60+fps avg at max settings 1440P.

No stutters, hitches, mouse acceleration issues, or crashes. Could be a hell of a lot worse for sure.

battlefront for example runs at double the avg framerate, looks a HELL of a lot better, all while theres tons more shit going on.
 
Can we try turning one or two settings down (usually shadows) and work out what console settings are before we throw the port under the bus?

i dont have the game but that totalbiscuit dude whos video is linked on the first page said turning everything to low gets him like 90 fps on a single titan x. i think it was 1080p?
 
battlefront for example runs at double the avg framerate, looks a HELL of a lot better, all while theres tons more shit going on.

Yet they are on small maps, with No a.i.

Hardly a apples to apples comparison.

I love Frostbite as much as the next guy, but games are not that black and white.
 
Yep.

To be frank, PC gamers who have this attitude are kind of ruining things for everyone.

Look at how they nerfed the PC version of dying light for example as a result of people moaning about their 970 not being able to max it out at 1080p/60 with draw distance maxed out and tons of foliage detail.

If you dont like having to tweak a few settings either buy better hardware or stick to console gaming. Its quite simple really.

This right here.

Also can we please stop throwing the term optimization around like no one's business?
 
Yet they are on small maps, with No a.i.

Hardly a apples to apples comparison.

I love Frostbite as much as the next guy, but games are not that black and white.

whats going on in primal shouldnt even qualify as AI tbh, but thats besides the point. ai has absolutely nothing to do with graphics rendering. i think having 48 players doing all kinds of shit counts for just as much as far cry 4s map being larger.
 
whats going on in primal shouldnt even qualify as AI tbh, but thats besides the point. ai has absolutely nothing to do with graphics rendering. i think having 48 players doing all kinds of shit counts for just as much as far cry 4s map being larger.

It has a lot to do with performance though primarily where the CPU is involved anyway, though that's not the only feature that uses CPU in engines, there's a reason why CPU intensive games struggle on current gen consoles, and no having 48 players doesn't count as much as farcry having a larger map that's also non comparable.

It's down to how the engine streams textures/assets, LOD settings also are impossible to compare.

Engines are not created equally, and while they are both FPS titles, there's too much variety in terms of the Engine, asset quality, various shaders, precision of effects that it's very hard to compare 2 completely different games.

Battlefront is a amazing looking game there's no denying that, but you can't just black and white say this runs better therefor everything else should.

I do believe Frostbite is the superior Engine, however Dunia isn't that terrible either.
 
whats going on in primal shouldnt even qualify as AI tbh, but thats besides the point. ai has absolutely nothing to do with graphics rendering. i think having 48 players doing all kinds of shit counts for just as much as far cry 4s map being larger.

It absolutely does have something to do with graphics rendering since there is an opportunity cost involved. If you use up all your CPU cycles on AI then how do you think the draw calls are going to get done?

Until you know what is actually going on under the hood of the two respective games you are not at liberty to say whether one is more optimised than the other.

And for what it's worth there will be a lot of active background AI process in primal since every animal/herd will have its own AI along with any potential human interactions. Then there is also a day/night cycle and a weather system. Battlefront doesn't have to worry about any of this at all and can just focus on rendering statically lit scenes.
 
It absolutely does have something to do with graphics rendering since there is an opportunity cost involved. If you use up all your CPU cycles on AI then how do you think the draw calls are going to get done?

Until you know what is actually going on under the hood of the two respective games you are not at liberty to say whether one is more optimised than the other.

any borderline sane number of draw calls cost extremely little cpu wise on consoles and on PC the game is completely gpu limited
 
You expected something different? It's the same engine as in FC4 which is known to run better on GCN h/w AND on h/w with more VRAM. So this is a perfect storm situation for 3xx series.

Well, almost all current games run better on AMD than Nvidia...expect even more differences with Async compute on the future.

But hey...75% of the GPU market for Nvidia.

Cheers people.
 
any borderline sane number of draw calls cost extremely little cpu wise on consoles and on PC the game is completely gpu limited

They cost very little on consoles because it's written in a format that the GPU can directly understand. It doesn't have to be "translated" by a driver as is the case on the PC hence draw calls are absolutely a concern for PC development. If you have AI and a world state to take into account this will also require resources from the CPU.

The comparison to Battlefront doesn't make any sense since the games are doing two completely different things. If we are going to start doing this then we should just go the whole hog and compare everything to the UE4 kite demo.
 
They cost very little on consoles because it's written in a format that the GPU can directly understand. It doesn't have to be "translated" by a driver as is the case on the PC hence draw calls are absolutely a concern for PC development. If you have AI and a world state to take into account this will also require resources from the CPU.

The comparison to Battlefront doesn't make any sense since the games are doing two completely different things. If we are going to start doing this then we should just go the whole hog and compare everything to the UE4 kite demo.

do you honestly think any of the core decisions wrt this games development factored in the PC? i think theres only been 1 ubisoft game this generation where the pc version wasnt outsourced to the kiev studio, and that was the crew. tbh im not even 100% sure on this. the division would be the 2nd.
 
Running well but, I can't believe the water surface in this game is still a flat animated texture. I mean, the very original Far Cry back in 2007 had better water shaders/refraction than this! Considering the emphasis in this game is more on exploring the stunning environments, you would have thought they might have at least brought it inline with most other games these days. Sheesh.

Engine upgrade time.
 
I wonder if it has to do with shadow draw distance like fallout 4.

That other video someone linked has shadows to medium and claims 60fps on a 970.

Fallout 4 benefits similarly to turning shadows to medium so they aren't drawing shadows a mile away that you can't see at 1080p. This could be a real resource hog for little to no benefit in open world games.

It's kind of a tough one for developers because thinking long term, at 4K having those extremely long shadow draw distances is nice and in 10 years when this game will run at 200fps on a $200 GPU at 4K having that option to push shadows out and draw a ton of them will be nice.

Maybe they should start adding a little disclaimer in the settings like some games did for textures stating that a max setting is "future-ready" and may cause performance issues on hardware at the time of release.
 
Getting really good performance at 4k ultra (no aa), titan x SLI 72-83 with Vsyn off, with it on I'm not getting dips at all.

Honestly suprised but you can tell some of the textures look crap every now and then
 
Top Bottom