Far Cry Primal PC performance thread

Also to note the benchmark is 100% representative of gameplay FPS.

While my overclocked benchmarks never went under 40, I have seen it go under 40 various times within the gameplay it self.

I know why they don't do it because people will scream unoptimised, but they should make Benchmarks during really PC stressing scenes and not just a flyby.
 
Far Cry 4 was definitely not good when it launched.

It was NEVER good.

They never fixed the stuttering, it's just that people have found ways to downgrade textures and avoid them.

But whereas Farcry 4 to Primal improved on console, the PC engine got WORSE. It takes a whole lot of effort to spend a year making the same engine even worse.

Kudos to Farcry technical team.
 
I guess I'm lucky that I don't care for AA and turn it to smoething low or off

I don't see anything higher than FXAA, but I'm we could mess with the registry or nvidia control panel
 
i5 2500K@4.4 GHz
GeForce GTX 970 Twin FROZR V (Game Ready 361.91 drivers)
8 GB Corsair Vengeance 1600 MHz
Win 10

1080p/Ultra: benchmark says 52 FPS, in-game average seems more like 48 in the areas i saw.
 
i5-2500k @ 4.0ghz

8GB 1600mhz RAM

GTX 970 SSC

Here are a few benchmark results.

Ultra Benchmark:

Low: 46
Avg: 56

Ultra w/ Shadows on Very High

Low: 48
Avg: 62 fps

I'm going to roll Ultra with shadows on VH and start the game. I'll try to alleviate the dips later. I almost think VH shadows looks better than Ultra, at least on the benchmark. Ultra is too sharp IMO.
 
Does this still have movement stutter on "uneven" framerates? I noticed that in FC4 when I tried it on my Gsync monitor; anything that wasn't 30 or 60 (couldn't push it past that anyway at reasonable graphics) had super annoying stutter during movement.
 
Runs above 30fps all Normal settings @2160x1080 on my i3 Alienware Alpha. Turned of FXAA because I don't really notice a difference with it off and it improves the framerate. Good looking game so far and its fun to play.
 
My benchmark with 2560x1080, 95 fov, SMAA, fullscreen mode and everything else maxed:
min fps 51
avg fps 65
max fps 77
vram usage was 20%

specs: 3570k@4.6, Titan X, 16gb DDR3 RAM (1600), installed on an ssd, running windows 7, latest nvidia driver
 
1920x1080, Ultra preset; 16GB DDR3-2133, 2600K @ 4.5GHz, 960GB Crucial M500, 980 Ti, 361.82, Win10 x64.

Practically identical results with the latest drivers:

2016-03-01_00001vps6p.jpg
 
Fired it up this morning for a bit, i7 4790k w/ GTX 980SC and 24GB of RAM, getting 50fps average at 1440p at Ultra settings, haven't noticed any big performance drops about half an hour in.

Game looks so good, can't wait to play more.
 
Is there a jump in graphics between this and FC4? im wondering why its so much more demanding. From what I've read it doesn't even have Gameworks integrated. Wonder what they changed.
 
Finally got to test it and it runs pretty wonderful for me so far. I set it up as a 1080p window for streaming and using the Very High preset with Shadows turned down to High and Motion Blur off, I haven't dipped below 65fps so far. No stuttering either!

970 MSI / 4790k @ 4,6 GHz / 16GB RAM
 
Finally got to test it and it runs pretty wonderful for me so far. I set it up as a 1080p window for streaming and using the Very High preset with Shadows turned down to High and Motion Blur off, I haven't dipped below 65fps so far. No stuttering either!

970 MSI / 4790k @ 4,6 GHz / 16GB RAM

So this game is actually pretty well optimised ? The first page looks kinda silly now.
 
So this game is actually pretty well optimised ? The first page looks kinda silly now.

Blame the benchmarking media this insists on Ultra everything for all benchmarks. I prefer a game shipping with future card (or multi-gpu) settings instead of having to dig around in config files or even mods to get better graphics in the future. If the game shipped without any Ultra setting at all there would be almost no playability complaints outside of most Nvidia cards underperforming.
 
With all this crying of bad ports when a game is demanding, I can forsee a future were pc games are in visual parity with console games.

I'm not saying bad ports don't exist but what I am saying is ultra is far better looking than the consoles version hence why its more demanding.

However I can see a future were developers go screw it and the max options look very similar to console options, and the game runs at 120 fps and it will be "optimised" , but then people will cry about console parity.

Like how assassins creed SYNDICATE "OPTIMISED" by making the visuals worse than unity.

I do think a new nvidia driver may help 's bit though.


I don't think there are too many people who will disagree with you. But you're describing something that isn't happening in this thread. The biggest cause of concern was the worrying differences in brand performance and the pitiful pre-launch stats for the highest-end cards @ 1080 and 1440.

It's not entirely unreasonable to unconditionally expect 60 frames per second on a visually status-quo AAA title using the most expensive consumer-grade GPUs.
 
Finally got to test it and it runs pretty wonderful for me so far. I set it up as a 1080p window for streaming and using the Very High preset with Shadows turned down to High and Motion Blur off, I haven't dipped below 65fps so far. No stuttering either!

970 MSI / 4790k @ 4,6 GHz / 16GB RAM
Same. Played it a bit on my slightly overclocked GTX 970 (1400Mhz effective boost clock), 3570K (3.8Ghz) and 16GB RAM (1600Mhz). 1080p with everything maxed out, except shadows on High instead of Ultra, and motion blur off: A solid 60fps. So if you don't mind turning shadows down a bit the game runs flawlessly.
 
Game runs pretty consistently at 60fps for me by just turning shadows down to Very High on my 970.

Side note, SMAA looks alright, but I tried FXAA and it looks HORRENDOUS. Worst implementation I've seen. Anyone else notice this?
 
Game runs pretty consistently at 60fps for me by just turning shadows down to Very High on my 970.

Side note, SMAA looks alright, but I tried FXAA and it looks HORRENDOUS. Worst implementation I've seen. Anyone else notice this?

Yeah. I actually play with FXAA in most games and don't mind it but I immediately noticed how bad it looks in this
 
Side note, SMAA looks alright, but I tried FXAA and it looks HORRENDOUS. Worst implementation I've seen. Anyone else notice this?
Yes I noticed this. SMAA looks pretty smooth. FXAA is a noisy mess in motion. By the way, there is no Ambient Occlusion option in the menu. Is there no AO in this game at all?
 
Yes I noticed this. SMAA looks pretty smooth. FXAA is a noisy mess in motion. By the way, there is no Ambient Occlusion option in the menu. Is there no AO in this game at all?

There is no AO option in the menu. Its probably running Ubisoft's custom AO solution that the console versions use. It does seem pretty heavy handed and reminds me of FC4's PS4 AO.

I wish the game also had the console's HRAA option.
 
I take back what I said earlier. This game looks and performs great on the 980 TI. 60FPS feels so much better than 30. How do I get SLI working though?
 
So I caved and bought it last night and played for about 2 hours. I was able to Max out the game with no drops below 60 at 1080p (TV ) I'll try it on my monitor later which is 2560x1080 game is pretty fun so far.

I'm running:
i7-4770k
780 Ti SLI (not sure if it supports SLI I didn't check)
16 gb ddr3 ram
Installed on SSD
 
Time for a peasant to chime in. 960 + 17-4460 here.

Managing v-high, 720p at v-synced 50hz.
Dropping shadows from v-high to high allows me to play at 60hz, but 50 will do me just fine. All in all, happy with the performance.
 
Is it or does this game zero ambient occlusion? There isn't even an option for it. Not even HBAO what is going on? Can if be forced through nvidia CP?
 
that doesnt actually support you. a 290 beating a 970 is a huge win for AMD

Yeah, hence why I'm saying this - this game isn't representative of the average DX11 performance landscape. I've expected as much prior to launch because of how FC4 was the exact same.
 
AMD beating Nvidia with old gpus is a normal thing.

The 970 is a a f***ing joke of a card, Nvidia lying again with it's planned obsolescence.

Prepare yourself Maxwell users, Pascal is going to kill your cards.
 
AMD beating Nvidia with old gpus is a normal thing.

The 970 is a a f***ing joke of a card, Nvidia lying again with it's planned obsolescence.

Prepare yourself Maxwell users, Pascal is going to kill your cards.

Sure, because Maxwell has obviously "killed" both 770 and 780Ti in the benchmark in question:

GwJb.png
 
Sure, because Maxwell has obviously "killed" both 770 and 780Ti in the benchmark in question:

GwJb.png

Are you blind?

Look that 780Ti, a 600$ GPU running almost like a 250$ GPU from AMD.

And better not mention the 970 compared to the R9 290/390, instakill.

Modern architecture, worst performance, more expensive, less DX12 support, no async compute support, and and obvious planned obsolescence.

Cheers Nvidia...you're doing great, because even with all this shit you're leading the GPU market by an enormous difference, that means you're the "gods of marketing".
 
Are you blind?

Look that 780Ti, a 600$ GPU running almost like a 250$ GPU from AMD.

And better not mention the 970 compared to the R9 290/390, instakill.

Modern architecture, worst performance, more expensive, less DX12 support, no async compute support, and and obvious planned obsolescence.

Cheers Nvidia...you're doing great, because even with all this shit you're leading the GPU market by an enormous difference, that means you're the "gods of marketing".

I think you're too aggressive to the point where people that might notice Kepler trending downward are off-put. Take it easy.

But yes, that chart does show the trend of Kepler performing 1-2 tiers below where it used to against AMD's GCN. 770 competed with the 280X, originally, and the 780 Ti competed with and was priced above the 290X. It's no guarantee that Pascal will do the same, of course. But if one were a Kepler user it can't hurt to note these facts.
 
I think you're too aggressive to the point where people that might notice Kepler trending downward are off-put. Take it easy.

But yes, that chart does show the trend of Kepler performing 1-2 tiers below where it used to against AMD's GCN. 770 competed with the 280X, originally, and the 780 Ti competed with and was priced above the 290X. It's no guarantee that Pascal will do the same, of course. But if one were a Kepler user it can't hurt to note these facts.

My point is, WTF is thinking Nvidia...

WHY make a new architecture like Maxwell that doesn't support DX12 or async compute, while AMD GCN, which is an older architecture, has full support.

WHY lying their costumers with the 970 3.5 Gb fiasco, and acting like nothing happened.

WHY this planned obsolescence.

Just f***ing WHY, and the worst thing of all is that the consumers are not aware of this things, or even if they are aware, they don't respond with acts like "ok, next time i will not buy an Nvidia GPU".

I'm an Nvidia user, and this is the last time i buy one of their GPUs, they are not going to lie me again.
 
Top Bottom