Far Cry Primal PC performance thread

Keeping up by playing at half the framerate (and more in a lot if cases), lower resolution, lower quality IQ and lower details.

Good stuff.

Did i just walk into a fanboy war thread without warning?

I was simply responding to someone who said console graphics were low medium for this gen, which for the most part is absolutely not true.

Some people can't help themselves can they..
 
My point is, WTF is thinking Nvidia...

WHY make a new architecture like Maxwell that doesn't support DX12 or async compute, while AMD GCN, which is an older architecture, has full support.

WHY lying their costumers with the 970 3.5 Gb fiasco, and acting like nothing happened.

WHY this planned obsolescence.

Just f***ing WHY, and the worst thing of all is that the consumers are not aware of this things, or even if they are aware, they don't respond with acts like "ok, next time i will not buy an Nvidia GPU".

I'm an Nvidia user, and this is the last time i buy one of their GPUs, they are not going to lie me again.

Historically, ATI/AMD implement new features ahead of their rivals.
For example:
-Radeon 9700 was the first DirectX 9 GPU
-Phenom had L3 cache before Intel started doing it with Nehalem
-Bobcat had Out-of-Order execution, which Intel later implented into Baytrail Atom.

Maxwell is more of a power/efficiency design and still largely based on Kepler and targeting DX11 software. Additional DX12 features were probably added late in the game. GCN's design was more forward-thinking, but held back (perhaps mostly by bad drivers and CPU utilization) under DX11.
 
Are you blind?

Look that 780Ti, a 600$ GPU running almost like a 250$ GPU from AMD.

And better not mention the 970 compared to the R9 290/390, instakill.

Modern architecture, worst performance, more expensive, less DX12 support, no async compute support, and and obvious planned obsolescence.

Cheers Nvidia...you're doing great, because even with all this shit you're leading the GPU market by an enormous difference, that means you're the "gods of marketing".

Oh, you're talking about prices? I haven't realized that.

Ok, let's look at how a $330 Radeon R9 390 is almost running like a $550 R9 290X! How can AMD do this to its users? How???

it has pretty much become the expected dx11 landscape sadly

No, it hasn't. It's pretty much still an exception to the rule in DX11.

Maxwell is more of a power/efficiency design and still largely based on Kepler and targeting DX11 software. Additional DX12 features were probably added late in the game. GCN's design was more forward-thinking, but held back (perhaps mostly by bad drivers and CPU utilization) under DX11.

Maxwell is more advanced in DX12 features than whatever is the newest GCN version.
 
Oh, you're talking about prices? I haven't realized that.

Ok, let's look at how a $230 Radeon R9 380X is almost running like a $550 R9 290X! How can AMD do this to its users? How???.

iWKad22.jpg


Edit: Since you edited, I'll be fair and explain why you are still worthy of a facepalm by saying "let's look at how a $330 Radeon R9 390 is almost running like a $550 R9 290X! How can AMD do this to its users? How???"

280X and 780 Ti launched within less than a month of each other, at $300 and $700, respectively. These cards were contemporary competitors at those prices. The latter card is 9% faster in the graph you linked. A bit of an error when seph1roth said “Look that 780Ti, a 600$ GPU running almost like a 250$ GPU from AMD.” He was off by prices, but the ratio of prices is nearly correct to what he said.

The 390 for $330 launched nearly 2 years after the $550 290X. How utterly misleading. It also stands strong above its equal priced competitor, the 970. And, btw, that $550 290X was competing against the $700 780 Ti at that time too and now the cheaper card is 28% faster. Taking 290X or 390 price or performance in this game this as a negative reeks of desperation and is just off base with reality.
 
Am I the only one that has started to accept and is now fine with running all games like this with the highest settings at 1080/30fps when they can't achieve 60 FPS? Sure, I'll attempt to but if I can't, I'll happily settle for a stable 30 FPS at Ultra settings. More often than not, it will look better than the console versions anyway.
With Primal I noticed that the console 30fps *feels* better than a PC 30fps and its due to frame pacing. Is there a program I can use to make game profiles after tweaking settings to get a good pace? Thanks guys!
 
I think you're too aggressive to the point where people that might notice Kepler trending downward are off-put. Take it easy.

But yes, that chart does show the trend of Kepler performing 1-2 tiers below where it used to against AMD's GCN. 770 competed with the 280X, originally, and the 780 Ti competed with and was priced above the 290X. It's no guarantee that Pascal will do the same, of course. But if one were a Kepler user it can't hurt to note these facts.

This stuff is almost making me reconsider AMD.
I'm probably going AMD again when the new cards release. Nvidia has done so much wrong in the past two years.
Am I the only one that has started to accept and is now fine with running all games like this with the highest settings at 1080/30fps when they can't achieve 60 FPS? Sure, I'll attempt to but if I can't, I'll happily settle for a stable 30 FPS at Ultra settings. More often than not, it will look better than the console versions anyway.
With Primal I noticed that the console 30fps *feels* better than a PC 30fps and its due to frame pacing. Is there a program I can use to make game profiles after tweaking settings to get a good pace? Thanks guys!

What are you even talking about? Consoles aren't getting 30fps and 30fps doesn't even feel good.
 
iWKad22.jpg


Edit: Since you edited, I'll be fair and explain why you are still worthy of a facepalm by saying "let's look at how a $330 Radeon R9 390 is almost running like a $550 R9 290X! How can AMD do this to its users? How???"

280X and 780 Ti launched within less than a month of each other, at $300 and $700, respectively. These cards were contemporary competitors at those prices. The latter card is 9% faster in the graph you linked. A bit of an error when seph1roth said “Look that 780Ti, a 600$ GPU running almost like a 250$ GPU from AMD.” He was off by prices, but the ratio of prices is nearly correct to what he said.

The 390 for $330 launched nearly 2 years after the $550 290X. How utterly misleading. It also stands strong above its equal priced competitor, the 970. And, btw, that $550 290X was competing against the $700 780 Ti at that time too and now the cheaper card is 28% faster. Taking 290X or 390 price or performance in this game this as a negative reeks of desperation and is just off base with reality.

That's a lengthy explanation for something so trivial as a comparison of an old top end videocard to a new middle end.

The only reason why AMD doesn't show the same price/performance difference is because AMD's 300 line consist of rebadges of the same GPUs used for 200 line while NV has actually produced new architecture and a whole lineup of new GPUs on it which are better, faster and cooler than their predecessors.

Other than that this situation is a typical one on the market and you have to be kinda genius to present it as AMD's advantage - look, AMD hasn't really updated its GPU since 2013! How great!
 
It's sad that Nvidia apparently doesn't optimize for Kepler and Fermi anymore, but then again AMD also ditched their VLIW/Terascale GPUs. A lot of models came out as late as 2012.
I'll wait for them to get stuck on the next process node for a few years, though. First designs on a new node usually aren't as efficient as they could be.
 
It's sad that Nvidia apparently doesn't optimize for Kepler and Fermi anymore, but then again AMD also ditched their VLIW/Terascale GPUs. A lot of models came out as late as 2012.
I'll wait for them to get stuck on the next process node for a few years, though. First designs on a new node usually aren't as efficient as they could be.

Kepler is performing just fine in the game this thread is about as can be seen in all benchmarks available. The amount of games where Kepler is performing notably worse than expected is very low. This version of Dunia engine is simply GCN-optimized in general as the same results were seen in FC4 previously.
 
That's a lengthy explanation for something so trivial as a comparison of an old top end videocard to a new middle end.

I'm not surprised you don't get why your comparison is plainly and grossly misleading.


The only reason why AMD doesn't show the same price/performance difference is because AMD's 300 line consist of rebadges of the same GPUs used for 200 line while NV has actually produced new architecture and a whole lineup of new GPUs on it which are better, faster and cooler than their predecessors.

Other than that this situation is a typical one on the market and you have to be kinda genius to present it as AMD's advantage - look, AMD hasn't really updated its GPU since 2013! How great!

What are you trying to shift the goal posts to? Nothing that discounts AMD cards offering far more value in Far Cry Primal. I just don't understand your issue with AMD and especially its relation to Nvidia's performance in this game, and how you could possibly try to spin it into a negativity.
 
I'm not surprised you don't get why your comparison is plainly and grossly misleading.
I understand that my comparison is just a fact so yeah I don't understand how you can even argue with that.

What are you trying to shift the goal posts to? Nothing that discounts AMD cards offering far more value in Far Cry Primal. I just don't understand your issue with AMD and especially its relation to Nvidia's performance in this game, and how you could possibly try to spin it into a negativity.
Goalposts? What? I was replying to a guy who is running around spewing nonsense like "Prepare yourself Maxwell users, Pascal is going to kill your cards." and the second guy who was saying "It's sad that Nvidia apparently doesn't optimize for Kepler and Fermi anymore" both of which are completely incorrect.

My issue isn't with AMD here, it's with several AMD fans who don't know jack shit about anything h/w related but are always running around screaming how async compute will bake bread in DX12 on their Pitcairns.
 
I understand that my comparison is just a fact so yeah I don't understand how you can even argue with that.

Right, your response to how bad it is that a $300 AMD card performing within 9% of a $700 Nvidia card (these cards launched with these prices within weeks of each other) is to express how bad it is that AMD's $550 card from 2013 is within a few % of AMD's $330 from 2015. Really, now you are entering troll material if you think you can firmly stand on this ground.
 
Am I the only one that has started to accept and is now fine with running all games like this with the highest settings at 1080/30fps when they can't achieve 60 FPS? Sure, I'll attempt to but if I can't, I'll happily settle for a stable 30 FPS at Ultra settings. More often than not, it will look better than the console versions anyway.
With Primal I noticed that the console 30fps *feels* better than a PC 30fps and its due to frame pacing. Is there a program I can use to make game profiles after tweaking settings to get a good pace? Thanks guys!

Depends on the game. For a game like Far Cry, FPS perspective where I can get attacked from all angles, be it by animals or enemies I prefer the higher framerate.
 
Am I the only one that has started to accept and is now fine with running all games like this with the highest settings at 1080/30fps when they can't achieve 60 FPS? Sure, I'll attempt to but if I can't, I'll happily settle for a stable 30 FPS at Ultra settings. More often than not, it will look better than the console versions anyway.
With Primal I noticed that the console 30fps *feels* better than a PC 30fps and its due to frame pacing. Is there a program I can use to make game profiles after tweaking settings to get a good pace? Thanks guys!

I'm pretty late with a response but I'll say that I understand why someone would value graphics and IQ over fps. Personally, sub-60 fps is difficult for me to get used to; it just feels slow.

As for a program, RivaTuner Statistics Server works fine at managing frame pacing. It comes with the download of MSI Afterburner.
 
So this game runs like total crap with 980ti SLI enabled. Disable SLI and not only do I get the amazing SMAA implementation but the game runs at a fixed 60FPS. It's like a completely different game just running on a single card :|

(Ultra fyi)
 
I'm pretty late with a response but I'll say that I understand why someone would value graphics and IQ over fps. Personally, sub-60 fps is difficult for me to get used to; it just feels slow.

As for a program, RivaTuner Statistics Server works fine at managing frame pacing. It comes with the download of MSI Afterburner.

How do you correctly set up the statistics server to fix pacing issues? What settings in the SS should I use when limiting to 30 fps?

What should the game be set to? Vsync on/off/half? Any other settings I should change?
 
So this game runs like total crap with 980ti SLI enabled. Disable SLI and not only do I get the amazing SMAA implementation but the game runs at a fixed 60FPS. It's like a completely different game just running on a single card :|

(Ultra fyi)

But wait... on Nvidia's website they state that, "By downloading our new Game Ready driver you'll receive performance optimizations and a SLI profile, ensuring optimal performance and smoothness when playing the prehistoric action title."
 
But wait... on Nvidia's website they state that, "By downloading our new Game Ready driver you'll receive performance optimizations and a SLI profile, ensuring optimal performance and smoothness when playing the prehistoric action title."

NVIDIA's recent drivers have been pretty half-baked. Not surprising at all.
 
There seems to be some pretty weird mouse issues going on on my end. I have acceleration and all that jazz off but if I'm aiming down sights, I have to move my mouse quite strongly for it to even move, I can't take super fine headshots because tiny mouse movements don't register
 
But wait... on Nvidia's website they state that, "By downloading our new Game Ready driver you'll receive performance optimizations and a SLI profile, ensuring optimal performance and smoothness when playing the prehistoric action title."

Well I wasn't trying to troll nVidia owners (I'm obviously one myself), but I am a bit perturbed with the fact that I can't reliably keep SLI enabled and expect to be at least on par with single card performance.

So SMAA is better than FXAA in this game?

Night and day better.
 
Well I wasn't trying to troll nVidia owners (I'm obviously one myself), but I am a bit perturbed with the fact that I can't reliably keep SLI enabled and expect to be at least on par with single card performance.



Night and day better.

Sorry, I was being sarcastic. I agree with you, the SLI is borked. I was also running SLI and changed to a single GPU and got better results.
 
There seems to be some pretty weird mouse issues going on on my end. I have acceleration and all that jazz off but if I'm aiming down sights, I have to move my mouse quite strongly for it to even move, I can't take super fine headshots because tiny mouse movements don't register

It's a mouse deadzone issue. A bunch of people on Steam and Ubisoft forum have experience the same thing. I switch to playing it on my controller until they fix it.
 
I'm probably going AMD again when the new cards release. Nvidia has done so much wrong in the past two years.


What are you even talking about? Consoles aren't getting 30fps and 30fps doesn't even feel good.

what? Did you think before you made this post?

Similarly, performance also sees a welcome, if somewhat subtle boost - frame-rate drops and tearing seen in Far Cry 4's more demanding areas (particularly on Xbox One) now seem to be a thing of the past, with Primal offering up a smoother experience throughout. Aside from the occasional dropped or torn frame, performance remains locked at 30fps, with consistent controls and relatively smooth motion. Fast-paced scenes and shaky camera movement sometimes create visible judder, which mildly distracts during fierce hand-to-hand combat - but frame-rates remain locked at 30fps here regardless.

http://www.eurogamer.net/articles/digitalfoundry-2015-performance-analysis-far-cry-primal
 
It's a mouse deadzone issue. A bunch of people on Steam and Ubisoft forum have experience the same thing. I switch to playing it on my controller until they fix it.

At least I know I'm not the only one now, thanks :)
Really annoying though, such an issue shouldn't happen in a 2016 PC game :/
 
Right, your response to how bad it is that a $300 AMD card performing within 9% of a $700 Nvidia card (these cards launched with these prices within weeks of each other) is to express how bad it is that AMD's $550 card from 2013 is within a few % of AMD's $330 from 2015. Really, now you are entering troll material if you think you can firmly stand on this ground.

The only place this is happening is in the title in question. While my comparison is true for all games out there. So yeah, I see who is entering what here.

Now since you're a fan of judging a trend by one data point here's some sand in your gears:

index.php


Look, an NV 980 which launched for $550 is beating an AMD's Fury X which has launched 9 months later for $650. AMD must be doomed.
 
How do you correctly set up the statistics server to fix pacing issues? What settings in the SS should I use when limiting to 30 fps?

What should the game be set to? Vsync on/off/half? Any other settings I should change?

Usually the best way to get a 30fps 33.3ms lock is to:

1. Set a 30fps limit in rivatuner
2. Play in borderless window
3. Experiment with vsync on and off and see which feels smoother

I tried a 30fps lock briefly and for me on a 970, 30fps rivatuner lock with borderless window and in game vsync gave perfect 33.3 ms frame times.
 
The only place this is happening is in the title in question. While my comparison is true for all games out there. So yeah, I see who is entering what here.

Now since you're a fan of judging a trend by one data point here's some sand in your gears:

index.php


Look, an NV 980 which launched for $550 is beating an AMD's Fury X which has launched 9 months later for $650. AMD must be doomed.

so now youre purposely using benchmarks that arent even representative of how the game performs currently?
 
That's good news... what's your setup? I'm not doubting you just want to figure out what's going on.

Totalbiscuit did another FC Primal testing with the new release patch version and latest 362.00 drivers and he still states SLI is borked on his system.

https://www.youtube.com/watch?v=vDpKx_a1bW8

Acer x34 GSYNC (3440x1440)
i7 4790k @ 4.4
Gigabyte G1 Gaming 980ti SLI
16GB RAM @ 2300
SSD

updating to the latest nvidia drivers still gives me no problems.
 
How do you correctly set up the statistics server to fix pacing issues? What settings in the SS should I use when limiting to 30 fps?

What should the game be set to? Vsync on/off/half? Any other settings I should change?

Someone already answered for you but Durante has a guide on using RTSS. It deals specifically with Witcher 3 but works with any game, it also outlines what exactly RTSS does.

create a profile for Witcher 3 by clicking on the “+” sign, and set the desired framerate limit. The final result should look like the configuration in the image to the right.
Just choose the .exe of the program you want to set a limit for and change the value under "Framerate limit".
 
That's good news... what's your setup? I'm not doubting you just want to figure out what's going on.

Totalbiscuit did another FC Primal testing with the new release patch version and latest 362.00 drivers and he still states SLI is borked on his system.

https://www.youtube.com/watch?v=vDpKx_a1bW8

I agree with him. SLI is boned. It doesn't matter to me anyway at this point because I cannot use SMAA with SLI :(
 
That's good news... what's your setup? I'm not doubting you just want to figure out what's going on.

Totalbiscuit did another FC Primal testing with the new release patch version and latest 362.00 drivers and he still states SLI is borked on his system.

https://www.youtube.com/watch?v=vDpKx_a1bW8

My post from earlier in the thread. SLI is running pretty good for me other than there being a LOD streaming issue in the benchmark, during gameplay it's fine:

Here's my benchmark results with single card and SLI. Noticeable LOD distance/streaming issues with SLI at the moment. It didn't seem like there was any stuttering/hitching. I'll wait and see what the gameplay is like though before judging that.

6700K stock(4.2ghz)
980Ti SLI w/361.91 drivers
16gb DDR4
SSD
Windows 10
1440p + Ultra settings

Single card :


SLI:
 
The only place this is happening is in the title in question. While my comparison is true for all games out there. So yeah, I see who is entering what here.

Now since you're a fan of judging a trend by one data point here's some sand in your gears:

index.php


Look, an NV 980 which launched for $550 is beating an AMD's Fury X which has launched 9 months later for $650. AMD must be doomed.

Damn, only two cards can max that game out at 1080p? 49fps with a 970 :(
 
So you can tell us how FCP will perform in two months?

what does that have to do with anything? we already know how RotTR performs now so why are you using benchmarks that dont represent that? benchmarks that show the most up to date performance are all that matter for any game.
 
what does that have to do with anything? we already know how RotTR performs now so why are you using benchmarks that dont represent that? benchmarks that show the most up to date performance are all that matter for any game.

This has to do with you assuming that the performance situation present in the launch version of FCP won't change with patches later even though in the same paragraph you're saying that it has changed in RotTR.
 
I don't notice any difference between high, very high, and ultra shadows other than the 10-15 fps I gain in the benchmark.

That puts the game at that nice 60-70fps on average on my 970.
 
Top Bottom