• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Witcher 3 PC Performance Thread

ACE 1991

Member
Any fix for the stuttering? If I limit the frame rate to 30 or reduce settings to achieve a locked 60 fps it goes away. I'm getting 45-60 on average and anything below 60 is rather stuttery. On an r9 290 and 2500K BTW.
 

jett

D-Member
Have you tried forcing 30fps with RadeonPro and using Unlimited in-game?

I'm using that plus Double Vsync (no triple buffering) and I felt it was acceptable for 30fps. Much better than in-game 30fps lock + in-game Vsync.

I did this and experienced no difference whatsoever.
 

Gbraga

Member
I'm following Durante's recommendation for the 30fps lock: no v-sync, borderless window, unlimited framerate and RTSS set to 30fps.
 

GavinUK86

Member
I'm following Durante's recommendation for the 30fps lock: no v-sync, borderless window, unlimited framerate and RTSS set to 30fps.

That creates even more stuttering for me, that's on top of the rubbish performance outside of fullscreen.

Fullscreen, vsync and locking it to 30 with Rivatuner works fine if you wanna play it at 30fps.
 

xam3l

Member
For someone with a modest laptop like this:

i7 4710HQ @ 2.5 - 3,5GHz
12 GB RAM
GTX850M 2GB DDR3

How does it old?
It's possible to have a performance similar of a console with similar specs?
 

MisterM

Member
I did this and experienced no difference whatsoever.

I definitely noticed a difference but then everyone's experience will differ slightly.

Definitely running RadeonPro in 64bit mode? I wondered why nothing seemed different and I had no OSD at first.
 

The Llama

Member
Saw this in the i3/750ti thread, thought it'd be worth posting in here to give people an idea of what to expect.

http--www.gamegpu.ru-images-stories-Test_GPU-RPG-The_Witcher_3_Wild_Hunt-game-new-1920_h_off.jpg


Seems to match up with other benchmarks and what we've seen from people ITT.
 

Iceternal

Member
Saw this in the i3/750ti thread, thought it'd be worth posting in here to give people an idea of what to expect.

http--www.gamegpu.ru-images-stories-Test_GPU-RPG-The_Witcher_3_Wild_Hunt-game-new-1920_h_off.jpg


Seems to match up with other benchmarks and what we've seen from people ITT.

This is bullshit.

No way am I getting 60+fps avg on a 780 ti with ultra settings without hairworks.
 

rc6886

Unconfirmed Member
Saw this in the i3/750ti thread, thought it'd be worth posting in here to give people an idea of what to expect.

http--www.gamegpu.ru-images-stories-Test_GPU-RPG-The_Witcher_3_Wild_Hunt-game-new-1920_h_off.jpg


Seems to match up with other benchmarks and what we've seen from people ITT.

The performance of the 970 SLI is looking very good.

I keep going back and forth between getting another 970 or waiting for the 980 Ti.
 
This is bullshit.

No way am I getting 60+fps avg on a 780 ti with ultra settings without hairworks.

Can confirm, that's BS.

With everything maxed bar hair on a slightly OC'd 970 (120/225), I don't hit 70fps ever. Chart says minimum 71? lol.

Unless it doesn't mean Ultra?

EDIT: Yeah it 's referring to HIGH settings, my bad.
 

UnrealEck

Member
How much heat can the MSI GTX970 generally withstand? Seems to be reaching around 75 celsius at times with TW3 (hovers around 50 to 70 usually). I wouldn't ask normally since it doesn't sound too high, but the fans are making quite a lot of noise.

Probably around 100c. Ideally I'd reccomend keeping it in the 80's. It's entirely up to you though. Higher temps (going close to the TJ max) might affect the lifespan of your card.
 

buffelo

Neo Member
Saw this in the i3/750ti thread, thought it'd be worth posting in here to give people an idea of what to expect.

http--www.gamegpu.ru-images-stories-Test_GPU-RPG-The_Witcher_3_Wild_Hunt-game-new-1920_h_off.jpg


Seems to match up with other benchmarks and what we've seen from people ITT.

It should be noted that this is a test with HIGH settings across the board. Just checked out the article on gamegpu.ru.
 

UnrealEck

Member
Unless it doesn't mean Ultra?

EDIT: Yeah it mans High. My bad.

Yep, HQ is High, VHQ is ultra.
Those numbers still appear somewhat generous. If you benchmark the game in a city or village, the numbers will change drastically. From what I've seen also (and I know I'm being a parrot here) Maxwell generally has a bigger lead over Kepler GPUs.
 

Xasthure

Neo Member
Ok, so I've been able to play more without crashes since last night. What ultimately worked better for me is by playing the game in fullscreen, with the GTAV-drivers, and setting the frame rate to unlimited while V-sync off. After that setting V-sync to adaptive in Nvidia panel. Still have had one or two crashes, but maybe 3 hours apart.

STILL, I get these white artifacts in the game sometimes. Usually in the distance, but when they come nearer bugs start to happen. For example, I may be riding Roach when one of those artifacts appear. And in a second I'm thrown off Roach. WTF?

Anyone else getting these?

I'm on i5 4690K (not overclocked), Gigabyte GTX 780 ti (factory overclocked) and 8gb RAM. On Windows 8.1.

If I check HWmonitor after a session the GPU-temperature is on max 80c. CPU around 50-60c.

Anyone?
 
This is bullshit.

No way am I getting 60+fps avg on a 780 ti with ultra settings without hairworks.
Believe they are high settings

Heres how I'm doing so far

Alienware X51, GTX 970, I7 3770 3.4, 16gb

Settings : Ultra with hairworks (MSAA off for hair), foliage distance set to high

Performance :

Cutscenes are running well when Geralts on screen they can drop into the low 30s due to hairworks (If MSAA isnt at 2x or off fps tanks heavily).

Gameplay is running at 40-60 in most areas only exception is if the Area requires alot off AO (Dark forest area or dense grass late at night) when it can drop to the high 30s.

Game runs really well haven't noticed any streaming stutters like those that GTA 5 had which is great + Added Bonus Vsnyc is Triple Buffered \O/
 

FakeWayne

Neo Member
My experience so far has been pretty much on par with everyone else who has similar hardware.

I'm running with a 3570K @4.2, GTX 970, 16 GGB RAM, Game is on an SSD. Pretty much locked 60 fps 1080p.

I've got Foliage, Shadows, and Background Characters all set to High, everything else is on Ultra. No hairworks, and I'm using FXAA in the NVCP which definitely looks better and costs less than in game AA.

I got 3 crashes in a little over 3 hours of play time. I'm like 99% certain it's not due to my overclocks on either my CPU or GPU.

On a semi related note...I had been noticing a lot of my games lately (DA:I, FC4, GTA, TW3), even when "locked" at 60 fps, would very briefly "hitch" to about 56 fps for less than a second and then jump right back up to 60. I had been completely stumped as to what the hell was causing it, until last night when I stumbled upon someone on Tom's Hardware a few months back who had the same issue and was using a DisplayPort to connect his Monitor to his GPU, just like I was. He suggested that changing over to an HDMI cord solved his problems, so I tried it...and viola. It's definitely better now. I'm not sure whether my cable was faulty, or if DisplayPort simply just isn't the way to go for whatever reason. So if anyone is having any similar issue and is using DisplayPort....try it out.
 
Think I am gonna totally reset my settings (tweaked a heap of config files yesterday night and something is keeping me below 60 all of a sudden) and run a load of tests to see how I perform overall.

Any suggestions for good places to test average fps that'll be consistent with the majority of the game world?
I'm only in the very first opening area right now as I've only really been fiddling with the damn graphics :p

Also, regarding BG characters:

Number of Background Characters

According to the game's config file, 'Number of Background Characters' limits the number of NPCs simultaneously rendered to 75, 100, 130, or 150, depending on the detail level chosen. To date we have been unable to find any location that features even 75 characters, let alone 150, so are unable to demonstrate the impact of this setting.

One has to assume a location where the setting is of use exists given its inclusion in the game. If you find it, please let us know in the Comments section.

from: http://www.geforce.com/whats-new/gu...r-3-wild-hunt-number-of-background-characters

May not make much difference having it on high if that scenario never actually happens anyway
 

Derp

Member
So has anyone done driver compares to see if these allegations are true?
Will do them soon. Game is downloading right now. Once it's done I'll test with the latest drivers and then the older drivers and let you know the results. People on Nvidia thread are claiming improvements, so I'd do it regardless just in case. One guy said he could barely hit 35fps with the latest drivers, and when he reverted to the 350.12 drivers he was getting 45-55fps. Seems... Exaggerated... Which is why I'm dying to test this out.
 
Thank you for the info.

At the moment I've had it set to clamp. Should I change it to allow to check it out with driver forced FXAA instead of the in-game aa?

The description says some apps use negative LOD bias to sharpen texture filtering. This sharpens the stationary image but introduces aliasing when the scene is in motion.

clamp doesn't work with fermi gpu's and newer
 

synce

Member
Barely playable on my laptop (20-30fps at 720p and under - default LOW settings)

i7 2760qm @2.4ghz
gtx 560m 1.5gb gddr5
8gb ram

If I unthrottle the CPU to 3.2ghz it improves cutscene framerate but gameplay is still bottlenecked by GPU
 

buffelo

Neo Member
Will do them soon. Game is downloading right now. Once it's done I'll test with the latest drivers and then the older drivers and let you know the results. People on Nvidia thread are claiming improvements, so I'd do it regardless just in case. One guy said he could barely hit 35fps with the latest drivers, and when he reverted to the 350.12 drivers he was getting 45-55fps. Seems... Exaggerated... Which is why I'm dying to test this out.

Hey Derp do you know anything about performing a driver roll-back and using nvidia inspector to implement the W3 sli profile? I could have sworn I read something about that in the nvidia forums but cant find it now..
 

UnrealEck

Member
Pretty sad that 780 sli is getting 4 fps more than a 290x right now.

Right after that image showing excellent SLI scaling on the Titan X.
It just reinforces this whole notion that the driver optimisations for Witcher 3 are based around Maxwell and not Kepler.
 

aravuus

Member
Probably around 100c. Ideally I'd reccomend keeping it in the 80's. It's entirely up to you though. Higher temps (going close to the TJ max) might affect the lifespan of your card.

Alright, thanks. I'll just play with headphones on and stop worrying, then.
 
Right after that image showing excellent SLI scaling on the Titan X.
It just reinforces this whole notion that the driver optimisations for Witcher 3 are based around Maxwell and not Kepler.

Couldn't it also point to the fact that this game is perhaps using more tesselation and compute than we know of how?

Heck the entire post processing chain could be a compute shader on top of the forward+ compute.
 

Shifty1897

Member
I know I could play on High settings / Hairworks off to get a smooth 60fps, but I've been playing Ultra / Hairworks on with a lock on 30fps and the game is just breathtaking.

When the game looks this good, I may be okay with 30 fps.
 
I have to be honest, I find the "hair" setting obsession hilarious. Of all the things to focus on. The fact that there is a specific setting for it cracks me up.

Nvidia has to offer something to gpu enthusiasts!! It would be kind of funny to read through this thread when Witcher 4 comes out.
 

BlackEyes

Member
i have done the trick with the MSAA on the hairworks went from 8 to 2
i get 60 fps with sli 970 gtx on 1440p with everything on, some stutters here and there but it's very smooth
 

Soi-Fong

Member
Will do them soon. Game is downloading right now. Once it's done I'll test with the latest drivers and then the older drivers and let you know the results. People on Nvidia thread are claiming improvements, so I'd do it regardless just in case. One guy said he could barely hit 35fps with the latest drivers, and when he reverted to the 350.12 drivers he was getting 45-55fps. Seems... Exaggerated... Which is why I'm dying to test this out.

Looking forward to seeing your results. If you end up with more fps, I'm definitely reverting back.

Of course, that would just put more suspicion on Nvidia as well.
 

Xyber

Member
Is driver forced FXAA noticeably better than RED's own AA? I'm having a similar setup and haven't tried any manual settings with Nvidia control yet.

I haven't compared, but I would think their own AA is better since it's temporal as well.
 

Barzul

Member
I'm still getting framerate drops into the 20's with a 970 and i5 3570s. When I first launch into the game the framerate is relatively high staying in the 60's and 70's. But as I start moving around, it flutters between the 50's and 20's. Is there a performance patch on the horizon for this because it's just annoying now?

None of my graphics settings are on Ultra, most on high, foliage distance on medium and number of characters on low, shadows on low also. Post-processing DOF, Motion Blur, Sharpening are all turned off, everything else is on high, No AA, using FXAA from Nvidia inspector, changed setting to Performance and number of pre-rendered frames to 2, Triple buffering off and anisotropic filtering set to 16x. I even overclocked my card ffs. I thought my rig would be at least enough to get steady 60 fps frame rate. I'm also getting audio cracks and sync issues. Not sure what that's about.

GTX 970
i5 3570s
8gb RAM
 

buffelo

Neo Member
I haven't compared, but I would think their own AA is better since it's temporal as well.

In my experience it is. Sharp edges like sword blades as well as the chain of the witcher medallion look unnaturally shimmery and shiny when using fxaa.
 

jett

D-Member
Saw this in the i3/750ti thread, thought it'd be worth posting in here to give people an idea of what to expect.

http--www.gamegpu.ru-images-stories-Test_GPU-RPG-The_Witcher_3_Wild_Hunt-game-new-1920_h_off.jpg


Seems to match up with other benchmarks and what we've seen from people ITT.

I've come to realize those benchmarks are bullshit. I am NOT getting close to 60fps on a 280X. It hovers between 30 and 40 the vast majority of the time.
 

The Llama

Member
Couldn't it also point to the fact that this game is perhaps using more tesselation and compute than we know of how?

Heck the entire post processing chain could be a compute shader on top of the forward+ compute.

I've thought about that before, but I don't believe it. AMD cards aren't that goof as tessellation yet outperform the Kepler cards, and Maxwell isn't really that much better than Kepler at compute (IIRC) and still falls well behind AMD cards on that.
 
I'm still getting framerate drops into the 20's with a 970 and i5 3570s. When I first launch into the game the framerate is relatively high staying in the 60's and 70's. But as I start moving around, it flutters between the 50's and 20's. Is there a performance patch on the horizon for this because it's just annoying now?

None of my graphics settings are on Ultra, most on high, foliage distance on medium and number of characters on low, shadows on low also. Post-processing DOF, Motion Blur, Sharpening are all turned off, everything else is on high, No AA, using FXAA from Nvidia inspector, changed setting to Performance and number of pre-rendered frames to 2, Triple buffering off and anisotropic filtering set to 16x. I even overclocked my card ffs. I thought my rig would be at least enough to get steady 60 fps frame rate.

GTX 970
i5 3570s
8gb RAM

You should be getting 50 ish minimum on a 970 with no hairworks. With those settings, 60+ easy. You tried any other games to test for problems with your system?
 

Grassy

Member
i7 2600k @ 4.5
Sli 670's w/352.86 driver
8GB ram

I have everything at Ultra other than shadows and foliage view distance on high, Hairworks off and with a few of the post-processing options like sharpening and motion blur turned off, at 1080p I'm getting a pretty solid 60fps with some dips into the mid 50's in the starting area. Pretty much what I expected. I am getting some slight stuttering however.

Foliage view distance is the killer, fps drops from 60fps on High to low 40's on Ultra in the same scene.
 

Barzul

Member
My experience so far has been pretty much on par with everyone else who has similar hardware.

I'm running with a 3570K @4.2, GTX 970, 16 GGB RAM, Game is on an SSD. Pretty much locked 60 fps 1080p.

I've got Foliage, Shadows, and Background Characters all set to High, everything else is on Ultra. No hairworks, and I'm using FXAA in the NVCP which definitely looks better and costs less than in game AA.

I got 3 crashes in a little over 3 hours of play time. I'm like 99% certain it's not due to my overclocks on either my CPU or GPU.

On a semi related note...I had been noticing a lot of my games lately (DA:I, FC4, GTA, TW3), even when "locked" at 60 fps, would very briefly "hitch" to about 56 fps for less than a second and then jump right back up to 60. I had been completely stumped as to what the hell was causing it, until last night when I stumbled upon someone on Tom's Hardware a few months back who had the same issue and was using a DisplayPort to connect his Monitor to his GPU, just like I was. He suggested that changing over to an HDMI cord solved his problems, so I tried it...and viola. It's definitely better now. I'm not sure whether my cable was faulty, or if DisplayPort simply just isn't the way to go for whatever reason. So if anyone is having any similar issue and is using DisplayPort....try it out.

Man can you screenshot your Nvidia Control panel settings for the game, you seem to have a very similar rig to mine bar the additional 8gb of RAM and more powerful CPU and your performance sounds like a dream compared to what I'm getting. Is FXAA all you forced on there?
 

Barzul

Member
You should be getting 50 ish minimum on a 970 with no hairworks. With those settings, 60+ easy. You tried any other games to test for problems with your system?

I haven't really tested with other games yet, but I will do so when I get home. It's just been driving me crazy.
 

UnrealEck

Member
Couldn't it also point to the fact that this game is perhaps using more tesselation and compute than we know of how?

Heck the entire post processing chain could be a compute shader on top of the forward+ compute.

It could be that and I can't say it's not since I'm really not anywhere near technically adept to say so.
From what I can tell though, the difference between Maxwell and Kepler in games like Ryse isn't similar to that of Witcher 3. (eg. 780 Ti being bested only slightly by 980)
As for tesselation, I'm not sure Witcher 3 even uses much if any tesselation outside of hairworks. I haven't found anything that suggests Maxwell is better at tesselation than Kepler either, but I have seen people citing it as the reason for the differences between the architectures in Witcher 3's performance.

I know that nVidia said they're 'investigating' things anyway. They wrote the drivers, I really don't think investigating is needed if Kepler is genuinely underperforming. I think 'honesty' is what's needed.
With that said, some benchmarks are showing Kepler is almost as good as the Maxwell cards when looking at the typical equivalents (eg 780 Ti versus 970 or 960 vs 770).
 

Gumbie

Member
I've come to realize those benchmarks are bullshit. I am NOT getting close to 60fps on a 280X. It hovers between 30 and 40 the vast majority of the time.

Someone posted the Ultra chart a little bit further down. It shows the 280x at 40.
 
I can leave hairworks on and keep 60 fps on my 970 thanks to that MSAA tweak. I turned it down to 2x. It doesn't look as nice, but it's worth it. Roach gets to keep his flowing mane, I get to play with a smooth framerate. All is well.

I can't wait to try this out later on! I have a 970 so it will be very interesting to see the cost reduction on the usage.
 
Top Bottom