• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Witcher 3 PC Performance Thread

It's glorious. It's the amount of detail that makes it look so great with the clarity of 60fps. It's one of the games that truly benefits from it visually.

People who are locking at 30 to increase visual effects are losing out imo.



Giving an opinion that it looks bad is valid dude, nothing wrong with it. They're not just being negative for the sake of it.

I think vanilla looks better too. Every Sweet FX mod and visual edit so far has sucked the life from the visuals imo.



No problem. Until you get into increasing voltage you don't have to worry, if you get suck at all feel free to PM me.



What?

The original person doesn't have a monopoly on this, if someone beat them to the punch that's no reason to boycott or shame them.

It's just another world.

I also tried to play it with 30fps locked & downsampling (3840x1620->2560x1080). The game looked better and sharper for sure, but it was way less fun that way.
Thanks for the impressions!
 

leng jai

Member
I tried playing at 30fps @ 1620p and unfortunately could not stomach how much it exacerbated all the game's movement issues. Trying to make precise movements with Geralt/Roach was bad at 60fps and just downright horrendous at 30. The game just feels infinitely worse and the IQ gains are not worth it (although the trees look almost photorealistic down sampled).

I'm guessing there's nothing we can do to fix the NPC pop in? So jarring.
 
I tried playing at 30fps @ 1620p and unfortunately could not stomach how much it exacerbated all the game's movement issues. Trying to make precise movements with Geralt/Roach was bad at 60fps and just downright horrendous at 30. The game just feels infinitely worse and the IQ gains are not worth it (although the trees look almost photorealistic down sampled).

I'm guessing there's nothing we can do to fix the NPC pop in? So jarring.

im guessing youve never actually seen a tree in real life
 
I was wondering: how can you tell if a game is using the desktop color settings as the standard? When this is the case then even in Full Screen the game should be using the ICC profile from the desktop right?

I'm asking because I'm using a ICC profile and when I switch between Borderless Window and Full Screen mode, I don't see any difference at all. So I just assumed the game was applying my ICC profile whatever rendering mode I chose. I tried both TW3 and GTA V but never saw a difference.

The reason I want Full screen is because I want my G-sync. But I also want games to make use of my ICC profile, which I got from tftcentral btw.

I applied my ICC profile with help from this page: https://pcmonitors.info/articles/using-icc-profiles-in-windows/

They also talk a lot about ICC profiles and how games work with them.

Here is a thread that can explain it much better than I can. Basically, it varies from game to game whether or not a custom icc profile will/can be used in fullscreen mode. They will always work in windows borderless though. I can tell you in the case of TW3, a custom icc profile isn't used by default. There are some programs mentioned in the thread that might help with that though.

As much as performance will allow me, I will always use windows borderless just so I can use my color profile (there are other benefits as well). The difference is truly there, and especially with a game like TW3 where such a rich color palette is used, it's worth it to at least try and make it work.

On another note, did the latest nvidia drivers allow for gsync to work in windowed mode? I thought I read that in the notes. I don't have a gsync (yet!!!), so I wish I could help you there.
 

mm04

Member
GTX970 SLI everything maxed out (Hairworks disabled) with SweetFX. Pretty solid framerate around 60fps.

Is this on a widescreen monitor? What resolution are you playing at. Reason I ask is I can set everything on max with AA and HairWorks on and be at over 60fps constantly with my 970 SLI setup. That being said, I play at 1440p and turn down foliage, shadows and grass to High with AA off and everything at Ultra and HairWorks on and am over 60fps consistently.
 

jorimt

Member
Does anyone know the setting in the renderings.ini file that will make the textures stream a lot further away from camera for the characters? I'm getting this annoying pop in of detailed textures when I'm in the city. It happens on the characters only. Or is this a bug?

Using a Titan X, so everyone else must be seeing it too.

This was driving me insane as well when I first noticed it. Fortunately, this is about the only draw-distance issue that can be easily fixed.

in your Documents/The Witcher 3/user.settings file, locate these two lines in the "[Rendering]" section:

TextureStreamingHeadsDistanceLimit
TextureStreamingCharacterDistanceLimit

I set them to "40000" to match the default "TextureStreamingDistanceLimit" value. Of course, I have no idea what limit these settings have, but safer than sorry.

Save the file (no need to read-only), and launch the game, and no more faces/clothing textures popping in 5 feet from your character.

Screenshot comparison below for proof:
http://screenshotcomparison.com/comparison/130652

This also prevents Geralt's face from turning into a blob every time you load a save. Take note however, that while this fixes texture pop-in on characters, it does not prevent some NPC beards or hair appearing/disappearing at a certain distance (must be a different mesh system controlling this), and thus does nothing for the cutscene pop-in either.

As far as I can tell, upping these settings have zero performance impact and zero side-effects, which makes it seem even more needless to have the NPC textures pop-in so close to your character. Some of the LOD distance limits in this game are baffling.
 

Diablos

Member
THe game on PS4 is low, medium, and high settings, and perhaps with one very high setting in there (post processing value wise). It generally looks similar at close to medium distances.

A 660 would be a similar to PS4 experience from what we know about the games performance, in some areas it could even be noticably better (the card is on average better than a 750Ti which offers par performance (and better) with an overclock).

You'll be able to tweak visuals to score solid 30fps.

Not sure exactly how it will compare settings wise to ps4, but it won't be far off either way.

Will be a much smoother gameplay experience though. Plus mods!
Good to know. And when playing on my TV (720p) I'd imagine I can increase the visuals even more. Danke.
 

parabolee

Member
Okay, so I'm thinking this is the best place to ask:

I played Witcher 3 at my friend's place on his PS4 and was loving it. I was satisfied with the visual quality; what exactly does that translate to in PC settings? Has this been figured out yet?

I have a FX6300 and GTX 660. I wouldn't go above 1080p. If it matches or barely exceeds PS4 quality I'd be happy (even if locked at 30fps).

Thanks.

According to Digital Foundry a GTX 750ti will match the PS4 version for settings. So a 660 will have no problems.

This is a very handy guide for you (including PS4 equivalent settings) -

What does it take to run The Witcher 3 at 1080p60?

http://www.eurogamer.net/articles/digitalfoundry-2015-the-best-pc-hardware-for-the-witcher-3

Quote -

Consoles are not so different from PCs these days - in fact, the core processor in Xbox One and PS4 is based on PC technology. We start our PC analyses by comparing the effects of each PC quality preset to the console equivalent. The developers know best when it comes to getting the best bang for their buck from a resource-constrained piece of hardware and that experience should translate well across to PC tech too. We can then use these settings as the base for further experimentation, scaling up on more powerful hardware.

Console-equivalent settings also gives us an excellent basis for settings selection on entry-level enthusiast graphics cards like the Radeon R7 260X and the GeForce GTX 750 Ti.

Resolution: 1920x1080
Nvidia HairWorks: Off
Number of Background Characters: Low
Shadow Quality: Medium
Terrain Quality: Medium
Water Quality: High
Grass Density: Medium
Texture Quality: High (all GPUs tested here support ultra though so choose that)
Foliage Visibility Range: Medium
Detail Level: Medium
Ambient Occlusion: SSAO
All post-process effects on, except vignetting
 

Diablos

Member
According to Digital Foundry a GTX 750ti will match the PS4 version for settings. So a 660 will have no problems.

This is a very handy guide for you (including PS4 equivalent settings) -

What does it take to run The Witcher 3 at 1080p60?

http://www.eurogamer.net/articles/digitalfoundry-2015-the-best-pc-hardware-for-the-witcher-3

Quote -

Consoles are not so different from PCs these days - in fact, the core processor in Xbox One and PS4 is based on PC technology. We start our PC analyses by comparing the effects of each PC quality preset to the console equivalent. The developers know best when it comes to getting the best bang for their buck from a resource-constrained piece of hardware and that experience should translate well across to PC tech too. We can then use these settings as the base for further experimentation, scaling up on more powerful hardware.

Console-equivalent settings also gives us an excellent basis for settings selection on entry-level enthusiast graphics cards like the Radeon R7 260X and the GeForce GTX 750 Ti.

Resolution: 1920x1080
Nvidia HairWorks: Off
Number of Background Characters: Low
Shadow Quality: Medium
Terrain Quality: Medium
Water Quality: High
Grass Density: Medium
Texture Quality: High (all GPUs tested here support ultra though so choose that)
Foliage Visibility Range: Medium
Detail Level: Medium
Ambient Occlusion: SSAO
All post-process effects on, except vignetting
Thanks again.
 

Dries

Member
Here is a thread that can explain it much better than I can. Basically, it varies from game to game whether or not a custom icc profile will/can be used in fullscreen mode. They will always work in windows borderless though. I can tell you in the case of TW3, a custom icc profile isn't used by default. There are some programs mentioned in the thread that might help with that though.

As much as performance will allow me, I will always use windows borderless just so I can use my color profile (there are other benefits as well). The difference is truly there, and especially with a game like TW3 where such a rich color palette is used, it's worth it to at least try and make it work.

On another note, did the latest nvidia drivers allow for gsync to work in windowed mode? I thought I read that in the notes. I don't have a gsync (yet!!!), so I wish I could help you there.

This is very helpful, thank you. So it seems that Borderless Windowed Mode is the way to go then. Especially now it supports G-sync. You indeed read that correctly in the driver notes.

So Borderless Windowed Mode gives me both G-sync and my icc profile. Shame I only found out till now.

Just to be sure also: All I need to do is launch the game in borderless window, right? I'm asking because they said:

Windowed mode so far has a 100% success rate. For games that reset the color profile, using applications to force the color profile works.

Will TW3 reset my color profile?
 

Easy_D

never left the stone age
This is very helpful, thank you. So it seems that Borderless Windowed Mode is the way to go then. Especially now it supports G-sync. You indeed read that correctly in the driver notes.

So Borderless Windowed Mode gives me both G-sync and my icc profile. Shame I only found out till now.

Just to be sure also: All I need to do is launch the game in borderless window, right? I'm asking because they said:



Will TW3 reset my color profile?


I've run it in Borderless and Fullscren (Because the game flat out runs better in full screen) and it hasn't messed with my colours at all. I use Magictune on my Samsung monitor.

Edit: NVM. Misunderstood your post completely, thought you were talking about games messing with your desktop colours after quitting them.
 

Faith

Member
Is this on a widescreen monitor? What resolution are you playing at. Reason I ask is I can set everything on max with AA and HairWorks on and be at over 60fps constantly with my 970 SLI setup. That being said, I play at 1440p and turn down foliage, shadows and grass to High with AA off and everything at Ultra and HairWorks on and am over 60fps consistently.
Yeah it's a 29" 21:9 2560x1080 display :)
 
This is very helpful, thank you. So it seems that Borderless Windowed Mode is the way to go then. Especially now it supports G-sync. You indeed read that correctly in the driver notes.

So Borderless Windowed Mode gives me both G-sync and my icc profile. Shame I only found out till now.

Just to be sure also: All I need to do is launch the game in borderless window, right? I'm asking because they said:



Will TW3 reset my color profile?

Nope! Just launch in windowed mode (via in-game setting) and your custom icc profile will be the one used by default!

I've run it in Borderless and Fullscren (Because the game flat out runs better in full screen)

Eh, I think it varies from person to person. I have a much smoother experience and better performance with windowed borderless.
 

Easy_D

never left the stone age
Nope! Just launch in windowed mode (via in-game setting) and your custom icc profile will be the one used by default!



Eh, I think it varies from person to person. I have a much smoother experience and better performance with windowed borderless.

Yeah it depends on the games too, for Source games they actually run better in Borderless for me.
 

Anteater

Member
I tried playing at 30fps @ 1620p and unfortunately could not stomach how much it exacerbated all the game's movement issues. Trying to make precise movements with Geralt/Roach was bad at 60fps and just downright horrendous at 30. The game just feels infinitely worse and the IQ gains are not worth it (although the trees look almost photorealistic down sampled).

I'm guessing there's nothing we can do to fix the NPC pop in? So jarring.

I hate this post, I really hate how the game feels at 30fps but I couldn't get it to run at 60 no matter what with my card :p not even 50 at 1920x800 with everything low, so I'm going to have to live with my 30 lock. Damn you all.
 

daninthemix

Member
I hate this post, I really hate how the game feels at 30fps but I couldn't get it to run at 60 no matter what with my card :p not even 50 at 1920x800, so I'm going to have to live with my 30 lock. Damn you all.

It's just opinions, man. I've run it at 60 and 30 and can tell you that with the two blur options enabled the game is very tolerable at 30.

Now Far Cry 4 - that is a game I cannot play at 30.
 

Anteater

Member
It's just opinions, man. I've run it at 60 and 30 and can tell you that with the two blur options enabled the game is very tolerable at 30.

Now Far Cry 4 - that is a game I cannot play at 30.

I'll get used to it :p I'm that dude that played hundreds of hours of dragon's dogma/skyrim/bayonetta on the ps3 and enjoyed it. Just kind of kidding around since that post made me go and tweak my settings again lol.
 
It's just opinions, man. I've run it at 60 and 30 and can tell you that with the two blur options enabled the game is very tolerable at 30.

Now Far Cry 4 - that is a game I cannot play at 30.

That applies to the entire genre of first person shooters for me. The closer perspective to the game world combined with the need for pinpoint accuracy makes 30 fps simply unacceptable to me for those.
 

daninthemix

Member
I'll get used to it :p I'm that dude that played hundreds of hours of dragon's dogma/skyrim/bayonetta on the ps3 and enjoyed it. Just kind of kidding around since that post made me go and tweak my settings again lol.

lol, that's the thing - PC performance threads will, by and large, always be full of people who are never happy. Everyone has to find their own compromise. Even guys like smokey who have 2 x Titan Xs.

p.s. hey smokey - don't you wish you were running 120fps @ 8k max settings??

Just kidding ;o)

That applies to the entire genre of first person shooters for me. The closer perspective to the game world combined with the need for pinpoint accuracy makes 30 fps simply unacceptable to me for those.

Agreed, although it's weird because I've played and enjoyed plenty of console FPSs in my time.
 
Just arrived in Novigrad and ran all over it looking for Gwent opponents. The performance seems good, even during rain storms. I am incredibly happy with my newly discovered fashion:

witcher3_2015_06_10_01_58_22_by_realghostvids-d8wmq7o.jpg


Geralt knows he looks good.

http://fat.gfycat.com/CourteousFrayedJay.webm
 
This was driving me insane as well when I first noticed it. Fortunately, this is about the only draw-distance issue that can be easily fixed.

in your Documents/The Witcher 3/user.settings file, locate these two lines in the "[Rendering]" section:

TextureStreamingHeadsDistanceLimit
TextureStreamingCharacterDistanceLimit

I set them to "40000" to match the default "TextureStreamingDistanceLimit" value. Of course, I have no idea what limit these settings have, but safer than sorry.

Save the file (no need to read-only), and launch the game, and no more faces/clothing textures popping in 5 feet from your character.

Screenshot comparison below for proof:
http://screenshotcomparison.com/comparison/130652

This also prevents Geralt's face from turning into a blob every time you load a save. Take note however, that while this fixes texture pop-in on characters, it does not prevent some NPC beards or hair appearing/disappearing at a certain distance (must be a different mesh system controlling this), and thus does nothing for the cutscene pop-in either.

As far as I can tell, upping these settings have zero performance impact and zero side-effects, which makes it seem even more needless to have the NPC textures pop-in so close to your character. Some of the LOD distance limits in this game are baffling.

Excellent!! Thank you so much!
 

Methos#1975

Member
Ahhhhh my performance has gone to hell today. I have been getting a consistent 30fps with no drops since launch, but today I can barely average 24 for some reason. No change really, I did download some Window updates this morning. Any chance they mucked something up?
 

ramshot

Member
So now that 1.05 has been out for a while, and I haven't figured out what the problem is, or if there even is a problem, it might be a good time to ask...

Did anyone else experience a performance drop with 1.05, and/or with the nvidia driver update that came out around the same time? I'm not entirely sure, because I took a break from the game just before the patch game out, but I'm pretty positive I was having better performance with 1.04. For instance, in Novograd, I had a solid lock on 60. Now it's floating anywhere between 50 and 60. I've checked that all my settings are as they were.

3570k @ 4.2
16 gigs DDR3/2133
GTX980 1500/7600
Win 8.1
 

Easy_D

never left the stone age
It's just opinions, man. I've run it at 60 and 30 and can tell you that with the two blur options enabled the game is very tolerable at 30.

Now Far Cry 4 - that is a game I cannot play at 30.

It's funny that (for me at least) no game has managed to feel as good at 30 FPS as the original Crysis, must be thanks to the blur effeects they used
 

Grechy34

Member
I just upgraded from a 980 to a 980ti and the 20-30% boost has allowed me to turn on hairworks and play at a decent FPS though overall the gameworks features run like absolute horse shit overall. Without hairworks its easily pulling 75-80FPS.
 

Qassim

Member
I'm using that mod now and I am satisfied. I keep nice consistent frame-times pretty much all of the time, even on Roach with a griffin trophy. I went to purposely find a pack of wolves to try and get my frame rate to dip and it dipped into the 50s - which on G-Sync is acceptable.

So I'm happy, I think I found near a worst-case scenario as far as hairworks goes and was happy with the performance. I think I'll be keeping this on!
 
Just bought a GTX 970 4GB, ditching my ol' 580.

Will arrive in my box tomorrow or friday.

Won't upgrade my mobo and cpu until next year though...

So, i7 2600k, 8gb ram and a gtx 970.

Will i run this sucker at max settings around 60 fps? (hair works off).
 
Just bought a GTX 970 4GB, ditching my ol' 580.

Will arrive in my box tomorrow or friday.

Won't upgrade my mobo and cpu until next year though...

So, i7 2600k, 8gb ram and a gtx 970.

Will i run this sucker at max settings around 60 fps? (hair works off).

I have a 4690k, 8GB ram and a 970. I'm at 1080p, 60fps with everything on Ultra except for shadows and foliage distance. Runs super smooth except in Novigrad it goes to 55 in some areas of the city which looks weird but it's honestly not bad since it's locked at 60 the rest of the time.
 

Zakalwe

Banned
Just bought a GTX 970 4GB, ditching my ol' 580.

Will arrive in my box tomorrow or friday.

Won't upgrade my mobo and cpu until next year though...

So, i7 2600k, 8gb ram and a gtx 970.

Will i run this sucker at max settings around 60 fps? (hair works off).

Yep.

I have:

i5 2500k@4.3
MSI 970 @ 1473/8010

1080, Vsync on, Fullscreen, Limited in Riva Tuner to 60, All PP on, Everything ultra except shadows, foliage, grass on high. Hairwoks off.

60 almost everywhere with drops to 50 in Novigrad and some other towns.
 
Ahhhhh my performance has gone to hell today. I have been getting a consistent 30fps with no drops since launch, but today I can barely average 24 for some reason. No change really, I did download some Window updates this morning. Any chance they mucked something up?

Is vsync on? Sounds like it might be locking to 24fps. Fairly common problem with TV screens. Try turning off vsync & see if that helps.
 

ss_lemonade

Member
That applies to the entire genre of first person shooters for me. The closer perspective to the game world combined with the need for pinpoint accuracy makes 30 fps simply unacceptable to me for those.

The original Crysis for some reason was playable to me at low framerates, but that might just be me back then accepting what my 8800 was capable of with the game
 

Shane86

Member
Really wish they'd fix the freezing, I'd buy a new card but it's happening to people with 970's too
After some further investigation, I tried under clocking my 660ti. Thought it was nonsense but I've been playing 4 hours without a freeze, was even able to crank most settings up to high as well. Dropped the memory and clock by 40, hope this helps someone else.
 

CHC

Member
Hair pop-in seems to continue getting worse for me. Half the NPCs are bald until they start speaking, which magically conjures hair on their heads.
 

didamangi

Member
ROTFL, just a few posts above mine. :D
Thanks for the feedback, I'm going to install it now!

Reading the comments on the mod's page and looks like if you kept the beard the fps hit will still be high. I tested it without the hair and beard with 4xAA hairworks and got around 60 fps riding around with Roach and the griffin's head outside of Novigrad. (3570K + 970)

Too bad the other trophies doesn't have hairworks :/

Not finding wolves or fiends atm but I'd guess it'll still drop somewhat when fighting 5 wolves or something.

Hopefully CD PR will put this in on the options menu in the future.
 

FLAguy954

Junior Member
I have gotten two hard crashes in the last week where the game would just crash to a red screen and I would have to restart my computer completely. Anyone else with this issue?
 
Top Bottom