Could anyone recommend good TextureMemoryBudget= value when running game on GTX980Ti? I tried googling about setting, but couldn't find any reliable information on how VRAM usage scales as that value is increased.
Anyone else have problems with textures (and especially people's hair) loading in after camera transitions in cutscenes? Is there a way to fix that? I'm running the game on Raid 0 SSDs, so I don't think it's my storage. This has been happening since day one and I'd like to fix it for my second play through.
It would be great to be able to teweak the game's VRAM usage to use as much as possible to prevent texture streaming on characters and otherwise. I have 12 gigs and would not mind if the game used as much as it could (currently 1080p uses only abit more than 1/4th of that at best).
So I added my 3rd 980Ti mainly for better 4k performance in the game and now it just keeps crashing to the desktop left and right (and blanking the screen). With 2 cards it was rock solid. I've benchmarked the 3 cards and they seem to be fine so that's not it. Is it the game itself that's just unstable? It's pretty much unplayable for me now. 😢
The hair and beard pop-in during cutscenes is NOT a texture streaming issue. It's actually an LoD issue, and there is a mod on the Nexus which fixes it already.
Try the mod and seeSurely it's not a LoD issue in a cutscene where Geralt is sitting directly opposite of whoever he's talking to?
Try the mod and see
I haven't noticed any FPS issues, and haven't heard of any either. I have a fx 8350 and a GTX660 Ti with 8GB ram, have to run the game at 30fps. I don't have a beast rig eitherWould destroy my framerate though, not reunning a beast rig
Welp. The latest patch kinda messed with my performance a bit, now averaging 40-45 FPS when in the overworld, used to be 55-60 with the same settings. Unless it's the Battlefront beta driver, AMD 280X here. That said, I managed to run it at 40 FPS in 1440p somehow, looked pretty hot. Almost wondered if it was my FX6300 bottlenecking and that I was just misremembering, but nope, with a Titan X, even the FX6300 can hold 55-60 FPS in Novigrad on Ultra per Digital Foundry's testing.
Maybe I should just roll back to 15.7, those ran Witcher 3 no problem, not surprising a beta driver launched for a beta game would be slightly worse for other games.
Do other games work? Are the cards overclocked?
So I added my 3rd 980Ti mainly for better 4k performance in the game and now it just keeps crashing to the desktop left and right (and blanking the screen). With 2 cards it was rock solid. I've benchmarked the 3 cards and they seem to be fine so that's not it. Is it the game itself that's just unstable? It's pretty much unplayable for me now. 😢
No, the cards aren't overclocked, at least not when I play the Witcher 3 (not that that seems to make it difference, I get CTDs either way). I even upped the voltage to 60% on Afterburner and that didn't help. I overclocked it while benchmarking and it worked fine. Ran Haven for quite a while.
One of my cards maxed out @ 91C, but it did when I was running with 2 cards too, that's to be expected especially on graphic intensive games/benchmarks.
I just purchased 3DMark, I'll see how Firestrike runs in 4k. I'll also test a few other games. Any suggestions?
Thank you very much! This is what I was waiting forThe hair and beard pop-in during cutscenes is NOT a texture streaming issue. It's actually an LoD issue, and there is a mod on the Nexus which fixes it already.
Which 980ti cards do you have? Maybe, not enough power. What PSU?
What settings are you using? I have a 4690k/stock 7950 3gb/oc and I average 40-45 fps with High preset/Ultra texture. Weird because on low I still get similar performance.
Anyone else have problems with textures (and especially people's hair) loading in after camera transitions in cutscenes? Is there a way to fix that? I'm running the game on Raid 0 SSDs, so I don't think it's my storage. This has been happening since day one and I'd like to fix it for my second play through.
Wow! I found it =D It's a setting named ShadowDistanceScale, which the tweak guide claims has a "minimal increase in image quality" lol
The 280x is essentially a 7970Ghz edition so I should be pulling more frames, your CPU is a lot better than mine for sure, but a stock FX6300 shouldn't bottleneck, hell in the DF video the bottlenecks only show in Novigrad where you have dips to the 40s.
Hariworks: Off
Background Characters: High (apparently this setting does nothing?)
Shadow Quality: Medium
Terrain Quality: Ultra (miniscule fps difference, maybe 1 or 2 frames all in all)
Water Quality: High
Grass Density: Ultra (no difference fps wise between high/ultra)
Texture Quality: Ultra
Foliage Visibility Range: High
Detail Level: Ultra (No difference that I've noticed between High/Ultra setting fps wise)
All post process effects on except CA and Vignetting and AO.
With these settings I used to average 55 FPS when in the open world, even in Velen swamps. I'd dip to 45ish if I had HBAO+ on.
What drivers are you currently using? Thinking it might be the Battlefront beta drivers after all.
Edit: If I put it on Low I get around 75ish FPS
Antec High Current Pro 1300w PSU w/EVGA 980Ti ACX 2.0+ cards. They upgraded me from the High Current Pro 1200w, and that might be the problem there. The 1300w could be shit, although I just ran Firestrike Ultra in a loop for quite a while with all 3 cards overclocked and it ran fine. I don't know.
Then the first time I ran the benchmark for Mordor overclocked it locked up, I had to run at normal clock speed.
I don't want to upgrade the PSU if I don't have to, that would be another $200+ investment in my machine which I don't have right now.
Welp. The latest patch kinda messed with my performance a bit, now averaging 40-45 FPS when in the overworld, used to be 55-60 with the same settings. Unless it's the Battlefront beta driver, AMD 280X here. That said, I managed to run it at 40 FPS in 1440p somehow, looked pretty hot. Almost wondered if it was my FX6300 bottlenecking and that I was just misremembering, but nope, with a Titan X, even the FX6300 can hold 55-60 FPS in Novigrad on Ultra per Digital Foundry's testing.
Maybe I should just roll back to 15.7, those ran Witcher 3 no problem, not surprising a beta driver launched for a beta game would be slightly worse for other games.
You'd kinda think Ultra texture quality would cache them fast enough to prevent pop in during cutscenes. Game doens't top out my 3gB either so it'd be cool. Witcher 2 had the same problem though, only during cutscenes.
The hair and beard pop-in during cutscenes is NOT a texture streaming issue. It's actually an LoD issue, and there is a mod on the Nexus which fixes it already.
Has this been fixed? The animation loop issue where it appeared as if the renderer skipped a frame. Thanks.
Hey guys. It took me ages to narrow down this problem. But whenever I play the Witcher 3 move Geralt then wall/ground textures are subtlety flickering. It is noticeable but not completely in your face. I've tried disabling all sorts of graphical options one at a time and changing stuff in the Nvidia control panel but none of them seemed to fix it.
Then I decided to lower the resolution to 1440p and it went away. Same with 1080p.
It only shows up in 4k and The Witcher 3 is the only game to do it. Any ideas? This didn't occur with my crossfire R9 290s. I had some slight crossfire flickering with them but it was different.
Driver bug. Update your driver or wait for AMD to fix it.
Sorry I have an Nvidia 980ti at the moment! And it has persisted through two driver updates. There is another one out today I may try but I heard people have issues with it.
Oh, sorry I saw many AMD owners with the exact same issue so I took it was AMD specific.
My mistake.
I have no idea what could cause this aside from drivers.
No worries! I'll try figure it out. It's driving me nuts and really breaking some of the immersion.
Doesn't make sense to only happen at 4k res with only 1 game either..
If I use Borderless Window mode, do I need to turn vsync off or still keep it on?
I know Borderless Window enables Triple Buffering, but I'm not sure if it does the same thing as vsync too.
You can turn in-game VSync off.
However in TW3, some users reported that doing so will significantly reduce the game's performance and cause stutter. The same thing happens to me too.
What I did is I still enable in-game VSync and then use borderless windows. No more frame drop and stutter.
VSync: Off
Maximum Frames Per Second: Unlimited
Resolution: 1680x1050
Display Mode: Full Screen
NVIDIA HairWorks: Off
Number of Background Characters: Ultra
Shadow Quality: Medium
Terrain Quality: Ultra
Water Quality: High
Grass Density: Ultra
Texture Quality: Ultra
Foliage Visibility Range: High
Detail Level: Ultra
Hardware Cursor: On
Motion Blur: On
Blur: On
Anti-aliasing: On
Bloom: On
Sharpening: Low
Ambient Occlusion: HBAO+
Depth of Field: On
Chromatic Aberration: Off
Vignetting: On
Light Shafts: On
What's everyone's opinion of vignetting? Should I turn it on or off? What other settings should I adjust for optimal performance?
Guys, any idea what the difference between "hairworks high" and "hairworks low" is?
I have yet to really examine it up close. MSAA samples for hairworks and the option to put it on geralt or everything is obvious though.
Hairworks Low makes the hair less smooth and more segmented and also reduces the thickness of the hair.
High vs Low
High vs Low
How's Witcher 3 on Windows 10? Anyone have any experiences upgrading and comparing?
Depending upone where you are coming from, Win 10 should improve CPU related performanc ein all dx11 games as far as I have come to understand it.
I saw no performance regression upgrading to win 10.
Good to know, thanks. I am coming from 8.1. Did you have to reinstall the game after upgrading?
I believe not, but I did it anyway and just made sure my saves were safe ( a good idea )
Same problem; the new xbox integration thing is what did it for me.
Hopefully that should resolve the problem.
- Open your browser, hit Windows key+G
- A small window should appear, check "yes, this is a game"
- Open the settings button at the right end of the window
- Uncheck "open game bar using X on a controller" (to remove guide button conflicts) and uncheck "remember this as a game" (so your browser isn't considered a game should you want to keep using this new feature)
I've decided to play through to completion after playing about 40 hours just after launch. I can't remember much of what was happening so I've decided to start a new game. Does anyone know if it's possible to re-bind quicksave to a gamepad button? It gets annoying reaching over to my keyboard to press F5 all the time, I'd like to bind it to the 'guide' button on the 360 gamepad. The in-game menu doesn't let me but is it possible in a config or does anyone know of some external program that allows you to bind keys to the guide button?
EDIT: Got a program called antimicro which lets you bind keys to gamepad buttons. The issue I'm having now is that the guide button doesn't appear to be working. =/ Completely unrecognized by Windows (10). Using a wireless 360 pad, anyone have any ideas?
EDIT 2: Got it working with help from this post on Reddit: