tioslash
Member
IDK if this was posted but a user found a gameworks related bug that is causing the terrible Kelper performace. Hope it helps anyone with Kelper.
https://www.youtube.com/watch?v=XnWkSFqo5A4
Well, that actually works.
IDK if this was posted but a user found a gameworks related bug that is causing the terrible Kelper performace. Hope it helps anyone with Kelper.
https://www.youtube.com/watch?v=XnWkSFqo5A4
IDK if this was posted but a user found a gameworks related bug that is causing the terrible Kelper performace. Hope it helps anyone with Kelper.
https://www.youtube.com/watch?v=XnWkSFqo5A4
IDK if this was posted but a user found a gameworks related bug that is causing the terrible Kelper performace. Hope it helps anyone with Kelper.
https://www.youtube.com/watch?v=XnWkSFqo5A4
After some crashes the first couple of days, the game has run like a dream on my main PC. I don't know if it was switching to unlimited frame rate, full-screen instead of windowed borderless, or the game updates, but haven't had a crash in a week.
...Until I tried running on my secondary PC. Not that it's a big deal or anything as there's always the main PC to go back to, but sometimes the missus and I like to just kick back and play a game in the bedroom. Steam In-Home streaming is an option for this, but despite a wired connection, I'm not overly enamored with the image quality. So then I tried just playing on the machine itself:
1.) AMD Phenom II X4 940
2.) 8 GB Ram
3.) GeForce GTX 770
Honestly, it runs it well enough on low/medium settings. And my wife was able to play for about an hour the other night with no incident. However, since then I can't seem to play more than 10 minutes without a crash that necessitates closing from task manager. I know the card had a factory overclock. I tried lowering the clock speed through Afterburner, but maybe I'm just not enough of a power user to know what I'm doing, as I never mess with overclock settings.
Anybody been able to fix constant crashing? Like I said, I'm not all that concerned about it as the game runs great on my main machine. But I just liked the flexibility of being able to play in different rooms without having to deal with In-Home Streaming.
IDK if this was posted but a user found a gameworks related bug that is causing the terrible Kelper performace. Hope it helps anyone with Kelper.
https://www.youtube.com/watch?v=XnWkSFqo5A4
Any help greatly appreciated. Geralt is fine. Just horses.Help. My sprint key has stopped working on Roach. Tried rebinding/swapping to toggle. reached a part of the game where I need my horse and I'm screwed. Won't even canter!
On PC fwiw. Anyone seen this bug?
This is interesting if actually true.
If the game doesn't crash if you simply delete the dll, then this means one of two things:
- It didn't actually use it, in which case it also can't have a performance impact.
- It uses the dll in a system folder (installed by a driver) instead.
In case of 2, it could be that the system dll is a different version which shows better performance in TW3.
Still strange that it would make such a big difference, my first instinct with something like this is always user error![]()
Until recently, I was powering a 980 build with a 500W PSU and had no problems whatsoever. I upgraded about two days ago because I just wanted to be safe, and give myself the necessary room for overclocking purposes down the line, but I played TW3 pretty extensively before doing so and didn't have any stability issues.Thinking about it a bit and with minor research, I'm wondering if inadequate power may be the issue. I bought a 970 about a month ago to replace a 770, and put the 770 that was in it into my old secondary PC to replace a 560 Ti. Well, the old machine only had a 500W power supply. I didn't even think about that. Before I replace the PSU, does anyone think I'm barking up the right tree in eyeing that as the culprit?
For reference, here was my last post:
Might be anecdotal, but any time I have Afterburner open, even at stock OC (evga's OC), the game will crash. When I close Afterburner the game doesn't crash (outside of a memory leak once). I'm using an evga 970, latest drivers, everything on ultra except shadows and foliage (both on high) hair off, full screen, frames unlimited. No ini tweaks.
My 3570k is OCd to 4.2ghz for what that's worth. 8gb RAM.
Welp. There it is.IDK if this was posted but a user found a gameworks related bug that is causing the terrible Kelper performace. Hope it helps anyone with Kelper.
https://www.youtube.com/watch?v=XnWkSFqo5A4
Screenshots:
APEX_ClothingGPU_x64 - OFF (renamed):
APEX_ClothingGPU_x64 - ON:
Pretty much the same spot, slight change in camera angle but the camera angle doesn't seem to be the reason for the difference from testing (after seeing the screenshots were slightly different).
Until recently, I was powering a 980 build with a 500W PSU and had no problems whatsoever. I upgraded about two days ago because I just wanted to be safe, and give myself the necessary room for overclocking purposes down the line, but I played TW3 pretty extensively before doing so and didn't have any stability issues.
So, I'd say it's worth upgrading, but I'm not so sure you'll see improvements.
The fact that these OC crashes are happening consistently across a wide range of cards and varying OC levels suggests that there is something in the game causing crashes to be way more likely then normal.
My game also has the misplaced grass and floating trees bug. I wonder if it has something to do with config tweaking.
That would explain why the problem is quite area-specific. Like, where there's lots of... PhysX... Stuffs...Okay, unless I'm being stupid, I think it has to do with GPU PhysX. I reverted change of the file back to default (so the game loads it) and then went to the NVIDIA control panel and set PhysX processing to CPU.
Screenshot (PhysX - CPU) 71fps:
Screenshot (PhysX - Automatic (auto-selected my 2nd 780)) 62fps:
fps counter on the second is a bit hard to see, especially with imgur compression, but I've double checked, it's 62fps. Keep in mind, I'm running SLI also, so I'm not sure if that'll change anything in relation to your testing - but the guy in the video is running a single card.
Okay, unless I'm being stupid, I think it has to do with GPU PhysX. I reverted change of the file back to default (so the game loads it) and then went to the NVIDIA control panel and set PhysX processing to CPU.
fps counter on the second is a bit hard to see, especially with imgur compression, but I've double checked, it's 62fps. Keep in mind, I'm running SLI also, so I'm not sure if that'll change anything in relation to your testing - but the guy in the video is running a single card.
Yeah, I can confirm that changing the setting PhysX processing to CPU on the NVIDIA Control Panel have the exact same effect as deleting that file. A 10-15% increase in FPS in all specific areas where I was getting drops.
I´m running a single 760.
Okay, unless I'm being stupid, I think it has to do with GPU PhysX. I reverted change of the file back to default (so the game loads it) and then went to the NVIDIA control panel and set PhysX processing to CPU.
Screenshot (PhysX - CPU) 71fps:
Screenshot (PhysX - Automatic (auto-selected my 2nd 780)) 62fps:
fps counter on the second is a bit hard to see, especially with imgur compression, but I've double checked, it's 62fps. Keep in mind, I'm running SLI also, so I'm not sure if that'll change anything in relation to your testing - but the guy in the video is running a single card.
if all this is true then it may explain why the AMD counterparts were performing better. They just default PhysX to the CPU. I wonder if this affects performance on non-Maxwell cards. Has anyone tried?
Okay, unless I'm being stupid, I think it has to do with GPU PhysX. I reverted change of the file back to default (so the game loads it) and then went to the NVIDIA control panel and set PhysX processing to CPU.
Hrm.. does that mean the game DOES use GPU accelerated PhysX?
The PhysX identifier says it is running on the CPU though...
Has anyone tried with the file deleted AND PhysX set to CPU to see if there's a bigger improvement? Or do they just do the same thing?Yup, before seeing your post, I thought of the exact same thing when I saw the original "fix." Earlier, I did the same test (runnning a GTX 770 4GB), first with the file deleted, the second time with the PhsyX set to "CPU."
Seems like both do the same thing, and only in certain areas; many areas, for me, show little to no improvement with either fix.
It looks as if the performance drop in affected areas are due to the PhysX processes intended to run solely on the CPU being offloaded to the GPU instead. Strange, seeing as it's been confirmed that all PhysX in The Witcher 3 run off the CPU.
You mean on Maxwell cards?
People already tested kepler.
Has anyone tried with the file deleted AND PhysX set to CPU to see if there's a bigger improvement? Or do they just do the same thing?
Aww okay. Think I was being a bit greedy hoping for moreYes, I tried.No difference here.
Would I double the fps if I bought another gtx 680?
if all this is true then it may explain why the AMD counterparts were performing better. They just default PhysX to the CPU. I wonder if this affects performance on Maxwell cards. Has anyone tried?
Aww okay. Think I was being a bit greedy hoping for more![]()
Would I double the fps if I bought another gtx 680?
Another 680 is not a good investment in any way at the moment due to the VRAM limitations on that card.
Put the money towards a proper GPU upgrade.
Wow, thanks for this. I gave up on sweetfx after trying some popular ones, but this one seems to be EXACTLY how i wanted things to look.
edit: i did turn off the bloom effect tho, not a fan of sweetfx bloom
Does anyone remember what post (or thread even) had the screenshot comparison of Ultra vs. Low shadow settings?
I can't find it![]()
Another 680 is not a good investment in any way at the moment due to the VRAM limitations on that card.
Put the money towards a proper GPU upgrade.
Has anyone tried with the file deleted AND PhysX set to CPU to see if there's a bigger improvement? Or do they just do the same thing?
Hrm.. does that mean the game DOES use GPU accelerated PhysX?
The PhysX identifier says it is running on the CPU though...
You mean on Maxwell cards?
People already tested kepler.
This is interesting if actually true.
If the game doesn't crash if you simply delete the dll, then this means one of two things:
- It didn't actually use it, in which case it also can't have a performance impact.
- It uses the dll in a system folder (installed by a driver) instead.
In case of 2, it could be that the system dll is a different version which shows better performance in TW3.
Still strange that it would make such a big difference, my first instinct with something like this is always user error![]()
I've just tested that Kepler fix and it seems to work and from what I can tell not actually change anything in game. I think it may just fall back to running on the CPU when it can't run it on the GPU? Perhaps there's just a bug with the library on Kepler GPUs for now.
I'm pretty sure the clothing includes things like nets, sheets, flags waving in the wind, etc, but I could be wrong on that.
I have a dedicated gpu as physx card(gtx 670) and its usage is 0 when playing witcher 3 when I check MSI's afterburner.
IIRC Physx includes fog right? Anytime fog is introduced in a scene my FPS goes to total shit yet when I've got tons of enemies on screen it holds solid.