Even though I have Vsync enabled, I am still getting screen tearing ocassionaly and it is pretty annoying. If I enable Vsync from Nvidia Control panel will it go away?
Is the VT Compress texture option makes a big difference on visual quality? It's the only thing I can't set right now? If I turn it off (so to get a better quality) the game goes down to 2-3fps. I get solid 60 with the textures compressed...
I usually don't comment on these, but I just wanted to throw out there some weird things that I've noticed.
Forcing Triple Buffering through the nvidia control panel doesn't seem to work for me...like at all. Game bugs out every time I've tried it. Textures won't load properly or glitch in hilarious ways. Disabling it again fixed that so I have to roll with it off I guess; which blows.
Disabling the intro screens also doesn't work in the way I hoped it would. It just sits at a black screen for what seems like the same length of time it would have taken to watch the logos and whatnot. I kind of hoped it would just skip to the start screen.
Setting launch arguments for the antialiasing level also wasn't working for me (like someone else on here mentioned) and typing it in the console was the only way to get it to work. However, every death/checkpoint/game restart it was resetting to default so I had to keep typing in the command.
I got tired of that so I just adjusted the profile information in graphicsprofiles.json in the base folder with the antialiasing setting I wanted, and that's the only way I've gotten it to persist without constantly having to reset it. But the tradeoff is you have to use one of the default profiles (low, med, high, ultra) in the advanced options menu; custom doesn't work as far as I can tell. Also, adding in the AF command information into this file didn't seem to work. Not entirely sure though; I'm going to try it again later.
Just throwing it out there for anybody else who couldn't get the antialiasing setting to stick for them the ways mentioned in this thread.
So has there been any news about a patch? Gameplay runs perfectly fine most of the time, while simply moving the camera around during a cinematic moment like the drive to the London Nautica or looking at Caroline and Anya as they chat in the hideout causes random drops to 30. Even if I drop a bunch of settings to low, or lower the native resolution, the drops still occur.
To anyone playing this game with multiple nvidia GPUs in your system and getting worse performance than you'd expect (even taking into account SLI is unsupported), I advise you to do the following:
1. Disable SLI globally in the nvidia control panel.
2. Disable your extra GPUs in Device Manager.
3. Make sure you're using set vt_usecudatranscode = 1 either in WolfConfig.cfg or in the console.
This really improves performance for me. I believe that this game has the same bug present in RAGE, that prevents CUDA transcoding being used if the system has more than one GPU installed.
After turning on Anisotropic Filtering via NVidia and even with Vsync on, I notice some kind of screen tearing but nothing that's major, more like 'overscan' I believe. What's a fix for this?
After turning on Anisotropic Filtering via NVidia and even with Vsync on, I notice some kind of screen tearing but nothing that's major, more like 'overscan' I believe. What's a fix for this?
This'll have nothing to do with Anisotropic filtering directly. If you are using in-game vsync (you don't specify), then it is adaptive vsync and tearing will occur when your PC can't maintain 60fps.
If you don't like tearing, disable in-game vsync and use nvidia control panel or nvidia inspector to set vsync tear control to standard, vsync to forced on, as well as enabling triple buffering.
This'll have nothing to do with Anisotropic filtering directly. If you are using in-game vsync (you don't specify), then it is adaptive vsync and tearing will occur when your PC can't maintain 60fps.
If you don't like tearing, disable in-game vsync and use nvidia control panel or nvidia inspector to set vsync tear control to standard, vsync to forced on, as well as enabling triple buffering.
Well I haven't seen that in my configuration. Triple buffering is optional anyway. If you don't like tearing, disable in-game vsync and force standard vsync through nvidia utilities.
So is the train sequence with Frau Engel meant to be 30FPS? Because it's running at 60 before, and the moment the game shows her face, it drops down. Even dropping everything to low, it still kept going between 30 and 55 during her whole dialogue.
I get 60fps almost constantly, unless theres a big close up of a characters face. Or in the hideout when theres several characters in one area, looking at that area ill get big frame drops, but in the rest of the hideout, its fine. For example, in the Belica level, game spoiler, when Frau Engel
gets her face fucked up and theres a huge close up of her bloodied, busted up face and shes talking about how shes going to hunt you down
, my FPS chugs like fucking crazy, to the point where her audio and animation is totally desynchronized.
So it seems to me theres some weird unoptimized shit going on around the main characters models, or something.
Other than that its 60fps in game tho, thats @ 1920x1200, 560ti 2GB, 16GB ram, i7 4770k
I get 60fps almost constantly, unless theres a big close up of a characters face. Or in the hideout when theres several characters in one area, looking at that area ill get big frame drops, but in the rest of the hideout, its fine. For example, in the Belica level, game spoiler, when Frau Engel
gets her face fucked up and theres a huge close up of her bloodied, busted up face and shes talking about how shes going to hunt you down
, my FPS chugs like fucking crazy, to the point where her audio and animation is totally desynchronized.
So it seems to me theres some weird unoptimized shit going on around the main characters models, or something.
Other than that its 60fps in game tho, thats @ 1920x1200, 560ti 2GB, 16GB ram, i7 4770k
That's exactly the issue I'm having. Do the console versions run at 60 during these moments? Because if they do, I'm guessing it's either a bug with the PC version, or they just didn't optimize shit properly.
Guys anyone having flashing or flickering during gameplay? This is on GTX 770 with latest drivers. Flashes are sometimes purple and sometimes white, and pretty small
Any update on the flicking / pop-in issues? I have a GTX 780 and still experiencing issues. Disabling triple buffering fixed it to an extent but I still get occasional texture pop-in and flicking around light sources. Makes playing the game a really frustrating experience!
Any update on the flicking / pop-in issues? I have a GTX 780 and still experiencing issues. Disabling triple buffering fixed it to an extent but I still get occasional texture pop-in and flicking around light sources. Makes playing the game a really frustrating experience!
Is triple buffering all you've tried changing? Personally I've not seen the flickering that some have posted about. Using a 780 Ti and driver 337.88 with triple buffering enabled. Perhaps you need to perform a clean install of the driver? Also r_multisamples is known to introduce artifacts.
Slow texture streaming can be caused by suboptimal config, there are plenty of posts in this thread explaining how best to configure things.
-Make sure you're using a texture cache on fast storage (fs_cachepath).
-If you have two or more GPUs, then disable the extras in device manager.
-Nvidia users enable vt_usecudatranscode 1.
-Experiment with vt_maxPPF values in multiples of 4.
-Enable VT Compression!
For the amount of time it took for me to get this game running decently through obscure (to me) config tweaks I almost wish I'd bought it for PS4 instead. What a pain in the ass!
vt_useCudaTranscode "1" made such a huge difference I really don't understand why it's not a toggle-able setting in game or automatically just set to on through hardware detection.
Do we know what causes the model flickering? I'm still seeing it occasionally.
I wish I knew more about these settings. I'm spending so much time tweaking it's kind of ruining my enjoyment of the beginning of this game. I've found that the settings below get me close to 60fps, but the lack of AA is a real eye sore, even at 1440p. Hell, even at 1620p, but I bumped it back down because I thought I noticed slowdown. I know you can turn on AA, but it's a gigantic hit, even at 2x.
So this is what I'm set to, and it seems low considering my GPU (a 4gb 770GTX):
VT Cache Size : Ultra
VT Compress: Enabled
Max PPF: 16 (any higher and I see texture load in lag.. how does this affect image quality. I can't see any difference between these at all)
Shadow Resolution: 2048 (again, I couldn't see any difference here, so I went with this)
Depth of Field: High
Screen Space Reflections: Disabled (is this the same frame killing setting that was in Shadow Warrior? Again I could not perceive a difference)
Additional Quality Settings: Ultra
Haze Flare: Enabled
I've also turned vt_useCudaTranscode "1" as I said earlier.
Does this seem off to anyone else? Maybe I'm CPU bottlenecked? I have a Xeon W3540 @ 3ghz. Not the latest and greatest but it seems to do well in most other games.
I wish I knew more about these settings. I'm spending so much time tweaking it's kind of ruining my enjoyment of the beginning of this game. I've found that the settings below get me close to 60fps, but the lack of AA is a real eye sore, even at 1440p. Hell, even at 1620p, but I bumped it back down because I thought I noticed slowdown. I know you can turn on AA, but it's a gigantic hit, even at 2x.
So this is what I'm set to, and it seems low considering my GPU (a 4gb 770GTX):
VT Cache Size : Ultra
VT Compress: Enabled
Max PPF: 16 (any higher and I see texture load in lag.. how does this affect image quality. I can't see any difference between these at all)
Shadow Resolution: 2048 (again, I couldn't see any difference here, so I went with this)
Depth of Field: High
Screen Space Reflections: Disabled (is this the same frame killing setting that was in Shadow Warrior? Again I could not perceive a difference)
Additional Quality Settings: Ultra
Haze Flare: Enabled
I've also turned vt_useCudaTranscode "1" as I said earlier.
Does this seem off to anyone else? Maybe I'm CPU bottlenecked? I have a Xeon W3540 @ 3ghz. Not the latest and greatest but it seems to do well in most other games.
These settings seem OK to me. MaxPPF controls the number of pages it will try to transcode in a frame. If you're having trouble going higher than 16 without noticing the texture streaming, then it means your GPU is struggling. You're using quite high resolutions, so I expect that's the main cause.
Frankly there's not much left to enable or turn up. Screen Space Reflections are simply environmental reflections. Some here have said that Depth of Field is also a frame-rate hog. Have you placed your texture cache on the fastest available storage (fs_cachepath)?
Bear in mind that the 770 uses a 2 year old GPU and isn't top tier. Having 4GB VRAM does not mean that the GPU can always draw frames at 16ms while using features that consume 4GB.
I found that downsampling from some resolutions produces flickering on assets and the environment. Use MSI Afterburner's OSD to check the GPU utilisation whilst playing to give you some idea what cost certain settings have on performance.
Performance sounds like it's all over the place in this thread. I'm relieved to get clarification on the i7 requirement though (my specs are a few posts above). It's not like I plan to run the game on ultra.
People tend to post when they have problems. Far fewer people bother to post and say everything is fine ". This game requires no more setting up than any other in my opinion. Still, it's your money. That you're supporting the game is what matters.
People tend to post when they have problems. Far fewer people bother to post and say ”everything is fine ". This game requires no more setting up than any other in my opinion. Still, it's your money. That you're supporting the game is what matters.
i was deciding between this or watch dogs and ended up getting WD in the end, while i enjoyed the game it was a technical mess, i was waiting on a sale for this game and it will be 40 on BB tomorrow, with the gamers club unlocked i get a 20% discount so it comes to 32 bucks, not a big difference money wise and after reading the DF article about it it seems the ps4 version ran quite well, so i rather enjoy the game this time because ive heard nothing but praise than have the experienced tainted by technical problems, at the end what matters is what you said, supporting the game
Picked it up on the Steam sale. Without any real tweaking, it seemed to run fine on an aging i7 930 and 560Ti on the high preset. Seemed pretty consistently at 60 FPS through the first hour of the game. Parts of the opening plane sequence chugged a bit, but that was more of an interactive cutscene than actual gameplay.
Doesn't seem too long. I have one of the fastest SSD on the market Samsung 840Pro and even on that it takes a couple seconds after each death and before level load.
This has to be software related on your end. You can see from the charts in the OP that your hardware should be capable of fine performance. GPU drivers would be my first port of call. Also make sure your GPU is in a 16x PCI-E slot. Idtech 5 games need to stream a lot of data from storage through the PCI-E bus.
This has to be software related on your end. You can see from the charts in the OP that your hardware should be capable of fine performance. GPU drivers would be my first port of call. Also make sure your GPU is in a 16x PCI-E slot. Idtech 5 games need to stream a lot of data from storage through the PCI-E bus.
I just reverted my drivers back from 14.6 to 14.4 and that helped a little bit. Now I can run the game on Low settings in 1440p and get 40-50 FPS. My GPU is in a x16 slot, only possible concern is that it is a 2.0 slot and my card is a 3.0. This has never been an issue with any other game before though.
The game ran fine the first time I played it (mostly 60 with occasional drops in bigger fights/set-pieces), I've started replaying since the patch and it now it seems to run like ass comparatively (frame drops all the time in action whatever I set the options too), so I'm wondering what it did, given I haven't had any driver updates or anything since then (on a Radeon HD 7850).
To those who have issues running AA with the game (specifically for AMD cards), I'll expand (aka tweak) upon a method given by the user "some goofy idiot" on Steam Forums:
create a file called custom.cfg in (your program files)\Steam\Steamapps\Wolfenstein.The.New.Order\base
Open that file in Notepad.
put the lines:
I just reverted my drivers back from 14.6 to 14.4 and that helped a little bit. Now I can run the game on Low settings in 1440p and get 40-50 FPS. My GPU is in a x16 slot, only possible concern is that it is a 2.0 slot and my card is a 3.0. This has never been an issue with any other game before though.
14.4 is the recommended driver version for this game. PCI-E 2.0 16x is equal to PCI-E 3.0 8x. This should be sufficient for much better performance than you are getting. Can you use MSI Afterburner with On Screen Display to see how your GPU utilisation is? You should get something like this:
To those who have issues running AA with the game (specifically for AMD cards), I'll expand (aka tweak) upon a method given by the user "some goofy idiot" on Steam Forums:
having some trouble with voices not syncing up with all of the cutscenes. Really breaks the immersion and story-telling. Anyone have a solution for this?
having some trouble with voices not syncing up with all of the cutscenes. Really breaks the immersion and story-telling. Anyone have a solution for this?
That happened to me early on in the 2nd or 3rd mission
when you take the picture test at gunpoint before sleeping with Anya
. No solution though
FWIW, I'm running the game on my 2gb 7850 and an i5-3570. I'm getting between 50-60 FPS on custom settings (everything on ultra except I turned down the shadow resolution). I haven't seen anything in the 40s yet as others are seeing.
I'll also echo the load time statements. They aren't unbearable, but I noticed them immediately.
That happened to me early on in the 2nd or 3rd mission
when you take the picture test at gunpoint before sleeping with Anya
. No solution though
FWIW, I'm running the game on my 2gb 7850 and an i5-3570. I'm getting between 50-60 FPS on custom settings (everything on ultra except I turned down the shadow resolution). I haven't seen anything in the 40s yet as others are seeing.
I'll also echo the load time statements. They aren't unbearable, but I noticed them immediately.
And then the LORD David Bowie saith to his Son, Jonny Depp: 'Go, and spread my image amongst the cosmos. For every living thing is in anguish and only the LIGHT shall give them reprieve.'
Could someone let me know if there's some trick to getting good performance on this setup? Game feels like it's rattling apart on the lowest possible settings.