Have you tried forcing 30fps with RadeonPro and using Unlimited in-game?
I'm using that plus Double Vsync (no triple buffering) and I felt it was acceptable for 30fps. Much better than in-game 30fps lock + in-game Vsync.
I'm following Durante's recommendation for the 30fps lock: no v-sync, borderless window, unlimited framerate and RTSS set to 30fps.
I did this and experienced no difference whatsoever.
Saw this in the i3/750ti thread, thought it'd be worth posting in here to give people an idea of what to expect.
Seems to match up with other benchmarks and what we've seen from people ITT.
Saw this in the i3/750ti thread, thought it'd be worth posting in here to give people an idea of what to expect.
Seems to match up with other benchmarks and what we've seen from people ITT.
This is bullshit.
No way am I getting 60+fps avg on a 780 ti with ultra settings without hairworks.
This is bullshit.
No way am I getting 60+fps avg on a 780 ti with ultra settings without hairworks.
How much heat can the MSI GTX970 generally withstand? Seems to be reaching around 75 celsius at times with TW3 (hovers around 50 to 70 usually). I wouldn't ask normally since it doesn't sound too high, but the fans are making quite a lot of noise.
Saw this in the i3/750ti thread, thought it'd be worth posting in here to give people an idea of what to expect.
Seems to match up with other benchmarks and what we've seen from people ITT.
Unless it doesn't mean Ultra?
EDIT: Yeah it mans High. My bad.
Believe they are high settingsThis is bullshit.
No way am I getting 60+fps avg on a 780 ti with ultra settings without hairworks.
Number of Background Characters
According to the game's config file, 'Number of Background Characters' limits the number of NPCs simultaneously rendered to 75, 100, 130, or 150, depending on the detail level chosen. To date we have been unable to find any location that features even 75 characters, let alone 150, so are unable to demonstrate the impact of this setting.
One has to assume a location where the setting is of use exists given its inclusion in the game. If you find it, please let us know in the Comments section.
Will do them soon. Game is downloading right now. Once it's done I'll test with the latest drivers and then the older drivers and let you know the results. People on Nvidia thread are claiming improvements, so I'd do it regardless just in case. One guy said he could barely hit 35fps with the latest drivers, and when he reverted to the 350.12 drivers he was getting 45-55fps. Seems... Exaggerated... Which is why I'm dying to test this out.So has anyone done driver compares to see if these allegations are true?
Thank you for the info.
At the moment I've had it set to clamp. Should I change it to allow to check it out with driver forced FXAA instead of the in-game aa?
The description says some apps use negative LOD bias to sharpen texture filtering. This sharpens the stationary image but introduces aliasing when the scene is in motion.
Will do them soon. Game is downloading right now. Once it's done I'll test with the latest drivers and then the older drivers and let you know the results. People on Nvidia thread are claiming improvements, so I'd do it regardless just in case. One guy said he could barely hit 35fps with the latest drivers, and when he reverted to the 350.12 drivers he was getting 45-55fps. Seems... Exaggerated... Which is why I'm dying to test this out.
Pretty sad that 780 sli is getting 4 fps more than a 290x right now.
Probably around 100c. Ideally I'd reccomend keeping it in the 80's. It's entirely up to you though. Higher temps (going close to the TJ max) might affect the lifespan of your card.
Right after that image showing excellent SLI scaling on the Titan X.
It just reinforces this whole notion that the driver optimisations for Witcher 3 are based around Maxwell and not Kepler.
Right after that image showing excellent SLI scaling on the Titan X.
It just reinforces this whole notion that the driver optimisations for Witcher 3 are based around Maxwell and not Kepler.
Pretty sad that 780 sli is getting 4 fps more than a 290x right now.
I have to be honest, I find the "hair" setting obsession hilarious. Of all the things to focus on. The fact that there is a specific setting for it cracks me up.
Will do them soon. Game is downloading right now. Once it's done I'll test with the latest drivers and then the older drivers and let you know the results. People on Nvidia thread are claiming improvements, so I'd do it regardless just in case. One guy said he could barely hit 35fps with the latest drivers, and when he reverted to the 350.12 drivers he was getting 45-55fps. Seems... Exaggerated... Which is why I'm dying to test this out.
Is driver forced FXAA noticeably better than RED's own AA? I'm having a similar setup and haven't tried any manual settings with Nvidia control yet.
I haven't compared, but I would think their own AA is better since it's temporal as well.
Saw this in the i3/750ti thread, thought it'd be worth posting in here to give people an idea of what to expect.
Seems to match up with other benchmarks and what we've seen from people ITT.
Couldn't it also point to the fact that this game is perhaps using more tesselation and compute than we know of how?
Heck the entire post processing chain could be a compute shader on top of the forward+ compute.
I used Antimicro, which can bind any controller button to keyboard key, works well.
I'm still getting framerate drops into the 20's with a 970 and i5 3570s. When I first launch into the game the framerate is relatively high staying in the 60's and 70's. But as I start moving around, it flutters between the 50's and 20's. Is there a performance patch on the horizon for this because it's just annoying now?
None of my graphics settings are on Ultra, most on high, foliage distance on medium and number of characters on low, shadows on low also. Post-processing DOF, Motion Blur, Sharpening are all turned off, everything else is on high, No AA, using FXAA from Nvidia inspector, changed setting to Performance and number of pre-rendered frames to 2, Triple buffering off and anisotropic filtering set to 16x. I even overclocked my card ffs. I thought my rig would be at least enough to get steady 60 fps frame rate.
GTX 970
i5 3570s
8gb RAM
I've come to realize those benchmarks are bullshit. I am NOT getting close to 60fps on a 280X. It hovers between 30 and 40 the vast majority of the time.
I've come to realize those benchmarks are bullshit. I am NOT getting close to 60fps on a 280X. It hovers between 30 and 40 the vast majority of the time.
Agreed.
My experience so far has been pretty much on par with everyone else who has similar hardware.
I'm running with a 3570K @4.2, GTX 970, 16 GGB RAM, Game is on an SSD. Pretty much locked 60 fps 1080p.
I've got Foliage, Shadows, and Background Characters all set to High, everything else is on Ultra. No hairworks, and I'm using FXAA in the NVCP which definitely looks better and costs less than in game AA.
I got 3 crashes in a little over 3 hours of play time. I'm like 99% certain it's not due to my overclocks on either my CPU or GPU.
On a semi related note...I had been noticing a lot of my games lately (DA:I, FC4, GTA, TW3), even when "locked" at 60 fps, would very briefly "hitch" to about 56 fps for less than a second and then jump right back up to 60. I had been completely stumped as to what the hell was causing it, until last night when I stumbled upon someone on Tom's Hardware a few months back who had the same issue and was using a DisplayPort to connect his Monitor to his GPU, just like I was. He suggested that changing over to an HDMI cord solved his problems, so I tried it...and viola. It's definitely better now. I'm not sure whether my cable was faulty, or if DisplayPort simply just isn't the way to go for whatever reason. So if anyone is having any similar issue and is using DisplayPort....try it out.
You should be getting 50 ish minimum on a 970 with no hairworks. With those settings, 60+ easy. You tried any other games to test for problems with your system?
Couldn't it also point to the fact that this game is perhaps using more tesselation and compute than we know of how?
Heck the entire post processing chain could be a compute shader on top of the forward+ compute.
I've come to realize those benchmarks are bullshit. I am NOT getting close to 60fps on a 280X. It hovers between 30 and 40 the vast majority of the time.
I can leave hairworks on and keep 60 fps on my 970 thanks to that MSAA tweak. I turned it down to 2x. It doesn't look as nice, but it's worth it. Roach gets to keep his flowing mane, I get to play with a smooth framerate. All is well.