thehillissilent
Member
Try alt tabbing in and out, happens to me time to time and doing that fixes it.
Thanks.
Try alt tabbing in and out, happens to me time to time and doing that fixes it.
This is "nvidia sponsored" game (GameWorks) and yet 4GB Fury X has more FPS than 6GB 980Ti? And nano is almost as fast as 980Ti? I'd say that's unexpected.
I wonder which setting there were using. My VRAM usage was around 4.5GB
EDIT: ok, no PCSS+ and HBAO+, that explains it. GimpWorks strikes again!
Calling two of the very best high-end diffuse lighting effects implemented in games "GimpWorks" reflects more on your perspective than those effects.EDIT: ok, no PCSS+ and HBAO+, that explains it. GimpWorks strikes again!
Calling two of the very best high-end diffuse lighting effects implemented in games "GimpWorks" reflects more on your perspective than those effects.
I swear, it often feels like you people would rather have a straight console port without any options.
The ridiculous phrase "strikes again" and an appropriate emoticon was there for a reason - i.e. so that people wouldn't take this dead serious. Guess that didn't work out.Calling two of the very best high-end diffuse lighting effects implemented in games "GimpWorks" reflects more on your perspective than those effects
Gsync is bugged in Fullscreen mode, try borderless window and it works. At least gsync. Maybe SLI + Gsync works in borderless window mode.
Is it? I just set it to Windowed (Fullscreen) after reading this post, and my GPU usage tanks (GSYNC on) to the 50%-60% area. Put it back on Full Screen and it's right back up to the normal 99%.
I wonder why they are recommending a resolution in which the game runs at 40 fps at max settings.
NVIDIA defense force? I kid. It would be better if they were using in house effects that could be optimized By each vendor and not only one.
Love hbao+ btw but we need open standards and not closed ones but this is a discution for another topic.
Is it? I just set it to Windowed (Fullscreen) after reading this post, and my GPU usage tanks (GSYNC on) to the 50%-60% area. Put it back on Full Screen and it's right back up to the normal 99%.
Windowed video modes do not work for AFR on windows under DX. Never have really. Should work under Mantle, Vulkan, and DX12 though.
Never ever used windowed fullscreen or otherwise with SLI.
I get 99% GPU load in both Fullscreen and windowed mode, but in Fullscreen the monitor refresh rate doesn't match my fps. The refresh rate jumps between 90hz-144hz and that's no where near the fps of around 45-60.
I wonder why they are recommending a resolution in which the game runs at 40 fps at max settings.
[ig]http://gamegpu.ru/images/remote/http--www.gamegpu.ru-images-stories-Test_GPU-MMO-Tom_Clancys_The_Division_Beta_-test-nv.jpg[/img]
I wonder why they are recommending a resolution in which the game runs at 40 fps at max settings.
I wonder why they are recommending a resolution in which the game runs at 40 fps at max settings.
Because there are settings other than max? People seriously need to get over this weird obsession with 'max settings' or it's going to change PC games for the worse.
A little confused by this post. What do you mean the refresh rate jumps? A little clarification if you could.
I would be fine if they hid "ultra" or "extra high" or insane settings behind .ini files
Why?
To make people feel better about themselves so that they can ignorantly brag about "maxing" the game out on their mid range GPU?
I think people need to come to terms with the fact that PC games are becoming more demanding agIn and embrace it. It's not a bad thing if you can put all the options on "medium" or "high" and still have a damn looking game while "ultra" is reserved for only those with enthusiast hardware or for until the future upgrade for most people.
This is how it always used to be until developers started making games for consoles first and then just porting the game over to the PC with minimal improvements. Now we are actually starting to get PC games that are doing things the consoles can only dream of but yet we are starting to have more complaints? We have people saying games are "Unoptimised" because they can't max it out at 60fps on a gtx 970. I mean, give me a break....
It doesn't make sense to me. A 970 gtx is not supposed to be able to see you through a whole generation (4 years) while being able to max every game out at 60fps. It doesn't work like that. If it did then the PC gaming space would be in a very bad place as it would mean we are in a state of stagnation.
It's mostly a "well my video card cost $300+ so I should be able to max everything because reasons" thing. Massive have put a good amount of work into the PC version. It runs and looks great. I dunno where this need to "max" everything came from either, but it needs to die.
We have people saying games are "Unoptimised" because they can't max it out at 60fps on a gtx 970. I mean, give me a break....
It doesn't make sense to me. A 970 gtx is not supposed to be able to see you through a whole generation (4 years) while being able to max every game out at 60fps.
This game is 1080p, 30 fps at high setting on consoles. A GTX 770 is about twice that power and should give double the fps of the console at similar settings and resolution.
The 970 is a step above the 770, several times the power of a PS4. It's NATURAL people expect double the framerate at way higher graphic settings than consoles.
This game doesn't even come close to that.
This game is 1080p, 30 fps at high setting on consoles. A GTX 770 is about twice that power and should give double the fps of the console at similar settings and resolution.
The 970 is a step above the 770, several times the power of a PS4. It's NATURAL people expect double the framerate at way higher graphic settings than consoles.
This game doesn't even come close to that.
It was just a few days ago I said that, with time, you'd need a 970 just for console parity and everyone replied with "NO WAY".
These days it takes few days to make predictions true. Just imagine what happens in the next couple of years when DX12 hardware comes out...
It's not natural, it's idiotic. Sorry to be blunt.
Without getting into whether the consoles are running the game at the equivalent of high settings or not, people need to take into account how resource intensive the settings they are selecting are. People are not thinking, they are just typing.
lf you whack up things like reflections, PCSS, hbao, subsurface scattering, particle detail and volumetric fog why are you expecting linear performance gains?
And as for this post... Console parity means running at equivalent settings with the same target resolution AND framerate. Just think about that for a second.
Of course we have to see what console settings are at, on PC.
It's just experience saying that usually console settings are equivalent to "high". In that case the PC, without PCSS or HBAO, still can't have stable 60fps at 1080p with a 970.
That's absurd.
It's not absurd if it's then discovered that console settings are dialed down even compared to "high" on PC. But that's hardly the case. We'll see.
There are countless of Eurogamer articles out there demonstrating how console parity is achievable with an i3 and a 750 Ti. Now try to run this game, at similar setting, on that hardware.
Yes. I said two things. One is that gap between PC hardware and console is narrowing. The other is that the 970 will reach parity with console within the next couple of years. Right now you can see that you can't even double the fps at same resolution, which is INSANE for a 970 compared to a console.
As I said, right now and for all games up to this point (say Battlefield), a 770 CAN double fps of a PS4 at similar settings. There are almost zero exceptions. All games out there outside a few cases that are poorly optimized.
The 970 is not a 770. The 770 can have stable 60 fps at 1080p on Battlefield at slightly better console settings. Battlefield doesn't even run at 1080p on console. Do you see how big is the gap?
Now in this game the 770 might not even offer fixed 30 fps at console settings and resolution.
There are countless of Eurogamer articles out there demonstrating how console parity is achievable with an i3 and a 750 Ti. Now try to run this game, at similar setting, on that hardware.
The gap is narrowing overnight.
How can you argue any of this without even knowing what the console equivalent settings are? You're just guessing at this point because it suits what you want to believe.
This game is 1080p, 30 fps at high setting on consoles. A GTX 770 is about twice that power and should give double the fps of the console at similar settings and resolution.
The 970 is a step above the 770, several times the power of a PS4. It's NATURAL people expect double the framerate at way higher graphic settings than consoles.
This game doesn't even come close to that.
It's not about "coming to terms that you can't max settings anymore". It's coming to term that the gap between expensive PC hardware and cheap console hardware is narrowing overnight.
Yes. I said two things. One is that gap between PC hardware and console is narrowing. The other is that the 970 will reach parity with console within the next couple of years. Right now you can see that you can't even double the fps at same resolution, which is INSANE for a 970 compared to a console.
As I said, right now and for all games up to this point (say Battlefield), a 770 CAN double fps of a PS4 at similar settings. There are almost zero exceptions. All games out there outside a few cases that are poorly optimized.
The 970 is not a 770. The 770 can have stable 60 fps at 1080p on Battlefield at slightly better console settings. Battlefield doesn't even run at 1080p on console. Do you see how big is the gap?
Now in this game the 770 might not even offer fixed 30 fps at console settings and resolution.
There are countless of Eurogamer articles out there demonstrating how console parity is achievable with an i3 and a 750 Ti. Now try to run this game, at similar setting, on that hardware.
The gap is narrowing overnight.
Gah! When is Pascal out? I need the 1080Ti or whatever they will call it in my life.
gap between PC hardware and console is narrowing.
So what should be a yardstick for good performance/optimization? If overclocked 980Ti paired with overclocked i7-6700K cannot sustain 60fps in Division Beta on max settings is that ok, or should we expect better performance? I wonder what's the consensus on that (if any).
So what should be a yardstick for good performance/optimization? If overclocked 980Ti paired with overclocked i7-6700K cannot sustain 60fps in Division Beta on max settings is that ok, or should we expect better performance? I wonder what's the consensus on that (if any).
I have this set-up. As stated above, going with high instead of PCSS will yield you 60+ FPS average.
Yes... It is OK. What difference does max make from not max? Is it a mental thing? I get an Avg of 62 fps with all settings at max except for PCSS at 2560x1440 . Technically it's not maxed because of that one setting. In Rise of TR I get 60+ with only shadows at one notch below max. Is that a problem too?
I'm getting performance far beyond the console versions here and over 60fps.
I remember nvidia marketing HBAO+ as a feature that offers superior quality with better performance, so in theroy HBAO+ should be the best option for nvidia users, but maybe I misunderstood.Going with Ultra instead of HBAO+ will get you even more FPS.
I remember nvidia marketing HBAO+ as a feature that offers superior quality with better performance, so in theroy HBAO+ should be the best option for nvidia users, but maybe I misunderstood.