This is what I have.
Hynix memory. So far, so bad.
Not sure if this is relevant to anything, but after restarting my machine, I find that Windows 8.1 is using about 400MB of VRAM all by itself according to GPU-Z.
Is this being taken into account with this test?
See headless mode, and most people are missing this hence the myriad of incorrect results now popping up.
Even when people run basic aero mode there is VRAM allocated so the only way to run this is by using the IGP else it's inaccurate.
Everyone should at the very least
before running this and posting their results. But it's still not particularly meaningful if you are using the same GPU for actual desktop graphics at the same time.
- disable Shadowplay and
- make note of their prior windows/application VRAM usage levels
I assume this is what you mean by VRAM usage:
![]()
So, that's before. During it seemed to go up to 3004 and stay there, but like others have said, the test crashed my video driver about every time.
General question, does Windows give up its VRAM allocation if a game is rendering in exclusive full-screen mode? If so, could the benchmark be recoded that way to keep from having to do the hassle of IGP?
I have an MSI 980, so I'm not terribly concerned at this point, I'm just curious since I don't do any Windows or gfx programming.
Also, I second the request for source code of the benchmark, I'd like to see how it decides where an allocated chunk is in physical memory, because I don't know a reliable way to do that.
Not totally sure, I run basic theme on my win 7 steam box to not have to worry about it whilst gaming.
User on nvidia forum has shown GTX 980 on aero mode to have degraded results, but normal results with basic/classic windows 7 theme.
This post on the nvidia forums shows the opposite, the guy got better results with aero and worse results with basic. Weird.
https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4430284/#4430284
Since installing my GTX 970 I've had problems with periodic stuttering/pauses when playing modern games, and, while it can't be 100% related to this (for instance I'm guessing CS:GO at 1920x1080 isn't taking nearly 4GB of RAM), hearing about this has definitely made me curious if it might be related, especially as I have the memory guzzling stuff maxed on most games and use DSR on some.
Everyone should at the very least
before running this and posting their results. But it's still not particularly meaningful if you are using the same GPU for actual desktop graphics at the same time.
- disable Shadowplay and
- make note of their prior windows/application VRAM usage levels
Also, I second the request for source code of the benchmark, I'd like to see how it decides where an allocated chunk is in physical memory, because I don't know a reliable way to do that.
.For anyone that is interested, here is the link to the source code of the benchmark (post 20 by Nai)
I was super close to grabbing a GTX 970 in the next month or two, but I'm gonna keep an eye on this and see how it pans out before I drop ~£280 on one.
Just a thought (speculation). Could the drivers be "soft-capping" 970 vram use to about 3.5gb *because* Nvidia already knows about this vram performance issue?
Because allegedly the 980 goes right to 4gb of use in game scenes and settings where the 970 gets allocated much less, and you have to crank it much further to get the driver to feed the vram past that point on 970.
It fits with the "hardware design cause" scenario and implies horrible things about Nvidia, so this should be proven before believed.
Just a thought (speculation). Could the drivers be "soft-capping" 970 vram use to about 3.5gb *because* Nvidia already knows about this vram performance issue?
Because allegedly the 980 goes right to 4gb of use in game scenes and settings where the 970 gets allocated much less, and you have to crank it much further to get the driver to feed the vram past that point on 970.
It fits with the "hardware design cause" scenario and implies horrible things about Nvidia, so this should be proven before believed.
If the hardware angle is true, the only fix will be a loyalty uprade to a GTX980.I literally just bought this card yesterday for my new build -.- Luckily I have two weeks to send it back if I don't open it, but I still hope they manage to fix it somehow.
If the hardware angle is true, the only fix will be a loyalty uprade to a GTX980.
If the hardware angle is true, the only fix will be a loyalty uprade to a GTX980.
If the hardware angle is true, the only fix will be a loyalty uprade to a GTX980.
Yeah, my guess is they will draw it out as long as they can with the same BS excuses. Free games simply aren't going to cut it for permanently gimped cards.Which will literally never happen.
If the hardware angle is true, the only fix will be a loyalty uprade to a GTX980.
I wouldn't expect a 980, but I do except a replacement 970 that utilizes all 4GB of RAM properly. That is if this can't be fixed through software.
I wouldn't expect a 980, but I do except a replacement 970 that utilizes all 4GB of RAM properly. That is if this can't be fixed through software.
I doubt that Nvidia would replace a card like this. At most, maybe a free game or some such as compensation, that is if it turns out to be a hardware issue.
THe only way that could possibly happen is if this fuck up your gaming experience with anomalies. If the only way to know about it is to read about it on an internet gaming forum, then there's not much chance. THey can just say it was never meant to be a 980 in the first place.
It's false advertizing. They misled customers with erroneous specs, that's potential class action lawsuit territory.
We don't know shit yet, especially since we have people saying that that test we're doing might not matter at all. So, anyone dreaming of a free upgrade to 980 at this point is living in fantasy land.
I doubt that Nvidia would replace a card like this. At most, maybe a free game or some such as compensation, that is if it turns out to be a hardware issue.
Yep. It's the equivalent of plugging in a new monitor and having the refresh rate or resolution be different than what's on the box, or getting a new cpu set up and having the its GHz be smaller. VRAM is a huge, huge selling point for graphics cards nowadays and it's one of the primary specs you cannot and should not lie or mislead customers regarding. I am going to be pissed as shit if this ends up being a hardware issue and Nvidia knew about it the entire time (how could they not?).
The bigger news sites need to report this.
If this ends up being an unfixable hardware design issue, then #1) I don't for one second believe Nvidia didn't already know about this and #2) that may be partially why it was priced so much lower than its slightly bigger brother. Scummy if true.
And very interesting that none of the big review sites reported this. I wonder why.
If there's enough backlash, then hopefully they do something about it. Drop the 980 price plz.