So has this been debunked or does this depend from person to person?
The reason Unity and Mordor stutter is that Unity and Mordor stutter. The reason why Wolfenstein doesn't stutter is that Wolfensteins engine doesn't stutter.
It is still 90% coding. Divinity: Original Sin is a cool game which performs like shit. It drops to 45FPS in battles with no appearent reason. A 970 should outperform this game x100 but it doesn't. Because of...code.
Stop thinking that every stutter issue which occurs now is related to 500MB of ram not being as fast as the other 3,5 gig. If Nvidia adresses this with their driver so that it's a non issue 90% o the time, I'm ok with it. It's not great, but nothing in life is perfect. The 970 is still a lot of performance per dollar, no matter the issue.
It's actually been nearly 2 weeks since it was discovered, go to overclock.net thread or guru3d thread.
Everyone... Frametimes, not FPS. You're measuring the wrong stuff. Abnormal stuttering or freezing with less than peak GPU usage. It's usually brief and is best captured by recording frametime spikes.
Bullshit. I have Unity, I tested Unity, I have played Unity from beginning to end, it doesn't give me stuttering and frametime spikes normally and in my test at 1920x1080/30 FPS (I usually play at 1440p with framerate uncapped up to 60 FPS), there was a clear VRAM issue with constant frametime spikes and hanging that do not occur otherwise in any situation but do where I exceed ~3600MBs if I can. You're insulting my machine if you think I always get that kind of stuttering otherwise, I do not. My CPU doesn't struggle with Unity, the game doesn't just do this by itself.
Neither does Skyrim. Neither does Space Engine which really has issues with VRAM once it allocates ~3500 MBs and usually refuses to go above despite massive frametime spikes and hanging from pretty much texture and geometry data overload. Space Engine is a VRAM hog.
I've looked at the Guru3D thread and I just see the same benchmark reposted again and again. Benchmarks are not games. There is a reason why we don't use furmark to calculate GPU temperature anymore.
This thread makes me kinda happy, was tempted to pick up a 970 not that long ago, I'm glad that I waited.
SMH at people trying to downplay the problem
You do understand that frametimes are the reverse of fps and ones can be figured out from the others, right?Everyone... Frametimes, not FPS. You're measuring the wrong stuff. Abnormal stuttering or freezing with less than peak GPU usage. It's usually brief and is best captured by recording frametime spikes.
Who is downplaying what problem exactly, point to specifics examples. And what is the problem exactly? How does this translate to a real world problem? Is it marketing or a specific problem in a game? Where are the mismatch between pre-known issue benchmarks and post-known issue benchmarks. This is not downplaying, I'm in nobody's camp, this is me genuinely asking because I want to now.After almost 600 posts there isn't a single example I've seen here that answers that question, only the same customized benchmarks posted over and over again.
You do understand that frametimes are the reverse of fps and ones can be figured out from the others, right?
Stuttering when you're near the limit of card's VRAM is normal and it's cause is bus data swapping.
The only way to prove anything here is to present a situation where 970 gives unexpectedly worse performance (fps, frametimes, stutter, etc.) compared to 980.
You do understand that frametimes are the reverse of fps and ones can be figured out from the others, right?
I don't know about performance but it's been said that there are games that use less VRAM (staying below 3.5GB) on 970 while using more on 980 under the same game settings. I've only seen it mentioned a few times though, haven't seen evidence of it yet. But if it were true, what would be a possible explanation?
No and no.
Driver avoiding allocation of more memory on 970. Which would mean that NV is aware of this issue since launch and it's likely h/w and unfixable. But still, mentions and evidence are different things. We need proof that this affects real games out there.I don't know about performance but it's been said that there are games that use less VRAM (staying below 3.5GB) on 970 while using more on 980 under the same game settings. I've only seen it mentioned a few times though, haven't seen evidence of it yet. But if it were true, what would be a possible explanation?
Then what do you do of the results I posted in this thread ? (post here)
There is way too much bandwagoning going on in this thread.
Maybe Nvidia dropped the ball, but we could use more datas before jumping to conclusions (especially datas not from the vram benchmark used so far).
Same thing happens with RAM and all graphics cards.
The more you have, the more your applications will use. It is there, so why not? I believe it has to do with caching and stuff, but it doesn't really matter.
EDIT: Oh wait, the GTX 980 doesn't have more VRAM, my bad. Still, it seems to be a pretty inconsistent thing for me, and I don't think the issue displayed in this topic would be shown by less VRAM usage in games. Unless it somehow auto-throttles the amount of VRAM it uses, which I doubt.
The more the frametimes the less fps you will have. Thus an unusually high frametimes will manifest themselves in an unusually low fps. What we need is a point of comparision which will allow us to define "usual". You can't get one from a 970 alone.Not necessarily. Sure you can find the average FPS from the average frametime, but looking at an FPS counter isn't going to show you that one frame which took twice or thrice as long to render.
Were people mentioning performance issues with the card before the discovery of this case? Granted it's a relatively new model(?) but I notice zig-zagging remarks as to what degree the limitation has, or is, effecting the games.
We bought our cards, because they were future proof. Current games are already allocating 3.5-4GB VRAM. In 1 year, 3.5GB won't be enough to max out some games I believe (regardless of the framerate)
Otherwise, developers will be forced to play around this limitation and gimp their games to run allocating max 3.5GB, so essentially, REAL 4GB cards will be held back. Not a nice scenario for people that bought the highest end cards...
I live in a third world country, and I spent half my salary just for my MSI 970 4g, I made a (huge) commitment (PC gaming is all about this) by buying good components and be future proof, and now I get this... Heck, there's no way I can even return the card (No nvidia official sellers here), so I'm basically screwed if no solution comes out, and since it seems a hardware issue, we probably will stay like this forever.
If this doesn't get solved, NVIDIA please go to hell. I'll buy AMD in the future, or I'll just stay with PS5 (or Xbox Two?) and screw PC gaming altogether.
You're clearly not informed on this matter and thus shouldn't be making judgements IMO. And yes auto-throttling of VRAM has been suspected since the beginning of the thread.
This thread makes me kinda happy, was tempted to pick up a 970 not that long ago, I'm glad that I waited.
Wow. Really?
Then what do you do of the results I posted in this thread ? (post here)
There is way too much bandwagoning going on in this thread.
Maybe Nvidia dropped the ball, but we could use more datas before jumping to conclusions (especially datas not from the vram benchmark used so far).
So OK, now i'm lost, because this shit happend "right on time". As always.
What should i buy online on this weekend (great 2-day GPU sale in my local store) 290X or 970? What GPU considering this issue is more future-proof?
Not trolling. I have serious problem choosing right now, i've considered 970 before this news came in.
Then go to overclock.net or nvidia forum there's plenty of benches even with frame time analysis. There's even one in this thread.
650W FSP Aurum 80 GOLDUnless you want the NVIDIA exclusive features or prefer nvidia drivers I would always advise getting the 290x over a 970 provided you have a decent PSU.
What is your full PC spec?
What nvidia driver version are you using?
And if possible could you try again but show the GPU usage and Frame times too? Thanks in advance.
1920x1200
![]()
2880x1800 (DSR)
![]()
3840x2400 (DSR)
![]()
PC : i7 4790k, asus z97 pro, 16GO ram, gigabyte gtx970 G1, no other pcie slots used.
Drivers : 347.09
Here are the non cropped screenshots I took. There is GPU usage, but no Frame times. I'll include them when I do some other tests. But it was actual gameplay and there was little to no stuttering.
It's actually been nearly 2 weeks since it was discovered, go to overclock.net thread or guru3d thread.
He spent half his salary on a product that isn't as advertised, all you can do is laugh? Why even bother posting.
You're a good poster, but the product is as advertised. Even if the product is proven beyond refute to only be able to use 3584MB at full rate, it's still as advertised. The design featuring 8 x 512MB modules is not indicative of anything other than the fact that they add up to 4096MB.
If nvidia actually disabled one of those modules and advertised the card as 4GB, you would have a case for mis-selling.
We can certainly debate whether the product is worth buying, or whether GPU vendors would be wise to be more transparent, but saying the card has been mis-sold doesn't stand up to scrutiny.
you're having a fucking giggle there
So OK, now i'm lost, because this shit happend "right on time". As always.
What should i buy online on this weekend (great 2-day GPU sale in my local store) 290X or 970? What GPU considering this issue is more future-proof?
Not trolling. I have serious problem choosing right now, i've considered 970 before this news came in.
Well, not really, those game don't stutter with setting that don't fill as much of the VRAM.
Some game just stutter on any configuration, you are right on that, it's about coding, and not every stutter is related to this, but on this specific situation is a bit different because the card has a clear memory allocation problem.
you're having a fucking giggle there
If you say so.
As stressful memory test as you can get...3.9GB in use, a smooth consistent 60-65fps.
http://i.picpar.com/fymb.png[/][/QUOTE]
Guessing here but it could be related to how the memory is being used/allocated/copied/written too that makes it show up in certain analysis to be degraded.
But as some say that could be down to game engine design, too many variables, we need nvidia to test it.
If you bought a 4gb ddr3 kit and the last 500mb was 10x slower you can't say you would be remotely annoyed? Especially when you could get another set that didn't have that problem for the same price?
Anyway it maybe a driver issue so let's wait for nvidia to comment.
Did I say you can't be annoyed? I said it doesn't constitute false advertising.
Guessing here but it could be related to how the memory is being used/allocated/copied/written too that makes it show up in certain analysis to be degraded.