Nvidia responds to GTX 970 memory issue


It didn't say they aren't accepting returns did it(I may have missed it reading)?

Newegg set up an FAQ entry specifically to address the issue, the info in which is nothing more than a regurgitation of what Nvidia has said. You do the math. ;)

Thanks Jase for that, the CSR showed that to me yesterday so I started kind of getting more aggressive with her about the return and that it was total BS with this information since what we we bought does not work as advertised. She took my info and I had to send her UPC, Serial#, etc. and its being escalated to their helpdesk which I see nothing happening.

I was considering just selling this card and maybe getting a 290X till all the newer cards release later this year. I have the NZXT G10 on my MSI 970 right now to water cool it and this G10 would work on that too so I'm not worried about heat output and noise from the 290X since I'm watercooling. I was thinking of getting the 980, but why reward Nvidia by spending another $200+ and then add on top of it what if something else comes out here soon that the 980 might have an issue.

My thing is I just bought a 4K TV where my gaming PC is hooked up to and I can't take advantage of using 4K with an even more gimped card where at least the 290X's are pretty decent with 4K. Just really sucks.
 
Yeah, as somebody who's felt held back by his 2GB cards since Watch Dogs/Wolf14, the configuration of the 970 has thrown me for a bit of a loop. One store here in Aus. is temporarily selling Galaxy 980s for $649, which is actually a decent price (roughly USD$500), but the price difference between a given 970 and said 980, even in light of the true nature of the former, is one very bitter pill. I know I should wait until Nvidia responds to AMD by repositioning the 980 to fill the slot that the 970 has slipped from, but, man, I was really looking forward to smashing through AssCreed Unity before Uni resumes and it wouldn't be wise to buy a 970 for that purpose seeing as SLI is the end game (one card now, another later, once I've sold my 670s).

(I've been quiet vocal about SLI support -- or rather the blotchy nature/lack thereof -- over the past 6 or so months, but as I only play a few "high-end" releases a year at best, the problem doesn't really affect me.)

Edit: Bah, stupid "compact" laptop keyboard. Fixed.
 
Personally, i think this issue is highlighting just how much better some stores CS is over other stores - i can say without a doubt my next hardware purchase will be via overclockers considering how well they've treated customers with regards to this issue, kinda amazing at how much they are doing over this compared to most places.
 
UK retailers are probably realising that under the Sales of Goods Act they wouldn't have much choice if people started pushing for refunds. And the kind of people who would find out about this issue and ask for a refund are the kind of people who would easily be able to find out about such legislation.
 
On boxing day the guy at the shop kept telling me that there is no point in buying a 980, He said the 970 will do the job. For some reason i just couldn't stop looking at the gigabyte G1 980 and ended up buying it. While I dont feel like I dodged a bullet (the 970 is still a good card) I feel maybe the extra I payed was for this very controversy.

Everyone was confused about the difference in price vs the small diff in specs between the two cards, but i think now we know the why it was such.
 
Is freesync limited to AMD cards?
Yes and no.

FreeSync is AMDs implementation of Adaptive Sync.
But a leaked beta driver enabled "module free g-sync" on Adaptive Sync capable laptops.
As for Adaptive Sync support for desktop monitors, Nvidia "hasn't decided yet"

Will a dedicated module be required for G-Sync on future products? Interestingly, NVIDIA says that the answer hasn't been 100% decided yet - the future could include this same module, a new module or perhaps even no module at all. There are features and technologies in the G-Sync module that haven't been revealed, at least according to NVIDIA, and with that line of thinking I don't see a time when NVIDIA relieves the desktop market of that requirement in the near term.
http://www.pcper.com/reviews/Graphi...ed-Leaked-Alpha-Driver/Drawbacks-and-Conclusi
 
Everyone was confused about the difference in price vs the small diff in specs between the two cards, but i think now we know the why it was such.

You pay a premium for the 980 no matter the REAL difference in power is. The price/power line is everything except linear.
 
UK retailers are probably realising that under the Sales of Goods Act they wouldn't have much choice if people started pushing for refunds. And the kind of people who would find out about this issue and ask for a refund are the kind of people who would easily be able to find out about such legislation.
Yes that and the unfair contract terms act means the retailer can't escape liability. Anyone in the uk should be able to get a refund.
 
Thanks Jase for that, the CSR showed that to me yesterday so I started kind of getting more aggressive with her about the return and that it was total BS with this information since what we we bought does not work as advertised. She took my info and I had to send her UPC, Serial#, etc. and its being escalated to their helpdesk which I see nothing happening.

I was considering just selling this card and maybe getting a 290X till all the newer cards release later this year. I have the NZXT G10 on my MSI 970 right now to water cool it and this G10 would work on that too so I'm not worried about heat output and noise from the 290X since I'm watercooling. I was thinking of getting the 980, but why reward Nvidia by spending another $200+ and then add on top of it what if something else comes out here soon that the 980 might have an issue.

My thing is I just bought a 4K TV where my gaming PC is hooked up to and I can't take advantage of using 4K with an even more gimped card where at least the 290X's are pretty decent with 4K. Just really sucks.

Unless your 4K TV has a DisplayPort, then 980 is really the only other option. One thing the 290/290X lacks is HDMI 2.0 support which is required for getting 60 Hz on nearly all 4K TVs, otherwise you're limited to 30 Hz.
 
Put through a refund request. Made a few references to the consumer rights guarantees on the ACCC site so hopefully I don't get any kind of resistance.
 
Because there haven't been many games to push the 4GB vram. As games in the very, very near future push 4GB and beyond, the card is going to start showing it's age.

Games already push the 4GB limit though, if gaming at above 1080 resolutions, which is why I don't understand people paying out a lot more money for a 980. Far Cry 4 runs out of VRAM at 4K resolutions, on systems with a 980, as evidenced by Nvidia on their own tweaking page for the game. You already have to lower settings to accommodate high resolutions, on a current game, which is indicative of 4GB not being enough sooner rather than later for high resolution gaming.

The 970 will have as 'long a life' as the 980 if only gaming at 1080p.
 
This is from r/pcgaming on Reddit.
Boris Vorontsov from ENBseries claims that the GTX 970 actually only has 3.5GB of VRAM and that the 'slow' 0.5gb segment is actually system RAM being used. He also claims that this can be easily proven by allocating blocks into VRAM and then dumping system RAM. By searching the dump, you can find the codes of those last 0.5GB blocks of 'VRAM'.

http://www.reddit.com/r/pcgaming/comments/2uco43/boris_vorontsov_from_enbseries_has_a_theory_on/

I don't know how true this is, I can't really test it right now but if true... nVidia have a lot to answer for.
 
This is from r/pcgaming on Reddit.


http://www.reddit.com/r/pcgaming/comments/2uco43/boris_vorontsov_from_enbseries_has_a_theory_on/

I don't know how true this is, I can't really test it right now but if true... nVidia have a lot to answer for.
Okay, now people are getting stupid.

Of course the memory is in main RAM. How do you think the computer sends stuff to the GPU? Pixie dust? It's first loaded into system memory and then uploaded to the GPU from there.
 
Okay, now people are getting stupid.

Of course the memory is in main RAM. How do you think the computer sends stuff to the GPU? Pixie dust? It's first loaded into system memory and then uploaded to the GPU from there.

Yeah in a sense data, before it is sent to vram is put in system ram first then "shadowed" to vram. The theory in that thread is that the last .5GB of vram is programmed to never be used until system ram is full (rendering it completely useless and for all intents and purposes a 3.5GB card) or it doesn't exist at all.
 
I was loving my Gigabyte GTX 970 G1 Gaming card until I heard this news...luckily I bought my card from Amazon and they agreed to give me a refund...but I'm wondering if I'm better off keeping my current 970, paying $200 more and upgrading to a 980 or as a last resort keeping my old GTX 580 and waiting for the next-gen cards from Nvidia...will the issue effect me since I currently game at 1920 x 1200?
 
I was loving my Gigabyte GTX 970 G1 Gaming card until I heard this news...luckily I bought my card from Amazon and they agreed to give me a refund...but I'm wondering if I'm better off keeping my current 970, paying $200 more and upgrading to a 980 or as a last resort keeping my old GTX 580 and waiting for the next-gen cards from Nvidia...I game at 1920 x 1200

Next gen cards fro Nvidia are Pascal in 2016. They will require mobo change and drastic upgrade.

In my opinion, if you did not encounter any problem and are a mono-GPU user at standard resolution, you will be fine. IMO.

If you feel better by refund or upgrade, of course a 980 will be better.
 
This is from r/pcgaming on Reddit.


http://www.reddit.com/r/pcgaming/comments/2uco43/boris_vorontsov_from_enbseries_has_a_theory_on/

I don't know how true this is, I can't really test it right now but if true... nVidia have a lot to answer for.
Pretty sure tests confirm that 4GB vRAM is present and available, its just the last 0.5GB of it performs so poorly its virtually unusable.

I was loving my Gigabyte GTX 970 G1 Gaming card until I heard this news...luckily I bought my card from Amazon and they agreed to give me a refund...but I'm wondering if I'm better off keeping my current 970, paying $200 more and upgrading to a 980 or as a last resort keeping my old GTX 580 and waiting for the next-gen cards from Nvidia...will the issue effect me since I currently game at 1920 x 1200?
I would say this depends a lot on how long you were planning on having that 970 in the first place. Or alternatively, how long do you plan on staying at 1920x1200?

I would guess Nvidia's next cards wont come out til later this year. Perhaps they'll come out with a full GM200 Titan II or something beforehand, but its likely to be *very* expensive.

Is AMD really not an option?

Next gen cards fro Nvidia are Pascal in 2016. They will require mobo change and drastic upgrade.
That's when they'll come out with a whole new architecture, but just like the 600 and 700 series were both Kepler, the next line-up of Nvidia cards before Pascal will likely still be Maxwell and they can probably do that this year.
 
Next gen cards fro Nvidia are Pascal in 2016. They will require mobo change and drastic upgrade.

In my opinion, if you did not encounter any problem and are a mono-GPU user at standard resolution, you will be fine. IMO.

If you feel better by refund or upgrade, of course a 980 will be better.

I'm more concerned about future games which might require 3.5GB+ VRAM ('The Division' for example)...I don't like to upgrade video cards every year...my last GTX 580 lasted me 3 years and I was hoping for the same with the 970...the 970 might be fine now at 1920 x 1200 but mid-2016 will this still be the case?...by next-gen I was thinking about the rumored 8GB GTX 970/980 cards coming in April
 
I'm more concerned about future games which might require 3.5GB+ VRAM ('The Division' for example)...I don't like to upgrade video cards every year...my last GTX 580 lasted me 3 years and I was hoping for the same with the 970...the 970 might be fine now at 1920 x 1200 but mid-2016 will this still be the case?...by next-gen I was thinking about the rumored 8GB GTX 970/980 cards coming in April



You know that a real GTX 970/980 8GB would run on a 512bit bus to be full speed. It would cost you much more than the current one.
 
Is AMD really not an option?

I've used both over the years and I just prefer Nvidia at this point...could change in the future with the AMD 300 series...most games that have come out in the past few months have been heavily favored towards Nvidia on the performance side (AC: Unity, Dying Light, Far Cry 4 etc)...upcoming games like Evolve and Witcher 3 are also Nvidia sponsored...so I just prefer sticking with them until AMD comes out with another 9800 Pro type of card
 
Yeah, as somebody who's felt held back by his 2GB cards since Watch Dogs/Wolf14, the configuration of the 970 has thrown me for a bit of a loop. One store here in Aus. is temporarily selling Galaxy 980s for $649, which is actually a decent price (roughly USD$500), but the price difference between a given 970 and said 980, even in light of the true nature of the former, is one very bitter pill. I know I should wait until Nvidia responds to AMD by repositioning the 980 to fill the slot that the 970 has slipped from, but, man, I was really looking forward to smashing through AssCreed Unity before Uni resumes and it wouldn't be wise to buy a 970 for that purpose seeing as SLI is the end game (one card now, another later, once I've sold my 670s).

(I've been quiet vocal about SLI support -- or rather the blotchy nature/lack thereof -- over the past 6 or so months, but as I only play a few "high-end" releases a year at best, the problem doesn't really affect me.)

Edit: Bah, stupid "compact" laptop keyboard. Fixed.

Yea I saw that card on PCCG which like you say is likely only the start of the 980 price drops. I'm sure I can get a refund for my 970 from MSY but to give Nvidia anymore money at this point doesn't send the right message for their woeful public relations throughout this episode. The only real way to force change is to hit their bottom line.

Am now thinking of pushing for the refund, getting a 290x and upgrading again when the higher vram cards are available whenever that is. Not ideal but at least Nvidia do not have any of my hard earned during this round. Either way it feels like I am cutting my nose to spite my face given I wasnt provided the opportunity to make a fully informed decision upfront. Dagnappit.
 

Well.

AnandTech said:
Multiple blocks in turn can be teamed together to provide additional bandwidth between two devices, or those blocks can be used to connect to additional devices, with the number of bricks depending on the SKU. The actual bus is purely point-to-point – no root complex has been discussed – so we’d be looking at processors directly wired to each other instead of going through a discrete PCIe switch or the root complex built into a CPU. This makes NVLink very similar to AMD’s Hypertransport, or Intel’s Quick Path Interconnect (QPI). This includes the NUMA aspects of not necessarily having every processor connected to every other processor.

But the rabbit hole goes deeper. To pull off the kind of transfer rates NVIDIA wants to accomplish, the traditional PCI/PCIe style edge connector is no good; if nothing else the lengths that can be supported by such a fast bus are too short. So NVLink will be ditching the slot in favor of what NVIDIA is labeling a mezzanine connector, the type of connector typically used to sandwich multiple PCBs together (think GTX 295). We haven’t seen the connector yet, but it goes without saying that this requires a major change in motherboard designs for the boards that will support NVLink. The upside of this however is that with this change and the use of a true point-to-point bus, what NVIDIA is proposing is for all practical purposes a socketed GPU, just with the memory and power delivery circuitry on the GPU instead of on the motherboard.


http://www.anandtech.com/show/7900/nvidia-updates-gpu-roadmap-unveils-pascal-architecture-for-2016
 

dUOuqVt.png


The driver update was also a lie.

Source: https://twitter.com/NVIDIAGeForce/status/560878957554569216
 
Christ, I was about to buy this card. Nvidia legitimately deserves to get sued for this.



I'd say this is like Intel or AMD selling a "quad-core" CPU where one of the cores has dramatically lower performance than the other three and not disclosing that information.

When you say something has four cores, you make a pretty strong implication that they all have the same functionality because that's the standard.
Like AMD selling things as 6/8 cores when they share the floating point unit and cache? Yeah AMD I'd do much better.😉.

Corporations scam you, you just have to keep them from doing it. Never be an early adaptor. Always read before you buy. And don't expect anything nice. Be ready to return it if you are not happy. Also don't accept store credit. Be an angry consumer all the time everytime.😈
 
I was loving my Gigabyte GTX 970 G1 Gaming card until I heard this news...luckily I bought my card from Amazon and they agreed to give me a refund...but I'm wondering if I'm better off keeping my current 970, paying $200 more and upgrading to a 980 or as a last resort keeping my old GTX 580 and waiting for the next-gen cards from Nvidia...will the issue effect me since I currently game at 1920 x 1200?

Keep the 970 or keep your 580. I wouldn't reward Nvidia with upgrade sales.
 
kVmb.png


I'm getting a good 30fps when using 3.7-3.8gb of Vram. For many of those saying NVidia should hard lock this card to 3.5gb, no thanks. If some want an option then fair enough but I want to use mine past 3.5gb and have done hundreds of times in the last few months. If they do hard lock it I'll be furious.
 
kVmb.png


I'm getting a good 30fps when using 3.7-3.8gb of Vram. For many of those saying NVidia should hard lock this card to 3.5gb, no thanks. If some want an option then fair enough but I want to use mine past 3.5gb and have done hundreds of times in the last few months. If they do hard lock it I'll be furious.

You shouldn't be happy with 30fps on PC...
 
I wouldn't necessarily call it a lie.

The engineer who made the forum post probably spoke out of turn and was most likely silenced by the PR department shortly after. Nvidia doesn't want to promise a fix that isn't possible, so to me it looks like they're just playing it safe for now and not making any promises.

A fix isnt possible in any way. That .5Gigs will always be bad for anything in real time.
It's a frame time test, have you read the thread? Also many tests so far are at 11-20fps in order to get the vram higher than 3.5gb.

SLI or 1400p and higher rigs, for the last time saying this, are inordinately affected. They will be hitting this VRAM wall whilst being playable framerates.
 
It's a frame time test, have you read the thread? Also many tests so far are at 11-20fps in order to get the vram higher than 3.5gb.

Then say "a good 30ms frametime" instead of "30fps". And yes, I have read the thread. I've been following it very closely since I've been planning to buy a 970 for a couple months before this came out.
 
Like AMD selling things as 6/8 cores when they share the floating point unit and cache? Yeah AMD I'd do much better.😉.

Corporations scam you, you just have to keep them from doing it. Never be an early adaptor. Always read before you buy. And don't expect anything nice. Be ready to return it if you are not happy. Also don't accept store credit. Be an angry consumer all the time everytime.😈

Amd never lied about that spec though. Where do people pull this argument from?
 
I was loving my Gigabyte GTX 970 G1 Gaming card until I heard this news...luckily I bought my card from Amazon and they agreed to give me a refund...but I'm wondering if I'm better off keeping my current 970, paying $200 more and upgrading to a 980 or as a last resort keeping my old GTX 580 and waiting for the next-gen cards from Nvidia...will the issue effect me since I currently game at 1920 x 1200?

As a guy who planned to upgrade for the Oculis Rift CV1 even before I bought my 970 (thus I'm as unbiased as I can be..) I have to say that it's puzzling to me how complicated you want to make life for youself. Continue to enjoy your 970, it's exactly as good as it was yrsterday. This is mostly on paper anyway. Find an alternative way to tell Nvidia its not ok other than worry yourself over something that is mostly on paper anyway.
 
Does an SLI 970 make less sense now with all this memory problems? I have 2 right now and I'm wondering if I should just return one of my 970 cause I wanted to downsample but it seems like that 3.5 barrier would cause more problems.
 
Top Bottom