Nvidia Launches GTX 980 And GTX 970 "Maxwell" Graphics Cards ($549 & $329)

Quick test with my MSI 970 Blue Tiger, AC: U | Maxed Out - 4K res - 4xMSAA

iAWW6RI1OdVUb.jpg

From what people are posting, the card can go above 3.5gb usage but anything over 3.5 causes the frame rate to take a hug hit. People have also posted that they their 970s are failing to hit 4gb @ 1080p in the exact places where a 980 hits that number.
 
From what people are posting, the card can go above 3.5gb usage but anything over 3.5 causes the frame rate to take a hug hit. People have also posted that they their 970s are failing to hit 4gb @ 1080p in the exact places where a 980 hits that number.

This is happening to me as well, i can go above 3.6GB but with massive framerate drops and stuttering, doesnt happen below 3.6GB.
 
From what people are posting, the card can go above 3.5gb usage but anything over 3.5 causes the frame rate to take a hug hit. People have also posted that they their 970s are failing to hit 4gb @ 1080p in the exact places where a 980 hits that number.

What a weird bug/issue/whatever. Jeez, hope that gets sorted out soon.
 
What games have you tried? I wish I could test but I don't have any real big VRAM hitters.

I get massive stutter as soon as cod:aw hits 3.6GB when framerates are above 80fps, same happens in far cry 4, as soons as it starts to consume 3.8GB it starts to stuter like hell.
 
I get massive stutter as soon as cod:aw hits 3.6GB when framerates are above 80fps, same happens in far cry 4, as soons as it starts to consume 3.8GB it starts to stuter like hell.

Hrmm, curious... I think when I visited my brother his cards had the same problem perhaps.
 
Hmmm this vram issues are really interesting,i'll test my card when i get home. Wouldn't this issues be grounds for a recall of sorts if this turns out to be a hardware problem?
 
I just did a quick test with Shadow of Mordor. All settings max out except AO at high. Resolutions ramped up through DSR. The card is a 970 G1 (at stock settings).
Results :
- 1920x1200 -> 3560 MB / 96.4 fps
- 2880x1800 -> 3753 MB / 54.4 fps
- 3840x2400 -> 4035 MB / 25.2 fps

Results are as expected.
 
the picture i posted was the picture posted by the user claiming tomb raider at 4k looks like cg. i didnt cherry pick anything. the screen in the post following it doesnt look any closer to cg. whether or not tomb raider looks ugly might be subjective, but comparisons to cg are not.

Eh, a bit late to respond, but I just wanted to point out that I wasn't saying that image in particular looks like CG. I see how that could be drawn from my comment though. Oops.

I was just commenting on how clean it can look, while playable at 30fps, on the 980. (4k down-sampled to 1080p)
 
Would Nvidia's Shader Cache feature be the main culprit behind this? Isn't there a certain amount of VRAM that is sectioned off for Shader Caching when the feature is enabled?

No, not at all. Shader cache saves compiled shaders to a disk to reduce loading times. It has nothing to do with VRAM whatsoever.
 
buddy just got an asus stryke 970. It only came with the card and a CD driver. Did he get an open box or is that all that is included?
 
I just did a quick test with Shadow of Mordor. All settings max out except AO at high. Resolutions ramped up through DSR. The card is a 970 G1 (at stock settings).
Results :
- 1920x1200 -> 3560 MB / 96.4 fps
- 2880x1800 -> 3753 MB / 54.4 fps
- 3840x2400 -> 4035 MB / 25.2 fps

Results are as expected.

The game should be using 4GB of VRAM at 1920x1200 (like it does for GTX 980). For some reason the same isn't happening with the GTX 970 without using some absurd settings like 5k etc.
 
Got a Gigabyte GTX 970 G1 today and so far everything has been great. No problem installing it, no coil whine and temperatures are good. The performance is amazing and I am really happy with my purchase.

edit: nvm, figured it out.
 
Does anyone know if the PhysX in Alice: Madness Returns is meant to be pretty heavy going when set to high or something? I thought a GTX 970 and i5 3570k @ 4.2Ghz would crush this game at 1080p/60fps, but I've noticed drops into the 40's. I just didn't think the Physx on this game would be that taxing on such a new card? Most of the time it is 60fps (unlocked framerate) but there are moments where it drops etc.
 
Does anyone know if the PhysX in Alice: Madness Returns is meant to be pretty heavy going when set to high or something? I thought a GTX 970 and i5 3570k @ 4.2Ghz would crush this game at 1080p/60fps, but I've noticed drops into the 40's. I just didn't think the Physx on this game would be that taxing on such a new card? Most of the time it is 60fps (unlocked framerate) but there are moments where it drops etc.

Alice MR doesn't use physx, it's their own physics engine. I can't comment on the performance though.

Besides, anyone has any update on the issue of 970s cannot use past 3.5 GBs ?
 
For 770 owners is it worth upgrading to 970/980? Or just wait for the next card?

Depends on a few factors: Do you have a 2GB or 4GB 770? Is it OC'd? What resolution are you playing at? And are you asking about the 970 or the 980, as (OC'd) there can be a decent difference?

This page from the AnandTech review does a pretty good job of breaking it down:

http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/22

For example, I went from a 2GB 770 to a 970, and then did the EVGA Step-Up to a 980. Part of it was because it was a Christmas treat for myself, the other was that I really enjoy what DSR does for pretty much every game, and the 980 really starts to pull away at higher resolutions.
 
For 770 owners is it worth upgrading to 970/980? Or just wait for the next card?

It depends. If you can sell it for ~$200, a 970 is a good upgrade.

I had a GTX 770 and found it's performance to be kinda bleh. I sold mine for $225 just before the 970/980 came out. I was going to get a 970, but since I had a credit at Micro Center and they had a decent deal on a 980, I got that.

This was my first PC (starting with the 770) so I have not gone through previous upgrade cycles... with that said I found the performance increase to be quite noticeable. Not a huge leap, but definitely noticeable. The reduced noise, heat and power draw was also appreciated.

The new AMD cards sound very promising, but as always if you wait for the "next big thing" you will be waiting forever.
 
Alice MR doesn't use physx, it's their own physics engine. I can't comment on the performance though.

Besides, anyone has any update on the issue of 970s cannot use past 3.5 GBs ?

Alice MR does use advanced Physx for smoke, debris and particles (PC only). Hair is custom tech.
 
Should I RMA my G1 970 if coil whine is very noticeable in practically all my games? Can they get it fixed?
Have you tried overclocking/overvolting it and benchmarking it? That has helped me get my G1's coil whine to manageable levels.

Too bad mine seems to be on the losing end of the silicon/BIOS lottery. I can only add 60Mhz to the core clock and 299Mhz to the VRAM, but that might be due to the voltage scaling being terrible. If someone is a member of Overclocker.net, there's a nice set of custom BIOS on there that might help, but are totally unavailable to nonmembers.
 
Can't believe EVGA is releasing a FTW+ edition and I just got the FTW edition in December last year from Amazon due to global warranty and that I'm from Australia.

Wish I knew about this before hand but I'm happy either way, upgraded from a 8800GT 512mb to a GTX 970.
 
Nvidia’s Monstrous 12GB Quadro M6000 Flagship GM200 GPU Confirmed Via Driver Update – Launching Soon

Well well well, the upcoming 22nd Nvidia Launch just got potentially more exciting. We have received independent confirmation of the Nvidia Quadro M6000 via a highly reliable source: Driver Updates. This particular card was first spotted some time ago in the TechPowerUp GPU database and now its presence in drivers more or less confirms it nearing launch date.

Big Daddy Maxwell paper launching on the 22nd? Quadro M6000 (GM200GL) and K1200 Confirmed

We have been waiting for big daddy Maxwell aka GM200 card to arrive for quite some time now and it looks that is going to happen pretty soon. Nvidia has the 22nd January event coming up and unless I am very much mistaken, it will unveil its Quadro Flagship as well equipped with the GM200GL core. The INF update in question can be found over at LaptopVideo2Go and here is an extract of the relevant portion:

NVIDIA_DEV.13BC = “NVIDIA Quadro K1200″
NVIDIA_DEV.17F0 = “NVIDIA Quadro M6000″


Now you might recall the specifications that appeared on TechPowerUp some while back. Well, if you remember the device ID, it was 10DE-17F0, the same device id now depicted in this update. This pretty much confirms the card existence and indicates that consumer samples are being prepared (if not already). If the GPUZ tool was reading the core correctly then the following are the specifications of the GM200 GPU:

5IXeFGZ.jpg


The GM200GL core rocks 3072 CUDA Cores in total with a nice 384 Bit Bus Width. Seeing Nvidia’s past trends that comes as a welcome relief. Since the card has a 348 Bit bus width it will have 96 ROPs. Maxwell architecture is layed out differently to Kepler and 96 ROPs equate to 192 TMUs on the die. For those wondering, we can now safely name the core of the Quadro M6000 because of id verification present at PCI Devices.
The specifications do look pretty epic, however at first glance it’s hard to tell whether this is the fully enabled version of the GM200 core although I am pretty confident that it is. The x000 nomenclature in Nvidia’s Quadro have usually been reserved for the full cores, where the K6000 and K5000 are obvious examples, this leads us to believe that the M6000 is no exception.

http://wccftech.com/quadro-m6000-flagship-professional-gpu-spotted-gm200-finally/

With Windows 10 and DX12 / Direct3D12 being a huge inflection point for new products for both Nvidia and AMD, I'd be shocked if we don't see at least one consumer graphics card based on the Big Maxwell GPU this year. Even if, at worst, it is a GTX Titan II on 28nm.

A reasonably priced 8 GB GTX 1080 in Q3 or Q4 would be so sweet. Would be sweeter if it was on 20nm.

If it happens and I got one, then for sure I would skip the Pascal generation entirely and not upgrade again until Volta, which I would imagine would be roughly around the same timeframe as PS5 and the next Xbox.
 
I dont believe for a second nvidia will retrofitt big maxwell with 12gb in the consumer space.

Certainly not the consumer space (GTX 1080 ?)

Maybe the Prosumer space (re: the Titan line), or maybe not.

Maybe 10 GB for Titans and 8 GB for regular GTX line.
 
No way any nvidia card launches in the consumer space with anything over 8gb in the nearish future. There wont be any use of it in games for a long time
 
No way any nvidia card launches in the consumer space with anything over 8gb in the nearish future. There wont be any use of it in games for a long time

I think the 11xx series (which is almost assuredly not what the follow-up to the 9xx follow-up will be named, but I digress) is the earliest point at which we'll see 8GB as a default configuration.
 
Looks like this guy found a way to test this problem quick and easy.
And it looks like my card doesn't have this problem!
http://imgur.com/KBLMbla
http://www.techpowerup.com/gpuz/details.php?id=6bp5e
At least with this test but I think I should be ok.
Here is the video
https://www.youtube.com/watch?v=bDCqYO-6HQ0&ab_channel=TonciJukic
(the link for the bench software he used is in in the descriptions).

Some 970s can hit 4GBs usage, but it will affect performance significantly, can this tell that issue though?
 
I think the 11xx series (which is almost assuredly not what the follow-up to the 9xx follow-up will be named, but I digress) is the earliest point at which we'll see 8GB as a default configuration.

im thinking nothing over 8gb until volta. i see an 8gb titan type maxwell card as possible tho. at an extreme price for sure.
 
im thinking nothing over 8gb until volta. i see an 8gb titan type maxwell card as possible tho. at an extreme price for sure.

8GB+ for the next Titan card is practically a given, but I wouldn't exactly say it's intended for consumers.
 
Top Bottom