Nvidia Launches GTX 980 And GTX 970 "Maxwell" Graphics Cards ($549 & $329)

My GPU (970) is using ~800MB of memory just sitting in Windows (8.1), is this abnormally high?

I'm hitting my memory limits in both Watch Dogs (High) and Mordor (High).
Using Windows 7, I'm at 285MB, I don't know how much that helps.

Does *seem* high.

And using all 4GB in Watch Dogs and Morder without Ultra settings also points to something not being right.
 
Man, it doesn't stop. The MSI GTX 970 is back at 432 EUR on amazon.de.
Hopefully we'll get decent prices until summer.

Go for another retailer :) Although it's out of stock, hardwareversand.de currently has it listed at 380€. You can also look up deals here, prices are as low as 369,89€ in stock. Amazon is not always the best for online shopping :D
 
EVGA GTX 970 FTW+ shown @ 0:35. Video also shows off the GTX 980 Classified Kingpin, and a GTX 980 AIO watercooled. It looks like the FTW+ is the same as the new SSC with slightly higher clocks and a backplate.

I've decided to return my Gigabyte GTX 970 G1 for the FTW+ once it comes in stock.
 
HOLY CRAP!!

The EVGA GTX 980 ACX 2.0 runs cool....I've never had a card run this cool at stock with me pushing resolutions through the roof.

My 780 when playing NFS:HP at 3840x2160 (4k) would hit 74-80 degrees C.

This freaking 980 does it at a locked 60fps with an average of 44 degrees C.

I'm blown away by how cool this card runs. I'm going to try it in what I would expect are much more demanding games next. Time for some 4k modded Skyrim, Sleeping Dogs, Witcher 2, Crysis 3, etc. Woo


Edit* This is using DSR, which I'm really liking as an easy to use setting on these cards.
 
EVGA GTX 970 FTW+ shown @ 0:35. Video also shows off the GTX 980 Classified Kingpin, and a GTX 980 AIO watercooled. It looks like the FTW+ is the same as the new SSC with slightly higher clocks and a backplate.

I've decided to return my Gigabyte GTX 970 G1 for the FTW+ once it comes in stock.
Eh, given the price of the FTW compared to the G1, I don't think returning the G1 would be worth it unless the FTW+ was only $10-15 more expensive than the G1 and/or you have space constraints.
 
Do manufacturers take RMAs for coil whine? I have been thinking about getting a 970 for a while but I've been put off by the prospect of coil whine.

My 6950 has awful whine in Linux (for some reason) but is fine in Windows. If there was some way to mitigate the chance of being stuck with a card with whine I would probably buy a GTX 970 soon.
 
Do manufacturers take RMAs for coil whine? I have been thinking about getting a 970 for a while but I've been put off by the prospect of coil whine.

My 6950 has awful whine in Linux (for some reason) but is fine in Windows. If there was some way to mitigate the chance of being stuck with a card with whine I would probably buy a GTX 970 soon.
Well, theoretically, you could solve coil whine by messing around with voltages and stuff, since coil whine is due to the transistors vibrating at a certain frequency.
 
Just played Tomb Raider at 4k on the EVGA 980 with everything cranked. This card keeps impressing me. Got an average of 30fps with TressFX on...I wonder how it runs without it, should have tried before calling it a night.

Who needs AA??

I sit above 10-12' from my TV...and it looked like CG at times, so purty.

gvfkew.jpg
 
I was mucking around with sound card device drivers last week and I don't know what I did but my FPS halved rather than roll back to a back up I did a long overdue reinstall. Yikes. I thought I was past loading my PC with junk but my 970 is performing better now than when I first got it. Random CTDs have all but stopped too. Good times.
 
Just played Tomb Raider at 4k on the EVGA 980 with everything cranked. This card keeps impressing me. Got an average of 30fps with TressFX on...I wonder how it runs without it, should have tried before calling it a night.

Who needs AA??

I sit above 10-12' from my TV...and it looked like CG at times, so purty.

gvfkew.jpg

Yup, TR can look really good at times for a game terribly held back by the 360/PS3 :
Y7tFWpE.jpg

Not my PC shot.
 
Yup, TR can look really good at times for a game terribly held back by the 360/PS3 :
Y7tFWpE.jpg

Not my PC shot.

The PC version has dramatically higher graphics settings than what the 360/PS3 versions used. They basically ported and tweaked the PC version to PS4/Bone to create the Definitive Edition. There's nothing being held back on the PC version of TR2013, even today it's one of the best looking games around.
 
The PC version has dramatically higher graphics settings than what the 360/PS3 versions used. They basically ported and tweaked the PC version to PS4/Bone to create the Definitive Edition. There's nothing being held back on the PC version of TR2013, even today it's one of the best looking games around.

No, assets/scope/lighting are made with 360/PS3 in mind. The game looks like a last-gen game even with all the bells and whistles. There is no way it could have been otherwise unless the PC version undergoes a very substantial rework.

I'm convinced the hardware of the time could handle much more. This is evidenced by the fact that a 680/7970 can run better looking games (AC Unity for example).
 
Gigabyte or EVGA if you can afford it and if the Gigabyte cards fit in your case. Gainward and Palit have less effective coolers, IIRC the reviews I've seen of them.
 
so higher screen res = cg now?

I think it's image quality that people generally associate with CG, namely lack of aliasing, which high downsampling provides. It's about the only consistent quality of "cg" across all generations.

I've seen someone say that "downsampling" is the wrong name to call it, since it typically means discarding all information above a certain threshold, which is exactly what happens here. Perhaps supersampling is the better term? That's the term devs seem to use when their games support it too.

Edit: Getting geeky about it now, looks like the correct technical term for it is Decimation

Wikipedia said:
In digital signal processing, decimation is the process of reducing the sampling rate of a signal. Complementary to interpolation, which increases sampling rate, it is a specific case of sample rate conversion in a multi-rate digital signal processing system. Decimation utilises filtering to mitigate aliasing distortion, which can occur when simply downsampling a signal. A system component that performs decimation is called a decimator.

GeDoSaTo / DSR: we be decimators, decimating jaggies.
 
I just bought msi gtx 970 gaming yesterday. So good. Surprisingly quiet too. I got it for 360€, there were only 2 copies left in the store.
 
I think it's image quality that people generally associate with CG, namely lack of aliasing, which high downsampling provides.

I've seen someone say that "downsampling" is the wrong name to call it, since it typically means discarding all information above a certain threshold, which is exactly what happens here. Perhaps supersampling is the better term? That's the term devs seem to use when their games support it too.

Edit: Getting geeky about it now, looks like the correct technical term for it is Decimation



So GeDoSaTo and DSR are decimators.

gvfkew.jpg


there is absolutely nothing about this image or game that is reminiscent of cg
 
gvfkew.jpg


there is absolutely nothing about this image or game that is reminiscent of cg

Like I said I think the game looks ugly, but I can't say that can't be a subjective opinion. I haven't seen enough screens/videos of the game to make a fair judgement, but nothing I've seen or played of it appealed to me. The screen you posted has clearly been cherry picked for its ugliness (it even looks like there's an error with the settings/renderer), conversely to the previous one on the page which was picked for it's positive qualities, which I'd say is reminiscent of "cg", albeit still not appealing to me.
 
A little anecdote on the 970s. I had to RMA my Zotac GTX 970. First card or piece of hardware I ever had to RMA. Also the first time I bought Zotac. Just sayin'.
 
Like I said I think the game looks ugly, but I can't say that can't be a subjective opinion. I haven't seen enough screens/videos of the game to make a fair judgement, but nothing I've seen or played of it appealed to me. The screen you posted has clearly been cherry picked for its ugliness (it even looks like there's an error with the settings/renderer), conversely to the previous one on the page which was picked for it's positive qualities, which I'd say is reminiscent of "cg", albeit still not appealing to me.

the picture i posted was the picture posted by the user claiming tomb raider at 4k looks like cg. i didnt cherry pick anything. the screen in the post following it doesnt look any closer to cg. whether or not tomb raider looks ugly might be subjective, but comparisons to cg are not.
 
the picture i posted was the picture posted by the user claiming tomb raider at 4k looks like cg. i didnt cherry pick anything. the screen in the post following it doesnt look any closer to cg. whether or not tomb raider looks ugly might be subjective, but comparisons to cg are not.

I didn't realize that screen was posted before in this page, my bad.
 
Damn that FTW+ looks so tempting, I might just jump on that after taxes instead of waiting for the 8gb versions because I might be waiting for a while.

Very likely. As I've said previous in this thread, nvidia has no need to make 8GB models when they can barely satisfy demand for 4GB ones.
 
Installed my gtx 980 today, upgrading from a Radeon 7970 ghz edition. I have an evga sc edition, this card is a fucking monster. Watching it do its thing on metro last light is amazing.
 
I think I have seen mine hit 3800MB in Shadow of Mordor. I am going to run it tonight to see.

I bought the card and haven't been playing (all that) graphically testing games; being under 2GB of VRAM is normal so getting to and above 3.5GB is difficult going.

Edit: I just tested Mordor and the highest I got was 3538MB, albeit in a very quick test. Didn't notice any frame rate drops.

Does anyone have Evil Within to test? What is the VRAM usage for that game?
 
My mate brought one his 970's round to try on my Intel Core Quad Q8300 instead of my GT9800 (lol), worked surprisingly well considering the rest of the pc is old and/or crap and may mean I just get a few bits at a time of my gaming pc instead of forking out a grand in one go
 
Top Bottom