Nvidia Launches GTX 980 And GTX 970 "Maxwell" Graphics Cards ($549 & $329)

I messed around with DSR last night for the first time. It is my first attempt at any sort of downsampling. I selected all the resolution scales and booted up Tomb Raider. I tried 4K at first and it was sub 30fps, so that was a no go lol. I bumped it down to 1440p and it ran it a little below 60 fps but it looked 'worse' than it does at 1080p. I was still using the in-game FXAA that I use at 1080p but it didn't look clean at all. Is it that FXAA just doesn't work well at higher resolutions? I'm usually satisfied with it most of the time.
 
I messed around with DSR last night for the first time. It is my first attempt at any sort of downsampling. I selected all the resolution scales and booted up Tomb Raider. I tried 4K at first and it was sub 30fps, so that was a no go lol. I bumped it down to 1440p and it ran it a little below 60 fps but it looked 'worse' than it does at 1080p. I was still using the in-game FXAA that I use at 1080p but it didn't look clean at all. Is it that FXAA just doesn't work well at higher resolutions? I'm usually satisfied with it most of the time.

FXAA should only ever be used as a last resort, in somes games it looks alright, in most it blurs everything noticeably. Also you should play around with the DSR Smoothness in the driver settings, the default (33%) is a tad too high imo.
 
FXAA should only ever be used as a last resort, in somes games it looks alright, in most it blurs everything noticeably. Also you should play around with the DSR Smoothness in the driver settings, the default (33%) is a tad too high imo.

I turned the smoothness off (or to 0%) first thing. I never tried turning it back up any. The next step up in Tomb Raider for AA is 2xSSAA and it's too taxing.
 
I have questions related to coil whine.

1. People have had success by enabling v-sync to keep frames at 60fps. That's all well and fine, but if you're playing a game where you get dips below 60, what then - suffer jumps between 30 and 60? Can you set set v-sync on a per-game basis, or perhaps instead of v-sync setting a limit of 60fps?

2. I've never followed a GPU this closely. Is this a vocal minority thing, or is it a widespread issue? Are other GPUs like this? I had barely heard of it before the 970.

what then - suffer jumps between 30 and 60

This does not happen with all games. You'll see this behaviour when a developer has implemented double-buffered Vsync in their title. An option in this situation is to disable the game's Vsync feature and force NVIDIA's Vsync feature in the control panel. Can be set per game.

Can you set set v-sync on a per-game basis

Yes, very easily, using the NVIDIA control panel.

or perhaps instead of v-sync setting a limit of 60fps?

For this you will need a third-party application. NVIDIA Inspector will do the trick, allowing per-game frame-rate caps. Out of the box you can set frame-rate limits to the most common rates, but you can add your own values (eg, you might need 57fps) with only a few minutes of tinkering.
 
lower power consumption = lower temps = greater room for overclocks

Not always true.

High power consumption = good cooling = lower temps.

Most OEMs are using their existing 770/780 style coolers like Gigabyte putting a Windforce 3X cooler on their 970 despite the card only using 145W of power. When you put it under extreme load you're barely seeing them top 60C because of the stupidly excessive amount of cooling available.

Overclocks? It really comes down to the chip. The 970 is severely TDP limited but is also ridiculously underclocked at stock numbers. 1450MHz boost isn't outside the realm of possibility which is utterly ridiculous compared to most 770s which would top out just before they hit 1300MHz.
 
Most OEMs are using their existing 770/780 style coolers like Gigabyte putting a Windforce 3X cooler on their 970 despite the card only using 145W of power. When you put it under extreme load you're barely seeing them top 60C because of the stupidly excessive amount of cooling available.

Overclocks? It really comes down to the chip. The 970 is severely TDP limited but is also ridiculously underclocked at stock numbers. 1450MHz boost isn't outside the realm of possibility which is utterly ridiculous compared to most 770s which would top out just before they hit 1300MHz.

It depends entirely on the OEM. Some cards come with cheaper stock like cooler and those have no problem hitting the 80°C in certain scenarios. I've ordered the small Zotac an hour ago and users are reporting completely different temperatures as well.

LyfwUfc.jpg
 
It depends entirely on the OEM. Some cards come with cheaper stock like cooler and those have no problem hitting the 80°C in certain scenarios. I've ordered the small Zotac an hour ago and users are reporting completely different temperatures as well.

The Gigabyte is kind of a special case because it's ridiculously over-engineered even for a 770. But Asus are still using their Direct CU II and MSI still have Twin Frozr so the quality OEMs aren't exactly throwing their existing cooling solutions out the window. It's not like they needed to engineer new ones with the cards dropping 75W in TDP across the board.
 
good on them, but things like this is why I'm waiting a couple months. Factory overclock, ACX 2.0 cooling, and backplate (like the one Anandtech got for review)

Just an heads up. If you want the best version of the ACX 2.0 cooling, buy the FTW version of the 970. (If you want a 970, but it should apply to the 980 too).

The FTW features the ACX 2.0 with 4 long heatpipes. All the other version with the ACX 2.0 (SSC, SC...) use a different design with 3 heatpipes.
 
My EVGA SC 970 ACX 2.0 has some coilwhine, it's not extreme but it's there and I'm not sure what to do. How long would it take and would it cost anything to get it replaced by newegg or EVGA?
 
My MSI 970 got shipped. CAN'T WAIT. I don't know why because my 660TI ran everyting at 60FPS....but CAN'T WAIT. Hope it doesn't have coil whine, my 660TI was relatively fine, just the typical GTX "buzzing" when under load.
 
Think I'm going to be working overtime out the wazoo this month and next to pick up a later batch MSI 970 around when Dragon Age comes out.
 
I just installed my MSI GTX 970, on the second PCI-E slot due to the chassi beign to small to install it on the first slot (Bit.Phenom).

The motherboard I have states it has 2 slots at 16x v3.0, but my card is running at 8.0 v1.1 according to GPU-Z. I changed this on the BIOS from Auto to Gen3 to see if it made any difference, but GPU-Z still states the same.

Anyone have any ideas on how to sovle this?
 
I just installed my MSI GTX 970, on the second PCI-E slot due to the chassi beign to small to install it on the first slot (Bit.Phenom).

The motherboard I have states it has 2 slots at 16x v3.0, but my card is running at 8.0 v1.1 according to GPU-Z. I changed this on the BIOS from Auto to Gen3 to see if it made any difference, but GPU-Z still states the same.

Anyone have any ideas on how to sovle this?

When it's not under load it reports lower speeds. There is a question mark to press next to it to put the GPU under load and it should show correctly.
 
I just installed my MSI GTX 970, on the second PCI-E slot due to the chassi beign to small to install it on the first slot (Bit.Phenom).

The motherboard I have states it has 2 slots at 16x v3.0, but my card is running at 8.0 v1.1 according to GPU-Z. I changed this on the BIOS from Auto to Gen3 to see if it made any difference, but GPU-Z still states the same.

Anyone have any ideas on how to sovle this?

Have you tried launching the render test (click on the ? near the PCI tab in GPU-z) and see if it make any difference? PCI "downclocks" itself while in idle.
 
Have you tried launching the render test (click on the ? near the PCI tab in GPU-z) and see if it make any difference? PCI "downclocks" itself while in idle.

I did try this, and it went from 8x v1.1 to 8x v3.0.

Still no 16x though :/

Edit: Found the answer while searching, and noticed I don't understand the marketing on Motherboards anymore *getting old* Heh.

2 x PCI Express 3.0 x16 slots (PCIE1/PCIE3: single at x16 (PCIE1) / x8 (PCIE3) or dual at x8/x8 mode)
 
What's the best benchmark for checking stability during overclocking? I know people use 3D Mark for actually getting a benchmark, but it seems kinda long for just checking if my GPU is holding up at a specific clock speed.
 
It's better than regular downsampling because it doesn't change your refresh rate, and there's no input lag. Also it looks sharper from what I've heard.

So there IS input lag on downsampling! I ALWAYS thought that my downsampled games felt a bit...weird, but never could point it out and since framerate and refresh rate were basically the same I thought it was just in my head. So if it wouldn't add input lag, it could be a gamechanger.
 
I'm trying to use GeForce Experience to mess with my settings on account of my desire to try out Shadowplay, but I can't, like, click on anything in the Shadowplay options, which makes it somewhat hard to do anything in regards to settings.

Any idea what's up?

EDIT: NVM. That's stupid, but I fixed it.
 
I'm trying to use GeForce Experience to mess with my settings on account of my desire to try out Shadowplay, but I can't, like, click on anything in the Shadowplay options, which makes it somewhat hard to do anything in regards to settings.

Any idea what's up?

Click the ShadowPlay button in the upper right, click the switch to turn it on.
 
Seriously, I've been trying this for days. How do you enable DSR?

I can only enable it for 3 games of my 300 installed, via GForce Experience. Can't you just enable it via control panel or even better, Nvidia Inspector?
 
Seriously, I've been trying this for days. How do you enable DSR?

I can only enable it for 3 games of my 300 installed, via GForce Experience. Can't you just enable it via control panel or even better, Nvidia Inspector?

I believe it's done with the Nvidia control panel.
 
Seriously, I've been trying this for days. How do you enable DSR?

I can only enable it for 3 games of my 300 installed, via GForce Experience. Can't you just enable it via control panel or even better, Nvidia Inspector?

It should be in the panel, yeah.
 
Seriously, I've been trying this for days. How do you enable DSR?

I can only enable it for 3 games of my 300 installed, via GForce Experience. Can't you just enable it via control panel or even better, Nvidia Inspector?

Enable it like shown in the screen above and then it will just show up as a new resolution in all your games.
 
got a second asus 970 to pick up tonight - it's not for me since I'm trading it for my brother in law's PS4/games. But I won't be seeing him until the weekend...maybe I'll try out SLI for a few days. I'll be stuck at 8xPCIE 2.0 though, since I've got a 2500k on a z68 motherboard.

No idea what I'll even test it out on. Maybe buy crysis 3 if it's still on sale? I can easily max every other game I own with one 970, many even downsampled from 4k.
 
got a second asus 970 to pick up tonight - it's not for me since I'm trading it for my brother in law's PS4/games. But I won't be seeing him until the weekend...maybe I'll try out SLI for a few days. I'll be stuck at 8xPCIE 2.0 though, since I've got a 2500k on a z68 motherboard.

No idea what I'll even test it out on. Maybe buy crysis 3 if it's still on sale? I can easily max every other game I own with one 970, many even downsampled from 4k.

8x should not limit you at all IIRC.
 
Thanks guys, finally! GeForce Experience drives me mad each time I try to use it, it's completely useless.

Regarding DSR Smoothness - is there any recommended value for 4K, or 2x/2.5x?

That's up to your personal preference. I just left it at the default at 33%, looks good to my eyes.
 
Thanks guys, finally! GeForce Experience drives me mad each time I try to use it, it's completely useless.

Regarding DSR Smoothness - is there any recommended value for 4K, or 2x/2.5x?

If I recall correctly, that DSR smoothness is a Gaussian filter only really useful for eliminating artifacts by downsampling from a resolution that is not an even multiple (for example 2560x1440 to 1920x1080) of the native output resolution. However, in the case of 3840x2160 to 1920x1080, the former is exactly twice the width and length (and exactly 4x the pixel amount) meaning such artifacts should not be present and the filter is unnecessary. You can test this yourself, but if going from 4K down to 1080p, I think the image would look best with the DSR smoothness setting set to 0, but downsampling from other resolutions would make the filter useful again.
 
Can't seem to find any 970 in stock, in Canada... I've sold my old card faster then I could find a new one.
Ouch that sucks. I would have at least waited until you were sure you could order one. I'm going to play BF4 on my 5850s right up until the switch next week and then sell them.

Restocking in Canada has been hella slow.
 
Ouch that sucks. I would have at least waited until you were sure you could order one. I'm going to play BF4 on my 5850s right up until the switch next week and then sell them.

Restocking in Canada has been hella slow.

Well, Newegg.ca just got a restock of Gigabyte 970. Perfect timing!
 
Well the second 970 improved my 3dmark score a bit. Surprised how easy SLI was to get up and running (after a frantic treasure hunt for an SLI bridge)

jNl8DNfnUEEtw.jpg


The 10200 score is a single card at 1400/1475 boost, and the other two are stock and at the same overclock respectively.

Nice that the second card holds the same overclock - before I give it to my brother I might put it in alone and see if I can push it harder than my original card.

But yeah, it's nice, but not worth $430. I'll maybe keep it long enough to finish Crysis 3 and Evil Within.

I'm particularly impressed by the temperatures - still hovering around 60 once I turned the fans up a little. Not like I can hear them anyways. Top card runs a little hotter though, around 6-8 degrees during benchmarks.

Also surprised what a hog Crysis 3 is...I can play it with MSAA but I don't like sub-60 framerates. I'd imagine my CPU is holding me back here.
 
Top Bottom