Nvidia GTX 980/970 3DMark Scores Leaked- from Videocardz.com

The exaggerations of microstutter in this thread are waaaaaay overblown.

Take a look at SLI reviews from PCPer and TechReport. They use frame time data, and the variance between single cards and dual cards is extremely marginal. These are the two websites that are responsible for bringing the hard data on uneven frame pacing (microstutter) to light.

As I said, frame pacing has been a huge focus of both AMD and NVIDIA over the last few years. Drivers have matured very nicely on this aspect.

This is, of course, assuming there's SLI/Crossfire profiles for the game in question.

For the folks saying they see it all of the time, they most likely are confusing it with something else (tearing perhaps) or have hardware that predates Kepler and GCN.
 
Yeah, Nvidia users have found that rolling back to 340.43 fixes the frequent crashing (not sure about AMD GPUs since I have two 670s myself).

I'm kinda wondering why it's taking so long really, they haven't released any type of driver in almost 2 months. They don't normally go that long without at least a beta.
 
Ugh!

That is lame. Why even bother releasing a new line if it isn't a real upgrade?

Will this new line of cards at least be released at a cheaper price point to make up for the negligible upgrade?

This is a real upgrade for the people who didn't get the GK110 chip (780 and up). This is the upgrade for people with 680/770 and below since those were the mid-range chips from the Kepler cards, just like these are the mid-range chips for Maxwell.

For the real upgrade to 780 you have to wait for the GTX1080 (or whatever they will call it) that will most likely be made with 16nm instead of 28.

And even if the performance gains over 780Ti won't be huge for 980 (roughly 10%, in benchmarks at least), it's still pretty impressive that they can achieve that performance at a much lower TDP. This will benefit laptops more than a desktop PC though.
 
So how much will this 970 be are we thinking? I've seen most sites saying $399. I was looking at the 770 for my new PC I just built, if this is $399 is it worth the extra $120 over what I can get a 770 for?
 
So how much will this 970 be are we thinking? I've seen most sites saying $399. I was looking at the 770 for my new PC I just built, if this is $399 is it worth the extra $120 over what I can get a 770 for?

Check out some benchmarks for 780 (not Ti) and see if you think the performance difference is worth it. And yeah, it seems like it will be $399.
 
I wonder if I should sell my 780 and just get a 970 since it would literally cost me $100 after you factor in the money I paid originally and the used price I can get. Might be worth it for the lower power draw and heat.
 
The exaggerations of microstutter in this thread are waaaaaay overblown.

Take a look at SLI reviews from PCPer and TechReport. They use frame time data, and the variance between single cards and dual cards is extremely marginal. These are the two websites that are responsible for bringing the hard data on uneven frame pacing (microstutter) to light.

As I said, frame pacing has been a huge focus of both AMD and NVIDIA over the last few years. Drivers have matured very nicely on this aspect.

This is, of course, assuming there's SLI/Crossfire profiles for the game in question.

For the folks saying they see it all of the time, they most likely are confusing it with something else (tearing perhaps) or have hardware that predates Kepler and GCN.
The problem with frame pacing is that, from what I've seen, it works almost perfectly for those games which are generally used to test frame pacing, moderately well for most other mainstream games, and totally breaks apart on niche titles.

Like multi-GPU rendering of applications which aren't multi-GPU aware itself really. It's a hack.
 
Yeah, hence my profile caveat. TechReport does a decent job of throwing in a random game or two to keep them on their toes.

Most niche titles shouldn't have an issue on a single high end card with SLI/Crossfire disabled. This goes back to the standard GAF advice on the subject, which is that it should always be avoided unless you are trying to achieve performance that would otherwise be impossible on a single GPU
I wonder if I should sell my 780 and just get a 970 since it would literally cost me $100 after you factor in the money I paid originally and the used price I can get. Might be worth it for the lower power draw and heat.
I find this is a nice way of keeping with the high end without breaking the bank. Also, capitalizing on people who do this . . .

So. . .

Looking to sell your 780, eh? :P
 
If you're not vendor-loyal, R9 290 still is the best bang for high-end buck.
Has there been talks about improving the fan noise?

Nivdia needs to look to the owners of the 670/80 or 760/770 to upgrade so it's doubtful that those people will be willing to upgrade for more than $500.
I'm using a GTX260 so...

In an ideal situation I would wait for Rift CV1 to come out before buying a new GPU, but my lovely OC 1GB 560 Ti is starting to show its age even when only gaming at 1080p. I don't need 60FPS in all games I play, but once newer games require you to tweak graphical settings to get a stable 30 an upgrade becomes inevitable.

If the price of a 970 gets around 300 euro at release or soon after I'll probably bite. Near-Titan performance is more than enough for someone who doesn't give a crap about extreme resolutions (yet), so it should last me about two years at the very least.
I'm still using 4:3 and, last year, discovered how good CRT actually is (always had internal suspicions about LCD but looking data confirmed them).

http://www.techpowerup.com/gpudb/2621/geforce-gtx-980.html
www.techpowerup.com/gpudb/2620/geforce-gtx-970.html

If these numbers are remotely true, the 970 should be a decent upgrade over my 670. But I may wait until witcher 3 hits.
That's the thing...PS4 or Gaming Card. If I don't get a new card before Witcher 3, then it'll have to be for Killing Floor 2.

You missed out. When everyone was dumping 290s due to Litecoin mining inefficiency, you could nab barely used ones for like $170. It was insane. You can still find them on sale (new) for pretty cheap, as low as ~$300.

For what it's worth, the RELATIVE performance might be pretty low for, lets say, a 760 or 280X. However, what you get is a card that can run 98% of games at 1080p/120Hz. You'll probably want to be on the lookout for AMD dropping prices, or perhaps an eventual 960 Ti.
WHAT!? GOD DAMN IT, WHY DIDN"T I KEEP PAYING ATTENTION AFTER 290's RELEASE!?

For the most part, the GTX260 will run most games at max at 60FPS even now but since Battlefield 3 (2011), there have been a few games that just can't handle the pressure at max settings (Battlefield 4 Beta barely got along despite putting everything at low).

I think AC Unity and GTA5 are going to push things. And if not, Witcher 3 will scream. Also the announcements of the Next Total War title should flex the muscles of the upgraders.
If Alians: Isolation is good...sure. For those only interest in Total War...wait to see if they won't make another Broken Piece of Shit.

That is why I want this: https://www.youtube.com/watch?v=KnrxNfxRK_4


You get two 27¨' screens in one with no bezel and only a 20% performance penalty over 2560x1440. I like the cost association of this compared to going dual screen 4k 27/30''. and the bezel... grrr...
So far, the only PC game that is making that resolution a big deal is The Evil Within. Still, that seems like the first monitor I've been truly interested in and actually excited...just probably not at that price (and desk I'm using).
Also, doesn't AMD also making super-synced monitors as well? I mean, I lost interest in G-Sync since I was looking super hard into the recent R series of cards and would defeat the purpose of getting one. Combine with my willingness to keep using this 17 year old monitor...man, I've been thinking for 3 years and yet it really does seem like next year will finally be the time for me to implement the last upgrade that I should have decided to choose in 2012.
 
I don't think it will be graphene and not until another 15-20 years. We will get to 7nm, fighting and screaming, but we will get there. Whether it's worth the cost will be the thing foundries will need to figure out.

Based on when TSMC claims to start volume production of 16nm (20nm w/FinFET) is Q1 2015. Which means late next year or early 2016 we may finally get an video card upgrade. That depends on if NVIDIA is ready to implement stacked DRAM and what AMD's plans are.

NVIDIA won't implement stacked dram until pascal or Volta. AMDs HBM which is stacked dram, is ready for production since earlier this summer, it was developed by SKHynix.
 
Yeah, hence my profile caveat. TechReport does a decent job of throwing in a random game or two to keep them on their toes.

Most niche titles shouldn't have an issue on a single high end card with SLI/Crossfire disabled. This goes back to the standard GAF advice on the subject, which is that it should always be avoided unless you are trying to achieve performance that would otherwise be impossible on a single GPU
I find this is a nice way of keeping with the high end without breaking the bank. Also, capitalizing on people who do this . . .

So. . .

Looking to sell your 780, eh? :P

LOL I might, once I found a sold date for release and a price hit me up.
 
This is a real upgrade for the people who didn't get the GK110 chip (780 and up). This is the upgrade for people with 680/770 and below since those were the mid-range chips from the Kepler cards, just like these are the mid-range chips for Maxwell.

For the real upgrade to 780 you have to wait for the GTX1080 (or whatever they will call it) that will most likely be made with 16nm instead of 28.

And even if the performance gains over 780Ti won't be huge for 980 (roughly 10%, in benchmarks at least), it's still pretty impressive that they can achieve that performance at a much lower TDP. This will benefit laptops more than a desktop PC though.

And for me with an 5770. :-) Can not wait, really!
 
What's better for someone who doesn't have a GPU yet, invest in the 970/980 or buy a cheaper card that will do the job for now and then upgrade when a real leap occurs?
 
What's better for someone who doesn't have a GPU yet, invest in the 970/980 or buy a cheaper card that will do the job for now and then upgrade when a real leap occurs?
Wait for the 970/980 to come out and see how their prices stack up to older cards that might get price cuts.
 
I'm assuming my 500w power supply would be fine with a 980 if it's alright with a factory OC'd 670?

Doesn't the 970/980 use a lot less watts than the previous "high" end cards? I think it should be fine if it's true that the 970 uses 70w less than the 780
 
I don't suppose there's any chance the cards will be priced as replacements to the 7xx series if they aren't going for a real performance increase? Aka, ~$300-$350 970 to replace the 770 and ~$400-$450 980 to replace the 780?
 
The problem with frame pacing is that, from what I've seen, it works almost perfectly for those games which are generally used to test frame pacing, moderately well for most other mainstream games, and totally breaks apart on niche titles.

Like multi-GPU rendering of applications which aren't multi-GPU aware itself really. It's a hack.

I have ~480 games installed on my pc. Using Tri-SLi of gtx680 I don't have framerate problems in any of them. SLI is disabled only for:

Rage (limited to 60 fps anyway)
Wolfenstein TNO (same as rage)
Watch Dogs (framerate scales but appear stutters)
Dead Rising 3 (stutters and shadow flickering)
Fifa 15 Demo (runs at 120 fps anyway)
Dolphin emulator

I can't remember more games with framerate problems using sli. It's really a small list, right?
 
Well, they don't have 6GB 780ti so you won't be able to game at higher res than 1080p reliably.

For 1080p 780ti will last you a while.

For super VRAM intensive games in the future? Maybe, but the 780Ti still has MASSIVE memory bandwidth, and from leaked 970/980 stuff MORE bandwidth than either of those.

The 780Ti currently can play tons of games at extreme resolutions. And I don't see this changing very much any time soon. Games like Watch Dogs are an exception because the game is typical ubicrap that is made poorly. At it's default ultra settings only a card with 6GB of Vram can use decent looking textures without tons of stuttering? OK SURE.


I doubt we will see many VRAM intensive games that actually make good use of VRAM. So far on so called next gen consoles with 8GB of RAM, really haven't seen textures that even remotely seem worthy of that much RAM. (And in WD case, textures are equivalent to PC high) Most of it has barely been current PC quality. Let alone quality of like texture mods for Skyrim or Crysis 2 for example.
 
What's better for someone who doesn't have a GPU yet, invest in the 970/980 or buy a cheaper card that will do the job for now and then upgrade when a real leap occurs?
Do you want to play now or wait for another year for the "real leap"?

I think that even 28nm 1st gen Maxwells will handle games for a couple of years without much problems. Why wait?
 
Do you want to play now or wait for another year for the "real leap"?

I think that even 28nm 1st gen Maxwells will handle games for a couple of years without much problems. Why wait?

Waiting with a cheaper GPU is what I asked though. I'd imagine a cheaper card like a 670 or 770 would handle games for the next couple of years without much problem. Am I wrong?

I also thought about getting a 970/980 now and then SLI when they really start to show their age in like 3-4 years. Is that a good strategy, will it keep a PC going for the entire generation? And by going I mean running everything on high and 1080@60.
 
Waiting with a cheaper GPU is what I asked though. I'd imagine a cheaper card like a 670 or 770 would handle games for the next couple of years without much problem. Am I wrong?

I also thought about getting a 970/980 now and then SLI when they really start to show their age in like 3-4 years. Is that a good strategy, will it keep a PC going for the entire generation? And by going I mean running everything on high and 1080@60.

I'm still running current gen games on some ancient 560 ti's just fine.

It all depends on what you are willing to sacrifice. I always run at 1080p/60 at the minimum, generally have to tone down some of the spare effects. You should be able to coast through the entire generation on a 770.

If you are planning on going high-end, wait until the next generation of cards. That should be the big jump.
 
So how much will this 970 be are we thinking? I've seen most sites saying $399. I was looking at the 770 for my new PC I just built, if this is $399 is it worth the extra $120 over what I can get a 770 for?

2GB GTX 770s simply don't have enough VRAM. You should be comparing 4GB GTX 770s not the 2GB models.

I'm still running current gen games on some ancient 560 ti's just fine.

It all depends on what you are willing to sacrifice. I always run at 1080p/60 at the minimum, generally have to tone down some of the spare effects. You should be able to coast through the entire generation on a 770.

If you are planning on going high-end, wait until the next generation of cards. That should be the big jump.

Not with 2GB of VRAM he won't. It's already a bottleneck in the here and now and it will only end up to prove crippling as time goes by. If you want to get through this generation comfortably at 1080p then I feel the the GTX 970 is the first midrange card that will allow you to do that.
 
As someone who may end up trucking 2-3 weeks at a time starting next year or late this year, the idea of a beefy laptop has become more appealing to me.

I'll be waiting for the 970M, at least, before making a purchase. These laptop GPU performance jumps are impressive. I don't need Ultra everything, so if I can keep temps down while maintaining a mix of Medium-High settings at 60+ FPS, I'll be content.

I need to be able to play, record and render without heat or other issues, and I've been extremely hesitant in the past to go hard on a laptop purchase. These 900-series mobile GPUs and a change in lifestyle are changing my tune, though.
 
So far, the only PC game that is making that resolution a big deal is The Evil Within. Still, that seems like the first monitor I've been truly interested in and actually excited...just probably not at that price (and desk I'm using).
Also, doesn't AMD also making super-synced monitors as well? I mean, I lost interest in G-Sync since I was looking super hard into the recent R series of cards and would defeat the purpose of getting one. Combine with my willingness to keep using this 17 year old monitor...man, I've been thinking for 3 years and yet it really does seem like next year will finally be the time for me to implement the last upgrade that I should have decided to choose in 2012.

wait... you are still using a monitor from 1997?



:O
 
As someone who may end up trucking 2-3 weeks at a time starting next year or late this year, the idea of a beefy laptop has become more appealing to me.

I'll be waiting for the 970M, at least, before making a purchase. These laptop GPU performance jumps are impressive. I don't need Ultra everything, so if I can keep temps down while maintaining a mix of Medium-High settings at 60+ FPS, I'll be content.

I need to be able to play, record and render without heat or other issues, and I've been extremely hesitant in the past to go hard on a laptop purchase. These 900-series mobile GPUs and a change in lifestyle are changing my tune, though.

Same for me. A 970m or single 980m is right up my alley.
 
Excellent, my 780ti is still good for a long time. But still, I always want the graphic cards to improve and they've been getting around 1.4x better for each flagship single card but this is ridiculous. A laughable few percent upgrade at most.
 
I certain ways yes. But in this day and age the negatives of a huge ass CRT outweigh the positives.

What negatives are you talking about? That it's big? Why? Do you have a slave holding up the monitor for you while you play it and you want to spare his back? Or do you have a desk or table like a normal person.


There is nothing an lcd monitor does better than a good crt, and even a bad crt does almost everything better than a 'good' lcd
 
What negatives are you talking about? That it's big? Why? Do you have a slave holding up the monitor for you while you play it and you want to spare his back? Or do you have a desk or table like a normal person.


There is nothing an lcd monitor does better than a good crt, and even a bad crt does almost everything better than a 'good' lcd

I'd have to be a fucking idiot to do photo editing on a CRT. Too unwieldy for dual monitor use too.
 
What negatives are you talking about? That it's big? Why? Do you have a slave holding up the monitor for you while you play it and you want to spare his back? Or do you have a desk or table like a normal person.


There is nothing an lcd monitor does better than a good crt, and even a bad crt does almost everything better than a 'good' lcd


-Massive size, weight and desk space
-Higher power draw
-Audible whine(though very small)
-No multi monitor
-Lower resolutions
-4:3
-CRT Drift over time
-Analog signal (VGA)


While CRT's have superior color reproduction and input lag, there is no reason to own one in todays day and age. Its out dated technology.

And this is coming from someone is has owned nearly every monitor in existence and used the holly grail of CRT's- Sony FW900. I would take my Dell 32'4k or Asus ROG over the Sony FW900 any day.
 
-Massive size, weight and desk space
-Higher power draw
-Audible whine(though very small)
-No multi monitor
-Lower resolutions
-4:3
-CRT Drift over time
-Analog signal (VGA)


While CRT's have superior color reproduction and input lag, there is no reason to own one in todays day and age. Its out dated technology.

And this is coming from someone is has owned nearly every monitor in existence and used the holly grail of CRT's- Sony FW900. I would take my Dell 32'4k or Asus ROG over the Sony FW900 any day.
There is one reason, and that is awesome scaling.
 
On techpowerup the 780ti is listed as having 5 TFLOPS of power while the 980 is only 4 TFLOPS yet the benchmark scores are really similar. Any reason for this or is it due to errors in estimation on techpowerup's website?

The 2 cards:
980 http://www.techpowerup.com/gpudb/2621/geforce-gtx-980.html
780ti http://www.techpowerup.com/gpudb/2512/geforce-gtx-780-ti.html

new architecture
hd5870 had 2.7TF and the hd7850 only 1.73 yet the 7850 is like 60 percent faster in both games and benchmarks
 
Top Bottom