Nvidia GTX 980/970 3DMark Scores Leaked- from Videocardz.com

It depends on who you ask. I'm sure anyone had a bad experience with the best brands around. As for myself I've owned Zotac, MSI, ASUS, and Gigabyte Nvidia cards and I never had any issues except when my Zotac 470 overheated.
I really love MSI's Twin Frozr cooling, it's very quiet and efficient.

If I had to buy a 970 I'd go for MSI.

I'll probably go MSI too with my next card. I want to support them because I have been using their excellent software with my Gigabyte card since quite some time now. The Gigabyte tools are kinda crap IMO. (GPU is good though)
 
According to the article the power draws on the 970 and 980 go way beyond the 148W and 165-175w that seem to be the final set numbers.

Is this because of the stress tests and benchmarks they are running? Will regular gaming draw that much power?

NVIDIA-GeForce-GTX-980-and-GTX-970-Power-Consumption-Total-System-573x1200.png

The 970 matching the 780 in performance while consuming ~100W less is huge, especially given it's on the same 28nm process.
 
yeah i think so. the 980 is the reference model and the temps on that are pretty high. Apparently the 970 is nonrefernce models and the temps on that are insanely low on full load. which makes sense with the lower tdp.

Better cooler means lower temps, lower temps means less fan noise.

Ok thanks. But is that why people are saying to get the reference 970 and get aftermarket coolers? What would be the big benefit? Just saving a few bucks?
 
I wonder how much L2 cache these GPUs have, since the maxwell in the 750ti (GM107) vastly increased the cache compared to the 650ti. It might explain why it doesn't suffer for having less bandwidth, at least at typical resolutions.

That also might explain some of the power reductions, since the cache hits avoid accessing memory as often. I doubt it could account for the bulk of it though, so I'm pretty curious what has changed in the SMs to provide such huge power savings. Roll on the proper reviews!
 
970sli seems to be the way to go... I have to be strong now. Could probably pick them up tomorrow morning, one of the biggest retailers of the country is very close.
 
Future proofing for max settings is always an impossibility.

As it should indeed but I see far too many people (perhaps not here) who want their purchase to be able to perform at such a level for years just because "games have PS4/XBO as common denominator".
 

That's AMD's high end lineup completely wiped out with a single card.

Better than 290x performance for $330 while consuming around ~150w less. I couldn't even recommend a R9 290 @ $250, AMD really have no where to go.

It's honestly worrying how far ahead Nvidia are now. Seeing even the cut down version of Nvidia's midrange 256 bit card beat out AMD's cream of the crop 512 bit monster is an embarrassing state of affairs.

If Nvidia really went all out they could realistically offer a card with almost twice the performance of the 290x with a similar power and silicon budget. I can't remember a time when the gap was that large.
 
While these are clearly very efficient cards, I can't help wondering how much they could do if they were using twice that wattage. You can never have enough power.

Might be tempted to switch from my 7950. Even though it is powerful enough for me, it is bloody noisy at full load
 
I wonder how much L2 cache these GPUs have, since the maxwell in the 750ti (GM107) vastly increased the cache compared to the 650ti. It might explain why it doesn't suffer for having less bandwidth, at least at typical resolutions.

That also might explain some of the power reductions, since the cache hits avoid accessing memory as often. I doubt it could account for the bulk of it though, so I'm pretty curious what has changed in the SMs to provide such huge power savings. Roll on the proper reviews!

The cards have 2MB of l2 cache for reference GK104 had 512KB and GK110 had 1.5MB. GM107 had 2MB as well.
 
That's AMD's high end lineup completely wiped out with a single card.

Better than 290x performance for $330 while consuming around ~150w less. I couldn't even recommend a R9 290 @ $250, AMD really have no where to go.

It's honestly worrying how far ahead Nvidia are now. Seeing even the cut down version of Nvidia's midrange 256 bit card beat out AMD's cream of the crop 512 bit monster is an embarrassing state of affairs.

From the benchmarks the 970 does not actually beat the 290X everywhere. But it's indeed impressive it manages to get that close. However it assurely beats the 290.
 
I wonder how much L2 cache these GPUs have, since the maxwell in the 750ti (GM107) vastly increased the cache compared to the 650ti. It might explain why it doesn't suffer for having less bandwidth, at least at typical resolutions.

That also might explain some of the power reductions, since the cache hits avoid accessing memory as often. I doubt it could account for the bulk of it though, so I'm pretty curious what has changed in the SMs to provide such huge power savings. Roll on the proper reviews!

Rumored is 2MB... 780ti has 1.5MB
 
While these are clearly very efficient cards, I can't help wondering how much they could do if they were using twice that wattage. You can never have enough power.

Might be tempted to switch from my 7950. Even though it is powerful enough for me, it is bloody noisy at full load

We won't see that happen until AMD decide to get their act into gear. Unfortunately I worry that we're now in the same situation as we are in the x86 space. AMD are so far behind that I wonder if they'll ever be competitive again. They may acknowledge this and give up on the high end all together as they have already done for CPUs.
 
Decided to wait the next wave of cards before upgrading. 770 will have to do for now.

I decided to wait for a game that makes me want to upgrade. It was Skyrim last time, it'll probably be another Bethesda game that makes me do it this time.

* * *

Sure hope AMD has an answer for this, or I guess the gpu market will be as exciting and value packed as the cpu market.
 
That's AMD's high end lineup completely wiped out with a single card.

Better than 290x performance for $330 while consuming around ~150w less. I couldn't even recommend a R9 290 @ $250, AMD really have no where to go.

It's honestly worrying how far ahead Nvidia are now. Seeing even the cut down version of Nvidia's midrange 256 bit card beat out AMD's cream of the crop 512 bit monster is an embarrassing state of affairs.

If Nvidia really went all out they could realistically offer a card with almost twice the performance of the 290x with a similar power and silicon budget. I can't remember a time when the gap was that large.
Well it's a new architecture and new gen of cards... it is supposed to make the last ones obsolete (just like the 7970 made the 580 obsolete in the 6 months before nvidia had kepler ready).
Nvidia doing it on the same 28nm process is impressive though.

When (if at all) amd have any kind of answer to these new cards is the question though... they better hurry the fuck up and not go all amd bulldozer with their gpus :\

Noob question: Why is less power consumption so exciting?
It's not just power consumption , they have better performance on a smaller die with a smaller memory bus than kepler (and especially amd GCN).
This means there is the possibility of a way bigger and way faster gpu that still consumes a reasonable amount of power.
Lower power consumption is also nice for everyone from low end to midrange to high end gpu users (at the right price) as it means less heat, less noise, less power bills (100watts less power consumption over 3-4 years of use will save you some extra money which is always welcome)

Under equal circumstances (price , performance) the gpu with a lot lower power consumption is always more appealing.

Here in germany electricity is not super cheap. 100W less for 3hours each day would be around 30€ less a year. After 3 Years (edit: the time between new GPUs for me) it would be nearly 100€.
Nah it's not quite that much.
3 hours a day at 100 watts for 3 years is only 109kwh, 35 cents per kw/h (which is btw holy shit expensive :o wow you germans have it pretty bad, here in belgium it's 22 cents) means you save 38 euros over the lifespan of the card.
It still counts though, I'd definitely count it in the purchase price of my gpu, hidden costs are no less real than upfront costs
 
From the benchmarks the 970 does not actually beat the 290X everywhere. But it's indeed impressive it manages to get that close. However it assurely beats the 290.

It beats it in the majority of those tests at 1080p. Performance at resolutions above that simply don't matter at this price point when they account for less than 3% of the market.
 
Well it's a new architecture and new gen of cards... it is supposed to make the last ones obsolete (just like the 7970 made the 580 obsolete in the 6 months before nvidia had kepler ready).
Nvidia doing it on the same 28nm process is impressive though.

When (if at all) amd have any kind of answer to these new cards is the question though... they better hurry the fuck up and not go all amd bulldozer with their gpus :

The point is that Nvidia were miles ahead already. They've now gone and achieved the single biggest increase in efficiency on top of that while staying on the same process node without any hint of a reply from AMD.

AMD will need both their biggest ever increase in efficiency and a die shrink just to get on level terms. I have absolutely zero faith that they can pull that off before Nvidia get to 20nm.
 
The power draw of these cards is insane. Ideal for SLI setups.

Now to decide if I just go for the 970 or the 980? The 970 seems a little weaker than I was hoping for... Might wait for some factory overclocked versions before deciding.
 
Nah it's not quite that much.
3 hours a day at 100 watts for 3 years is only 109kwh, 35 cents per kw/h (which is btw holy shit expensive :o wow you germans have it pretty bad, here in belgium it's 22 cents) means you save 38 euros over the lifespan of the card.
It still counts though, I'd definitely count it in the purchase price of my gpu, hidden costs are no less real than upfront costs

its 109kwh a year, you missed one *3

edit: it was 100w less for each hour for 3 hours a day. Not 100W less for the 3 full hours. Thats the difference between 290X and 970/80.
 
It beats it in the majority of those tests at 1080p. Performance at resolutions above that simply don't matter at this price point when they account for less than 3% of the market.

We need a wider range of games tested though. I'd like to see how well the 970/980 fare against the 780/780ti and R9 290/290X in Crysis 3, Battlefield 4, Watch Dogs, AC4, Metro Last Light.
 
The point is that Nvidia were miles ahead already. They've now gone and achieved the single biggest increase in efficiency on top of that while staying on the same process node without any hint of a reply from AMD.

AMD will need both their biggest ever increase in efficiency and a die shrink just to get on level terms. I have absolutely zero faith that they can pull that off before Nvidia get to 20nm.

No I agree, they are super far behind, it's depressing.
It's not fair to 1:1 compare it to their last gen cards though (unless they don't have anything for another year, which is depressingly likely) , their next cards will obviously catch up , just not enough most likely and so the gap will keep on growing.

I wish the amd aquisition never happened , it's all been downhill since then... they sold their fabs, they became less competitive and I assume they are partly being dragged down into the abyss with the cpu division.

How much of an engineer/talent exodus has the gpu division seen compared to the cpu one? I know that for the cpu one pretty much anyone who was a big name left.

its 109kwh a year, you missed one *3
Ops, you are right
Man you germans have it rough with your energy prices:\ that's frigging 114 euros over 3 years, that is huge.
 
It beats it in the majority of those tests at 1080p. Performance at resolutions above that simply don't matter at this price point when they account for less than 3% of the market.
What do you mean by "at this price point"? This is the top-tier of the GPU hierarchy and pricing, if these cards aren't for the above-1080p crowd, then what exactly are? The new artificial $1000 pricing bracket? Yes, above 1080p performance does matter, $400-500 itself is a niche price bracket for GPUs too; your point?
 
The power draw of these cards is insane. Ideal for SLI setups.

Now to decide if I just go for the 970 or the 980? The 970 seems a little weaker than I was hoping for... Might wait for some factory overclocked versions before deciding.
Curious,how powerful were you expecting the card to be?
 
While these are clearly very efficient cards, I can't help wondering how much they could do if they were using twice that wattage. You can never have enough power.
The exact same question popped in my head when I saw the power draw of these new cards.
 
Curious,how powerful were you expecting the card to be?

You normally expect a new high end card to be way more than a 10 percent perfromance jump...
People asking these questions is normal, give a product a misleading name and you will cause confusion.

It's not a high end card, it doesn't have a high end price (well the 970 at least :p ,thank fucking god for that), but people who see "980" and don't follow gpu news won't know that. And in the case of the 500+ euro price of the 980 I can't blame people for expecting a lot more.

It beats it in the majority of those tests at 1080p. Performance at resolutions above that simply don't matter at this price point when they account for less than 3% of the market.
talk about self fulfilling prophecy
In monitor threads you can't recommend a 4k monitor or express a desire for a 120 hz 1600p one because there's no gpus that 'll run games well at those settings, and here you can't ask for a more powerful gpu (and even worse: call performance at higher res irrelevant) because the market that has those monitors is too small.
Fine let's just halt progress indefinitely with this chicken or egg "problem", who doesn't love paying 500 euros for a gpu in 2014 and still dealing with the same aliasing they dealt with back in 2007.

Btw in case you weren't aware (you are aware), downsampling is a pretty big thing these days. People don't buy a gtx 780ti to play at 1080p 60 fps, they buy it for 120hz or heavy downsampling

I can only guess what your opinion on 1080p tvs was in 2006 *roll eyes*
 
We need a wider range of games tested though. I'd like to see how well the 970/980 fare against the 780/780ti and R9 290/290X in Crysis 3, Battlefield 4, Watch Dogs, AC4, Metro Last Light.

Oh absolutely and we need some frame time tests adel but they will come. Either way the picture is starting to become clear and the 970 looks like a fantastic card.
 
I'm going for a MSI GTX 970, without a doubt. I finally managed to save some money to get a good GPU that was worth paying for, so that my current one, an ASUS GTX260 Core 216, can rest after 6 years of service. Hope I'll be able get it already this friday/saturday at a local retailer.

And that performance per watt really is something. Glad I waited a few months for Maxwell.
 
In my opinion the most interesting comparison is 970 vs 780 Ti, a comparison that is missing from those charts but you can easily compare the numbers yourself. The comparison between 970 and GHz edition is kind of useful, but GHz edition isn't that well known. In their tests GHz is clearly faster than the Ti. If you compare 970 to 780 Ti, the 970 wins most of the time. If the 970 launches anywhere near 300 euros, I'll be getting one of those as soon as possible (and assuming actual reviews have similar numbers for the 970).

My 2GB 770 was a mistake even at 1080p. I mean, the 2GB rarely affects performance because games, engines and drivers are working their magic, but it's an annoyance seeing your GPU memory 100% full all the time.
 
Gotta say that $329 970 seems great.

Waiting for more benchmarks and reviews on the 980 though. Is it worth buying if I already have a 780Ti? Gaming at 1440p 144hz, haven't really noticed the need to upgrade, especially with G-Sync.
 
In my opinion the most interesting comparison is 970 vs 780 Ti, a comparison that is missing from those charts but you can easily compare the numbers yourself. The comparison between 970 and GHz edition is kind of useful, but GHz edition isn't that well known. In their tests GHz is clearly faster than the Ti. If you compare 970 to 780 Ti, the 970 wins most of the time. If the 970 launches anywhere near 300 euros, I'll be getting one of those as soon as possible (and assuming actual reviews have similar numbers for the 970).

Price point is rumored to be anywhere between 329 and 399 bucks for non-reference. So I kind of expect to pay around 350 euros.
 
I know the Wccftech article says the cards have 64 ROPs, but doesn't the bandwidth say otherwise? Bandwidth is equal to ROPs * memory clock (Ghz), so 32 * 7.0 Ghz = 224 Gb/s bandwidth.
 
Gotta say that $329 970 seems great.

Waiting for more benchmarks and reviews on the 980 though. Is it worth buying if I already have a 780Ti? Gaming at 1440p 144hz, haven't really noticed the need to upgrade, especially with G-Sync.
If you can sell your 780Ti for near the cost of a 980 so that you're only paying like $50-100 for it out-of-pocket, its probably worth upgrading, sure. You've got the ROG Swift it sounds like and can use every bit of performance you can get. The extra vRAM will probably come in handy at times as well.
 
These marginal performance gains are a joke, but we have become complacent in this regard.

Not a joke considering the TDP and the fact that this is still a 28nm process.
I'd agree with you if we were talking about 250w GPUs on a smaller node beating the previous line-up by 25%.
 
Top Bottom