Nvidia GTX 980/970 3DMark Scores Leaked- from Videocardz.com

Why are you purposely ignoring the obvious large gains they have made at 28nm with maxwell? Your moore's law has hit a brick wall excuse holds no water
That's based on one 750Ti part. Do we know if it scales well from 640 cores up to high 2000s? You need to wait till their cards arrive at Toms/Anand to see.

Again, nvidia went down this same path with the move from 40 to 28nm , what's your excuse for that?
We need to be into chip design to know for sure. Here people are taking laymen terms like TDP and 28nm, and project to everything including sinister predator pricing. Anand did mention the design turnaround in their 680 review, including the refocusing on reducing the high TDPs of 580. Maybe it was for mobility, since laptops are the biggest beneficiaries (correct me if wrong).
 
No I'm just commenting on you acting like console games being designed for amd hardware with multiplatform games will somehow make up for amd getting really far behind performance wise (once gm200 is out) and being far behind performance/watt wise.

Amd cards won't be irrelevant if they don't have a proper successor to catch up to maxwell no, it'll be like the cpu market where they can get the crumbs at the low end (similarly with underpowered yet power hungry hardware) while the consumer gets fucked up the ass due to there being no competition at the midrange or high end anymore for nvidia.
The cpu market is an ugly thing right now, I'd hate to see the gpu market fully barrel down that abyss.
This is just a what if scenario that you are proposing (preemptive damage control with that amd gpu optimisation nonsense, hence why I replied), I don't know what amd have coming next. (but they better have something good up their sleeves for OUR sake.)

As for the midrange 960 ti, that is what the 980 already is... it's the successor to the 560 ti
Anything nvidia would call 960 ti right now would probably be some low end 196 bit bus abomination like the 2GB 660 ti was and would be no faster than a gtx680.


Anyhow that's enough down to earth realistic negativity from me for today, here is to hoping nvidia sell the 980 for 400 or less and that the 970 can become the next 560ti that we all want.
(so you're saying there's a chance.gif:p)

Well my problem is I don't have a 4K monitor or in the plans to get one any time soon.

Although 120fps gaming does interest me.

So if whatever card I get next to surpass my HD7950 can at least feasibly accommodate 1080p/120fps

Or perhaps downsample a 4k resolution onto my 1080p screen for higher texture detail.

I generally only buy hardware to meet my needs. I mean I could buy a 780 GTX Ti to have the best of the best, but at the resolution of 1080p would be totally and completely inefficient as that card can easily push higher resolutions.

It's like having a Lamborghini Aventator but the roads I can only drive it on have a speed limit of 45 Mph, a waste of energy and power not being able to used to it's full advantage.
 
Anyhow that's enough down to earth realistic negativity from me for today, here is to hoping nvidia sell the 980 for 400 or less and that the 970 can become the next 560ti that we all want.

$400 is the price I expect for the 970. Right now nvidia "overprices" their cards to keep AMD from falling off the map. They're probably fed up with waiting for AMD to do something - and the 285 was AMD's announcement they don't want to move forward. So screw it, maybe nvidia should price the 980 at $400 and deal with the consequences. They'll have to argue that they don't have a monopoly due to Intel's presence.
 
That's based on one 750Ti part. Do we know if it scales well from 640 cores up to high 2000s? You need to wait till their cards arrive at Toms/Anand to see.


We need to be into chip design to know for sure. Here people are taking laymen terms like TDP and 28nm, and project to everything including sinister predator pricing. Anand did mention the design turnaround in their 680 review, including the refocusing on reducing the high TDPs of 580. Maybe it was for mobility, since laptops are the biggest beneficiaries (correct me if wrong).

We are discussing the things listed in the OP and the data it provides.
If it's wrong it's wrong, but that is all there is to discuss atm and the benchmarks and numbers in the OP align with previous rumors a few days back.



As for the underlined, what does that even mean?
Of course the jump from 40 to 20nm is going to come with lower power consumption per performance (just like every single gpu transition in that list you were mocking!)

And what do laptop gpus have to do with desktops, nvidia released the titan and the 780ti did it not? both cards with 250W TDP

They didn't refocus shit, they just released a small die midrange gpu (680) and 6 months later released the high end one.

Anandtech are so annoying, always shilling.

Their verdict for haswell quad cores was the same , despite a 5 percent performance increase over ivy alongside a 5 percent price increase, and it still having the same problem with the IHS/heatsink connection being worthless and the thing being uncoolable (as in better cooling does little to nothing because the heat doesn't get properly transfered to the cooler to begin with unless you delid it).

The intel dies get smaller with each iteration (and more die space gets alotted to their shitty IGPU) while IPC goes nowhere, you're making a smaller lower end chip on a new architecture and process , of course it's going to frigging use less power.

And then they also spinned it as 'a success on desktop' while praising reduced power consumption.
Meanwhile everrrrryone's reaction to haswell was 'ugh what is this shit' , until they fixed the IHS issue with the 4690 revision a year later.

You know what else was more power efficient than its predecessor, the hd4670, perhaps amd would have done well to sell it for 500 bucks , anandtech would have praised them for their focus on power efficiency over performance in the desktop space, you know , in light of how it would benifit mobile gpus.

Well my problem is I don't have a 4K monitor or in the plans to get one any time soon.

Although 120fps gaming does interest me.

So if whatever card I get next to surpass my HD7950 can at least feasibly accommodate 1080p/120fps

snip.

Right, except every gpu down the line also got a price increase after the amd 7900 and gtx 680 happened...
You could be getting what you need (that 960ti) for 150 euros instead of 300 or whatever, the only difference is that it would be called a 950.
When I bought my last mid range gpu that was good enough for 1080p 75 hz (my monitor) I paid 160 euros (hd6870), and before that I paid 130 for the 4870. (both about 6 months after they were released)
Now after all this time if I want a midrange 680 or 7970 it's still going to cost me over 300 euros... (these things are 2 years old now!, in that timespan I previously upgraded my 4870 to the 6870!, the 4870 was worth 30 euros second hand at that point! )
I'm not a 4k or 1440p at 144 hz guy either because I always buy midrange stuff, only now midrange stuff costs twice as much for me... so I'll be keeping my 6870 for yet another year longer.

The price for a midrange pc went from 500 to 800 euros over the past 4 years, that is some fucking inflation isn't it
 
They didn't refocus shit, they just released a small die midrange gpu (680) and 6 months later released the high end one.
That's based purely on die area and TDP, again we are using two layman measurements to project the world on these parts. We're not accounting for actual complexity of chip and the associated defect rates if they decide to pack more into a larger die. Sure Maxwell looks extremely promising now, but how realistic is it to expect Nvidia to reach that promised land without the learning curve of Kepler + first stab at larger die 28nm back at 2012?
 
That's based purely on die area and TDP, again we are using two layman measurements to project the world on these parts. We're not accounting for actual complexity of chip and the associated defect rates if they decide to pack more into a larger die. Sure Maxwell looks extremely promising now, but how realistic is it to expect Nvidia to reach that promised land without the learning curve of Kepler + first stab at larger die 28nm back at 2012?

I've covered all of this earlier in the thread, so if you want to talk about it with me just read the previous pages, I don't want to type it all again:p

One thing I forgot to add earlier: nvidia did have big kepler lined up and on track before the 7970 released, all the sources at the time about gk104 were accurate and those same sources said gk110 was on the way at the same time.

Then the 7970 happened , was shit (not shit compared to ancient 40nm and shitty 6000 series architecture, but obviously shite compared to kepler) and sold well at a very high price (again the reason why I covered before) , so nvidia said we'll have some of that ! and postponed the gk110 launch since they didn't need it to compete.
I doubt you'll find many people who don't agree on this (gk 110 release) by now.

If I don't reply anymore it's because I passed out it's 4 am:p
 
So basically, even though I could get one in January, I should save up and wait for GM200?

Well, do you need the performace for anything in January? If so, you can buy this now and then sell it and upgrade when the new cards arrive. That's what I'm planning on doing. I need the extra performance now, but I really want the real deal when it's released.
 
What's the point of this though? Were you guys actually expecting Moore's Law to be eternal? This is like shouting at the clouds.

These progresses listed were accompanied with a pretty significant die shrink and clockspeed increases because they weren't hitting electrical and thermal limitations that chipmakers are currently facing now on the 22/28nm generation. You knew the writing was on the wall Intel Isreal told HQ that physics is getting tired of their shit :D haha

NO I expect the prices to reflect the performance. Everyone knows that 22nm is delayed, but the prices of the cards in that post are relatively justified. Anything other than further sinking price for the 900 series is bull.
 
I've covered all of this earlier in the thread, so if you want to talk about it with me just read the previous pages, I don't want to type it all again:p

One thing I forgot to add earlier: nvidia did have big kepler lined up and on track before the 7970 released, all the sources at the time about gk104 were accurate and those same sources said gk110 was on the way at the same time.

Then the 7970 happened , was shit (not shit compared to ancient 40nm and shitty 6000 series architecture, but obviously shite compared to kepler) and sold well at a very high price (again the reason why I covered before) , so nvidia said we'll have some of that ! and postponed the gk110 launch since they didn't need it to compete.
I doubt you'll find many people who don't agree on this (gk 110 release) by now.

If I don't reply anymore it's because I passed out it's 4 am:p

You are just trying to find conspiracy theory where there is much simpler explanation.

Yields for gk110 were so bad that big corporate customers had to wait in line to get Tesla cards.

Could nvidia have sold 680 for 400$ ? Probably yes but it wasn't them who set performance and price expactations for this generation so why would any sane corporation leave money on the table ?
 
I'd be disappointed if GM200 doesn't have a 512bit bus.

Why? You would need a thicker PCB, the additional VRAM bandwidth isn't really necessary given current GPU specs and techniques and it would just make the cards unnecessarily more expensive.
 
My GTX670 is almost dead so I really, really need the 980 released as soon as possible.

It better be more powerful than a 780Ti. Hopefully 6Gb, too.
don't get your hopes up.

i'm still pretty much a layman when it comes to computer technology but it's looking to me that nvidia are setting the foundations for a product (i.e the next iteration of the big maxwell cards) that will outstrip the 780ti rather than beating it out of the gate. what we're getting is something on the same level or marginally more powerful, with slightly more vram, and lower power consumption. it's not a total waste of a card, but the price point is probably going to make it very unattractive.
 
don't get your hopes up.

i'm still pretty much a layman when it comes to computer technology but it's looking to me that nvidia are setting the foundations for a product (i.e the next iteration of the big maxwell cards) that will outstrip the 780ti rather than beating it out of the gate. what we're getting is something on the same level or marginally more powerful, with slightly more vram, and lower power consumption. it's not a total waste of a card, but the price point is probably going to make it very unattractive.

Man, I really hope it's priced decently. It better be under $600 AUD.
 
I want to believe that the GPUs will be monsters at overclocking at least, considering the 970 has a higher boost clock than the 770 while having an 80W lower TDP (230 vs ~150).
 
Radeon is temping. Temping, but I've had tons of ATI problems in the past... plus it's a dual card as well. Trying to get away from that.
I was just joking. From what you wrote, I knew you wanted to get away from dual GPU on a single card.

To answer your question, I would be surprise if even next year's single GPU cards will be faster than a 690.
 
I want to believe that the GPUs will be monsters at overclocking at least, considering the 970 has a higher boost clock than the 770 while having an 80W lower TDP (230 vs ~150).

Yeah I'm expecting to get my first experience replacing a GPU heatsink this time around. I had a lot of fun overclocking my 580 but it came with a 3 slot monster of a cooler.
 
All these people saying they NEED the 980 right now because their 600 series is outdated... I'm on 470 GTX. Feel my first world pain. I feel like a homeless drunkard to your droogs.

But as soon as EVGA or ASUS bring out their version with 6GB or more they will be singing in the rain, of my money.
 
Man all these sites with leaks on them keep hyping to 400 USD price for the 970. Considering it runs about the same as a 780 (which goes for about 430 USD at best).... ehh... that 30 dollar advantage seems a wee bit lame.
 
Pure performance gains may have stagnated recently at the top end, but i still find it absolutely fascinating that if these rumors are true, the new 970 card is the same power draw as a 7850/660. That's absolutely mind blowing to me
 
Well now, count me in the good old 560Ti group!

Yeah, I guess that card is about 3 years old now. It has actually done well by me, although truth be told I have not updated the drivers in ages, due to the fact that the most recent drivers were causing it to crash.

I figure I need to upgrade though in order to play a game like Wolfenstein properly which I want to do.

So has it already been said in here when people are thinking these new ones will be released?

Of course then I have to decide to bite the bullet on a new one or just buy a 780 with a hopefully reduced price from what it is now.....

Right now the information is kind of making it difficult to decide.
 
All these people saying they NEED the 980 right now because their 600 series is outdated... I'm on 470 GTX. Feel my first world pain. I feel like a homeless drunkard to your droogs.

But as soon as EVGA or ASUS bring out their version with 6GB or more they will be singing in the rain, of my money.

My man, I'm on a Radeon 5850, I know the pain.
 
In sort of related news, someone took pics of Galaxy's 970 card. They also got a CPU 3DMark score on an i3 for some reason.
Galaxy-GeForce-GTX-970-GC-4GB-4-850x446.jpg


Also, count me in the 560 Ti group, although I have the 448 core version.
 
Yeah, I guess that card is about 3 years old now. It has actually done well by me, although truth be told I have not updated the drivers in ages, due to the fact that the most recent drivers were causing it to crash.
Yeah I haven't updated my 560Ti drivers since Nvidia went through that spot where every single update would either crash your system or run the card so hot it died
 
I just need a variant that is quiet, I recently upgraded my case fans and my GPUs are now incredibly noisy.

Gigabyte Windforces are known to be relatively quiet. I have a Windforce 3X GTX 670, and it's only noticeable at heavy load. It also runs cool. (~34 celsius idle, 55 on load) Not an OC beast like EVGA's though IIRC, nor as SLI friendly since its not a blower.
 
Did I take a wrong turn and end up in some console vs pc thread, you believe in your optimisation fairytales, but in the real world the 980 in name 960 in specs is outperforming the 290x by 20 percent while using less than half the power.

Which reminds me,does anyone know what amd actually have coming next?
This tonka toy 285 can't be it...
I haven't heard anything about their next architecture

You haven't heard anything about AMDs new GPUs because you aren't really looking. Their new line of GPUs will be called Pirate Islands. The 390X is rumored to have over 4000 stream processors, 96 rops, and a 512 bit memory bus. AMD also developed stacked DRAM with SKHynix which is available right now for production right now.
 
You haven't heard anything about AMDs new GPUs because you aren't really looking. Their new line of GPUs will be called Pirate Islands. The 390X is rumored to have over 4000 stream processors, 96 rops, and a 512 bit memory bus. production right now.

Isn't that the Pirate Islands “Bermuda” chip? That one might be awfully tempting if Nvidia doesn't bring out Big Maxwell cards until Q4 2015.
 
Top Bottom