Nvidia Kepler - Geforce GTX680 Thread - Now with reviews

I'm guessing there is absolutely no reason to get one of these if I have a GTX580?

That 4gb 680 EVGA card does look hot but would be a vanity purchases seeing as my card is like a year and a half old :S

I love vanity purchases though >:)
 
"GTX 680 is even more impressive at future graphics workloads than the review benchmarks really let on. Part of this has to do with having 2x the geometry performance/cluster as Fermi, the bind-less texture support, etc. Given that the game development mind share is basically backwards compatible to the console boat anchor, in all honesty, it is going to take a while before people realize what is possible on GTX 680 when they really start to push the card with the same vigor as they push the Xbox360 or PS3. However if this does happen, you can expect a fundamentally different experience than even the best current PC games."

http://timothylottes.blogspot.pt/2012/03/gtx-680-gk104.html
That is because GK104 has half the number of Polymorph engines compared to Fermi. Bind-less texture support sounds promising but depends on developer adoption. Kind of like how AMD's PRT, which John Carmack confirmed to being implemented in the next gen Doom title.
 
"GTX 680 is even more impressive at future graphics workloads than the review benchmarks really let on. Part of this has to do with having 2x the geometry performance/cluster as Fermi, the bind-less texture support, etc. Given that the game development mind share is basically backwards compatible to the console boat anchor, in all honesty, it is going to take a while before people realize what is possible on GTX 680 when they really start to push the card with the same vigor as they push the Xbox360 or PS3. However if this does happen, you can expect a fundamentally different experience than even the best current PC games."

http://timothylottes.blogspot.pt/2012/03/gtx-680-gk104.html

Good luck with that...it hasn't happened with any prior card, why would they start now.
 
Regarding the Lottes quote, see the Tessmark (X64) results here:
http://techreport.com/articles.x/22653/6

I noticed that earlier when reading the review. I think it's the single largest improvement in any benchmark (synthetic or otherwise) that 680 shows over 580. There must be some deeper architecture changes at work here, otherwise I don't see how an improvement of more than 2x could be possible.
 
I'm guessing there is absolutely no reason to get one of these if I have a GTX580?

That 4gb 680 EVGA card does look hot but would be a vanity purchases seeing as my card is like a year and a half old :S

I love vanity purchases though >:)

Well, most reviews are stating a 36% on average performance boost from a 680 vs. a stock 580. For me that seems like a pretty big jump. Depending on how much you can sell you 580 for, you will have to determine if the money you need to kick in after you sell your 580 is 36% faster worth it to you or not?
 
My winnings from gambling the last two nights could cover the cost between selling my 580 and getting the 680 D: . This might speed up my considering process, heh. Though I'll still wait a little to find out when the custom cooler models will be out (DirectCU or TwinFrozr).

How long did it take for the 580 custom models to come out? I forget.
 
I take it that there are no games that are currently programmed to activate TXAA?
 
It was a damn good card for a long time though. Mine is retired. Sort of wish I would've waited for this instead of splurging on the 580.

I can still pull 60fps in Diablo 3 with almost everything on high at 1920x1200. She's been good to me, that's for sure. I think I'll hold out for a budget level 680 when it drops.
 
Regarding the Lottes quote, see the Tessmark (X64) results here:
http://techreport.com/articles.x/22653/6

I noticed that earlier when reading the review. I think it's the single largest improvement in any benchmark (synthetic or otherwise) that 680 shows over 580. There must be some deeper architecture changes at work here, otherwise I don't see how an improvement of more than 2x could be possible.
And this is only going to be $550? I just might have to save my pennies for this!
 
So the 680 is the replacement for the 580 right? Does that mean the mid range cards will be released a little later then? I'm waiting it out at the moment with a 7600GT in hopes that the new card will either push the prices down on the 5's or a good mid range 6 card will come out soon.

Or have I got this completely wrong?
 
Regarding the Lottes quote, see the Tessmark (X64) results here:
http://techreport.com/articles.x/22653/6

I noticed that earlier when reading the review. I think it's the single largest improvement in any benchmark (synthetic or otherwise) that 680 shows over 580. There must be some deeper architecture changes at work here, otherwise I don't see how an improvement of more than 2x could be possible.
ONN6T.jpg


+ increased clocks compared to Fermi. I'm also suspecting that the faster L2 cache might also be one of the reasons.
 
I mean, it makes sense for them. It's faster than the GTX 580, and faster than the 7970. They want this to seem like the flagship card at the moment to make suckers like me buy into it.

I'm fighting hard though...I'm going to hold off until GK110.
 
So why do you need $500gpu and new CPU to game at 60fps at 1080p with a controller with noticeable input lag?
Well because I do not upgrade that much anymore every few years or so and I'm on a 5850 at the moment

Yes, and i'll add that depending from what CPU The Dutch Slayer is upgrading and what games he typically uses he could still get slight fps increases because of the better cpu architecture. Also Dutch try to get a GPU with 3GB, even if you game at 1080p i gave that Skyrim example were 2GB gets maximized depending on your settings.

Maybe some months from now there will be a 680 3GB edition at the same price.
upgrading from a first gen i7 920, no sendybrige or anything.

thanks for all the tips.
 
So basically the 680 is Nvidias mid range card at high range price.

http://www.techpowerup.com/162901/Did-NVIDIA-Originally-Intend-to-Call-GTX-680-as-GTX-670-Ti-.html

They must be making a fortune atm.

I remember not too long ago when people were laughing at them. "They were gonna be late to the market and ATI was going to be releasing Sea Islands (to blow them out of the water again) not long after Kepler finally launched." That narrative certainly has taken a hit.

Hopefully Sea Islands is actually a quality product this time, because allowing Nvidia to downward revise their entire line sucks big time.
 
roadmapw.png


Hadn't seen that before. The 680 is GK104, but they've reduced its bus width by 128bit.

So it is actually not quite as fast as nvidia's intended mid range top dog and it costs a lot more than it really should. And of course they have a lot more cards to come (that will be a lot faster to boot).
 
http://img651.imageshack.us/img651/3986/roadmapw.png

Hadn't seen that before. The 680 is GK104, but they've reduced its bus width by 128bit.

So it is actually not quite as fast as nvidia's intended mid range top dog and it costs a lot more than it really should. And of course they have a lot more cards to come (that will be a lot faster to boot).
Nope, GK104 has four 64-bit memory channels. Nothing was reduced there. The chart has quite a few things wrong.
 
Saw a few in stock at a local online retailer, even at nice price. Would have pulled the trigger on two if it wasn't for the damn stacked power connectors :/
 
Freaking Newegg is sold out. Eh, either way I'm RMA'ng the 7950 I just got and for $100 more I'll snag a GTX680.... whenever they get back in stock.
 
Freaking Newegg is sold out. Eh, either way I'm RMA'ng the 7950 I just got and for $100 more I'll snag a GTX680.... whenever they get back in stock.

they arent sold out, they basically never had any (well, I saw one pop in stock, only to be gone when out of curiosity I added to my cart, not that I was going to buy it)
 
Yeah. That's why we need competition. AMD needs to figure their shit out. They are surely getting destroyed by Nvidia in GPU sales and Intel in CPU sales. I would hate for them to go away and leave Intel and Nvidia to rule unchallenged.

Seriously. Not to mention good competition helps to keep prices in check too.
 
So how well would The 680 run The Witcher 2 with ubersampling? Not sure if that's one of those features that's reserved for a future where hardware is way faster than it is now or what...
 
So how well would The 680 run The Witcher 2 with ubersampling? Not sure if that's one of those features that's reserved for a future where hardware is way faster than it is now or what...

Poorly

Remember though that ubersampling isn't an on-off thing. You can ramp that stuff up. the default is off or 2 but you can write in 16 and it will do it at something like 10 seconds per frame.


Interestingly Computerbase.de's results for The Witcher 2 are completely different:

http://www.computerbase.de/artikel/...a-geforce-gtx-680/37/#abschnitt_the_witcher_2

(computerbase is very trustworthy in my experience)

They look similar to me.... red in the graph here means minimum framerate I think. with green being average. compare the MLAA + 16x AF on computerbase with this.
 
For the first time Nvidia did a great job in Power Consupmtion...
197W at full load? NICE JOB!

And only $499?

Best card at the moment! And I'm a ATI's fan...
 
I remember not too long ago when people were laughing at them. "They were gonna be late to the market and ATI was going to be releasing Sea Islands (to blow them out of the water again) not long after Kepler finally launched." That narrative certainly has taken a hit.

Hopefully Sea Islands is actually a quality product this time, because allowing Nvidia to downward revise their entire line sucks big time.

Yeah, and there was also a point where "Kepler handily beats 7970 for 299". How quickly we forget.

I dont see the need for too much doom and gloom for AMD on the GPU front. They're in the same spot they were last gen performance wise (580>6970, 680>7970) if not a little better off as the gap seems a little less.

Their pricing looks horrible right now, but they can fix that.

The only way I can see Nvidia hurting them gravely is if Nvidia launches a price war, and I dont see that happening. Something like, 680 for 299.

Some doom and gloom is warranted though. Just keep it in perspective imo.

Hell, if you look at overall GPU market share, AMD is only rising while Nvidia is falling. Why? Because of Fusion type products, and increasingly integrated GPU's are creeping up on and obsoleting the lower reaches of discrete.
 
Poorly

Remember though that ubersampling isn't an on-off thing. You can ramp that stuff up. the default is off or 2 but you can write in 16 and it will do it at something like 10 seconds per frame.

Huh? It's an on off option in Witcher 2. Unless there is some ini edit you can make or something.
 
Top Bottom