Nvidia Kepler - Geforce GTX680 Thread - Now with reviews

426 euro excluding vat = around $400-450 launch price in US typically for a pre-order:
Don't think this is true.
500 after VAT = $500 (NON PREORDER).

Yes, early prices that pop are usually more, but I don't expect this to be at $400 or $450 at all unless they plan on releasing GK110 at $550+ to blow away everything in the coming months.
 
BTW, my "question" had nothing to do with that post.

Really interested in seeing how people respond to the next two GPU generations. Strange days.


Damn, are those 3 monitors attached to 1 stand?
You can have quite a few. For instance:

http://www.samsung.com/hk_en/consum...MURHB/XK-features?subsubtype=md-and-vc-series

06.jpg
05.jpg
07.jpg
 
Don't think this is true.
500 after VAT = $500 (NON PREORDER).

Yes, early prices that pop are usually more, but I don't expect this to be at $400 or $450 at all unless they plan on releasing GK110 at $550+ to blow away everything in the coming months.

I hope this is somewhat true. Love to get my hands on the GK110 at a somewhat reasonable price.
 
I hope this is somewhat true. Love to get my hands on the GK110 at a somewhat reasonable price.
I hope it is true also. $450 would be the absolute lowest I'd expect if it is equal or faster to the 7970.

From GK104's wattage draw (which is insane) it really does look like it's just INSANELY capable. It doesn't look like it was pushed to match 7970 performance (The arch design would have been specced lower and wattage draw should be way up). This just confirmes that the 7970 is a marginal improvement and Kepler is actually what we should have expected from both parties from both a new arch and process. If those benches are true and not biased.

Which means... GTX660Ti and GTX670 are going to be even more ridiculous from a value standpoint. Even if they the cut clocks down as long on those, as long as the card has 2GB of RAM I'll pick one up because it'll be better value.
 
From GK104's wattage draw (which is insane) it really does look like it's just INSANELY capable. Which means... GTX660Ti and GTX670 are going to be even more ridiculous from a value standpoint. Even if they the cut clocks down as long on those, as long as the card has 2GB of RAM I'll pick one up because it'll be better value.

idle power draw actually doesn't look good. AMD's zero core winning there

pic1hk.jpg


If 680 is $550 as I expect, and basically similar value to 7970, I'm not sure I'd expect great things out of their down lineup either.

I might just grab a 7870 already, tired of waiting. Wait another indeterminate amount of time for the rest of Nvidia's lineup to trickle out?

Kepler might be a nice fit for console with these power numbers. Looks like sony jumped off the Nvidia bandwagon at exactly the wrong time lol.
 
this thread is a goddamn rollercoaster:

Its better than 7970 - that's good
No it's not - that's bad
Its on par, but cheaper and uses less power - that's good
Actually it's quite expensive - that's bad
but it might be cheaper...
 
idle power draw actually doesn't look good. AMD's zero core winning there

pic1hk.jpg


If 680 is $550 as I expect, and basically similar value to 7970, I'm not sure I'd expect great things out of their down lineup either.

I might just grab a 7870 already, tired of waiting. Wait another indeterminate amount of time for the rest of Nvidia's lineup to trickle out?

Kepler might be a nice fit for console with these power numbers. Looks like sony jumped off the Nvidia bandwagon at exactly the wrong time lol.
I'd imagine it's pretty hard to get close to those idle numbers. Those are low. I don't think that's Zero core though.

It's just everything combined I said in the post you quoted. Not meant for as high end, but scales great with its load power.
 
My guess is that Epic games is using a GTX 690 which is said to possibly be launching in mid May to power the Unreal Engine 4 technology that was shown off behind closed doors at GDC this year. Sounds like a nice upgrade over my aging GTX 590. Bring it on!

Uhhhh, they said it was 200 watts so.... 680.
 
What's with the 300mhz difference on default clock and GPU clock there on GPU-Z?

Factory overclocked?

Maybe, but I am also wondering if this is that Turbo OC that was rumored? Perhaps all GTX 680's have built in turbo OC like Intel CPU's. That'd be pretty neat. Perhaps it can be enabled or disabled via control panel.
 
*looks at lightning xtreme 580s in machine

*looks at benches

*looks back at 580s

*eyes squinting looking at 680 specs

*looks at wallet...looks back at screen

*CLICK X TO CLOSE BROWSER
 
*looks at lightning xtreme 580s in machine

*looks at benches

*looks back at 580s

*eyes squinting looking at 680 specs

*looks at wallet...looks back at screen

*CLICK X TO CLOSE BROWSER

lol Smokey....I am glad I am not the only one...as insane as it sounds...looking at this thread and reading other threads has me eyeing my 3gb Classified 580's and thinking 2 680s would be really nice....lol
 
With the risk of sounding stupid, but aren't there anti-cartel laws in place just for these things?

implicit price fixing is very hard to catch. Signalling to the competition is not illegal and a defense of economics and this is when we were done making it is rock solid.
 
We need some 1 to 1 GPU comparisons of the 680 vs 7970. I want to see how both perform at the same clock since both should be able to OC well, and we all know the stock frequency on the 7970 is too low.
 
We need some 1 to 1 GPU comparisons of the 680 vs 7970. I want to see how both perform at the same clock since both should be able to OC well, and we all know the stock frequency on the 7970 is too low.
As far as I know they're even.
 
We need some 1 to 1 GPU comparisons of the 680 vs 7970. I want to see how both perform at the same clock since both should be able to OC well, and we all know the stock frequency on the 7970 is too low.

I don't understand why people want this. They don't use the same architecture and therefore a direct comparison between clock speeds just doesn't work. A 1 to 1 comparison would be with them both at stock speed and then their max overclock.
 
If they were going to sell it for $400, I think they probably would have cut the 580's MSRP to something lower than...$400.
580 isn't in production anymore. Whatever cut they may or may not do for it won't last longer than a couple of months (until GK106 hits the market).
 
Some minor new info from the HardOCP forums:

8 days and 10 hours, ill will spill some serious beans...


all i can say is NVIDA surround will not run on one card, heaven benchmark with everything maxed above 29fps in surround, thats all i can say, believe or not


wow i don't know why my phone auto corrected to NOT, its obviously wrong

2012-03-15_135626-1.jpg


Source: http://hardforum.com/showpost.php?p=1038505114&postcount=730

Sadface...I don't want to wait 8+ days for the official reveal...that means it's probably two weeks until we can order two. :)
 
Shit imagine the results if he plugged in the power cables!!

I can only hope that Nvidia does a "Here's our new card and it's in stores right now"....I don't want to wait 2 weeks+
 
Shit imagine the results if he plugged in the power cables!!

lol...I always do that type of shit when I first get a card. I get excited to take a pic of it in my case and I'll either forget to plug it in (for the pics) or put the SLI bridge or something important there...no one would notice out of my friends but I still have to fix it. :)
 
I wonder if it supports two monitors. I have two projectors, and I would LOVE to try out multi-pj gaming. I've messed around with some software solutions, but it is not really feasible. Plus, 3840x1080 (or 2560x720) at least has some chance of running at 60fps, unlike single card 5760x1080. I also happen to have 2 1080p monitors, but there I think the bezel could cause some issues. I wonder if they could do something like having only an offset display (i.e. extending the field of view to the right only) instead of centering it across two displays. Please nVidia, make it happen.

Edit

TXAA looks intriguing, wonder if that's what makes it possibly to go head to head with a 384 bit card - normally, I would expect the card with more memory bandwidth to spank the 256 bit card, with decent res and 8xAA. I really hope nVidia reveals what the GK110 is capable of at launch, although business-wise I guess it doesn't make much sense.
 
Top Bottom