Nvidia Kepler - Geforce GTX680 Thread - Now with reviews

there any benchmarks for the first Crysis? wasnt that game way more demanding on the gpu than crysis 2?

That game just wasnt well optimized, look at how much better Warhead ran compared to the original.
Honestly Crysis 2 and CryEngine 3 outdo Crysis and CryEngine 2 in almost every way.
The only Benchmarks we need from CryTek are Crysis 2 DX11 with high res textures....CryEngine 2 is old news.

Wait...isn't that multi screen setup in this video from somebody at this site???
Call me when nVidia allows Multi Monitor setups to run on one GPU, I drive this with 1*6950 :
B0nQP.jpg

Indeed it is, I love that he has jumped camp, this was all I needed to know that Kepler is god.

Any upcoming games that could make use of this card at 1920 x 1200? I want it because it is sweet but having a hard time justifying it.

Resident Evil 6, FarCry 3, BioShock Infinite, GTAV, GRFS maybe Borderlands 2, maybe Max Payne 3.
This is assuming all of them are decently made and not just weak ports.

P.S This is also assuming you have a 120hz monitor and/or will play in 3D
 
roadmapw.png


Hadn't seen that before. The 680 is GK104, but they've reduced its bus width by 128bit.

So it is actually not quite as fast as nvidia's intended mid range top dog and it costs a lot more than it really should. And of course they have a lot more cards to come (that will be a lot faster to boot).

Hm, wonder how far off the GK107 and GK106 are since according to this they were supposed to launch before the GK104. Hopefully in time for the Ivy Bridge launch.
 
I need to know who has the GTX 675M in notebooks. All I see at the moment is Origin, so I'm assuming that Sager and the like will get it soon.
 
FYI for those who bought are considering buying a 680.

TechPowerUp
W1zzard said:
One thing that I noticed is that the cooler emits a strong smell of solvents when the card is loaded. This is not the typical "new graphics card" smell, but something more like glue. Even after a week of testing, it has not completely gone away, but gotten much less intense.

XBitLabs
Sergey Lepilov said:
When we replaced the default GPU thermal interface with Arctic MX-4, we saw the GPU temperature drop by 4°C at peak load. The top speed of the fan decreased by 120 RPM in the automatic regulation mode at that.

Something to keep an eye out on.
 
FYI for those who bought are considering buying a 680.

TechPowerUp


XBitLabs


Something to keep an eye out on.

So the stock cooler sucks ass along with their thermal paste, just like every other reference stock cooler and paste ever on GPUs?

Colour me surprised :P
 
Guru3d just posted their SLI Review of the 680. Still reading it, so will report back on any highlights I read from the review. I know some will be interested in the review.
 
Werent some people here making a big deal out of those 99th percentile frame time tests by tech report back when Nvidia was winning them?

Well they're back, and now 7970 is beating 680, and funny, I haven't heard a peep LOL
This is called "projecting". Some of us actually are unbiased :P.

The one I was waiting for.

Some surprising frame spikes on the 680, they speculate it may have to do with the turbo boost. This is one of those things that can't really be seen with a traditional GPU reviewing methodology.
(That's me quoting the techreport review much earlier in the thread)

Frame time analysis is simply a lot more useful than frame rate analysis.
 
Good reference:

Single, Dual, Triple, and Quad GPU review of GTX 680 vs HD 7970.

Includes 1920x1080, and 5760x1080, with various settings.

MVir6.jpg


nVidia GeForce GTX 680 Quad SLI demo
http://www.youtube.com/watch?v=nSbDSwmvxjI


XGBff.jpg



nVidia GeForce GTX 680 Quad-SLI review [English version]
http://nl.hardware.info/reviews/2641/nvidia-geforce-gtx-680-quad-sli-review-english-version


Table Of Contents

01 Introduction
02 Benchmarks
03 3DMark Vantage
04 3DMark11
05 Aliens vs Predator
06 Batman: Arkham City
07 Battlefield 3
08 Crysis 2
09 Metro 2033
10 The Elder Scrolls V Skyrim
11 Energy consumption
12 SLI scaling
13 Video
14 Conclusion
3DMark Vantage

In 3DMark Vantage two cards score 46,842 points, only marginally faster than the HD 7970. 3DMark Vantage does not scale well to three or four cards.

3DMark11

In 3DMark 11 Performance the GTX 680 in SLI scores over 1,000 points more than two HD 7970s. The Extreme setting scales perfectly to four cards to a maximum of 10,850 points. Impressive!
Energy consumption

In the power usage test we once again witness the excellent work that AMD has done in terms of idle usage. A second card is almost completely powered down when idling on the Windows desktop, using less than 3 watts. The difference we measured in SLI is quite significant. Idle our system used 96.4 watts with two HD 7970s, with two GTX 680s it used 114,5 watts.

The tables are turned in Metro 2033, however. With two HD 7970s we measured 511 watts, but with two GTX 680s the usage did not go above 475.7 watts.

Measurements for triple-SLI will soon we added. Since we had to test quad-SLI on a different motherboard, those results are not comparable.
SLI scaling

In the table below we listed all scores of a single GTX 680 and those of two cards in SLI, with the scaling in the third column.

We can see that SLI improves performance with an average of 62 percent. If we look solely at 5760x1080 resolution, then the scaling is 177 percent. At that resolution with the highest settings scaling reaches 188 percent.

If we compare with AMD Radeon HD 7970 CrossfireX, then we see that the standard SLI-scaling is slightly inferior. With AMD we measured a 69 percent increase in performance on average, and 84 percent in 5760x1080. However, on the highest settings nVidia scored a little better, 88 percent compared to 84 percent.



Radeon HD 7970 Quad-CrossfireX Scaling (2 vs 1, 3 vs 1, 4 vs 1)
http://nl.hardware.info/reviews/2506/19/amd-radeon-hd-7970-quad-crossfirex-eyefinity-review-schaling
 
Hm, wonder how far off the GK107 and GK106 are since according to this they were supposed to launch before the GK104. Hopefully in time for the Ivy Bridge launch.
GK107 is shipping in notebooks right now (as GF 640-660M). It was technically the first Kepler-based GPU that was ready and it was the main reason for all the notebooks design wins NV got in this cycle.
GK106 should come May-ish.
 
http://forums.overclockers.co.uk/showpost.php?p=21532597&postcount=29

Gibbo@OcUK said:
Hi there

I've had access to roadmaps for sometime and have access to the latest.

GTX 680 was intended for March/April, it is now here. Fact is NVIDIA were always intending to release GTX 680 class card now, yes the spec may have changed but this has always being scheduled launch for their top-end GTX 680 product.

Cards that will come next shall be 670Ti and 670, expect them around May time, maybe end of April, these shall both be slower than GTX 680 obviously.

GTX 680 2GB, aimed at 7970.
GTX 670ti replaces GTX 580 and shall also be 2GB at £320ish range, it will take on 7950 3GB.
GTX 670 replaces GTX 570 and no doubt 2GB also, expect £239.99 and well slightly faster than GTX 570.

GTX 560Ti and 560 are not due to be replaced until much later in the year.

All low-end, 520/550 etc. shall be re-branded into 6xx series, same cards just re-boxed as 6 series with slightly bumped clock speeds.

A dual GPU based card could and can be released when NVIDIA desire to do so, most likely called GTX 690.

Again GTX 680 is flagged on the roadmap as fastest single GPU card, will a faster single GPU card come this year. Well I guess that depends if NVIDIA feel they need one and if they do I suspect October-December timeframe.


What we can expect in April/May is AIB's making much faster and higher TDP varients of GTX 680.

Don't be surprised to see cards like EVGA GTX 680 Superclocked 4096MB soon with twice memory and higher clock speeds for £500-£600 region.

I guess GK106 will be the base for GTX 660 and GTX 660Ti.
 
I'm super impressed with GTX 680 (GK104) however I will wait and buy it when it is inevitably sold as GTX 760 alongside GTX 780 (GK110) later this year or next year. Pure speculation on my part but I'm willing to take a gamble, 294mm^2 is just too small for an Nvidia top end chip and I won't pay top end prices for it. I do most definitely want it though :D
 
Werent some people here making a big deal out of those 99th percentile frame time tests by tech report back when Nvidia was winning them?

Well they're back, and now 7970 is beating 680, and funny, I haven't heard a peep LOL

http://techreport.com/articles.x/22653/11

scatter-value-99th.gif

You clearly skipped over my post. I mentioned this and the 7970 having 3GB of VRAM.

Although to be fair, it's possible it only lost that because of driver issues with Batman. Nvidia claims it'll be addressed in the next driver. We'll see.

Even with performance increases of only ~10% over SB? Last I checked anyway.

I can see his thinking. 10 percent is 10 percent. Plus it's another 20 percent more energy efficient. And if you coupled it with a new motherboard with Lucid Virtu Universal MVP (if the virtua v-sync actually works as claimed), that's a combo that's well worth waiting for if you can.
 
I wonder how much wiggle room AMD has with pricing now. Would be a swift kick in the ass to early adopters if they dropped the prices by $100 respectively. It would certainly make the 7950 a very attractive card at $349 though. :o
 
I wonder how much wiggle room AMD has with pricing now. Would be a swift kick in the ass to early adopters if they dropped the prices by $100 respectively. It would certainly make the 7950 a very attractive card at $349 though. :o

With the 7870 priced at ~349, I doubt they'll do that. My expectation is this will happen in the next month or so:

7970: 449
7950: 399
7870/50: 349/249 (unchanged)

Once the GTX 650/660/670 (May?) are released, I think we'll see price movement on the other cards, until then I think we're in a holding pattern as far as price goes.
 
I always see 4xAA and stuff like that in benchmarks. Why isn't it set to 16xAA etc..??
The problem with higher quality AA modes beyond 4xMSAA is that they are not directly comparable between vendors. (Well, 8xMSAA is to an extent, but on NV you'd almost always choose 16xCSAA instead, which looks better and is faster)
 
I always see 4xAA and stuff like that in benchmarks. Why isn't it set to 16xAA etc..??
Currently no GPU does pure 16xMSAA.

Nvidia has CSAA modes which have additional occlusion checks after 8xMSAA.
They also have mixed modes where ordered SSAA is mixed with MSAA, this mode has proper subsamples up to 32xAA, but is quite slow in comparison to true 32xMSAA.
AMD has MSAA modes up to 8xMSAA and then rest modes are just different kind of tent filters to subsamples of neighbouring pixels.

This leads to the fact that it is not feasible to compare with AA modes with higher AA sample counts than 8 samples.

What reviewers could do is test SSAA modes and that we have seen in couple of reviews.
 
No.. a single GPU. (albeit huge.. with around 6Billion transistors..)

GK110 is dual GPU model according the roadmap.

http://img651.imageshack.us/img651/3986/roadmapw.png[/IMG[/QUOTE]

If the rumors are true, the ones saying that this "680" was initially designed to be a mid-range 660 ti-ish card, do you still think this monster 6 billion transistor gpu will be on schedule for the 6xx gen of cards?
 
Like I said earlier that roadmap is wrong. It was an guesstimation by Goto (author) on pc watch impress.

GK110 = Single GPU, 512bit, Q3
Dual-GK104 has no GPU codename for obvious reasons, April
GK104 is not 384-bit obviously
GK106 is not 256-bit, May

There are more errors but you get the point.
 
Like I said earlier that roadmap is wrong. It was an guesstimation by Goto (author) on pc watch impress.

GK110 = Single GPU, 512bit, Q3
Dual-GK104 has no GPU codename for obvious reasons, April
GK104 is not 384-bit obviously
GK106 is not 256-bit, May

There are more errors but you get the point.
Damn.. missed that.
If the rumors are true, the ones saying that this "680" was initially designed to be a mid-range 660 ti-ish card, do you still think this monster 6 billion transistor gpu will be on schedule for the 6xx gen of cards?
It is same generation to Kepler so unless nvidia marketing department go bananas again it should be named 6xx..
Who knows though.. they named slightly fixed 480 a 5xx series..
 
Pleasantly surprised by the Dutch prices. Cheapest GTX680 is €450 right now. AMD will definitely have to react.

It is same generation to Kepler so unless nvidia marketing department go bananas again it should be named 6xx..
Who knows though.. they named slightly fixed 480 a 5xx series..
I doubt it. That thing will be a monster. Naming it 'GTX 685' or something wouldn't do it justice I think.
 
Pleasantly surprised by the Dutch prices. Cheapest GTX680 is €450 right now. AMD will definitely have to react.

I doubt it. That thing will be a monster. Naming it 'GTX 685' or something wouldn't do it justice I think.


Yeah, they're either going to sit on it for the next gen of cards. Or name it something badass like 699SS.


And god help me, I might just buy one at 450 (the 680 that is). Must be strong.
 
Hmmm, Arctic's 680 compatible ( Among others ) VGA cooler was removed from their site and it required a side-by-side power connector layout rather than the reference stacked. Plus it was slated for an April 19th release ... The 7970 cooler was available right off the bat.

Would I be crazy in thinking we'll see another Kepler card around that time and Arctic weren't supposed to have posted that just yet? Seems weird to call support on a top-end card when the reference design isn't even compatible.
 
Top Bottom