Nvidia Kepler - Geforce GTX680 Thread - Now with reviews

There's pretty much no way Nvidia wasn't going to be faster. Since 7970 was only 20% faster than GTX 580. Nvidia would have had to introduce a generation only 20% faster, which would have been a massive fail as well as unlikely. AMD for their part needs to start aiming higher. Though I'm sure their strategy is ultimately a dual die 7970 based card to take the top. Then again if this mid-high range Nvidia seems to be so good, Nvidia will be able to trump that as well as they can fit two on one card. Only the really big single Nvidia GPU's are unfit for dual GPU.

I see now everybody's going for Nvidia in PS4 again now that they see this thing performing well. That's funny as up until now every comment I have seen has stated as fact AMD GPU in both Ps4 and Xb next...how quickly the winds change. I knew that would happen.
 
I know GK110 was supposed to be the top dog but there is also a GK112.
There is no GK112. At least not in the line which is supposed to be released soon. Next year -- who knows?

Latest rumors (not posted anywhere) are that the Kepler launching in March/April is GK110 and there is no GK104. Charlie's saying GK104 is launching at that time and will handily win against the 7970. Nvidia shills are also saying GK104 exists. So yeah, I'm thinking GK104 and GK110 are one and the same - Nvidia probably referred to the same GPU with two names to weed out the leaks.
GK104 is a mainstream GPU, GK110 is an enthusiast GPU. These aren't only different chips but they're probably based on a slightly different versions of Kepler architecture (GK110 is supposedly more GPU compute oriented while GK104 -- and lower -- are more graphics oriented).
What Charlie says is that NV's next mainstream GPU -- GK104 -- beats AMD's newest and fastest Tahiti/GCN. Charlie doesn't know anything yet about GK110 because that chip will launch some time later than GK104 and because of this it's kept under wraps for now.

Good, I hope this drives the prices down on the 570
570, 580, 560Ti448 and all other cards based on GF110 aren't in production anymore.
 
You all should realize that Charlie was being sarcastic on that one. Didn't mention nothing about in what GK104 is a "win".

GK104 competes with HD7800(which is also small and 256bit), not 7900.
 
but sans multi monitor setups where are the games to stress these next gen cards?

nonetheless im excited because maybe these will be the cards they stuff in the 720 and ps4
 
BF3 on ultra is already here

i guess emphasis on the plural. but dont 580s 60fps bf3 on ultra at 1080? i thought they did

edit: they get 45 fps? y'dont say

edit edit: the norm between the benches seems to be around ~50-60ish, but yeah, sub 60.
 
Can we finally put crysis to very high at 1080p and have it run a rock solid 60 fps ? ?!?!?!

Seems like this is somethign that's still hard to pull off, sure the tech sites show 60 fps on benchmarks but look closer and they have antialiasing turned off, overall settings are left on "high" instead of very high and even then at 1080p it seems like you need 2 , 500 $ videocards to run a game that came out in 2007.

Not saying it's not perfectly playable with some custom settings approximating the best parts of high and very high at 30fps or so , just seems unreal to me that almost 5 years later and it's still a worthwhile question for a GPU - "but can it run crysis?".
 
Call me when nVidia allows Multi Monitor setups to run on one GPU, I drive this with 1*6950 :

B0nQP.jpg

All that just for porn bro? Geeezz
 
Can we finally put crysis to very high at 1080p and have it run a rock solid 60 fps ? ?!?!?!

Seems like this is somethign that's still hard to pull off, sure the tech sites show 60 fps on benchmarks but look closer and they have antialiasing turned off, overall settings are left on "high" instead of very high and even then at 1080p it seems like you need 2 , 500 $ videocards to run a game that came out in 2007.

Not saying it's not perfectly playable with some custom settings approximating the best parts of high and very high at 30fps or so , just seems unreal to me that almost 5 years later and it's still a worthwhile question for a GPU - "but can it run crysis?".

out of curiosity, would you replay the game if they did?
 
Cannot wait for the new nvidia cards to come out, currently rocking my GTX 480 it runs hotter than hell but it's a solid card but just not enough for some games to run at 1080P on max settings. Would recommend anyone looking at a GTX 580 to simply not buy, it's to expensive and the new ATI card is simply better. WAIT for kepler if you are thinking of upgrading to a GTX 580!! (Or wait for the 580 to drop in price!)
 
Can we finally put crysis to very high at 1080p and have it run a rock solid 60 fps ? ?!?!?!

Seems like this is somethign that's still hard to pull off, sure the tech sites show 60 fps on benchmarks but look closer and they have antialiasing turned off, overall settings are left on "high" instead of very high and even then at 1080p it seems like you need 2 , 500 $ videocards to run a game that came out in 2007.

Not saying it's not perfectly playable with some custom settings approximating the best parts of high and very high at 30fps or so , just seems unreal to me that almost 5 years later and it's still a worthwhile question for a GPU - "but can it run crysis?".

Don't worry, thanks to internet cry-babies we won't have to worry about games pushing our hardware anymore.
 
Don't worry, thanks to internet cry-babies we won't have to worry about games pushing our hardware anymore.

I laughed. I don't mind graphics being pushed at all personally, the idea that graphics and game play cannot be pushed at the same time always bothered me. New game play ideas and improved graphics are always welcome : )
 
My guess is by handily he means like every generation since the r700/gt200. Nvidia will have the most powerful card but also the biggest and most expensive. Which you go for will depend on performance at your price point.
 
Man, I did not know that exists, anyway, too bad its only 1.5GB of VRAM, games at 5760*1200 are really VRAM hungry.
PS-GTX460-Crysis3.jpg
DM4Jr.gif

Hahahahaha this deserves to be in the pics that make you laugh thread.
Ohh man havent laughed like that in a while......i think its time to abandon the thread till real news comes out....right now its clearly gone to shit.
 
There is no GK112. At least not in the line which is supposed to be released soon. Next year -- who knows?
Just because we havent heard much about it yet, doesnt mean it doesnt exist.

GK104 is a mainstream GPU, GK110 is an enthusiast GPU. These aren't only different chips but they're probably based on a slightly different versions of Kepler architecture (GK110 is supposedly more GPU compute oriented while GK104 -- and lower -- are more graphics oriented).
What Charlie says is that NV's next mainstream GPU -- GK104 -- beats AMD's newest and fastest Tahiti/GCN. Charlie doesn't know anything yet about GK110 because that chip will launch some time later than GK104 and because of this it's kept under wraps for now.
I never said anything contrary to any of that. You probably skipped the part of my post where I said that there is conflicting information being spread inside the rumor mill - one is that there is GK104 launching in March/April and the other is that there is no GK104 and the Kepler launching in March/April is GK110.
 
but sans multi monitor setups where are the games to stress these next gen cards?

nonetheless im excited because maybe these will be the cards they stuff in the 720 and ps4

Have you tried running Crysis 2 with DX11 at high resolutions? There's a real time reflection on anything with the slightest hint of moist specularity.
 
nVidia means better drivers though. It's more than likely that Kepler will be the better choice.

I just hope the proverbial sweet spot cards come out sooner than later.

Yup awsome drivers

http://www.legitreviews.com/news/10352/

aside from that , both groups have problems with drivers.

This is from the 290.53 bea drivers dated dec

Key Bug Fixes

•Fixes some random instances of triangular artifacts when playing Battlefield 3 (fix is now enabled for GeForce 400 and 500 series GPUs).
•Fixes a default panel resolution/ timing bug in 290.36.

3D Vision

•Fixes issues with Call of Duty Modern Warfare 3 not launching into 3D Vision mode

290.36 beta drivers dated nov

Some Key Bug Fixes

•Fixes random flickering as Windows boot logo is loading or fading away.
•Fixes corruption in Crysis 2 with SLI and lower quality shadow settings.
•Fixes ability to set Surround resolutions to 5760x1080 using custom resolutions.
•Fixes some random instances of triangular artifacts when playing Battlefield 3.
•Fixes corruption seen in Settlers 7 with 275.33 drivers.


Now obviously both companys have bugs , it happens the grass isn't allways greener on the ohter side.


Nvidia didn't have offical support for dead island till october
 
Yup awsome drivers

http://www.legitreviews.com/news/10352/

aside from that , both groups have problems with drivers.

This is from the 290.53 bea drivers dated dec



290.36 beta drivers dated nov




Now obviously both companys have bugs , it happens the grass isn't allways greener on the ohter side.


Nvidia didn't have offical support for dead island till october

He didn't say "perfect", he said "better", which just indicates they are superior to AMD's drivers, which they very much are.

Sure Nvidia has faulty drivers and suffer from problems from time to time, everyone knows that but most people also know AMD's drivers are usually lacking a lot.
 
He didn't say "perfect", he said "better", which just indicates they are superior to AMD's drivers, which they very much are.

Sure Nvidia has faulty drivers and suffer from problems from time to time, everyone knows that but most people also know AMD's drivers are usually lacking a lot.

A lot of what? Rage was the only game in recent memory I couldn't play on day 1 and I brought almost all of the "AAA" PC releases day 1 this year (BF3, Witcher 2, Skyrim...).

I know AMD has problems, I've been putting up with screen flickering when I overclock while using multiple monitors for almost 2 years (the gpu idles down too far when overclocked), but I also had a friend who's Nvidia display driver crashes a couple of times a day while browsing the internet.
 
So how are you guys measuring driver stability between AMD and Nvidia? Are there any articles where people spend a lot of time with both?
 
So how are you guys measuring driver stability between AMD and Nvidia? Are there any articles where people spend a lot of time with both?

Who knows?

Usually people just voice their personal experiences and usually they've had problems with drivers a couple of times. As someone who has built PCs for over 15 years I've never really noticed any big difference. Always just buy the card with best performance / money. If there are problems with drivers it's fixed, regardless of what brand of card you have.

But I guess the most recent driver problems was Rage with AMD/ATi so thats the one people remember. These things go in waves though and I've never had a driver problem which was not fixed fast and rendered a game unplayable.
 
I think the driver issue isn't just (or even primarily) about bugs, it's about features. To me, NV - at this point in time - is superior to AMD in 3 major areas that are often not adequately accounted for in reviews:
  • Speed of support for new games. New games are usually supported on releas or very lose to it with profiles/fixes/optimization etc. For AMD this can take longer.
  • Minimum framerates and number of stutters. The Techreport has introduced a new measurement system that moves away from average FPS (which isn't very useful really) to try and capture the "smoothness" of gameplay instead, taking into account effects such as microstutter and framerate distribution over time. The results were generally slightly better -- compared to FPS measurements -- for NV, which confirms a feeling I had for a while.
  • Abilities to enhance IQ. This should be what owning a high-end GPU is all about, and NV offers more options here. Between downsampling, SGSSAA, combined MS/SSAA, CSAA and the different transparency AA modes it's possible to force or improve AA in basically every single game ever. Regarding AF, high quality mode is still unmatched by any mode on AMD (not in terms of angle dependence, but in terms of samples taken and thus flickering). And on top of that you get the option to force SSAO in many games, which often doesn't work that well but is really nice in some.
These are all not really related to driver stability, but driver capability. And they are the reasons why an AMD card would have to be at least 30% or so better in terms of pure framerate/$ for me to consider it at this point in time. (Note that all of this disregards specialty features on both sides, like 3d Vision, Eyefinity, Cuda or Physx)

FWIW, I've used both brands for a long time, but haven't used AMD in my personal gaming desktop since getting a GTX260. I use at least 3 systems at any point in time (home desktop, work desktop, laptop) so I have more data points available.
 
Usually people just voice their personal experiences and usually they've had problems with drivers a couple of times. As someone who has built PCs for over 15 years I've never really noticed any big difference. Always just buy the card with best performance / money. If there are problems with drivers it's fixed, regardless of what brand of card you have.

I agree with this. Am I the only one who's cards from both vendors over the years have all worked fine?

If anything it seems to me there's more problems with Nvidia cards these days, microstuttering reports and such. But that could just be because I havent owned an Nvidia in a while, but AMD's have been smooth as silk for me for a couple gens.

I just think there's a lot of Nvidia fans around here so they go around saying "AMD drivers suck" all the time until everybody acts like it's fact.

GK104 is a mainstream GPU, GK110 is an enthusiast GPU. These aren't only different chips but they're probably based on a slightly different versions of Kepler architecture (GK110 is supposedly more GPU compute oriented while GK104 -- and lower -- are more graphics oriented).
What Charlie says is that NV's next mainstream GPU -- GK104 -- beats AMD's newest and fastest Tahiti/GCN. Charlie doesn't know anything yet about GK110 because that chip will launch some time later than GK104 and because of this it's kept under wraps for now.

This is not the gist of internet rumors to me. what they say is that GK104 is the mid-high end chip. Basically it's going to be the 660 Ti, for reference. So just think where 560 Ti slots now for an idea. GK110 is just supposed to be two GK104's glued together for the new high end dual GPU card. So nothing very exciting there.

The alleged super powered, GK112, 512 bit bus Nvidia chip, isn't slated for a long time then, perhaps not until early 2013.

Long story short GK104 should be the most powerful single Nvidia GPU coming for a long time then. Somewhere long down the road the big beast GK112 would come. By that time AMD should be readying something new as well.

Current rumors say GK104 might have 768 CC's. That's 50% more than GTX 580. HD 7970 for reference has 33% more SP's than HD6970. So indeed, on the surface you might expect GK104 to be somewhat faster than HD7970, but nothing explosive. Also you have the whole 256 bit bus thing from Charlie, which is probably going to put some kind of cap on GK104 performance, so again it seems unlikely it'll be blowing anything away, but I do bet it is faster than HD7970 particularly at lower resolutions.

But this is all just rumors, too be sure, even though GK104 does seem to be shaping up.

Hell just to confuse things more I've even heard "GK104 and GK110 are the same chip, Nvidia just gave it two names to weed out leakers".
 
I think the driver issue isn't just (or even primarily) about bugs, it's about features. To me, NV - at this point in time - is superior to AMD in 3 major areas that are often not adequately accounted for in reviews:
  • Speed of support for new games. New games are usually supported on releas or very lose to it with profiles/fixes/optimization etc. For AMD this can take longer.
  • Minimum framerates and number of stutters. The Techreport has introduced a new measurement system that moves away from average FPS (which isn't very useful really) to try and capture the "smoothness" of gameplay instead, taking into account effects such as microstutter and framerate distribution over time. The results were generally slightly better -- compared to FPS measurements -- for NV, which confirms a feeling I had for a while.
  • Abilities to enhance IQ. This should be what owning a high-end GPU is all about, and NV offers more options here. Between downsampling, SGSSAA, combined MS/SSAA, CSAA and the different transparency AA modes it's possible to force or improve AA in basically every single game ever. Regarding AF, high quality mode is still unmatched by any mode on AMD (not in terms of angle dependence, but in terms of samples taken and thus flickering). And on top of that you get the option to force SSAO in many games, which often doesn't work that well but is really nice in some.
These are all not really related to driver stability, but driver capability. And they are the reasons why an AMD card would have to be at least 30% or so better in terms of pure framerate/$ for me to consider it at this point in time. (Note that all of this disregards specialty features on both sides, like 3d Vision, Eyefinity, Cuda or Physx)

FWIW, I've used both brands for a long time, but haven't used AMD in my personal gaming desktop since getting a GTX260. I use at least 3 systems at any point in time (home desktop, work desktop, laptop) so I have more data points available.

Excellent post and very much agree with this, it's also one of the primary reasons why Nvidia drivers are being touted as better, because the above reasons are very important. I have however been on both side of the fence in the last two years and the amount of graphical issues with AMD's drivers where I've had to downgrade or just stick with old drivers to not have problems were numerous. It went all tits up when I got my 5970, which was the worst gaming experience I ever had. Besides the poor game support and the micro stuttering, the slow support cycle in drivers and waiting around for CF profiles made it downright unbearable.

I've even seen posts here outline numerous problems with AMD drivers just over the past year or so. So it's just weak to brush aside all arguments by saying it's because RAGE is in fresh memory or people are Nvidia fanboys. AMD drivers do have a lot of problems for a lot of people. Again, nobody is saying Nvidia's drivers are perfect, far from it actually but their launch support of games and especially their 'out of game' support for a wide range of graphical options is great. Just as a heads up, I have had a lot more ATI / AMD cards in my time than Nvidia cards, so I have had years of exposure to AMDs drivers.

But overall, I think they got really infamous when it comes to all the Crossfire issues, the multiple releases in one month, waiting for profiles etc.
 
Minimum framerates and number of stutters. The Techreport has introduced a new measurement system that moves away from average FPS (which isn't very useful really) to try and capture the "smoothness" of gameplay instead, taking into account effects such as microstutter and framerate distribution over time. The results were generally slightly better -- compared to FPS measurements -- for NV, which confirms a feeling I had for a while.

Those numbers are pretty controversial and nobody can even agree what to measure, I wouldn't put much stock in them.

For example they focus a lot on the 99th percentile, but if one GPU is faster 99% of the time, isn't that arguably more important/relevant to gameplay experience than the 1% of slowest frames? Would you rather be on a GPU faster in the slowest 1% or faster on the 99%? You're going to be spending a lot more time in the 99% of frames than the 1%.

Anyways I've seen a lot of microstuttering issues with Nvidia drivers as well >> http://www.youtube.com/watch?v=q9s3jhqY6pY and I think there was a huge issue in BF3.

Generally from forum posts anecdotes I get the feeling of more problems with Nvidia cards (seems like the BF3 thread on PC was 70% Nvidia problems), but I bet if I owned one I would have no problems with it either. Though I'm kinda scared to cause I KNOW there are no problems with my AMD. But if an Nvidia card ever dominates price/performance again (hopefully GK104!) I'm sure I'd buy it.
 
Speed of support for new games. New games are usually supported on releas or very lose to it with profiles/fixes/optimization etc. For AMD this can take longer.

Isn't this because games are developed on PCs with NVidia GPUs most of the time?

Anyway, I hope this is true, if only because it should be good for pricing. That should carry over to the next series of GPUs, at least to some degree, which is when I'll probably be upgrading. I have my doubts it's going to be the 760 level GPUs though, then again somebody from NVidia did mention that their cards were way beyond what the AMD rumors were estimating a while before launch?
 
All we can go on is this PR from Nvidia themselves: http://www.nvidia.com/object/IO_17342.html

Why would they need to lie about this?

Because it's a PR, they were never going to tell the market the truth, that they were chosen as a last minute option because Sony's internal GPU based on Cell (it was a Cell with ROPs and embedded ram, SPEs acted as shaders) was scrapped because the die was huge and the performance sucked (less than a 6800).
So they said that they had been working on the RSX since 2002, which is not completely false because nVidia had been working on the GF7 generation since 2002. But the deal was made in 2004, RSX development started in summer 2004, it was delivered in its final form at the end of 2005.
 
Because it's a PR, they were never going to tell the market the truth, that they were chosen as a last minute option because Sony's internal GPU based on Cell (it was a Cell with ROPs and embedded ram, SPEs acted as shaders) was scrapped because the die was huge and the performance sucked (less than a 6800).
So they said that they had been working on the RSX since 2002, which is not completely false because nVidia had been working on the GF7 generation since 2002. But the deal was made in 2004, RSX development started in summer 2004, it was delivered in its final form at the end of 2005.

You may very well be right with what you say, I guess we'll never know for sure.

Just thought it'd be easier all round not to mention anything about how long they've been working together in the PR, than flat out lie about it....

As an aside, what do you think the chances of Sony going Nvidia again are?
 
Just because we havent heard much about it yet, doesnt mean it doesnt exist.
Well, I didn't heard much about extra terrestrial life either. What I'm saying is that GK112 isn't part of NV's plans for Kepler launch. Whoever made that story about them launching a top end SKU (GK112) in the end of 2012 was wrong. That's all.

I never said anything contrary to any of that. You probably skipped the part of my post where I said that there is conflicting information being spread inside the rumor mill - one is that there is GK104 launching in March/April and the other is that there is no GK104 and the Kepler launching in March/April is GK110.
GK110 will probably launch in May during the next GTC conference. GK104 will launch some time before Ivy Bridge launch because lots of IB notebooks will have Keplers in them. When exactly is up to market conditions and TSMC's ability to produce enough chips for the market.
 
Well, I didn't heard much about extra terrestrial life either. What I'm saying is that GK112 isn't part of NV's plans for Kepler launch. Whoever made that story about them launching a top end SKU (GK112) in the end of 2012 was wrong. That's all.
Well, you can put yourself under a rock and pretty much believe that <insert anything here> doesnt exist either.

GK110 will probably launch in May during the next GTC conference. GK104 will launch some time before Ivy Bridge launch because lots of IB notebooks will have Keplers in them. When exactly is up to market conditions and TSMC's ability to produce enough chips for the market.
Kepler launch is not tied to Ivy Bridge (contrary to the rumors and what you are saying) Nvidia will launch as soon as their requirements to launch are fulfilled. They wont sit laying eggs on their brand new GPU just for fucking Intel, which in another context is still their competitor.
 
Top Bottom