• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia announces the GTX 980 Ti | $650 £550 €605

Nvidia's stance has always been the boost clocks are a bonus though, not a guarantee. This is one reason why overclocking in the Kepler/Maxwell era is so infuriating, the card is hard-coded to throttle back on it's own once it hits certain Nvidia imposed TDP and thermal limits. The only thing you're guaranteed is the base 3D clocks and the reference cooler has no problems maintaining those.
I suppose thats my point. Non-reference cooling solutions typically ensure the boost clocks remain stable, considering they keep operating temps below the thermal limit.
 

mkenyon

Banned
Nvidia's stance has always been the boost clocks are a bonus though, not a guarantee. This is one reason why overclocking in the Kepler/Maxwell era is so infuriating, the card is hard-coded to throttle back on it's own once it hits certain Nvidia imposed TDP and thermal limits. The only thing you're guaranteed is the base 3D clocks and the reference cooler has no problems maintaining those.
Unless you just flash with a new BIOS.....
 
Unless you just flash with a new BIOS.....

It was possible for me to hack my BIOS to lift the TDP limit on my 970. I was able to mess with the fan profile to reduce the impact of the thermal limit. I did not find any way to lift the maximum core voltage no matter what I did. Eventually the card would throttle back from the boost clock I set no matter what I did, although I set a boost clock of 1450 or so and it throttled back to 1360 or something.

But the vast majority of people aren't going to hack their video card's BIOS to overclock, that is another level entirely and my point stands pretty well.
 

thelastword

Banned
"The entire industry is running up against the laws of physics, but, like...what if it weren't? That'd be cool."

Also, watching you raise the price in successive comments of the Titan X from $1k to $1350k to $1500k is a riot.

Also also, talking about AMD innovation as they put out the 300 series as yet *another* rebrand of past chipsets is pretty great. Fiji/Fury will be fun to see, though!
Strides are being made,14nm is on the horizon mainly due to Samsung (a great hardware company btw), but yet you maintain we're kicking against the pricks. Technology never stops, you know what stops sometimes? (innovation and thinking outside the box). All I ask is that companies concentrate on giving me the next revolutionary GPU instead of nickel and diming me on old and bruteforce tech.

As for the prices, you do realize that the Titan Z had an MSRP of $3000.00 at launch, do you also realize that it's performance against much lower priced cards was a crying shame, how do you think the Titan Z compares to the Titan X or even this new 980 Ti? You want some perspective, read this up. That's where I'm coming from, btw, look up current prices of the Z at least a year later.


A better example. Lets take an old game, like Half-Life 1 and port it to PS4 and have it running at 4K. It runs at 30 fps. So the PS4 can't even run a game from 1999 at more than 30 fps?

It's not because PS4 is weak, it's because 4K is a huuuge resolution

If you think the card is too expensive that's fine, it's just the "can't even run a game from 2 years ago" logic that bothers me. Settings matter, a lot.
That's extreme, but please understand that the PS4 can only output at 1080p for video games, in that respect there isn't any game from 1999 that won't run at 60fps at that resolution on the PS4. Due to the large vram pool, I'm also inclined to say that 1999 games at 4k is indeed possible on the PS4, don't forget that the PS3 was able to render Okami at 4k resolution internally and then downsampled, this is just to put things into perspective btw.

Why are you surprised that a game that was made for future GPUs even @ 1080p does not run at 4k 60fps with MSAA?

The answer is obvious, your acting obtuse about "how could this be?" is really transparent.
Surprised? So the games I buy now were made for future GPU's and not for me enjoy to the fullest on my current hardware,(not even the best hardware available when said game is released)? This is absurd. So how many generations of GPU's must I wait on and how much money must I spend in between expecting that I am able to play a game in the best way the developer deemed? Persons have been waiting over two years for Crysis 3, yet not even $3000.00 cards can get them close. My my my, people have no expectations from these cards afterall, is it just a money pit to say you've spent that much. Such logic baffles me to no end.
 
1600 strikes me as an odd resolution but those are still pretty good benchmarks. Considering the only one I may play is finishing DA:I and when I build my PC in Oct/Nov time I want to get a g sync monitor...so if I'm 1440pish and don't tend to play the Crysis' of the world I'd be pretty golden?

There's that devil that wants to have only the latest stuff saying well we'd be a few months away from the 1000 series but I don't think they're likely to be leagues better.
 
Surprised? So the games I buy now were made for future GPU's and not for me enjoy to the fullest on my current hardware,(not even the best hardware available when said game is released)? This is absurd. So how many generations of GPU's must I wait on and how much money must I spend in between expecting that I am able to play a game in the best way the developer deemed? Persons have been waiting over two years for Crysis 3, yet not even $3000.00 cards can get them close. My my my, people have no expectations from these cards afterall, is it just a money pit to say you've spent that much. Such logic baffles me to no end.
Welcome to ultra high end pc gaming. 4k monitors are massive in terms of pixels, games like Crysis 3 are made for future tech, MSAA is super expensive, 60FPS with these conditions is incredible (in Cryengine documentation, they speculate about 15TF GPUs for Cryengine at 4K). Keep your expectations in check.

This is basically how it has always been for power hungry games. Back in the day, games scaled in the future... not all do now though of course.
 

thelastword

Banned
Welcome to ultra high end pc gaming. 4k monitors are massive in terms of pixels, games like Crysis 3 are made for future tech, MSAA is super expensive, 60FPS with these conditions is incredible (in Cryengine documentation, they speculate about 15TF GPUs for Cryengine at 4K). Keep your expectations in check.

This is basically how it has always been for power hungry games. Back in the day, games scaled in the future... not all do now though of course.
I will keep my expectations in check when they no longer ask $500-$3000 from me, when no card within that range of expenditure can give me the returns I desire, absolutely none.
 

Caayn

Member
so who is going to sli these puppies?
Raises hand. I caved in :( Ordered two EVGA SC with the reference cooler.

Now all I need to do is wait for them to be delivered, only god knows when that will be, and wait for the EVGA 2 way long V2 SLI bridge to become available.

Reading some posts makes me think that I should've bought a PS4 instead.
 
I will keep my expectations in check when they no longer ask $500-$3000 from me, when no card within that range of expenditure can give me the returns I desire, absolutely none.

In short, you want a card that maxes out all current games (regardless of whether they're built to scale with future hardware), at a resolution ranging from 1080p to 4k (depending on who you're arguing with), for less than $500.

I want a car that does 0-60 in 3 seconds, gets 50 MPG, and is $20k. Clearly, Honda needs to stop nickel-and-diming me, and work on their software optimizations.
 

Jin

Member
Surprised? So the games I buy now were made for future GPU's and not for me enjoy to the fullest on my current hardware,(not even the best hardware available when said game is released)? This is absurd. So how many generations of GPU's must I wait on and how much money must I spend in between expecting that I am able to play a game in the best way the developer deemed? Persons have been waiting over two years for Crysis 3, yet not even $3000.00 cards can get them close. My my my, people have no expectations from these cards afterall, is it just a money pit to say you've spent that much. Such logic baffles me to no end.

Not every game is built the same. Just because 980 can't max out ONE game doesn't mean it can't max other games because the performance across different games varies. Even if you can't max everything out the game still looks damn good. And I have no idea what you mean by expectation - the 980ti has shown to increase FPS by up to 20 compare to 980. That's a pretty big jump.
 
I will keep my expectations in check when they no longer ask $500-$3000 from me, when no card within that range of expenditure can give me the returns I desire, absolutely none.

You seem to need some dire perspective about your expectations:

here is how doom 3 (a game that isn't even nearly as future scaling as Crysis 1 or 3) ran in 2004 :
85497pnunr.jpg


6800 Ultra @ 54 GFLOPs runs it at the common resolution at release @ 37 FPS

To stay consistent with your expectations: to get that to run at 4X the resolution (3200X2400) @ 60 fps average would require more A LOT more Gflops and a card quite a bit more powerful than a 8800 GT (which came out 3 years later and could only play the game at 16:10 1600p @ 87 fps).

The situation you are asking for has never ever existed for ultra high end graphical games. Never.
 

thelastword

Banned
In short, you want a card that maxes out all current games (regardless of whether they're built to scale with future hardware), at a resolution ranging from 1080p to 4k (depending on who you're arguing with), for less than $500.

I want a car that does 0-60 in 3 seconds, gets 50 MPG, and is $20k. Clearly, Honda needs to stop nickel-and-diming me, and work on their software optimizations.
Ok, let's put it this way. If I'm spending $500, I should be able to get 1080p 60fps maxed out. If I'm spending $650-$750 I should be able to get 1440p and 2560 *1600 games maxed out. If I'm spending $1000 and above I should be able to get 4k maxed out.

The problem is that I can't get 1440/1600 maxed out in certain games whilst paying up to $3000.00 on the high-end, farless at 4K, just forget about it. In essence, $3000 won't get me to play Crysis 3 maxed at 60fps at 4K. $500 won't get me to play Crysis 3 maxed out at 60fps at 1080p. My question is, why am I paying all that money on the high end then? Why Should I buy a $3000 dollar card to play Crysis 3 at 4k 60fps at medium'low settings or a mix?
 

HooYaH

Member
Ok, let's put it this way. If I'm spending $500, I should be able to get 1080p 60fps maxed out. If I'm spending $650-$750 I should be able to get 1440p and 2560 *1600 games maxed out. If I'm spending $1000 and above I should be able to get 4k maxed out.

The problem is that I can't get 1440/1600 maxed out in certain games whilst paying up to $3000.00 on the high-end, farless at 4K, just forget about it. In essence, $3000 won't get me to play Crysis 3 maxed at 60fps at 4K. $500 won't get me to play Crysis 3 maxed out at 60fps at 1080p. My question is, why am I paying all that money on the high end then? Why Should I buy a $3000 dollar card to play Crysis 3 at 4k 60fps at medium'low settings or a mix?

I don't think we'll get 4k at a steady 60fps rate with today games w/ highest settings until 2017 GPU lineup using a single gpu.
 

Skyzard

Banned
Lol oh no, I can't get Crysis 3 maxed out at 4K 60fps on a single card that costs £550. FML.

Asking for like 3x too much really. It's extremely good price performance...and compare it to what alternative? Latest consoles? they'd melt if they could even attempt it...tbh doesn't seem all that worth it for Crysis 3, it was mostly done for consoles wasn't it, cross-gen at that? -nope, last gen.
 
Ok, let's put it this way. If I'm spending $500, I should be able to get 1080p 60fps maxed out. If I'm spending $650-$750 I should be able to get 1440p and 2560 *1600 games maxed out. If I'm spending $1000 and above I should be able to get 4k maxed out.

The problem is that I can't get 1440/1600 maxed out in certain games whilst paying up to $3000.00 on the high-end, farless at 4K, just forget about it. In essence, $3000 won't get me to play Crysis 3 maxed at 60fps at 4K. $500 won't get me to play Crysis 3 maxed out at 60fps at 1080p. My question is, why am I paying all that money on the high end then? Why Should I buy a $3000 dollar card to play Crysis 3 at 4k 60fps at medium'low settings or a mix?
You're arguing price/performance in the high end space, that's why. It's never going to be in your favor. 4k is the benchmark for high end cards because 4k is high end. At this level you're going to be bleeding money no matter what
 

bodine1231

Member


Just fired up Witcher 3 and with EVERYTHING including foliage and Hairworks maxed at 1440p I'm getting 60fps solid in the middle of the largest city and 80+ in the field,smooth as butter with Gsync as well. With my single Titan X I was getting drops into the 40's in the large city with Hairworks off and 60ish in the field.The one issue so far is that it seems that both cards aren't being used at 100%. The top card usage is in the 60+% and the bottom is in the 50's,anyone know why this is?
 


Just fired up Witcher 3 and with EVERYTHING including foliage and Hairworks maxed at 1440p I'm getting 60fps solid in the middle of the largest city and 80+ in the field,smooth as butter with Gsync as well. The one issue so far is that it seems that both cards aren't being used at 100%. The top card usage is in the 60+% and the bottom is in the 50's,anyone know why this is?

SLI scaling can be an odd beast, but you are basically getting a little bit better than single card performance if you say 60 and 50%.

What is your CPU?

I bet this is the problem though: For the love of all that is holy MAKE SURE THE GAME IS SET TO FULLSCREEN AND NOT WINDOWED FULLSCREEN OR WINDOWED. (even with Gsync, SLI breaks in windowed modes).
 
Just fired up Witcher 3 and with EVERYTHING including foliage and Hairworks maxed at 1440p I'm getting 60fps solid in the middle of the largest city and 80+ in the field,smooth as butter with Gsync as well. With my single Titan X I was getting drops into the 40's in the large city with Hairworks off and 60ish in the field.The one issue so far is that it seems that both cards aren't being used at 100%. The top card usage is in the 60+% and the bottom is in the 50's,anyone know why this is?
Are your drivers up to date?
 


Just fired up Witcher 3 and with EVERYTHING including foliage and Hairworks maxed at 1440p I'm getting 60fps solid in the middle of the largest city and 80+ in the field,smooth as butter with Gsync as well. With my single Titan X I was getting drops into the 40's in the large city with Hairworks off and 60ish in the field.The one issue so far is that it seems that both cards aren't being used at 100%. The top card usage is in the 60+% and the bottom is in the 50's,anyone know why this is?

Nice - any issues with SLI? Smooth and glitchless?

Has anyone got Witcher 3 at 4K 60 fps on some insane Ti/TitanX SLI setup?
 
Ok, let's put it this way. If I'm spending $500, I should be able to get 1080p 60fps maxed out. If I'm spending $650-$750 I should be able to get 1440p and 2560 *1600 games maxed out. If I'm spending $1000 and above I should be able to get 4k maxed out.

The problem is that I can't get 1440/1600 maxed out in certain games whilst paying up to $3000.00 on the high-end, farless at 4K, just forget about it. In essence, $3000 won't get me to play Crysis 3 maxed at 60fps at 4K. $500 won't get me to play Crysis 3 maxed out at 60fps at 1080p. My question is, why am I paying all that money on the high end then? Why Should I buy a $3000 dollar card to play Crysis 3 at 4k 60fps at medium'low settings or a mix?

First off, your arbitrary "this is what I think the price/performance situation should be, and if it isn't the graphics cards makers are RIPPING YOU OFF" is silly. Clearly, there is a market for these high-end cards at their current price/performance ratio.

Secondly, bringing up the "$3k Titan Z" doesn't make sense, as it's A) currently ~$1500, B) old tech, and C) was known as *terrible* value for gaming when it released. Just because Nvidia sold a halo card for way too much doesn't mean the whole market is suspect.

Thirdly, anyone trying to do what you are asking for isn't looking for a single card, they're going SLI. A pair of 980ti's will get close to Crysis 4k@60 maxed, for $1300 (especially if overclocked.)

Lastly, at least one site believes that your $650 for Crysis 3 1440p@60 maxed is doable -
RfsNRtm.png
 

Smokey

Member


The one issue so far is that it seems that both cards aren't being used at 100%. The top card usage is in the 60+% and the bottom is in the 50's,anyone know why this is?

SLI works pretty well with W3.

  • Make sure you don't have "GSYNC in Full Screen and Windowed" checked in the NVIDIA Control Panel.
  • Make sure VSYNC is off in game
  • Make sure SLI is enabled in NVCP

The last one is important especially if you are coming from a single GPU. SLI is not automatically enabled just because you added an additional card.
 
Witcher 3 is running very smooth,no micro stutter or hitching at all. I'm going to try GTA V in a minute once its finish updating and will report on performance.

That's really impressive given that the game just came out.

Good timing too - W3 and GTA V PC are amazing games and looks so good...
 

Jin

Member


Just fired up Witcher 3 and with EVERYTHING including foliage and Hairworks maxed at 1440p I'm getting 60fps solid in the middle of the largest city and 80+ in the field,smooth as butter with Gsync as well. With my single Titan X I was getting drops into the 40's in the large city with Hairworks off and 60ish in the field.The one issue so far is that it seems that both cards aren't being used at 100%. The top card usage is in the 60+% and the bottom is in the 50's,anyone know why this is?

Strange. I'm getting 95+% usage on both cards.
 

mephixto

Banned
Strange. I'm getting 95+% usage on both cards.

Same here on my 2x 970, also dowload GPU-Z and check if both cards are on PCI-E 3.0 x16. I was getting some weird perfomance issues cause my mobo only detected the cards at 2.0x8 and sometimes worst.
 

Skyzard

Banned
Had an Zotac 680 AMP something or other.

The noise was horrendous.

This EVGA 980 ti HYBRID seems solid. I'm going with that. I just hope that it'll perform well enough to get 60 frames with SW Battlefront at 1440p.

Thanks, think I'll wait on a review or two of the zoltac otherwise I'll play it safe - not going watercooling after previous experience with corsair product though...
 

thelastword

Banned
Not every game is built the same. Just because 980 can't max out ONE game doesn't mean it can't max other games because the performance across different games varies. Even if you can't max everything out the game still looks damn good. And I have no idea what you mean by expectation - the 980ti has shown to increase FPS by up to 20 compare to 980. That's a pretty big jump.
I know, but let's not forget that all these tests are likely done on i7 CPU's with at least 16GB of the latest and best ram, so if the game is CPU limited or close, it's still the best case scenarios for these tests. Here's my point though, a $650 card can't play Crysis 3 at max settings at 2560*1600 60fps neither can the $3000 card at the very same 2560*1600. We won't even touch the 4k rez that the latter is being sold on....It can't even do it at a lower rez.

You seem to need some dire perspective about your expectations:

here is how doom 3 (a game that isn't even nearly as future scaling as Crysis 1 or 3) ran in 2004 :
85497pnunr.jpg


6800 Ultra @ 54 GFLOPs runs it at the common resolution at release @ 37 FPS

To stay consistent with your expectations: to get that to run at 4X the resolution (3200X2400) @ 60 fps average would require more A LOT more Gflops and a card quite a bit more powerful than a 8800 GT (which came out 3 years later and could only play the game at 16:10 1600p @ 87 fps).

The situation you are asking for has never ever existed for ultra high end graphical games. Never.
Come on Dictator, if there was one game that scaled well to the hardware that existed at that time was DOOM 3, back then most people had 1024*768 monitors anyway. I know I did. In any case. Here's some DOOM 3 stats just one year later. Back then, I had the 6800 Ultra, but my 8800GTX Ultra ran a train on doom just about 2.5 years later, not 3 as you implied as the GTX and GTS performed just as well and made it to the market way before the ultra, closer or about 2 years after Doom3. In any case the 7800 GTX numbers linked explains itself quite profoundly just 1 year later.

I don't think we'll get 4k at a steady 60fps rate with today games w/ highest settings until 2017 GPU lineup using a single gpu.
Cool, if that be the case no GPU should be over $750 at max and that's being slightly too fair imo.
 

Culex

Banned
Damn this card is amazingly well priced.

I really wonder if AMD's new card is going to be like the 9700 from years ago and just humiliate Nvidia.
 
Come on Dictator, if there was one game that scaled well to the hardware that existed at that time was DOOM 3, back then most people had 1024*768 monitors anyway. I know I did. In any case. Here's some DOOM 3 stats just one year later. Back then, I had the 6800 Ultra, but my 8800GTX Ultra ran a train on doom just about 2.5 years later, not 3 as you implied as the GTX and GTS performed just as well and made it to the market way before the ultra, closer or about 2 years after Doom3. In any case the 7800 GTX numbers linked explains itself quite profoundly just 1 year later.

Um, I had a 1600X1200 monitor in like 2001.

Secondly, what I am saying is true. Your anecdotal evidence that "an 8800 Ultra ran a train on it" doesn't mean diddly squat. You are wanting Crysis 3 to run @ 4K 60 with MSAA @ 60fps, when the highest end card at release couldn't even manage 1080p 60 with MSAA off. I just extrapolated your wishes to a previous historical example.

I am sorry reality does not align with your "concern" in a pc GPU thread.
Holy shit. The gibberish nonsense is killing me!

Coming to terms with reality induces gibberish at times.
 

lmbotiva

Junior Member
im fucked right now, i manage to return the 980 i got about a week before the 980ti was announced but now i have no card and the witcher 3 here, looking at me, telling me "what you doing boy" im really really close to buy the reference card but im gonna hold right up until batman comes out, if by that time no MSI and EVGA C have been released than the reference it is
 
Any sites compare the 980TI vs a 970 SLI setup that include runs at a higher resolution?

http://www.maximumpc.com/nvidia-geforce-gtx-980-ti-review/

Raise your hand if you were expecting the GTX 970 SLI setup to crush the GTX 980 Ti. Sure, there are definitely cases where the SLI 970 cards win, but there are also times when that single high-end GPU is the better approach. Batman and Witcher don’t appear to scale as well with multiple GPUs, but even in the best scenarios, the 970 SLI configuration is only about 20 percent faster than a 980 Ti. Overall, the average performance advantage of 970 SLI is a paltry 4–8 percent, depending on resolution, with higher resolutions scaling slightly better with SLI.

...

However, looking at average frame rates only tells half the story; the minimum frame rate can be just as important. Minimum frame rates are also why we generally prefer a single fast GPU over two slower GPUs running in SLI/CF. There’s CPU and system overhead associated with SLI/CF, so while it can provide clear benefits to average frame rates, minimum frame rates will often drop.

Case in point: We just mentioned that on average 970 SLI is roughly six percent faster than the 980 Ti and 290X CF is up to 30 percent faster. Switch over the minimum frame rates and the story changes: On average 970 SLI is almost 10 percent slower than a single 980 Ti—16 percent slower at 4K. 290X CF likewise shows an overall decrease in minimum frames per second of 10 percent, and over 20 percent at 1080p, where CPU overhead is creating a bigger bottleneck.
 
im fucked right now, i manage to return the 980 i got about a week before the 980ti was announced but now i have no card and the witcher 3 here, looking at me, telling me "what you doing boy"

Run it on your CPU's integrated graphics.

Leave Hairworks on, watch the world burn
 
im fucked right now, i manage to return the 980 i got about a week before the 980ti was announced but now i have no card and the witcher 3 here, looking at me, telling me "what you doing boy" im really really close to buy the reference card but im gonna hold right up until batman comes out, if by that time no MSI and EVGA C have been released than the reference it is

Relax, you're not playing TW3 while they are patching out all the worst bugs. By the time you get to play it, it will be
not really
bug-free!
 
Top Bottom