• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia announces the GTX 980 Ti | $650 £550 €605

Watto

Neo Member
The only downside I can see to the ACX is you will be chucking a lot of hot air back inside the case, I'm sure there are people here who know a lot about this than I do who will know if this is an issue or not.
 
If the reviews are accurate, then the 980Ti is the i5-2500k of videocards.

For what? 8 Months? The 2500K of GPUs is most certainly, unarguably going to be some kind of Pascal variant, probably the 1070. This card is still very expensive. I'd argue right now the 970 Gigabyte version when overclocked, and given 50 off by Amazon lol, is the best deal around still for 1080p AAA + 4K elsewhere gaming.

I have gone Gigabyte the last two cards and their coolers and design are the best out there IMHO. Fucking love the Windforce 3 designs.
 
Dumb installation question here, I had 2 eight pin powercables going from my psu to my 780, the 980ti however only requires 1 eight pin and 1 six pin... Now can I just let one row hanging lose in the casetower and just plug the remaining power cables Willy nilly into the 980? Or do I have to get different cables and do I have to make sure what pin is what before I plug them in?
 

Watto

Neo Member
Dumb installation question here, I had 2 eight pin powercables going from my psu to my 780, the 980ti however only requires 1 eight pin and 1 six pin... Now can I just let one row hanging lose in the casetower and just plug the remaining power cables Willy nilly into the 980? Or do I have to get different cables and do I have to make sure what pin is what before I plug them in?

I have two pins unused in my current setup and it doesn't seem to be causing any issues.
 
Dumb installation question here, I had 2 eight pin powercables going from my psu to my 780, the 980ti however only requires 1 eight pin and 1 six pin... Now can I just let one row hanging lose in the casetower and just plug the remaining power cables Willy nilly into the 980? Or do I have to get different cables and do I have to make sure what pin is what before I plug them in?

You can just let one row hang loose. This is what I had to do on my original GTX 980. No problems.
 

jiggles

Banned
It looks like today is not my day. Damn. It wouldn't be so bad, but I paid up front. Now I'm over a grand lighter and nothing but buyer's remorse to show for it.
 

dmr87

Member
Now that the Gigabyte G1 exists I want that. I need that.

Luckily I didn't pull the trigger on a GTX 980 Ti yet.

Just wait for more reviews, no need to rush. Still need to see Asus, MSI and Zotac's cards.

wmyb.jpg

xmyb.jpg

ymyb.jpg
 

Trojita

Rapid Response Threadmaker
So do I keep the EVGA GTX 980 TI Stock I got from Amazon for $588 after tax plus I was able to use $200 worth of giftcards or should I get the G1 GTX 980 TI from Newegg?

Decisions decisions.................

I could always use that giftcard balance for something else.......
 
So do I keep the EVGA GTX 980 TI Stock I got from Amazon for $588 after tax plus I was able to use $200 worth of giftcards or should I get the G1 GTX 980 TI from Newegg?

Decisions decisions.................

I could always use that giftcard balance for something else.......

Honestly, so little reason to go and spend more money. I mean, if it is lying around but then again why did you get the other one knowing that nonref cards were coming out? Does not compute really.
 

Theonik

Member
So do I keep the EVGA GTX 980 TI Stock I got from Amazon for $588 after tax plus I was able to use $200 worth of giftcards or should I get the G1 GTX 980 TI from Newegg?

Decisions decisions.................

I could always use that giftcard balance for something else.......
You could sell it for a profit right now could you not? The step to the G1 should be small then.

Honestly, so little reason to go and spend more money. I mean, if it is lying around but then again why did you get the other one knowing that nonref cards were coming out? Does not compute really.
Because he got it $70 under RRP? Then got $200 as a discount.
 

cackhyena

Member
Honestly, so little reason to go and spend more money. I mean, if it is lying around but then again why did you get the other one knowing that nonref cards were coming out? Does not compute really.

nonref? Sometimes I think I know a decent amount about computers and the inner workings and then I visit a forum. Is that just another term for suped up?
 
nonref? Sometimes I think I know a decent amount about computers and the inner workings and then I visit a forum. Is that just another term for suped up?

Non-reference. The reference cards are the ones designed to the reference specs with the reference cooler. Non-reference are the cards that are custom designs from the different manufactures with custom cooling solutions. There is typically a gap of a few weeks between the launch of the reference cards and the launch of the non-reference cards.
 

Rolfgang

Member
nonref? Sometimes I think I know a decent amount about computers and the inner workings and then I visit a forum. Is that just another term for suped up?

Reference cards are the cards that are the most basic form and they are all the same, doesn't matter from which manufacturer it comes (MSI, EVGA, ASUS etc.). Non-ref(erence) cards are cards that are changed in one or multiple ways, such as overclocking, custom coolers, no voltage limit etc. and they have the capacity to have a lot of difference between them.
 
I think everyone needs to cool down a little but on that G1 card... I mean, if you REALLY don't want to OC your card yourself, then I can see the appeal. But really, nearly ANY card with an aftermarket cooler should be hitting those boost clocks, and therefore hitting that same performance, of the G1. I personally wouldn't pay the $50 extra for it.

G1 vs reference cooler, however, is a no brainer.
 

Trojita

Rapid Response Threadmaker
You could sell it for a profit right now could you not? The step to the G1 should be small then.


Because he got it $70 under RRP? Then got $200 as a discount.

Maybe. I'm not great at selling things though.

And yeah I got it because it was $70 under and didn't think the non-ref's were going to be THAT much better. But then again they were comparing them only to a stock card, so who knows.
 
I really want the gigabyte G1, but i will
with help of some pills
wait for the Fury cards. That will drop the price and increase the availability for sure.
 

Zexen

Member
Just ordered one from EVGA, currently waiting new stocks to get another one. Do we know if the ACX's PCB is the same as the Titan X/980 Ti? I suppose yes, but I'd prefer to be sure about it.
 

datamage

Member
As a first time EVGA owner, I'm pretty happy with my card. Managed to get it to a boost of 1444, without adding any voltage. (Appears to be stable, as I played GTA for several hours and ran some benches for quite some time without a hiccup.) Load temp was about 75ish.


 
How hot is acceptabel under GPUZ?

A short ingame run at standard 1080p peaked at 70 degree celsius (158F)

I never measured temps before and have no idea about standards and limits, is it a good idea to benchmark the card just to take it to the max and see if everything is running ok?
 

Rolfgang

Member
How hot is acceptabel under GPUZ?

A short ingame run at standard 1080p peaked at 70 degree celsius (158F)

I never measured temps before and have no idea about standards and limits, is it a good idea to benchmark the card just to take it to the max and see if everything is running ok?

Fans usually start working at 60 degrees Celcius and throttle at around 85 - 90 degrees Celcius. At least, that's what I thought it was, I might be completely wrong.

Benchmarking is always a good idea, just to see what the maximum temperature is what it can handle.
 
Fans usually start working at 60 degrees Celcius and throttle at around 85 - 90 degrees Celcius. At least, that's what I thought it was, I might be completely wrong.

Benchmarking is always a good idea, just to see what the maximum temperature is what it can handle.

How do you know whats the max temp it can handle? Will the benchmark tell it?
 

Rolfgang

Member
How do you know whats the max temp it can handle? Will the benchmark tell it?

You have to check the performance. When it hits a certain temperature, the performance kicks back insanely, to prevent the GPU from melting. When you see that happening, you will know at which temperature to stop.

The best way to check this is to overclock it step-by-step, I think. Maybe someone knows a piece of software that can check it.
 

dr_rus

Member
What worries me about that Gigabyte oc card is the lack of vents in the back for the extra ports. Also, one more fan on the card is one more thing that could possibly fail. I'll take the Classified with one less port, a vent, and a little slower.

This is likely to be my third WindForce 3X card and I've had zero problems with previous two - 770 4GB and 970 G1. Then again I have a Fractal Design full tower with 6 or so case fans.
 

spicy cho

Member
So do I keep the EVGA GTX 980 TI Stock I got from Amazon for $588 after tax plus I was able to use $200 worth of giftcards or should I get the G1 GTX 980 TI from Newegg?

Decisions decisions.................

I could always use that giftcard balance for something else.......
I don't understand the reasoning here. Just keep it stock or drop an evga hybrid kit on it for $100.00 which will beat any air cooler. Sell/return a 980ti to buy a 980ti? wat
 

Trojita

Rapid Response Threadmaker
Now that I've gotten a chance to read the Guru3D article, I know why the numbers don't seem to make sense.

That isn't the stock overclocked card being represented on those graphs, 3DGuru further overclocked the card via Gigabyte's provided OC'ing software.

I love to see these devices being pushed to their performance limits, but leave that for a part of the article, don't include those numbers in comparison to the other GPU's that you are leaving on stock.

Those graphs are just going to be posted without any context.
 
Is it feasible to SLI a 980ti with a Titan X? I have a Titan x and want more GPU power, but I definitely don't want to spend another $1200 for a second Titan.
 

dr_rus

Member
Now that I've gotten a chance to read the Guru3D article, I know why the numbers don't seem to make sense.

That isn't the stock overclocked card being represented on those graphs, 3DGuru further overclocked the card via Gigabyte's provided OC'ing software.

I love to see these devices being pushed to their performance limits, but leave that for a part of the article, don't include those numbers in comparison to the other GPU's that you are leaving on stock.

Those graphs are just going to be posted without any context.

980TiG1 has two stock OC levels. Both reviews of the card is showing the highest level - which is understandable since the card is running perfectly fine on it. I kinda don't know why there is a second, slower one "gaming" level actually.

OC Mode - GPU Boost Clock : 1291 MHz, GPU Base Clock : 1190 MHz
Gaming Mode – GPU Boost Clock : 1241 MHz, GPU Base Clock : 1152 MHz
 

Everdred

Member
Easiest way is to run a hardware monitoring program like Afterburner/Rivatuner and see what usage you're getting from you CPU/GPU. You're ideally looking at as close to 100% GPU usage as possible, assuming you're not framerate limited (eg VSync). If you're significantly under, you might be CPU bottlenecked. A CPU usage close to 100% will be the clue ;)

Now to answer your specific question about the 2500k, it depends on the resolution and refresh rate.

At 1080p and 60Hz neither 2500k nor 980 ti is likely to be maxed by games in the near future (better off stopping at a 980 here). At 1080p and 120/144 though your CPU could become a limiting factor in certain CPU-heavy games, particularly once you start pushing triple figure framerates.

Best way to guarantee taking your CPU out of the equation is to run at 1440p+ or downsample and make that 980 ti work for its living. Remember the GPU draws the pages (graphics), while the CPU turns the pages (framerate). If your GPU is drawing each page faster than the CPU can turn them then you are CPU limited. At higher resolutions the GPU has a heck of a lot more to draw meaning the CPU can keep up!
Great explanation, thank you.
 
Top Bottom