• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia announces the GTX 980 Ti | $650 £550 €605

Wag

Member
Damnit! I don't know what's going on but since I got my second 980Ti I've been having all sorts of problems. Now my PC is shutting down with only 1 card in it during desktop use. I think it's dead. 😢
 
Also... Amazon is just plain dirty.

PwUlnqa.png


List Price: $1,325.24
You Save: $625.25

What a joke.

Oh that's news to me. Last I checked, this was sold out. Is the G1 considered the "best" 980 Ti out there from those benchmarks or are there better ones?
The various brands all have their pros and cons. The gigabyte does seem to have one of the best coolers. But it's apparently so heavy that it sags on the motherboard and it gets a ton of complaints about coil whine.
 

mario_O

Member
In the Netherlands the shops are filled with it... But that might have to do with the fact that MSI has an office here.

So would doing games at 1440p at 144hz be a reasonable expectation of this card?



Amazon UK has it for a good price, only 10 more though.

Still waiting for them to get the G1 gigabyte version though, the MSI one doesn't fit in a air 240 :(.

I finally got it from Alternate ( thanks for the tip-off sindrom101), they're selling them for 799 euros. Pricey, but I couldn't wait any longer. It will be mine on monday. :)))))))))))
 
Yeeeaahhh...

Well, the Gigabyte G1 is in stock, but the EVGA SC+ is not in stock.

I want another 980TI for SLI. Thinking about getting the Gigabyte. Not a great idea aesthetically, but that would still work fine though, right? I mean there's no reason why it wouldn't. And the Gigabyte would be cooler, so I'd use that for whichever card is normally hotter in SLI. Throw the same custom bios on both cards to make it easier to hit the same clock speeds....

Someone should talk me out of this fast cuz I'm about to do it :p
 

Rolfgang

Member
Yeeeaahhh...

Well, the Gigabyte G1 is in stock, but the EVGA SC+ is not in stock.

I want another 980TI for SLI. Thinking about getting the Gigabyte. Not a great idea aesthetically, but that would still work fine though, right? I mean there's no reason why it wouldn't. And the Gigabyte would be cooler, so I'd use that for whichever card is normally hotter in SLI. Throw the same custom bios on both cards to make it easier to hit the same clock speeds....

Someone should talk me out of this fast cuz I'm about to do it :p

I assume you have the EVGA SC+ at the moment? That's not really Fashion Souls man, get your shit together!
 

Skyzard

Banned
How much do you expect to lose on 2 cards when upgrading to pascal next year?

Or is it best not to workout or even approximate it :p
 
I assume you have the EVGA SC+ at the moment? That's not really Fashion Souls man, get your shit together!

I never look in my tower unless I have to fix something. It's out of view. I gave up on fashion a looooong time ago. Maybe I'll color coordinate my next build. And maybe by then I won't be so afraid of water cooling to finally do it.

How much do you expect to lose on 2 cards when upgrading to pascal next year?

1. I don't expect the first Pascal card to be the 100% peformance boost that people seem to think it will be. But I'll be pleasantly surprised if it is and will upgrade to that.

2. Don't really care all that much about how much I'm going to lose. There's probably a lot of better things I could do with $700 but I'm spending it on a single GPU instead...

What about the year after? How much will you lose upgrading to next gen Pascal/whatever? You can do that forever. If that were the case why ever buy anything when something new will eventually be better?

Pretty much
 

baphomet

Member
How much do you expect to lose on 2 cards when upgrading to pascal next year?

Or is it best not to workout or even approximate it :p

What about the year after? How much will you lose upgrading to next gen Pascal/whatever? You can do that forever. If that were the case why ever buy anything when something new will eventually be better?
 
At near max settings? 144FPS? Hell no. No single card on the market can get anywhere near that goal. You'd have to have dual or even triple SLI to approach that with modern games.

Oh, then what's the fuss about them 4k then if we can't even do 1440p 144hz yet? I don't really get it, what's the big hotness? 4k? 1440p? 144hz?
 

Rolfgang

Member
I never look in my tower unless I have to fix something. It's out of view. I gave up on fashion a looooong time ago. Maybe I'll color coordinate my next build. And maybe by then I won't be so afraid of water cooling to finally do it.

Oh, I also want to color coordinate my next PC, currently I'm mixing so many colors it could be the Official PC for the LGBT-community. But SLI-ing GPU's with two different aftermarket coolers? Nah, that's just a sin according to the Book of Gaben. In the old ánd the new testament!

Oh, then what's the fuss about them 4k then if we can't even do 1440p 144hz yet? I don't really get it, what's the big hotness? 4k? 1440p? 144hz?

4K is the new hotness, primarily because 1440p is not a TV-resolution, so most people aren't even aware of 1440p. But 1440p is the sweet spot for a single 980 Ti for 60+ fps and with a 144hz G-sync monitor for that whole smooth range, it's just pure gold.
 

cackhyena

Member
Oh, then what's the fuss about them 4k then if we can't even do 1440p 144hz yet? I don't really get it, what's the big hotness? 4k? 1440p? 144hz?
I'm curious about this as well. I plan on getting a G1 soon and waiting till the holidays for a gsync, 1440p monitor. At that resolution I'm going to have to dial stuff back to make it work?
 

Rolfgang

Member
I'm curious about this as well. I plan on getting a G1 soon and waiting till the holidays for a gsync, 1440p monitor. At that resolution I'm going to have to dial stuff back to make it work?

No, 1440p is the sweet spot for a single 980 Ti. 1080p is old news by now for high end PC gaming and 4K is just a step too big (it's 4x 1080p).
 

baphomet

Member
I'm curious about this as well. I plan on getting a G1 soon and waiting till the holidays for a gsync, 1440p monitor. At that resolution I'm going to have to dial stuff back to make it work?

Why the need to hit 144fps? Anything above 90 looks the same really. You get a 144hz gsync display so you DONT have to max 144fps and still have extremely smooth video.
 

cackhyena

Member
No, 1440p is the sweet spot for a single 980 Ti. 1080p is old news by now for high end PC gaming and 4K is just a step too big (it's 4x 1080p).
Well then what's digitalrelic talking about? Is it the 144mhz that can't be done?

Why the need to hit 144fps? Anything above 90 looks the same really. You get a 144hz gsync display so you DONT have to max 144fps and still have extremely smooth video.

Ah, I'm not well versed at all with this stuff. I'm still on a plain old 1080p monitor.
 

Rolfgang

Member
Well then what's digitalrelic talking about? Is it the 144mhz that can't be done?

Yeah, it's about the refresh rate and fps. On a recent game you aren't going to hit 144 fps on 1440p. With a 144hz G-sync monitor it doesn't mean you have to get 144 fps to be free of screen tearing, you just have a smooth range from 30 - 144 fps. Non G-sync 144hz monitors however are more meant for games such as Counter-Strike on an professional level, where you can easily hit 144 fps. But to be honest, those people aren't going for a 1440p panel, they just stick with 1080p.
 

Skyzard

Banned
1. I don't expect the first Pascal card to be the 100% peformance boost that people seem to think it will be. But I'll be pleasantly surprised if it is and will upgrade to that.

2. Don't really care all that much about how much I'm going to lose. There's probably a lot of better things I could do with $700 but I'm spending it on a single GPU instead...

Why not 100%? Think they'll hold it back on purpose? Then the SLI guys with more cash won't be as likely to upgrade. Or did they really just rock it with maxwell...
It's gonna have to be high benefit if it requires new pc parts too.


I thought you were asking for someone to talk you out of it :p

That's my goto for not getting SLI. That and "my extra" £600 being disabled in an sli fix. Plus, who needs more issues. The graphics would be insane when it's all good though, I'm waiting a year for it :p

What about the year after? How much will you lose upgrading to next gen Pascal/whatever? You can do that forever. If that were the case why ever buy anything when something new will eventually be better?

About 50% less of a loss with 1 than with 2 :p
 

Rolfgang

Member
Amazon doubles the price on nearly everything and then acts as if you are saving half.
Could depend on what kind of shopper you are though.

I see retails shops do it often when they are having stuff on sale. A retailer in The Netherlands acts as if normal edition games are usually sold for 90 euros when they are throwing stuff on sale, banking on the ignorance of people who don't know that and their primary audience is kids and their mothers and fathers. I think it's a scumbag move and plain lying (it's not 50% off, it's only 15% off for example) and I don't buy anything from them. Maybe only in the case something is really cheap, but well... that never happens anyway.
 

Skyzard

Banned
Not SLI. 2 different brands of 980TI. I'm definitely getting 2 of these cards. It's just I'm tempted to run a Gigabyte with my EVGA.

I read comments years ago about that not being the best to do but people replied saying it just uses the lowest clocks or whatever anyway and it was about different cards...

If it's amazon they should let you take it back anyway.

SLI would match up all the timings these days surely, even for voltage or other timings I don't know - if that makes a difference for synching?
Do nvidia have an official stance on it?
 

spicy cho

Member
Anybody have a good guide for overclocking maxwell cards? I checked my ASIC - 78.6 on an EVGA ACX 2.0 SC. I won't go for an OC for a while though since this thing pounds everything I throw at it at 2560x1600... Might do it in the winter though.
 

Rolfgang

Member
Believe me I've considered it. Could probably get a decent bit of money for it too considering it's still very new and has 79.6% ASIC quality.

If you can proof the ASIC quality and stable boost clocks, than you can surely get quite some money for it. I know quite a lot of people that are hunting for second hand GPU's with a proven OC, compared to the silicon lottery of a new GPU.
 

Skyzard

Banned
Nvidia's stance on it:

Can I mix and match graphics cards from different manufacturers?

Using 180 or later graphics drivers, NVIDIA graphics cards from different manufacturers can be used together in an SLI configuration. For example, a GeForce XXXGT from manufacturer ABC can be matched with a GeForce XXXGT from manufacturer XYZ.

Reading forum posts elsewhere they're saying it's okay too but you might want to check deeper if you're still concerned.


Thing is though, you don't want to get a shitty card that can't keep up. I'd buy from amazon, in case it's pretty ugly (for you :p ... 1450? ;0). Or maybe that won't matter so much once you are sli'ing anyway.
 
If you can proof the ASIC quality and stable boost clocks, than you can surely get quite some money for it. I know quite a lot of people that are hunting for second hand GPU's with a proven OC, compared to the silicon lottery of a new GPU.

How would you prove something like that?

From what I'm told, I wouldn't be able to use the same custom bios I'm using on my EVGA with a Gigabyte.

I should really learn how to edit those myself....
 

Qassim

Member
No clue, just read something about that but not sure at all.

Won't there be new processors to buy too? Seems to usually expect a new motherboard, no?

It's highly unlikely that Pascal would require a new motherboard or any new particular hardware unless you have a very old system right now (in which case the same would apply to the current lot of cards).
 

Skyzard

Banned
It's highly unlikely that Pascal would require a new motherboard or any new particular hardware unless you have a very old system right now (in which case the same would apply to the current lot of cards).

What I'm getting from this about nvlink:
http://www.techenablement.com/the-m...-or-hello-pascal-bye-bye-pci-bus-limitations/

Is that we don't know for sure what it will need but also that it doesn't have benefits (pascal still probably will by itself, sure) unless you're in sli from what I'm reading. Needs a new cpu which have yet to be made, which will definitely require new mobos right? To fully take advantage of it all anyway.
 
What I'm getting from this about nvlink:
http://www.techenablement.com/the-m...-or-hello-pascal-bye-bye-pci-bus-limitations/

Is that we don't know for sure what it will need but also that it doesn't have benefits unless you're in sli from what I'm reading. Needs a new cpu which have yet to be made, which will definitely require new mobos right? To fully take advantage of it all...

No. Like a lot of things that Nvidia have announced lately, NVLink is for high performance supercomputing. Specifically, it's an interconnect for when you're going to put hundreds or thousands of GPUs together which are going to require communication faster than the PCIe bus. NVLink wouldn't have a real effect on the performance of GPUs gamers require because it's not fast enough for real time rendering and PCIe isn't really hit and is never the bottleneck in the vast majority of games.
 

Skyzard

Banned
"Vast majority" doesn't sound good enough tbh (going forward) :p
Also it says it's not just for gpu to gpu though:

interface for CPU to GPU and GPU to GPU point-to-point
Special CPUs with proprietary silicon on-chip interfaces will be able to communicate via NVlink to entirely bypass the PCI bus.

Will that have no benefits for gaming? -including non-sli... is that direct cpu to gpu communication purely to sort out the multi-gpu stuff?

If they're going to try and merge the two initially...maybe they can make super beefy cards once people have switched mobos, or started to.


--wait by not fast enough for real time rendering...the thing we do now? Not a future technique or anything...
 

paskowitz

Member
If you can proof the ASIC quality and stable boost clocks, than you can surely get quite some money for it. I know quite a lot of people that are hunting for second hand GPU's with a proven OC, compared to the silicon lottery of a new GPU.

It depends on the customer though. From what I have been told, a lower (70-75%) is better for water cooling and LN2. On air, higher ASIC is better.
 
Anyone else have problems with Gigabyte OC Guru? This thing will not launch no matter what I do. It shows up in the task manager for a second after launch, and then immediately removes itself with no warning whatsoever.
 

Rolfgang

Member
It depends on the customer though. From what I have been told, a lower (70-75%) is better for water cooling and LN2. On air, higher ASIC is better.

True. Altough I wouldn't consider 75% a lower ASIC rating, more like <70%.

Anyone else have problems with Gigabyte OC Guru? This thing will not launch no matter what I do. It shows up in the task manager for a second after launch, and then immediately removes itself with no warning whatsoever.

Nope, it just starts as normal, no problems. Are they maybe more instances of it in the Task Manager? Sometimes it's already running one in the background that's not starting and then a second won't start either. If so, close it and start OC Guru again. If not... have you tried turning it off and on again?
 
"Vast majority" doesn't sound good enough tbh :p

It's the polite way of saying "if you're not a god damned moron". Because technically, if you're filling up a small amount of VRAM running at ultra settings and your VRAM is thrashing like crazy then yes, having an 80GB/sec bus instead of a 15GB/sec bus would make the game laggy instead of a slideshow.

Also it says it's not just for gpu to gpu though:

Will that have no benefits for gaming? -including non-sli... is that direct cpu to gpu communication purely to sort out the multi-gpu stuff?

Well the bulk of the data in your VRAM is going to be texture data which needs to be in VRAM before the frame starts to be rendered. The bulk of the data that's streamed to the GPU each frame is a command list which can be measured in maybe hundreds of megs a second at most. Nvidia GPUs already have a bridge for SLI so that's not hitting the bus.

So we've really got nothing to stuff down the pipe that's overwhelming x16 3.0 and high end graphics cards are already stuffed with VRAM allowing gobs of high resolution textures.

So I suppose the question is, how would a medium bandwidth interconnect help in the rendering process? Short answer: it probably wouldn't. I mean if you want to believe we're all going to be forced to buy new motherboards for the sake of having some controversy go nuts. I can't really stop you. But when Nvidia says "GPU-Based Server":


when they're talking about NVlink performance I tend to believe this isn't for us mere mortals.
 

Skyzard

Banned
Only a small amount of VRAM? Why is that the limitation for nvlink...it doesn't allow for larger data? I don't get what I'm missing...is there a trade-off for using nvlink?

Can't we get like ultra mega textures with that technology...coupled with future cards and games that might be more demanding?
 
Only a small amount of VRAM? Why is that the limitation for nvlink...it doesn't allow for larger data? I don't get what I'm missing...is there a trade-off for using nvlink?

Can't we get like ultra mega textures with that technology...coupled with future cards and games that might be more demanding?

You gotta keep in mind that the assets used in a scene for a frame remain, for most practical purposes, relatively constant. That's why we have large VRAMs with ultra high throughput. Keep in mind when you're gunning for 60fps you need to pull the entire scene together in 16.667ms. So we make our memory bandwidth as high as possible (hundreds of gigabytes per second) so that we can do this repetitively and we make our frame buffers big enough to hold all the assets we might need for a scene at the resolution and frame rates we're targeting. VRAM might as well be literally one giant L3 cache for the GPU.

HPC compute workloads involves both streaming large amounts (terabytes, petabytes maybe) of ever changing data through the vector units as fast as possible and getting the results of a calculation to possibly another unit to be worked on as fast as possible. In those situations a higher bandwidth interconnect is invaluable.
 

Skyzard

Banned
I was thinking for like really open world games with fast vehicles... maybe more physx stuff?

Obviously not something your average game would use but when does that stop them from adding new stuff if they can advertise it...I guess if it costs a lot for each one to add extra to it... I get they're advertising it for server stuff now... still though. Maybe with the next pascal series and they'll make their own game for it and bring in a big crowd to pc gaming :p.
 
I was thinking for like really open world games with vehicles... maybe more physx stuff?

Obviously not something your average game would use but when does that stop them from adding new stuff...I guess if it costs a lot for each one to add extra to it... I get they're advertising it for server stuff now... still though.

Again, stuff like that is GPU performance limited rather than interconnect limited. Maybe if Geralt's hair was saturating the PCIe bus it might help but it doesn't.
 

Skyzard

Banned
Well, that sucks. Would have been nice to have a solution for hairworks ;). Nah I'm not buying a physx card :p. They'll never put that many hairs in...

Nvidia needs to start a game dev team. We don't need theatre, just do gameplay and graphics :) I guess they could sponsor some stuff but people would probably still go for other settings with proper sli if it was great. Even with their own game... nvm.

I was thinking more like a game with shitloads of textures that you need at a fast rate and having maybe more physx stuff possible, even if they were small calculations...?

I dunno, anyway. I would thank you for bearing with me but you did flip out a bit :p
 
Anyone on these boards having issues with coil whine and the g1? I ordered one but havnt gotten all the parts for my build yet, so I don't know if I'm affected or not.
 
Top Bottom