Nvidia GTX 980/970 3DMark Scores Leaked- from Videocardz.com

20140910_214555zwjp7.jpg


Gotta have that VRAM.

That is why I want this: https://www.youtube.com/watch?v=KnrxNfxRK_4


You get two 27¨' screens in one with no bezel and only a 20% performance penalty over 2560x1440. I like the cost association of this compared to going dual screen 4k 27/30''. and the bezel... grrr...
 
That is why I want this: https://www.youtube.com/watch?v=KnrxNfxRK_4


You get two 27¨' screens in one with no bezel and only a 20% performance penalty over 2560x1440. I like the cost association of this compared to going dual screen 4k 27/30''. and the bezel... grrr...

Thats sort of what you are seeing in that picture, the center monitor is a 21:9 29" screen, the outside monitors are 16:10, resolution is 6400 * 1080, I also dont like the bezels too much but with the 21:9 screen there the bezels are really far apart than before when my center screen was 16:10.
 
Sooooo, is SLI a good or bad idea these days? I really need a new card ASAP and as such I doubt I'll be able to wait for the reference 980Ti/Titat 2 and as such will in all likelyhood grab a 980 when it's available and then maybe another one around when the Witcher 3 comes out.

Does it effectively double my power (I know vram doesn't get doubled) or do games have to have proper SLI profiles patched in to take advantage of running duel-cards?
 
I have sli'd 670s and it looks like I may still be waiting on an upgrade - 980 looks to beat them out, but not by a large enough margin. More Vram would be nice, but I only play @1080 currently. Hopefully I am pleasantly surprised on the 17th.
 
Sooooo, is SLI a good or bad idea these days? I really need a new card ASAP and as such I doubt I'll be able to wait for the reference 980Ti/Titat 2 and as such will in all likelyhood grab a 980 when it's available and then maybe another one around when the Witcher 3 comes out.

Does it effectively double my power (I know vram doesn't get doubled) or do games have to have proper SLI profiles patched in to take advantage of running duel-cards?

Its never a full 100% double in performance that I recall seeing. Usually 80-90% better I think.
 
Sooooo, is SLI a good or bad idea these days? I really need a new card ASAP and as such I doubt I'll be able to wait for the reference 980Ti/Titat 2 and as such will in all likelyhood grab a 980 when it's available and then maybe another one around when the Witcher 3 comes out.

Does it effectively double my power (I know vram doesn't get doubled) or do games have to have proper SLI profiles patched in to take advantage of running duel-cards?

I've not looked into SLI for years...since I had dual Radeon 4890s, actually. I am extremely sensitive to micro-hitching, so the increased performance wasn't worth it for me. I'd be curious to know if that kind of stuff exists anymore.
 
Well, games haven't really needed more horsepower thanks to consoles, aside from a few titles and effects. VRAM is more of an issue honestly.

Doesn't have anything to do with this
Last gen when gpus were already 4x more powerful than the consoles amd and nvidia still made 50-100 percent of performance improvement with each new line of cards, and the gen before that, and the one before that.

There's no point damage controlling or making up excuses for what is happening.

Question is when.


And so they spend less, progress slows even more, perhaps AMD even gets out of the business altogether and then I can just justify it all by saying 'that's capitalism', right? ;)
Well yeah, that's why I said self fulfilling prophecy:p
In the end they'll try to guilt trip the consumer for it too, like we'll have been responsible.
 
Its never a full 100% double in performance that I recall seeing. Usually 80-90% better I think.

I've not looked into SLI for years...since I had dual Radeon 4890s, actually. I am extremely sensitive to micro-hitching, so the increased performance wasn't worth it for me. I'd be curious to know if that kind of stuff exists anymore.

Hmmm, guess I'll wait on reviews. Cheers.
 
Shouldn't the 980 and 970 be compared to the 680 and 670 since the are all mid entry cards for their associated architecture.

I think it's rather impressive that a mid entry card from the new architecture is giving the same performance as the old architecture's high end cards.
 
Shouldn't the 980 and 970 be compared to the 680 and 670 since the are all mid entry cards for their associated architecture.

I think it's rather impressive that a mid entry card from the new architecture is giving the same performance as the old architecture's high end cards.

I guess it all depends on what price bracket they are sold in.
 
Sooooo, is SLI a good or bad idea these days? I really need a new card ASAP and as such I doubt I'll be able to wait for the reference 980Ti/Titat 2 and as such will in all likelyhood grab a 980 when it's available and then maybe another one around when the Witcher 3 comes out.

Does it effectively double my power (I know vram doesn't get doubled) or do games have to have proper SLI profiles patched in to take advantage of running duel-cards?
It's probably not so bad with Gsync. I know most of my games stutter like a mo when Gsync is off, for whatever reason. Anyone want to chime in?
 
I've not looked into SLI for years...since I had dual Radeon 4890s, actually. I am extremely sensitive to micro-hitching, so the increased performance wasn't worth it for me. I'd be curious to know if that kind of stuff exists anymore.

I have a 690 and I haven't notice any stuttering for games that have proper Nvidia driver support. Some site has done a report which shows Nvidia drivers are very good at reducing noticeable stuttering. But if you're extremely sensitive you might notice it.

The one thing I hate about SLI is waiting for Nvidia to release drivers to support new games. There is currently no SLI support for Dead Rising 3 so I'm twiddling my thumb waiting. When a game does have SLI support it runs great so it's a trade off.

For my next video card I'm sticking with single cards. Less problem.
 
The one thing I hate about SLI is waiting for Nvidia to release drivers to support new games. There is currently no SLI support for Dead Rising 3 so I'm twiddling my thumb waiting. When a game does have SLI support it runs great so it's a trade off.

For my next video card I'm sticking with single cards. Less problem.

Yeah, I'm in the same boat myself. The situation with DR3 is weird: there are AMD/Nvidia build branches but the game's now a week old and neither party has released drivers. I don't think Capcom actually gave them access to early builds of the game so they could prepare drivers ahead of time and instead they've been scrambling to put together something since it released.
 
I miss when the 8800gtx came out and blew the Fuck out of everyone. Same with the Radeon 9800(9700?) pro.

We've had less than 10% performance jumps for like 3 years now. Oh well
7970 is one of the best cards released in recent times. This card was an absolute beast when overclocked and came at an attractive price.
 
I've not looked into SLI for years...since I had dual Radeon 4890s, actually. I am extremely sensitive to micro-hitching, so the increased performance wasn't worth it for me. I'd be curious to know if that kind of stuff exists anymore.
Frame pacing has been a major focus of AMD and NVIDIA with dual card solutions over the last two years. It has been mostly addressed. There's still a number of drawbacks though.
 
7970 is one of the best cards released in recent times. This card was an absolute beast when overclocked and came at an attractive price.
I think the best sweet spot card in the same generation was 7870XT, not sure why it was more limited than the other cards.
 
Yearly 10-20% increases do have pretty unsettling implications of course. Over 5 years, a 15% yearly increase means that performance has only doubled. In five years.
 
Can't be that bad. Supposedly 16nm will deliver 35%+ speed increase over 28nm. Plus TSMC's own 16FF+ line is hyped to be 15% more on top of that.
 
Yearly 10-20% increases do have pretty unsettling implications of course. Over 5 years, a 15% yearly increase means that performance has only doubled. In five years.

I'm surprised this isn't bigger news in the tech media than it is (i know it's been talked about, but..). In the big picture, even though no one is to blame per se, it's pretty damn close to being a full-blown crisis both performance wise and price wise. The delays are sad in itself. In some ways it also has an impact on the future of VR. For me who have been following the scene for decades it feels kind of funny to be "forced to accept" the relatively weird situation.
 
In sort of related news, someone took pics of Galaxy's 970 card. They also got a CPU 3DMark score on an i3 for some reason.
Galaxy-GeForce-GTX-970-GC-4GB-4-850x446.jpg


Also, count me in the 560 Ti group, although I have the 448 core version.

Another one in the 560 Ti 448 group. I want something affordable, with a reasonable TDP and enough memory and performance to last me through this console generation at 1080p. The price has to be right.

£250 or less and I'm in. I'm expecting £300-£350.

Sooooo, is SLI a good or bad idea these days? I really need a new card ASAP and as such I doubt I'll be able to wait for the reference 980Ti/Titat 2 and as such will in all likelyhood grab a 980 when it's available and then maybe another one around when the Witcher 3 comes out.

Does it effectively double my power (I know vram doesn't get doubled) or do games have to have proper SLI profiles patched in to take advantage of running duel-cards?

If you have a relatively good perception of motion then SLI is never worth it. If you want your FRAPs counter to show a high number and want better performance in benchmarks, then knock yourself out.
 
Every time I consider a multicard setup, I run into warnings of microstutter. Now I may be back to considering a single card, but 2560x1440 @120-144hz seems like a tall order for any single card.
 
Everytime I consider a multicard setup, I run into warnings of microstutter. Now I may be back to considering a single card, but 2569x1440 @120-144hz seems like a tall order for any single card.

Not just micro-stutter, bad or even worse, no multi-gpu profiles will render your setup pretty much useless.

When it works the advantages can be awesome and it will make your game run a hell of alot better but be prepared for disappointment when you find out Nvdia or AMD (and sometimes the devs) don't have their shit together the first two or three weeks a game is out.

Edit: Will say that at that resolution and framerate, yeah no single card is gonna max out anything coming out soon, will have to tweak settings.
 
Hadoken said:
I have a 690 and I haven't notice any stuttering for games that have proper Nvidia driver support. Some site has done a report which shows Nvidia drivers are very good at reducing noticeable stuttering. But if you're extremely sensitive you might notice it.

The one thing I hate about SLI is waiting for Nvidia to release drivers to support new games. There is currently no SLI support for Dead Rising 3 so I'm twiddling my thumb waiting. When a game does have SLI support it runs great so it's a trade off.

For my next video card I'm sticking with single cards. Less problem.
.

Yep I just came from a SLI 770 setup and I sold both of them and got a 780ti. I got tired of constantly waiting on driver updates. Also there were several new games that you actually had to disable sli so they game wouldn't run like shit.
 
Question is when.


And so they spend less, progress slows even more, perhaps AMD even gets out of the business altogether and then I can just justify it all by saying 'that's capitalism', right? ;)

When TMSC and Global Foundries gets their shit together, this was all on them. AMD and Nvidia could have had their new GPUs on 20nm ready, but TSMC had have some issues on their High performance silicon.
 
Every time I consider a multicard setup, I run into warnings of microstutter. Now I may be back to considering a single card, but 2560x1440 @120-144hz seems like a tall order for any single card.
If you're running at 120-144fps you're probably not gonna notice microstutter. But it's really bad below 60fps. I got rid of my Crossfire setup because of my CPU going out of date. The games that ran at ~120fps looked fine and dandy.
 
Every time I consider a multicard setup, I run into warnings of microstutter. Now I may be back to considering a single card, but 2560x1440 @120-144hz seems like a tall order for any single card.

That extra money you were going to spend on a second card put it towards a G-sync monitor.

The difference will be far more noticeable than the extra grunt another card will bring.
 
If you're running at 120-144fps you're probably not gonna notice microstutter. But it's really bad below 60fps. I got rid of my Crossfire setup because of my CPU going out of date. The games that ran at ~120fps looked fine and dandy.

Hmmmmm... I'm not quite completely ruling it out but it's such a big investment. It would be crushing if it was more hassle than help.

That extra money you were going to spend on a second card put it towards a G-sync monitor.

The difference will be far more noticeable than the extra grunt another card will bring.

Ha, I was planning on doing both. I have always resisted the urge to go multicard in the past. Right as I start to convince myself that the benefits outweigh the drawbacks, SLI users warn me that the problems are still there.
 
Yeah, I'm in the same boat myself. The situation with DR3 is weird: there are AMD/Nvidia build branches but the game's now a week old and neither party has released drivers. I don't think Capcom actually gave them access to early builds of the game so they could prepare drivers ahead of time and instead they've been scrambling to put together something since it released.

That's likely why the game is a crashfest.
 
Can someone please explain why the scores are practically the same as a 780 TI?

The hell is that about? I expected it to blow the last iteration of cards out of the water.
 
Heh. We've always got commercial graphene to look forward to :D

I don't think it will be graphene and not until another 15-20 years. We will get to 7nm, fighting and screaming, but we will get there. Whether it's worth the cost will be the thing foundries will need to figure out.

Based on when TSMC claims to start volume production of 16nm (20nm w/FinFET) is Q1 2015. Which means late next year or early 2016 we may finally get an video card upgrade. That depends on if NVIDIA is ready to implement stacked DRAM and what AMD's plans are.
 
It's on 28nm because of chip production issues at AMD and Nvidia's primary supplier.

Ugh!

That is lame. Why even bother releasing a new line if it isn't a real upgrade?

Will this new line of cards at least be released at a cheaper price point to make up for the negligible upgrade?
 
Top Bottom