Nvidia GTX 980/970 3DMark Scores Leaked- from Videocardz.com

Man, I am still running a lowly single GTX560Ti and honestly it runs everything just dandy for me. I think when they double the number of what I have now I'll buy a GTX1120Ti to replace it, lol.


Maybe it's soon time I upgrade....
 
Man, I am still running a lowly single GTX560Ti and honestly it runs everything just dandy for me. I think when they double the number of what I have now I'll buy a GTX1120Ti to replace it, lol.


Maybe it's soon time I upgrade....

Even my upgrade from 6950 (roughly same performance as your card) to 680 was a big fucking step up for me. Can't even imagine how good it will feel when you upgrade. :P
 
Man, I am still running a lowly single GTX560Ti and honestly it runs everything just dandy for me. I think when they double the number of what I have now I'll buy a GTX1120Ti to replace it, lol.


Maybe it's soon time I upgrade....

The rub is we really don't need to upgrade at all.

560 ti runs current-gen games just fine. But I want to turn all those bells and whistles on.
 
i actually had two 560ti in SLI and yeah for the most part games ran fine. I ended up selling them though as i was starting to chafe with only 1gb of VRAM and i figured sell them now while they were still worth something . ( I bought each for $200, had them for 3 years and sold them each for $99, not back imo)

At this point i've been running on just my i7;s video and i'm getting sick of it. waiting for nvidia's new card. but i'm not paying that kind of money. a factory OC'ed 290 for $350 bucks is sounding better and better ($150 out of pocket)
 
The rub is we really don't need to upgrade at all.

560 ti runs current-gen games just fine. But I want to turn all those bells and whistles on.

yeah. I feel like I'm in the same boat. My 660ti can run most games on high, but without the extras. And I fear for the "next gen" Witcher 3, GTA V, Star Citizen sort of games.
 
The rub is we really don't need to upgrade at all.

560 ti runs current-gen games just fine. But I want to turn all those bells and whistles on.

What concerns me is that games like The Witcher 3 might not even run on these new cards that well. A GTX980 to run the game on 1080p and high details (not max) and consistent 30 fps would be disappointing. I'm hoping that Ryse might give us a glimpse of the performance we can expect from other more demanding games in the future.
 
Why do you think TW3 might not run well on a GTX 980?

It's just a personal assumption based on nothing. TW3 feels like one of the first real "next-gen" games we're getting on PC and perhaps that's the reason I consider it as more demanding than what we already have on the platform. I don't remember TW2 running that well at launch either, so my judgment is probably a bit clouded.

If the consoles can do it.....

I expect my GTX670 will be able to handle the game perfectly well at 30fps.

Its getting it to 60fps that makes me nervous.

That's a good one as well and it's not only 60 fps. As someone who's excited about VR, the thought that I would need to run games at 2560x1440@min 75hz to get a good experience, gives me a bit of a headache. But that's at least another year or two for me away, so I try not to worry about it now.
 
Why do you think TW3 might not run well on a GTX 980?
2504542-3840569450-25044.gif


That in 2560x1080, 4xMSAA and maxed out with constant 60fps? I'm not shure if a single GPU could handle it.
 
Pretty sure someone from CDP has said that a 780Ti won't run the TW3 maxed out at 1080p and achieve a stable 60FPS.

But that most likely include übersampling too though.
 
980 @450€ (definitely not 600) would tickle me, but then again, there is nothing coming out that really warrants a new card at the moment. I'm not hyped for the Ubisoft stuff and I'll get GTAV on PS4 first... Still, new card releases are always exciting. :D
 
That's a good one as well and it's not only 60 fps. As someone who's excited about VR, the thought that I would need to run games at 2560x1440@min 75hz to get a good experience, gives me a bit of a headache. But that's at least another year or two for me away, so I try not to worry about it now.
VR is a whole nother ballpark. CV1 is gonna make The Witcher 3 seem like child's play.
 
http://www.guru3d.com/news-story/the-witcher-3-gtx-780ti-could-push-35-45-fps-at-max.html

If it is up to Gametechs claims then The Witcher 3: Wild Hunt will be a harsh on the GPU title, so much that One 550 USD GTX 780 Ti already has a hard time to keep up with the game engibe, under the condition that higher levels of AA are enabled. The website mentions that the PC version runs HQ and 8xMSAA at 35-45fps on a GTX 780Ti at 1080p.
A 980 won't be enough, even if you turn down the MSAA option.
 
Trying to keep up with maxing out the newest games is an exercise in insanity.

Trying to max out Witcher 3 is another tier above that.
 
A 980 won't be enough, even if you turn down the MSAA option.
For 30fps, it obviously will be. It says it right there. Even with 8xMSAA, its getting over 30fps.

And that build will have been over 6 months old by the time the game comes out, so there's a lot of time for optimization still as well.

EDIT: Oh, you're talking about 60fps now. Yea, I'm pretty sure a single 980 will be able to do it with reasonable settings. Simply turning it down to 4xMSAA will give a fair bit of performance.
 
Pretty sure someone from CDP has said that a 780Ti won't run the TW3 maxed out at 1080p and achieve a stable 60FPS.

But that most likely include übersampling too though.
Highly unlikely that he meant with ubersampling. I don't think a single 780ti would get anything above 20fps on ultra with ubersampling
 
For 30fps, it obviously will be. It says it right there. Even with 8xMSAA, its getting over 30fps.

And that build will have been over 6 months old by the time the game comes out, so there's a lot of time for optimization still as well.

EDIT: Oh, you're talking about 60fps now. Yea, I'm pretty sure a single 980 will be able to do it with reasonable settings.
30fps gaming on a PC? ^^

One of many reasons why I'm playing mainly on my PC is because of the higher framerate ;)
 
A 980 won't be enough, even if you turn down the MSAA option.

Don't know how demaing MSAA is in this game, but 8xMSAA is usually really demanding in deferred engines. Use 2xMSAA and add some SMAA and it will probably run fine at 1080p with 60FPS.
 
Trying to keep up with maxing out the newest games is an exercise in insanity.

Trying to max out Witcher 3 is another tier above that.

And 8xMSAA, lol.

I'm running a single GTX 780 (1150 MHz) and a 2560x1440 panel. Though I don't believe I will be getting a consistent 60 FPS with TW3, I have faith my 780 will perform well enough (not talking 30 fps, that crap makes me sick; more like 45-60 fps with occasional/brief drops) with 2xMSAA/SMAA and high-ish settings. It will look glorious too, but if Nvidia or AMD actually released something notably better than my 780 I would love to purchase one. Unfortunately the 980 doesn't seem to be that product based on leaks, but maybe it'll clock like a monster.
 
With these rumored prices, I'm leaning more torwards a R9 290X. I know that card does well in 4k, is it just as good in 1080p?
 
Since this was brought up here and there's really no where else to ask:

Does some of the system's "176" Gb/s bandwidth get used elsewhere besides the GPU and CPU (or by non-gaming applications)?

OT: If the 970 ends up being ~$350, I might have to get that for my first desktop build, unless maybe the 960 is quite good too and can overclock to near a 780 for ~$250 (although maybe that's hoping for too much).
 
Well, they don't have 6GB 780ti so you won't be able to game at higher res than 1080p reliably.

For 1080p 780ti will last you a while.

1440p with 780ti and G-sync monitor and I couldn't be happier.

I'll upgrade next year if there is a significant improvement.
 
Its slightly preferable to a 780Ti, sure, but it still leaves AMD with the better bang-for-buck and I'd be amazed if they really let AMD have that.
Why would you be amazed? They've "let AMD have that" for well over 2 years, and still sell significantly more dedicated graphics cards the last time I checked.
(And, of course, I'll add my obligatory objection of defining "bang-for-buck" solely as FPS/$)

That in 2560x1080, 4xMSAA and maxed out with constant 60fps? I'm not shure if a single GPU could handle it.
If a single GPU can handle "maxed out" settings of anything at release then there aren't enough settings.

A 980 won't be enough, even if you turn down the MSAA option.
Are you really aware of the performance impact of 8xMSAA?
 
Man, I am still running a lowly single GTX560Ti and honestly it runs everything just dandy for me. I think when they double the number of what I have now I'll buy a GTX1120Ti to replace it, lol.


Maybe it's soon time I upgrade....

same, I got this card when Skyrim came out and its done everything perfectly to my eye on a 1080p 21" monitor. I play a ton of Payday 2 and it never has a problem.

I said to myself last year that I would buy the best <$300 card I could get when GTA5 hits the PC so up to this point I've been looking at a 760 2GB.
 
Just so everyone knows, the TW3 is completely forward rendered. MSAA wont be nearly as expensive as in... Crysis 3... for example.
Absolutely, but 8xMSAA still usually has a >2x performance impact in a forward renderer. So if it does that at 35+ FPS and you remove the MSAA, well...
 
Just so everyone knows, the TW3 is completely forward rendered. MSAA wont be nearly as expensive as in... Crysis 3... for example.

Is TW3 completely forward rendered? My understanding is it uses a combination of deferred and forward+ rendering. For example here's one of the devs talking about the difficulties of making the deferred pipeline work with the XB1's limited ESRAM:

http://gamingbolt.com/targeting-ful...nge-constantly-optimizing-usage-witcher-3-dev

Anyhow even with normal forward rendering (non-plus) the cost of 8xMSAA compared to 4x isn't negligible. You're right it scales better than deferred MSAA samples though.

There's probably an argument to be made that putting out a game that can only stress the latest single card setup when you use incredibly high AA settings isn't putting the hardware to the best use to begin with.
 
what is this forward render stuff? if you don't mind me asking.
It's a new name for what we just used to call a "renderer" for 2 decades, until deferred shading came along. It's basically not doing that.

I don't really have time to explain it in detail, but you can read about it at the link. The idea is that with deferred shading, you render material, screen-space normal vector and other data to a set of buffers, and then perform the lighting and other material/shading calculations in screen space. The traditional "forward" way is to perform those calculations immediately when rendering each polygon. The advantage of the deferred approach is that you don't shade multiple times per pixel in case of overdraw, and that its less effort to add a large number of lights, but one disadvantage is that it breaks hardware MSAA.
 
Is TW3 completely forward rendered? My understanding is it uses a combination of deferred and forward+ rendering. For example here's one of the devs talking about the difficulties of making the deferred pipeline work with the XB1's limited ESRAM:

http://gamingbolt.com/targeting-ful...nge-constantly-optimizing-usage-witcher-3-dev

Anyhow even with normal forward rendering (non-plus) the cost of 8xMSAA compared to 4x isn't negligible. You're right it scales better than deferred MSAA samples though.

There's probably an argument to be made that putting out a game that can only stress the latest single card setup when you use incredibly high AA settings isn't putting the hardware to the best use to begin with.
I thihnk he was making a generic statement about deferred rendering on xb1 and not refering to the tw3 engine. Everything I have read about TW3 engine points to it being completely forward.
" A flexible renderer prepared for deferred or forward+ rendering pipelines :"
It apparently can do both!
 
I thihnk he was making a generic statement about deferred rendering on xb1 and not refering to the tw3 engine. Everything I have read about TW3 engine points to it being completely forward.

It apparently can do both!

I was meaning the section where he says:

“I would say that targeting Full HD with a complex rendering pipeline is a challenge in terms of ESRAM usage,” Balazs said to GamingBolt. “Especially when this pipeline is built on deferred rendering."

If i had to guess it's using tiled deferred for generic materials, and re-using that tile light information for forward+ to do exotic materials and non-opaque geometry. If you look up RedEngine 3 it has support for both pipelines like you say.

Anyhow sorry for the thread derail, I just wanted to say that I hope newer games aren't only being made to stress the 980 at very high MSAA or SSAA levels. I'm one of these 560ti owners looking at finally upgrading again, and I'm looking forward to more details on the architectural changes when the big reveal comes at the end of the week.
 
Looks more and more like I will be getting a 970 instead of 980. Will still be more than enough performance to keep me busy until the full maxwell card arrives next year.

I don't want to stop believing, but it's almost hopeless by now. :(
 
The discussion over the last few pages brings up a concept that I'm really not fond of.

"Can't run X game with it."

What the person is actually saying is, "Can't run X game with every setting maxed out at (insert >1080p resolution) with 60 FPS".

Those are two very different things. Crysis, on High, was much more pretty than any other game out there at the time.

I wish that sort of sentiment wasn't as vague as it often is.
 
Why would you be amazed? They've "let AMD have that" for well over 2 years, and still sell significantly more dedicated graphics cards the last time I checked.
(And, of course, I'll add my obligatory objection of defining "bang-for-buck" solely as FPS/$)

If a single GPU can handle "maxed out" settings of anything at release then there aren't enough settings.

Are you really aware of the performance impact of 8xMSAA?

Preach it

also forward renderer for witcher 3 would be nice... msaa actually working instead of still needing 7different types of post and temporal aa to get rid of jaggies especially in motion

I prefer my games to look clean without having a dozen glorified photoshop filters doing this to it :
:p
 
How do you go about comparing graphics cards from their specs? If those GTX 980 specs are correct, is the GTX 980 better or worse than the 780Ti (both the stock 780Ti and the ~1GHz versions from Gigabyte etc)? It's listed as having a lot fewer cores, lower TFLOPs, less bandwidth etc, but much higher clock speed and pixel fillrate. Which of those stats makes the biggest difference? Sorry if this is a pretty basic question, I'm quite new to this and looking to buy my first graphics card once the 980 has been officially announced (buying either the 780Ti or the 980, whichever is better assuming there's not a huge price difference).

Is it likely that there will be full reviews and performance comparisons as soon as the cards are announced, or will that take longer?
 
So what games at the unveiling event?

Witcher 3? Doom? Ryse? Homefront? Project Cars? Star Citizen?

What do you guys think?

Experience never-before-seen PC gaming technologies and be the first to hear about incredible new products and game announcements.
Watch exclusive partner interviews with Epic, Ubisoft, Boss Key, NCSoft, Blizzard, Wargaming, SOE, Deep Silver, 2K Games, Warner Brothers, Respawn, Red Bull, and many others.

Games from those people.
 
Top Bottom