Nvidia GTX 980/970 3DMark Scores Leaked- from Videocardz.com

What do you mean by "at this price point"? This is the top-tier of the GPU hierarchy and pricing, if these cards aren't for the above-1080p crowd, then what exactly are? The new artificial $1000 pricing bracket? Yes, above 1080p performance does matter, $400-500 itself is a niche price bracket for GPUs too; your point?

I'm specifically talking about the 970. The vast majority of buyers will be using it for 1080p and below. If you've ever looked at the Steam hardware stats you'd see that gaming monitors above 1080p resolution are a very small niche. $250-$350 GPUs are a relatively high volume segment.
 
A bit above the 780 to be honest. Normally the x70 card will be marginally better than the x80 card of the previous generation.

But as it was explained before, this kinda changed with Kepler and their midrange chip line (670/680/760/770) and Big Kepler (Titan/780) line releases, right? This isn't the full Maxwell chip. The 780 doesn't have a Maxwell equivalent yet.
 
If you can sell your 780Ti for near the cost of a 980 so that you're only paying like $50-100 for it out-of-pocket, its probably worth upgrading, sure. You've got the ROG Swift it sounds like and can use every bit of performance you can get. The extra vRAM will probably come in handy at times as well.

Yeah, that seems like a good idea as long as the performance numbers turn out to be true. I'm having some trouble believing them when the 980 has 800 less CUDA cores and 112GB/s lower memory bandwidth than the 780Ti. Is the new architecture THAT efficient?
 
While these are clearly very efficient cards, I can't help wondering how much they could do if they were using twice that wattage. You can never have enough power.

Might be tempted to switch from my 7950. Even though it is powerful enough for me, it is bloody noisy at full load

You'll see that in new titan class cards ;) these are only midrange now.
 
talk about self fulfilling prophecy
In monitor threads you can't recommend a 4k monitor or express a desire for a 120 hz 1600p one because there's no gpus that 'll run games well at those settings, and here you can't ask for a more powerful gpu (and even worse: call performance at higher res irrelevant) because the market that has those monitors is too small.
Fine let's just halt progress indefinitely with this chicken or egg "problem", who doesn't love paying 500 euros for a gpu in 2014 and still dealing with the same aliasing they dealt with back in 2007.

I can only guess what your opinion on 1080p tvs was in 2006 *roll eyes*

My personal opinion is irrelevant. 1080p and below is the resolution of choice for the vast majority of people in the market for a $300-$350 GPU.
 
LOL at all the AMD hate/fear-mongering/doom. They haven't released their next gen stuff but they are already dead. Geeze guys.
 
LOL at all the AMD hate/fear-mongering/doom. They haven't released their next gen stuff but they are already dead. Geeze guys.

Seriously this. AMD absolutely have what it takes to stay competitive in the GPU market (CPUs are another depressing story however).

I thought the 200 cards were their second 28nm generation cards as the 900s are for Nivdida. They are going to have a 3rd?

The 200 series were just rehashed cards with increased memory clock speeds.
 
Not a joke considering the TDP and the fact that this is still a 28nm process.
I'd agree with you if we were talking about 250w GPUs on a smaller node beating the previous line-up by 25%.
The die is very relatively small, the prices on at least the 980 still pretty high, the performance gains still pretty small and conditional. The efficiency is impressive for Maxwell as an architecture, but these products specifically do not leverage that in a truly meaningful way and are a joke for what people are actually interested in with high-end cards (which is performance). We've got Gsync 1440p panels, 4K panels, TW3,and other such behemoth games approaching and this is the best we get when Nvidia can clearly do better even on 28nm with Maxwell? For $500? Barely better than what we already had as performance products? It's a horrible joke, I only wish people not giving a crap and defending it were a joke as well.
I'm specifically talking about the 970. The vast majority of buyers will be using it for 1080p and below. If you've ever looked at the Steam hardware stats you'd see that gaming monitors above 1080p resolution are a very small niche. $250-$350 GPUs are a relatively high volume segment.
The 980 will not scale better for higher resolutions in any case, it's the same chip with the same bandwidth and pixel fillrate characteristics, so...

P.S. When did the 970 become $250-$350?
 
The 200 series were just rehashed cards with increased memory clock speeds.
Some were, but cards like the 290/290X were entirely new.

The 980 will not scale better for higher resolutions in any case, it's the same chip with the same bandwidth and pixel fillrate characteristics, so...
It doesn't necessarily need to scale perfectly well in order to be a better card for higher resolutions. Raw horsepower still counts for a lot.
 
My personal opinion is irrelevant. 1080p and below is the resolution of choice for the vast majority of people on the market for a $300-$350 GPU.

And it has been since 2007
Now gpus in that price range are becoming powerful enough to render fancy modern games at 1440-1600p , monitors will follow.
The performance of a midrange monitor at higher resolutions definitely matters.

It's time to move on from 1080p,for everyone. It's long overdue. The midrange is no longer about 1080p from here on out.

If you're in the market for a 970 that is comparable to the previous enthusiast 780ti then you are able to start looking at higher resolution monitors. The midrange consumer just graduated to what will become the new midrange resolution. Just like the midrange moved on from 800*600 back in the 90s

(you know I'm baiting you into saying it's not comparable to a 780 or 780ti at higher resolutions right? So choose where you go from here with your argument:p)

I'm specifically talking about the 970. The vast majority of buyers will be using it for 1080p and below. If you've ever looked at the Steam hardware stats you'd see that gaming monitors above 1080p resolution are a very small niche. $250-$350 GPUs are a relatively high volume segment.

If you look at steamstats the majority of people still use some integrated turd instead of a real gpu (let alone a midrange one), steam stats are not representative of midrange gpu buyers.
And again midrange gpus keep becoming more powerful, if a midrange gpu now matches the old high end gpus meant for high resolutions, how does your argument hold up?
 
The 970 is very compelling with that price point. Glad I bought a holdover card (750ti) to last me a few months when I built my rig earlier this year, rather than investing in something else.
 
Some were, but cards like the 290/290X were entirely new.


It doesn't necessarily need to scale perfectly well in order to be a better card for higher resolutions. Raw horsepower still counts for a lot.

Naturally it doesn't, but when it's a 10% or so only improvement at around 1920x1080, what's it at 4K if it doesn't scale as well? I don't think the variance will be so big the 980 will be a downgrade or so at such resolutions, but not an improvement really (except VRAM, conveniently for Nvidia) and there's nothing better.
 
Any hope for a $499 980 and how does it compare to the 780? Looks like I can get $550 out of my 6GB GTX 780 ACX card and really can't see myself using any more than 3-4GB in the next couple of years.
 
(you know I'm baiting you into saying it's not comparable to a 780 or 780ti at higher resolutions right? So choose where you go from here with your argument:p)
This is ridiculous considering performance reviews aren't even out yet.

Stephan, it's like you're just digging for things to be mad at :P
Any hope for a $499 980 and how does it compare to the 780? Looks like I can get $550 out of my 6GB GTX 780 ACX card and really can't see myself using any more than 3-4GB in the next couple of years.
Wait for reviews.
 
And it has been since 2007
Now gpus in that price range are becoming powerful enough to render fancy modern games at 1440-1600p , monitors will follow.
The performance of a midrange monitor at higher resolutions definitely matters.

Honestly, increased resolution is very low down on my list of desires from a PC monitor. Monitors with blacks that aren't light greys would be a start. I wouldn't trade playing on my 1080p Panasonic plasma for a 4K TN panel. Contrast, game performance and motion resolution matter just as much as static resolution.

Give me a 120hz VA 2560x1080 Gsync monitor and I'll be happy. I'd rather have decent contrast and smooth gameplay before increasing resolution any further and killing my frame rates in the process.

We can start talking about higher resolutions when the displays sporting these increased resolutions aren't complete shit by every other metric.
 
It's time to move on from 1080p,for everyone. It's long overdue. The midrange is no longer about 1080p from here on out.

Even the fastest GPUs aren't ready for resolutions above 1080p (ie. 2560x1600) unless you go SLI/Crossfire and most people won't even consider dual GPUs. Sure, some games run at 60fps or above, but the games you want to stress your GPU with won't run at 60fps, in fact, even half of that is (too) often a struggle. You can turn settings down and keep the resolution, but those are compromises much like staying at 1080p.
 
Any hope for a $499 980 and how does it compare to the 780? Looks like I can get $550 out of my 6GB GTX 780 ACX card and really can't see myself using any more than 3-4GB in the next couple of years.

a leak bench says 15-20% faster than a 780ti, but not so sure if legit
 
Honestly, increased resolution is very low down on my list of desires from a PC monitor. Monitors with blacks that aren't light greys would be a start. I wouldn't trade playing on my 1080p Panasonic plasma for a 4K TN panel. Contrast, game performance and motion resolution matter just as much as static resolution.

Give me a 120hz VA 2560x1080 Gsync monitor and I'll be happy. I'd rather have decent contrast and smooth gameplay before increasing resolution any further and killing my frame rates in the process.

We can start talking about higher resolutions when the displays sporting these increased resolutions aren't complete shit by every other metric.
Though you did say 2560x1080, there is an Eizo Foris FG2421 that is 1080p, VA, 120Hz with native strobing.
Even the fastest GPUs aren't ready for resolutions above 1080p (ie. 2560x1600) unless you go SLI/Crossfire and most people won't even consider dual GPUs. Sure, some games run at 60fps or above, but the games you want to stress your GPU with won't run at 60fps, in fact, even half of that is (too) often a struggle. You can turn settings down and keep the resolution, but those are compromises much like staying at 1080p.
Say what?

c3-99th.gif

c3-50ms.gif

bd-99th.gif

bd-16ms.gif

grid2-99th.gif

grid2-16ms.gif


Those are all at 1440p, without mature drivers for either NVIDIA or AMD.
 
Even the fastest GPUs aren't ready for resolutions above 1080p (ie. 2560x1600) unless you go SLI/Crossfire and most people won't even consider dual GPUs. Sure, some games run at 60fps or above, but the games you want to stress your GPU with won't run at 60fps, in fact, even half of that is (too) often a struggle. You can turn settings down and keep the resolution, but those are compromises much like staying at 1080p.

The "fastest" GPUs are more than fine at 1080p. 1080p puts the fastest GPUs to sleep they are bored.
 
Though you did say 2560x1080, there is an Eizo Foris FG2421 that is 1080p, VA, 120Hz with native strobing.

No Gsync, no sale, though admittedly the Eizo is probably the best none Gsync monitor on the market.

23.5" would feel a little too small after playing on 37" and 42" HDTVs for the past 5 years as well.
 
Even the fastest GPUs aren't ready for resolutions above 1080p (ie. 2560x1600) unless you go SLI/Crossfire and most people won't even consider dual GPUs. Sure, some games run at 60fps or above, but the games you want to stress your GPU with won't run at 60fps, in fact, even half of that is (too) often a struggle. You can turn settings down and keep the resolution, but those are compromises much like staying at 1080p.
that's why GSync (and hopefully, before too long Freesync) is such a big deal. things like the ROG Swift, capable of 1440p at 144 Hz if they needed you to be rendering out frames up at that refresh, would be almost inconceivable outside of multi gpu setups, but Gsync makes a display like that desirable even with just one.

it's not just appealing technology, it also helps make up for the stagnation we've had in the GPU market and makes bigger resolutions more desirable by making 40 fps a workable framerate.
 
I'm not sure what your GPU situation is, but after playing with G-Sync at 1080p, I'd have to say it's highly overrated if you're already in the high end for graphics cards.

I pretty much leave my VG248QE in ULMB almost always. I'd actually much rather sell it and get the Eizo Foris now that I've experienced G-Sync.
 
What!?

At 1080p/120Hz?

Am I in bizzaro land based in unfounded reality all of a sudden?

If you're not using Gsync then your refresh rate should be the target minimum frame rate to prevent stuttering or tearing and no single GPU machine can achieve that. SLI systems can't even achieve that in high end games due to CPU restraints.

It's why 120hz only make sense to me if you marry it with Gsync. It's just too demanding otherwise.
 
Honestly, increased resolution is very low down on my list of desires from a PC monitor. Monitors with blacks that aren't light greys would be a start. I wouldn't trade playing on my 1080p Panasonic plasma for a 4K TN panel. Contrast, game performance and motion resolution matter just as much as static resolution.

Give me a 120hz VA 2560x1080 Gsync monitor and I'll be happy. I'd rather have decent contrast and smooth gameplay before increasing resolution any further and killing my frame rates in the process.

We can start talking about higher resolutions when the displays sporting these increased resolutions aren't complete shit by every other metric.

I fully agree, all of those things are much higher priority (especially motion resolution) than more pixels. We're not going to get those though. (you know how I hate the shit out of lcd:p)
They have nothing to do with gpu power though... they are just symptoms of display tech limitations. Nvidia can't do anything about those :\

Resolution itself is still important.
If I'm going to have to deal with blurry motion, awful contrast, crushed blacks or grey blacks etc at least let me put that new gpu to use to clean up the graphics a bit and get rid of aliasing.
Every time I see a star citizen video I cringe at the amount of aliasing in them.

@mkeynon, I'm not mad at anything , I'm just disputing that higher resolutions don't matter to midrange gpu buyers. Modern gpus should definitely also be tested at higher resolutions (as well as at 1080p, it's annoying when someone benchmarks only at high res at it makes it impossible to compare performance to older gpus that were tested only at 1080p)

Though you did say 2560x1080, there is an Eizo Foris FG2421 that is 1080p, VA, 120Hz with native strobing.

Say what?

http://techreport.com/r.x/radeon-r9-290x/c3-99th.gif[/IM
[IMG]http://techreport.com/r.x/radeon-r9-290x/c3-50ms.gif[/IMG
[IMG]http://techreport.com/r.x/radeon-r9-290x/bd-99th.gif[/IMG
[IMG]http://techreport.com/r.x/radeon-r9-290x/bd-16ms.gif[/IMG
[IMG]http://techreport.com/r.x/radeon-r9-290x/grid2-99th.gif[/IMG
[IMG]http://techreport.com/r.x/radeon-r9-290x/grid2-16ms.gif[/IMG

Those are all at 1440p, without mature drivers for either NVIDIA or AMD.[/QUOTE]
Yep, gpus are ready for higher resolutions, so let's please test them at more than just 1080p as well.

@brainstew, one last thing: if I wanted to worry about and constrict myself to what the majority of people used, I'd buy a console :p
 
Minimum frame rate or minimum frame time?

What kind of settings?

What games?

Because, that's patently untrue for an overwhelmingly large percentage of games. Hell, look at this data with a 7970 (frequency numbers are the speed of the 3570K, but these tests I ran seem to be the only frame time data targeted at 1080p/120Hz on the net)

I've yet to enounter a game that can't be run with at least 98% of frames rendered faster than 9-10ms on either my 290 or 780 Ti.

At that kind of frame rate with a 120-144Hz panel, tearing is unnoticeable to almost anyone outside of folks who are trained to look for such things (Durante).
 
If you're not using Gsync then your refresh rate should be the target minimum frame rate to prevent stuttering or tearing and no single GPU machine can achieve that. SLI systems can't even achieve that in high end games due to CPU restraints.

It's why 120hz only make sense to me if you marry it with Gsync. It's just too demanding otherwise.

Tearing is barely noticeable on a 120/144Hz monitor from my experience after owning one for over a year now.

The only game I can remember where the tearing bothered me was The Stanley Parable for some reason. So I had to enable Vsync and the mouse lag felt like shit.

I turn off Vsync in all my games now and with my 680 I don't hit 120FPS if it's more demanding. I hate tearing, but I just don't notice it anymore.

And then you can have drops down to about 90FPS before you start to feel the difference. But it certainly doesn't feel stuttery, that only happens when I approach 60FPS.
 
Tearing is barely noticeable on a 120/144Hz monitor from my experience after owning one for over a year now.

The only game I can remember where the tearing bothered me was The Stanley Parable for some reason. So I had to enable Vsync and the mouse lag felt like shit.

I turn off Vsync in all my games now and with my 680 I don't hit 120FPS if it's more demanding. I hate tearing, but I just don't notice it anymore.

And then you can have drops down to about 90FPS before you start to feel the difference. But it certainly doesn't feel stuttery, that only happens when I approach 60FPS.
Yeap, this anecdote is the one repeated in my entire gaming circle.

4.5ish GHz Intel proc + $300 GPU = 120Hz stable.
Yep, gpus are ready for higher resolutions, so let's please test them at more than just 1080p as well.
TechReport and PCPer run all their tests at 1440p for the high end cards. They do 1080p for mid-low range cards.
 
I've been doing a lot of thinking. I was initially going to buy an Asus ROG Swift PG278Q but now I'm considering skipping it and just waiting for a GSync 1080p@144hz/120hz monitor instead. The combination of high res and high framerate might be more than I'm ready to commit to.

This would make me feel more comfortable with a single 980 and I would choose a 5820K over a 5930K without hesitation.
 
Say what?

Crysis 3 was only run at high settings and Grid 2 doesn't belong on a list of games you use to stress your GPU. While Codemasters games look good in motion, their engine doesn't stress CPU/GPU at all and lack many of the high-end features. As I said, you can compromise on details/features to get games running at 60+ with single GPUs. 1080p is also a compromise but it's still the compromise most people choose to go for. Blood Dragon seems to run particularly well on their (Techreport) tests though, compared to results measured by other websites.
 
I'm not sure what your GPU situation is, but after playing with G-Sync at 1080p, I'd have to say it's highly overrated if you're already in the high end for graphics cards.

I pretty much leave my VG248QE in ULMB almost always. I'd actually much rather sell it and get the Eizo Foris now that I've experienced G-Sync.

I'm very sensitive to motion inconsistency, but can absolutely enjoy certain titles at a true locked 30 fps so long as there aren't frame pacing issues.

I absolutely intend to use ULMB on all the games I have that can hit the appropriate frame rates for the viable ULMB refresh rates (unless they're flawless in 3D, which few titles are), but for anything in the 40 - 60 fps range I will absolutely be using Gsync.
 
Crysis 3 was only run at high settings and Grid 2 doesn't belong on a list of games you use to stress your GPU. While Codemasters games look good in motion, their engine doesn't stress CPU/GPU at all and lack many of the high-end features. As I said, you can compromise on details/features to get games running at 60+ with single GPUs. 1080p is also a compromise but it's still the compromise most people choose to go for. Blood Dragon seems to run particularly well on their (Techreport) tests though, compared to results measured by other websites.
Those tests are to show that 1440p is absolutely fine with a single card. And saying a game doesn't belong on a list of games for a benchmark is absolutely ludicrous. You don't measure whether a thing is capable at a certain resolution by only using the top .5% of games, in terms of stressing frame rate, especially when those are a tiny portion of what people actually play.
 
Depends on your standards. There are games that can't hold a steady 60 fps at 1080p on a 780 Ti (at least not with proper AA). And those won't remain excepetions forever.
Example:

http://3.bp.blogspot.com/-by5oj_jws34/TonUNPZaNEI/AAAAAAAACOw/LCcwS4mECPY/s1600/NETS.jpg

I'll dignify your goalpost moving for a second because I'm nice:p
You cherrypicked benchmarks with 4xmsaa (and 4s smaa which is a combination of msaa and post process aa) in games where msaa has a massive performance impact (more than halving framerate)

btw most people will prefer downsampling or a higher resolution over msaa because msaa is ineffecive in games that use deferred rendering.
The best way to do AA in modern games is throwing more pixels at it and second to that is settling for an inferior mish mash of limited coverage msaa , image blurring temporal aa and edge blurring post process AA, maybe with some sharpening filter to try to compensate for the blur which then shits on the Image even harder.
It's kind of like a medication cocktail that helps people live with HIV, rather than curing it.
 
Just talked to my guy and totally surprisingly they'll have cards available tomorrow morning. Still having a hard time justifying it though. The performance doesn't seem to be a big enough step up from my 680. The only two games giving me remotely trouble are Metro Redux and Crysis 3 ... I think I'll wait for GTA. By then there should be decent aftermarket coolers available as well. I also don't care for Witcher 3, so that makes things kinda easy.
 
Depends on your standards. There are games that can't hold a steady 60 fps at 1080p on a 780 Ti (at least not with proper AA). And those won't remain excepetions forever.
Example:
Your examples are two of the most demanding games on the market at ultra or max settings with AA.

That's like saying that welfare is wasteful because Andrew Carnegie created a vast sum of wealth out of nothing through hard work.
 
btw most people will prefer downsampling or a higher resolution over msaa because msaa is ineffecive in games that use deferred rendering.
The best way to do AA is throwing more pixels at it.

Even with just FXAA there can be dips:
http://www.tomshardware.com/reviews/geforce-gtx-780-ti-review-benchmarks,3663-7.html


edit:

Your examples are two of the most demanding games on the market at ultra or max settings with AA.

That's like saying that welfare is wasteful because Andrew Carnegie created a vast sum of wealth out of nothing through hard work.

I was responding to a post which claimed that the fastest single GPU graphics cards were "put to sleep" at 1080p.
I agree that a 780 Ti is enough for the majority of cases, but certainly not by leaps and bounds.
 
Just talked to my guy and totally surprisingly they'll have cards available tomorrow morning. Still having a hard time justifying it though. The performance doesn't seem to be a big enough step up from my 680. The only two games giving me remotely trouble are Metro Redux and Crysis 3 ... I think I'll wait for GTA. By then there should be decent aftermarket coolers available as well. I also don't care for Witcher 3, so that makes things kinda easy.

Memory is gonna be a big problem next year, if you are in the same position as me with a pair of 680 with only 2GB.
 
Tearing is barely noticeable on a 120/144Hz monitor from my experience after owning one for over a year now.

It's cool that it works for you but sorry, removing Vsync should never be classed as as serious suggestion. Tearing completely removes any sense of immersion or image consistency and is never an acceptable option.
 
All I want is to play my games @ 1080p/High/60fps. Don't need to max a game out or have everything set to ultra, but my GTX660 can't give me those three things, all together, all the time. Admittedly, the GTX970 probably won't be able to 100% either, but it'll do a better job, I feel.

1080p isn't going anywhere for a while yet. Maybe in 2-3 I'll be ready to move up, but for now, 1080p feels right to me. 1440p, 120/144hz, some of these newer display technologies are still to young for me to buy into.
 
It's cool that it works for you but sorry, removing Vsync should never be classed as as serious suggestion. Tearing completely removes any sense of immersion or image consistency and is never an acceptable option.
Given your penchant for larger panels, I am curious, have you played on a 120Hz monitor?
 
Top Bottom