Nvidia GTX 980/970 3DMark Scores Leaked- from Videocardz.com

That in 2560x1080, 4xMSAA and maxed out with constant 60fps? I'm not shure if a single GPU could handle it.

I guess I should have left the entire quote. He said high not max and said 1080p which I guess is 1920x1080 and he said 30 FPS.

I know the game looks good, but people blow it out of proportion.
 
Unless you're doing heavy multithreaded tasks like video editing / encoding, CPU upgrades just aren't as necessary anymore, I expect that to continue.

My (almost six years old) X58 / i7 920 @ 4 GHz setup was still enough to run nearly any game at over 60 FPS. I just finally upgraded to a six core X99 setup, but mostly for the additional platform features (USB 3.0, increased SATA speeds etc).

Well I do that for a living :P

Still, it's funny how much life that chip's gotten. It's been fine for video editing, too, but I assume it's shit by today's standards.
 
How about an i5-750 at 3.6Ghz? I'd like to get away with not upgrading for as long as I can, and so far I've never felt cpu bottlenecked.

That CPU will almost certainly bottleneck. The Nahelem i7's still have some life left in them but anything below that will bottleneck, especially in games that love CPU's (Valve Games, Blizzard Games, RTS Games, MMO's).
 
Honest question: Wouldn't 28nm get cheaper over time, especially for smaller die sizes? We have been at it so long that I would imagine the cost of mid size chip production should ahve gone down dramatically.
Yes. Which is why a 980 will probably be roughly the same price as a 680 at launch, even though it's significantly larger.
 
That CPU will almost certainly bottleneck. The Nahelem i7's still have some life left in them but anything below that will bottleneck, especially in games that love CPU's (Valve Games, Blizzard Games, RTS Games, MMO's).

It depends on the game whether a bottleneck will appear, and the only games where the i7 will outperform the i5's of that generation is where we'd be using more than 4 threads.
 
uh, speaking of cpus, any recommendations in terms of the new stuff from intel? i know fewer, faster cores are generally better for gaming.
 
I wonder how well an OC'd 980 will do on Witcher 3...I'm willing to turn down some settings to get 60FPS at 1080p, but not too many.

If you stick with maybe 2xMSAA and turn down one of the more demanding settings it will probably be fine.

But there's really now way of knowing until the game is out.

All these price leaks are toying with my heart. If $299 for 970 is true (which I have some serious doubts about) I would even start considering getting 2 of them..
 
uh, speaking of cpus, any recommendations in terms of the new stuff from intel? i know fewer, faster cores are generally better for gaming.

Only if you are not overclocking. If you are overclocking 4 cores will have minimal advantage over 6 cores but it''s in 100-200 Mhz range per core.

But anyway until DA, TW3, Batman, etc. are out it's hard to say if investing in 6 cores will be worthwille this gen.
 
The site also has pictures of the GTX970 reference model and suddenly it doesn't look all that midrange anymore. 399 seems likely again.

heZBDNK.jpg
 
Ive got an i7 2600 and a GTX580 (stock model): would you buy one of these new cards if you were me? Or should i wait till the new 20nm next year? Will my cpu bottleneck that gpu?
 
Ive got an i7 2600 and a GTX580 (stock model): would you buy one of these new cards if you were me? Or should i wait till the new 20nm next year? Will my cpu bottleneck that gpu?

No I don't think your CPU will bottleneck you and either card will be a very good upgrade from your 580.
 
Not at all. 2500k won't even bottleneck 2x GTX 970's...

That's not true at all. Even with an heavily overclocked modern i7 like a 4790K or 5820K the CPU will bottleneck a SLI/Crossfire-Setup in quite a few games (1.080p). In fact a lot of games will already be bottlenecked by one GPU. Some examples were CPUs are bottlenecking: Battlefield 4 MP (if your not using Mantle), Star Citizen (at about 50 Fps once other ships are around, down to 35 in New Horizon Racing), Star Craft 2 (even in 4K), most MMORPGs, most open world games like Watch Dogs (50-70 Fps), Grid Autosport (about 80-100 Fps) and a lot more.

SLI/Crossfire will only be an advantage if you up the resolution or when DX 12 is finally around, hopefully removing CPU-overhead.

Regards,
blaidd
 
That's not true at all. Even with an heavily overclocked modern i7 like a 4790K or 5820K the CPU will bottleneck a SLI/Crossfire-Setup in quite a few games (1.080p). In fact a lot of games will already be bottlenecked by one GPU. Some examples were CPUs are bottlenecking: Battlefield 4 MP (if your not using Mantle), Star Citizen (at about 50 Fps once other ships are around, down to 35 in New Horizon Racing), Star Craft 2 (even in 4K), most MMORPGs, most open world games like Watch Dogs (50-70 Fps), Grid Autosport (about 80-100 Fps) and a lot more.

SLI/Crossfire will only be an advantage if you up the resolution or when DX 12 is finally around, hopefully removing CPU-overhead.

Regards,
blaidd

Yep - unfortunatly that also means those games will be bottlenecked by any cpu in existance as there's 10-15% of IPC difference from Sandy to Haswell.
 
That's not true at all. Even with an heavily overclocked modern i7 like a 4790K or 5820K the CPU will bottleneck a SLI/Crossfire-Setup in quite a few games (1.080p). In fact a lot of games will already be bottlenecked by one GPU. Some examples were CPUs are bottlenecking: Battlefield 4 MP (if your not using Mantle), Star Citizen (at about 50 Fps once other ships are around, down to 35 in New Horizon Racing), Star Craft 2 (even in 4K), most MMORPGs, most open world games like Watch Dogs (50-70 Fps), Grid Autosport (about 80-100 Fps) and a lot more.

SLI/Crossfire will only be an advantage if you up the resolution or when DX 12 is finally around, hopefully removing CPU-overhead.

Regards,
blaidd

Anybody using SLI/Xfire and running at 1080p is clueless. Hence, my point still stands.

Regards,
NoLoveForFailWheelDrive
 
As far as I know they are live streaming from the event for 24 hours, and have yet to announce what is happening when in that 24 hour stream. So we could conceivably not get the GTX 970/980 announcements until well into the next day... but I expect we'll start with them.

I mean, I really hope that.

I would expect they start with it as well. "Here are new GPUs, and over the next 24hours we'll show you how well they play and how much devs love them".
 
As far as I know they are live streaming from the event for 24 hours, and have yet to announce what is happening when in that 24 hour stream. So we could conceivably not get the GTX 970/980 announcements until well into the next day... but I expect we'll start with them.

I mean, I really hope that.

we're definitely getting something during that 24h event. When Nvidia does stage shows they usually leave the cards towards the end, right? For a 24h event where they have dev interviews and showcases maybe it makes more sense to start with the reveals. Dunno.
 
If I upgrade from my GTX 560 Ti, will I need to install a motherboard BIOS thingy update... thingy? I can manage GPU drivers just fine, but everytime I looked at updating my BIOS I ran away with my tail between my legs.

Hoping I can just plug in a new card, update the drivers, and go.
 
Anybody using SLI/Xfire and running at 1080p is clueless. Hence, my point still stands.

Regards,
NoLoveForFailWheelDrive

If you want 120hz, SLI or Xfire is needed if you want higher presets for a lot of games.
If I upgrade from my GTX 560 Ti, will I need to install a motherboard BIOS thingy update... thingy? I can manage GPU drivers just fine, but everytime I looked at updating my BIOS I ran away with my tail between my legs.

Hoping I can just plug in a new card, update the drivers, and go.

GPUs are pretty much plug and play. Just pop it in and do a fresh driver install.
 
If you want 120hz, SLI or Xfire is needed if you want higher presets for a lot of games.


GPUs are pretty much plug and play. Just pop it in and do a fresh driver install.

I'd recommend you uninstall the old drivers before you put in the new card too, I've very occasionally run into issues caused by not doing that.
 
Anybody using SLI/Xfire and running at 1080p is clueless. Hence, my point still stands.

Regards,
NoLoveForFailWheelDrive

Wrong, wrong, wrong.

Some people run sli at 1080p so they turn all the extra bells and whistles up to high or very high (aa,af,shadows,etc) and want to run at 120hz.

When I had sli 770s I could run Assassins Creed IV with everything cranked up at 1080p smooth as butter. Got rid of them and got a single 780ti and had to turn some stuff down to keep it at a steady 60.
 
Man I don't know how I missed this thread for so long.

I really hope all these rumors turn out to be true.

I've been wanting to upgrade my i7 660 X51 for sometime now. It looks like the 970 will meet the watt requirements no problem but I really hope the 980 eeks in there as well.

.....now I just have to hope one or both of them fit.
 
My wishlist for tomorrow is that the 980 is $500, the 970 is better than the 780, and that the 780's price drops to $300.

Pipe dreamin'
 
Hoping the price for the 980 is 499! I needto build a new PC this month, but if the 980 is 600+ I just might as well go 295x for 400 more, or even buy one used for like 800.
 
I've never experienced a SLI system. I have a 770 4GB @1.2Ghz, should I consider buying a second one or should I upgrade to something like the 780Ti (hopefully 980Ti) if I need more power?
 
A non-reference 970 clocked at 1140 (probably MSI Twin Frozr) scores just below what the original Titan did (4600 GPU score) in 3DMark 11 extreme. Stock 780Ti scores roughly 4750. Numbers were taken from the Sweclockers review of 780Ti.

original.jpg


If it turns out to be priced at $299, that is one hell of a deal. $399 seems more likely.
 
Can't wait to start seeing benchmarks. I'm seriously considering dumping my dual 7970 GEs for a single card and joining Team Green. Having dual cards hasn't made any discernible or measurable difference in the games I've benched, it only boosted my 3D Mark/Unigine scores/performance.

I typically shoot for 1080p60, but the power to downsample 4K would be pretty awesome.
 
Top Bottom