Nvidia GTX 980/970 3DMark Scores Leaked- from Videocardz.com

If the 970 is at $350, then it's instant buy. Or else I look at AMD and see what they do with current line.

amd Gold is nice too with alien and star citizen.
 
Yes, expectations are obviously lowered. Because of fabrication realities. We're still at 28nm.

Honest question: Wouldn't 28nm get cheaper over time, especially for smaller die sizes? We have been at it so long that I would imagine the cost of mid size chip production should ahve gone down dramatically.
 
If the 970 is at $350, then it's instant buy. Or else I look at AMD and see what they do with current line.

amd Gold is nice too with alien and star citizen.
Word of warning - Star Citizen is not for the full game. Its just a starter pack for the Commander module access.

It is a nice deal, though. I'm hoping Nvidia at least offer Borderlands: The Presequel with a 980/970. Would help justify a purchase a bit more for me.

Honest question: Wouldn't 28nm get cheaper over time, especially for smaller die sizes? We have been at it so long that I would imagine the cost of mid size chip production should ahve gone down dramatically.
That may be the case, but expenditures for new architecture might well offset any savings in node costs.
 
$299 would be insane value. At that price I wouldn't even feel the slightest bit of guilt for upgrading from a card that still runs most of what I regularly play.

Even more interesting though would be what in the world AMD is gonna do when Nvidia prices its cards like that. The performance/watt and performance/dollar ratio would both be hard to triumph, especially since Nvidia is generally regarded as the 'premium' brand anyway.
 
It probably won't happen. It's a bit confusing, in terms of architecture on how the actual chips relate to the numbered GTX cards but more or less be looking for a repeat of the 600 series.

Basically, there is no chip that would work as a 980 Ti. The GM204 chip being used in the 980 and 970 is the full thing in the 980. There is no "more full" version of the GM204 that could be used in a 980 Ti.

With the 700 series, the GK110 was the Titan, with a cut down version released as the 780. Then they released the 780 Ti which was something in between the Titan and 780.

The GM200, which would be the Maxwell equivalent of the GK110, will probably be first released as a Titan 2.0 or something. Then probably as the 1080/1080 Ti a year or two from now.

Ah thanks, might just jump on that 980 then.
 
I'm a total noob, what does "non reference" mean for GPUs and why is it more expensive?

Reference = Blower style cooler. (Better for a closed case because the exhaust heat goes out the back)
Non-Reference = Aftermarket style cooler. Usually a dual fan unit and a better heatsink.

Here's a Linus video to explain a little better - Link
 
I'm a total noob, what does "non reference" mean for GPUs and why is it more expensive?
Gpu will come with custom coolers, over clocking tweaks and other things.

Take the Asus Direct CU II line of GPU, Asus claim their cooler is better than nvidias standard one, by being able to keep the card at lower temperatures with lower noise. They also claim to tweak the board design and power delivery circuitry to give a higher chance of better over clocks. Some board partners go as far as cherry picking chips for better potential over clocks.

As noise was important to me I picked a non reference card with low noise levels
 
Take the Asus Direct CU II line of GPU, Asus claim their cooler is better than nvidias standard one, by being able to keep the card at lower temperatures with lower noise. They also claim to tweak the board design and power delivery circuitry to give a higher chance of better over clocks.

It's not really a claim once it's established fact. Nvidia does a better job designing their reference parts than AMD has in recent years, but they are still bound by cost. The simple fact is reference designs are as conservative as they can get--a cooler just good enough to acceptably cool that GPU at stock settings. When you can adjust the price point and overclock without worrying, you can afford better cooling and such.
 
Gpu will come with custom coolers, over clocking tweaks and other things.

Take the Asus Direct CU II line of GPU, Asus claim their cooler is better than nvidias standard one, by being able to keep the card at lower temperatures with lower noise. They also claim to tweak the board design and power delivery circuitry to give a higher chance of better over clocks. Some board partners go as far as cherry picking chips for better potential over clocks.

As noise was important to me I picked a non reference card with low noise levels

So for someone like me who doesn't know squat about with overclocking, a reference model would be the best choice? Or should I get a non reference and learn how to mess around with this stuff?

Reference = Blower style cooler. (Better for a closed case because the exhaust heat goes out the back)
Non-Reference = Aftermarket style cooler. Usually a dual fan unit and a better heatsink.

Here's a Linus video to explain a little better - Link

Thank you for that, will check it out.
 
So for someone like me who doesn't know squat about with overclocking, a reference model would be the best choice? Or should I get a non reference and learn how to mess around with this stuff?



Thank you for that, will check it out.

You'd probably be better off getting a non-reference design if you are going to overclock. Most, if not all of them come factory overclocked to a degree.

That video I linked shows the most popular non-reference designs. Personally, I'm partial to EVGA but they all are pretty damn good.
 
So for someone like me who doesn't know squat about with overclocking, a reference model would be the best choice? Or should I get a non reference and learn how to mess around with this stuff?



Thank you for that, will check it out.

Non reference card usually have better cooling. I would recommend non-reference even if you don't plan to overlock.
 
So for someone like me who doesn't know squat about with overclocking, a reference model would be the best choice? Or should I get a non reference and learn how to mess around with this stuff?



Thank you for that, will check it out.

If you care about noise at all it's almost always worth spending an extra 20 on an aftermarket version that has a nice and big heatsink and quiet low rpm fans

the amd stock blower coolers are jet engines, the nvidia ones are better (as in they look good by comparison just because of how loud the amd ones are)

A blower cooler is only good if you have either a tiny case (i'd say like for a small htpc build, but a htpc needs to be quiet so a blower would be out of the question) or when you go SLI and 2 or more cards might simply dump too much heat into the case for the case fans to handle.


Blower coolers have small fans that go really fast and funnel the air out of the back of your case (think how a hairdryer functions, it sucks up air inside the case and directs it all through the heatsink (which is fully enclosed) and out of back of the card next to the hdmi port)
radeon-290x-hsf-645x469.jpg

aftermarket coolers have larger fans that run slowly (and big heatsinks) and make very little noise.
e.g
You have this huge radiator like heatsink and an enormous fan that only needs to run at like 300 rpm , and it will still cool way better than the stock blower. (which has a much smaller heatsink and a little leafblower 1500+rpm 60mm fan

The only downside is part of the heat gets blown into the case instead of straight out the back, but a 500rpm (also quiet) 120mm outtake fan will take care of it.

A lot of people spend a bit extra to get a nice cooler so their pc stays quiet. You get what you pay for it's all optional.

For nvidia cards the high end reference coolers (on 780-titan and apparently also the 980/970) are not too bad noise wise, but I would never recommend anyone buy a reference amd blower card...

The reference blower coolers are a way for amd and nvidia to save costs (lovely isn't it, getting nickle and dimed on some 50 cents plastic fan and a tiny piece of aluminium when you buy that 600 dollar r9 290x :p) while also retard proofing their gpus, if someone for some reason has some terrible tiny case with 2 broken fans then at least the blower cooler will ensure the card still works.
 
Oh yeah, I'm still concerned about non-reference since I'm still using an FT02; so the way the pipes are aligned matters :|
 
Basically, there is no chip that would work as a 980 Ti. The GM204 chip being used in the 980 and 970 is the full thing in the 980. There is no "more full" version of the GM204 that could be used in a 980 Ti.

With the 700 series, the GK110 was the Titan, with a cut down version released as the 780. Then they released the 780 Ti which was something in between the Titan and 780.

The GM200, which would be the Maxwell equivalent of the GK110, will probably be first released as a Titan 2.0 or something. Then probably as the 1080/1080 Ti a year or two from now.
I believe Nvidia has to release the hypothetical Quadro GM110 first before we get the gamer version?

K6000 preceded 780Ti, if I'm not mistaken.

Honest question: Wouldn't 28nm get cheaper over time, especially for smaller die sizes? We have been at it so long that I would imagine the cost of mid size chip production should ahve gone down dramatically.
Hmm you have a point. AMD did mention recently about 28nm having more legs in the form of improving costings.
 
So what's the probability that I could run Witcher 3 with a 970 at max settings, 2x MSAA, capped at 30fps with a resolution roughly 1600x900 or more depending on which experiences the least drop in framerate? I'm big on graphics, though I don't mind a lowered resolution and fps. I'll even settle for 720p.
 
So what's the probability that I could run Witcher 3 with a 970 at max settings, 2x MSAA, capped at 30fps with a resolution roughly 1600x900 or more depending on which experiences the least drop in framerate? I'm big on graphics, though I don't mind a lowered resolution and fps. I'll even settle for 720p.

I'd say that's pretty much guaranteed to be possible, minus ubersampling if they include that again. But if it's like TW2, no card will handle ubersampling well for a while.
 
So what's the probability that I could run Witcher 3 with a 970 at max settings, 2x MSAA, capped at 30fps with a resolution roughly 1600x900 or more depending on which experiences the least drop in framerate? I'm big on graphics, though I don't mind a lowered resolution and fps. I'll even settle for 720p.

Are you still on CRT or something? Who drops resolution like that? Maybe you would be better of with the PS4 version if you don't care about resolution and are happy with 30fps. Performance wise people can only guess right now.
 
So what's the probability that I could run Witcher 3 with a 970 at max settings, 2x MSAA, capped at 30fps with a resolution roughly 1600x900 or more depending on which experiences the least drop in framerate? I'm big on graphics, though I don't mind a lowered resolution and fps. I'll even settle for 720p.

I'd expect a 970 could run maxed Witcher 3 at 1080p and constant 60fps. I'll be very disappointed if it can't.
 
Oh yeah, I'm still concerned about non-reference since I'm still using an FT02; so the way the pipes are aligned matters :|

I don't think I've encountered a heatsink that has any problem in my FT02. For reference, I've used a modded XFX R9 290 Double Dissipation and an EVGA ACX cooler in mine.
 
I'd expect a 970 could run maxed Witcher 3 at 1080p and constant 60fps. I'll be very disappointed if it can't.
If W2 and the W3 previews are any indication, you will be disappointed.

We had a recent GAF discussion about "maxed" settings. Don't do "maxed", instead go for "near max but turning off effects with questionable quality/cost ratio"
 
If W2 and the W3 previews are any indication, you will be disappointed.

We had a recent GAF discussion about "maxed" settings. Don't do "maxed", instead go for "near max but turning off effects with questionable quality/cost ratio"

I recently bought The Witcher 2 and found this performance thread.

Looks like a GTX 570 was able to run it on Ultra settings with Ubersampling disabled at 1080p/40-60fps. Hopefully we get a similar situation with The Witcher 3 and performs well on a 970....and my 780 :/
 
Really? Holy shit, I've had mine for over 2 years now, what a great buy that chip was. The sad thing is I'm sure Intel will make sure they never release something as future-proofed again lol

The 2500k and 560 Ti combo has been amazing for me for these past 3 years, but i want to play MGS5 and possibly Star Wars Battlefront on high, lol.
 
Really? Holy shit, I've had mine for over 2 years now, what a great buy that chip was. The sad thing is I'm sure Intel will make sure they never release something as future-proofed again lol

Unless you're doing heavy multithreaded tasks like video editing / encoding, CPU upgrades just aren't as necessary anymore, I expect that to continue.

My (almost six years old) X58 / i7 920 @ 4 GHz setup was still enough to run nearly any game at over 60 FPS. I just finally upgraded to a six core X99 setup, but mostly for the additional platform features (USB 3.0, increased SATA speeds etc).
 
Top Bottom