This. At that price I am in for 2 sli and move from team red to green.$299 = 2x970
But no way in hell is Nvidia dropping this a $299 add another 50/75 I reckon.
This. At that price I am in for 2 sli and move from team red to green.$299 = 2x970
Yes, expectations are obviously lowered. Because of fabrication realities. We're still at 28nm.talk about lowered expectations.
Yes, expectations are obviously lowered. Because of fabrication realities. We're still at 28nm.
Given that all the other x70 cards are $350, this is extremely likely.If the 970 is at $350, then it's instant buy.
Word of warning - Star Citizen is not for the full game. Its just a starter pack for the Commander module access.If the 970 is at $350, then it's instant buy. Or else I look at AMD and see what they do with current line.
amd Gold is nice too with alien and star citizen.
That may be the case, but expenditures for new architecture might well offset any savings in node costs.Honest question: Wouldn't 28nm get cheaper over time, especially for smaller die sizes? We have been at it so long that I would imagine the cost of mid size chip production should ahve gone down dramatically.
Given that all the other x70 cards are $350, this is extremely likely.
But Nvidia could drop a bomb on us all by shaving off that $50. It IS a possibility, however remote.
USD 299 for the 970 would make too much of their lineup devalue too quickly. I don't see it happening.
It probably won't happen. It's a bit confusing, in terms of architecture on how the actual chips relate to the numbered GTX cards but more or less be looking for a repeat of the 600 series.
Basically, there is no chip that would work as a 980 Ti. The GM204 chip being used in the 980 and 970 is the full thing in the 980. There is no "more full" version of the GM204 that could be used in a 980 Ti.
With the 700 series, the GK110 was the Titan, with a cut down version released as the 780. Then they released the 780 Ti which was something in between the Titan and 780.
The GM200, which would be the Maxwell equivalent of the GK110, will probably be first released as a Titan 2.0 or something. Then probably as the 1080/1080 Ti a year or two from now.
I imagine it would just destroy AMD's GPU sales for a year unless they come up with something spectacular.
I'm a total noob, what does "non reference" mean for GPUs and why is it more expensive?
Gpu will come with custom coolers, over clocking tweaks and other things.I'm a total noob, what does "non reference" mean for GPUs and why is it more expensive?
Take the Asus Direct CU II line of GPU, Asus claim their cooler is better than nvidias standard one, by being able to keep the card at lower temperatures with lower noise. They also claim to tweak the board design and power delivery circuitry to give a higher chance of better over clocks.
Gpu will come with custom coolers, over clocking tweaks and other things.
Take the Asus Direct CU II line of GPU, Asus claim their cooler is better than nvidias standard one, by being able to keep the card at lower temperatures with lower noise. They also claim to tweak the board design and power delivery circuitry to give a higher chance of better over clocks. Some board partners go as far as cherry picking chips for better potential over clocks.
As noise was important to me I picked a non reference card with low noise levels
Reference = Blower style cooler. (Better for a closed case because the exhaust heat goes out the back)
Non-Reference = Aftermarket style cooler. Usually a dual fan unit and a better heatsink.
Here's a Linus video to explain a little better - Link
So for someone like me who doesn't know squat about with overclocking, a reference model would be the best choice? Or should I get a non reference and learn how to mess around with this stuff?
Thank you for that, will check it out.
So for someone like me who doesn't know squat about with overclocking, a reference model would be the best choice? Or should I get a non reference and learn how to mess around with this stuff?
Thank you for that, will check it out.
So for someone like me who doesn't know squat about with overclocking, a reference model would be the best choice? Or should I get a non reference and learn how to mess around with this stuff?
Thank you for that, will check it out.
You have this huge radiator like heatsink and an enormous fan that only needs to run at like 300 rpm , and it will still cool way better than the stock blower. (which has a much smaller heatsink and a little leafblower 1500+rpm 60mm fan
I believe Nvidia has to release the hypothetical Quadro GM110 first before we get the gamer version?Basically, there is no chip that would work as a 980 Ti. The GM204 chip being used in the 980 and 970 is the full thing in the 980. There is no "more full" version of the GM204 that could be used in a 980 Ti.
With the 700 series, the GK110 was the Titan, with a cut down version released as the 780. Then they released the 780 Ti which was something in between the Titan and 780.
The GM200, which would be the Maxwell equivalent of the GK110, will probably be first released as a Titan 2.0 or something. Then probably as the 1080/1080 Ti a year or two from now.
Hmm you have a point. AMD did mention recently about 28nm having more legs in the form of improving costings.Honest question: Wouldn't 28nm get cheaper over time, especially for smaller die sizes? We have been at it so long that I would imagine the cost of mid size chip production should ahve gone down dramatically.
So what's the probability that I could run Witcher 3 with a 970 at max settings, 2x MSAA, capped at 30fps with a resolution roughly 1600x900 or more depending on which experiences the least drop in framerate? I'm big on graphics, though I don't mind a lowered resolution and fps. I'll even settle for 720p.
So what's the probability that I could run Witcher 3 with a 970 at max settings, 2x MSAA, capped at 30fps with a resolution roughly 1600x900 or more depending on which experiences the least drop in framerate? I'm big on graphics, though I don't mind a lowered resolution and fps. I'll even settle for 720p.
So what's the probability that I could run Witcher 3 with a 970 at max settings, 2x MSAA, capped at 30fps with a resolution roughly 1600x900 or more depending on which experiences the least drop in framerate? I'm big on graphics, though I don't mind a lowered resolution and fps. I'll even settle for 720p.
Oh yeah, I'm still concerned about non-reference since I'm still using an FT02; so the way the pipes are aligned matters :|
If that price rumor for the 970 is true, how badly is my 2500k going to bottleneck it?
If W2 and the W3 previews are any indication, you will be disappointed.I'd expect a 970 could run maxed Witcher 3 at 1080p and constant 60fps. I'll be very disappointed if it can't.
Not at all. 2500k won't even bottleneck 2x GTX 970's...
People linking WCCFTech and believing their bullshit really makes me cringe. They really need to be a black listed site.
If W2 and the W3 previews are any indication, you will be disappointed.
We had a recent GAF discussion about "maxed" settings. Don't do "maxed", instead go for "near max but turning off effects with questionable quality/cost ratio"
Ah, that's really good news
Not at all. 2500k won't even bottleneck 2x GTX 970's...
I'd expect a 970 could run maxed Witcher 3 at 1080p and constant 60fps. I'll be very disappointed if it can't.
Really? Holy shit, I've had mine for over 2 years now, what a great buy that chip was. The sad thing is I'm sure Intel will make sure they never release something as future-proofed again lol
Really? Holy shit, I've had mine for over 2 years now, what a great buy that chip was. The sad thing is I'm sure Intel will make sure they never release something as future-proofed again lol
Isn't there an nvidia conference Thursday?I'm getting so thirsty for legit info. It's still like 3 days away probably.
I'll be at it so it existsSource?
![]()
I hope you handle disappointment well.I'd expect a 970 could run maxed Witcher 3 at 1080p and constant 60fps. I'll be very disappointed if it can't.
They normally don't make up their own stuff, but they will perpetrate any rumor from chiphell, videocardz, swedeoverclockers, fudzilla, semiaccurate, etc.Ah, that's really good news
Are they known to lie for clicks?