They going from 40 nm(5000/6000 series) straight to 28nm? That's an awfully big jumped.
With the lazyness of developers being a day 1 buyer can be horrible. Pay full price, recieve patches later.
That would change if Nintendo enters into pc market. Not gonna happen.
My move from an AMD 5870 -> Nvidia GTX570 this year was incredible. I try not to be a fanboy; but I really don't see myself jumping back to AMD anytime soon.
Uh, what?
What? Aren't the 5870 and 570 pretty similar in terms of performance?
Or are you talking about drivers?
He's saying that Nintendo's stance isn't "Shove it out the door and patch all the bugs later", but instead "Release a finished product that's been fully tested". But really, it's apples to mangos. Nintendo deals with a static platform.
GTX 570 is like 50-80% faster than the 5870.
Actually is 20-25%~.
HD5870 --> +17% HD6970 = GTX480 --> 0-5% GTX570
570 is not 50-80% faster than the GTX470 AFAIK. Both HD6000/GTX500 are tweaked versions of the previous gen, nothing to be surprised about.
I think dev support is just as important as post release drivers, if not more so. AMD has been working with DICE on Frostbite 2 and it shows. On the other hand, Saints Row: The Third and RAGE were completely broken at launch.
According to these guys, 29% difference between 5850 and 570.Actually is 20-25%~.
HD5870 --> +17% HD6970 = GTX480 --> 0-5% GTX570
570 is not 50-80% faster than the GTX470 AFAIK. Both HD6000/GTX500 are tweaked versions of the previous gen, nothing to be surprised about.
Here, have some numbers:
http://www.anandtech.com/bench/Product/294?vs=306
Seems like the biggest differences are at super super high resolutions, which indicates to me a memory limitation on the card.
I bet the fan on this new card will be loud as fuck.
Just like the 6990.
Then what the fuck are you playing?
Witcher 2 broken crossfire on launch.
Skyrim broken crossfire on launch
Batman Arkham City broken crossfire on launch.
Rage broken everything on launch.
According to these guys, 29% difference between 5850 and 570.
But yeah, we can cherrypick benches to show the difference is 80%![]()
I bet the fan on this new card will be loud as fuck.
Just like the 6990.
Actually is 20-25%~.
HD5870 --> +17% HD6970 = GTX480 --> 0-5% GTX570
570 is not 50-80% faster than the GTX470 AFAIK. Both HD6000/GTX500 are tweaked versions of the previous gen, nothing to be surprised about.
single card on crysis 1 1080p and 60fps on ultra settings possible?
What? Aren't the 5870 and 570 pretty similar in terms of performance?
Or are you talking about drivers?
As a 6870 owner would I be better off waiting for the revision series/8000's? With regards to better performance for money.
From everything I read, and what you can see has been posted in the last several replies, is that I could expect a 20-30% increase in FPS.
Well, from my experience at 1920x1200 and paired with an i7 930, the performance bump is an almost universal 30% bump.
Which may not sound like much. But it really, really is. Games that hovered in the mid 40s fps, are now a silky smooth 60fps. And that really makes a pretty massive difference.
And it helps keep my lows up, which is most important. Whereas BF3 would sometimes dip to the high 20s (28-29fps) in the most explosive action moments, my 570 hasn't dropped below 30 even once. And while it may not sound like much, that's a pretty big deal.
Factor in better drivers, with more recent updates; Physx; and 3D Vision (for when I upgrade); and better scalability in SLI than CF; well, it's just a spectacular card. And the fact that it OC's like a CHAMP doesn't hurt. I'm rocking basically GTX580 speeds. Not bad for a card I spent less than $300 on (Microcenter sale).
My only complaint is that it runs HOT. My 5870 would be around 65c under load, and my 570 can hit 97-98c pretty regularly. Apparently it's all within spec, and I haven't noticed any graphics distortions once... but in the interest of full disclosure... yeah.
Either way, at this point I'm really looking forward to the next Nvidia series. Nevertheless, if AMD knocks the 7xxx series out of the park, I'll be looking into them for sure.
Uh...you may want to check your setup. What kind of 570 do you have? That's not normal at all...
Uh...you may want to check your setup. What kind of 570 do you have? That's not normal at all...
I was worried, too. It's a ZOTAC AMP! card. I agree, the numbers sounded ridiculous. I asked around and did a lot of research, and most said that it wasn't as abnormal as it sounded.
Too sooth my worries, I exchanged it for another card and the temps were the same. I changed the cooling in my case, and got the temps down to the high 80s/low 90s... but then OC'd it quite a bit ;p
For the most part it seems to hover in the 88-89 range, but it'll hit the 96-97 range if I play for three or four hours when my ambient temps are in the low 70s in my room.
A buddy online ended up getting the same card a month later, and even with his really nice, big case with lots of fans, he's still hovering in the high 80s.
If I learned anything, it's that I'll never get a ZOTAC card again ;p I'll probably just stick with eVGA. But I just couldn't pass on the deal.
I was worried, too. It's a ZOTAC AMP! card. I agree, the numbers sounded ridiculous. I asked around and did a lot of research, and most said that it wasn't as abnormal as it sounded.
Too sooth my worries, I exchanged it for another card and the temps were the same. I changed the cooling in my case, and got the temps down to the high 80s/low 90s... but then OC'd it quite a bit ;p
For the most part it seems to hover in the 88-89 range, but it'll hit the 96-97 range if I play for three or four hours when my ambient temps are in the low 70s in my room.
A buddy online ended up getting the same card a month later, and even with his really nice, big case with lots of fans, he's still hovering in the high 80s.
If I learned anything, it's that I'll never get a ZOTAC card again ;p I'll probably just stick with eVGA. But I just couldn't pass on the deal.
Something is either wrong with your card or your cooling setup. I have tested 5 GTX 570's and all have stayed under 80c under the most intense of games.
I went from a Geforce 6800 -> x1950GT -> 3850 -> 4850 -> 5850 -> 2x 5850I rarely had major problems with AMD cards and I own nearly every generations of them.
9800 > X800 Pro > 1950 Pro > 3870 > 4870 > 6870.
Switching between nvidia and ATI has lead me to one startling conclusion.
Both sets of drivers have their own sets of bugs and flaws and compatibility issues.
On the other hand, only ATI has triple monitor on one card support.
Maybe, provided you ignore OpenGL support.
Interesting. I wonder how much of this tech will make it into consoles.
Nono, I can totally understand that you turn your back on a company when you get constantly burned by them. But it's like the Saints Row 3 example. The game runs very smooth on my machine and then Volition releases a patch (DX11 performance enhancements) and the performance gets worse on AMD cards - AMD still is getting blamed which is wrong in my eyes. This is just one example, I am sure people have worse issues, but it's not always that easy to point the finger at AMD.So you think the entire world is just biased against them? There are so many people who still post that they have problems with AMD cards and comparatively fewer with NV. My own personal experience dictates that I've never had major driver level issues with an NV card. I owned mostly ATI cards back in the day, starting with a 9800 Pro, then an X1900XTX, a 4850 and most recently a 4890. I had an 8800GT at some point in there as my only NV card and now I'm on a 560 Ti. I had no problems at all with the NV cards but constant headaches with the AMD ones.
It's like the people who defend Razer because they own a Deathadder that works fine while you can find a thousand posts on the internet complaining about Razer's build quality, quality control and support being complete shit. Just because you personally don't have issues with it doesn't mean you should ignore the relevant masses of people who are posting about issues with the hardware.