AMD Radeon 7000 series to be unveiled Dec 5 - first with 28nm again

Status
Not open for further replies.
I completely agree that AMD should still stay competitive in the gpu market. Nothing would be worse than letting Nvidia have the market all to themselves.
 
I got burnt pretty well by ATi. I bought a 4870 at launch and was really expecting great things. In most games, it would work well enough, sure you could cook a ham on it, but Nvidia cards weren't dramatically cooler at the time.

But there were those little dozen games per year that inexplicably ran significantly worse than the Nvidia counterpart. It just added up over time. Buying a game at launch, having it run like shit, going to forums to find that it was an ATi exclusive feature.

I even gave them a second chance on a 4770 when my 4780 died. It was just a placeholder card until I found my DX11 card, and it when you paired it with a game that had good ATi compatibility, it was easily the greatest GPU value on the market. But, again, there were those few games that came out and just blew up with ATi drivers.

Overall it was a small percentage of games, but it was enough for me to side-grade to a 275 GTX instead of getting a 5000 series card to tide me over. I've got a 580 GTX now and am more than happy with it. :)

To get me back, it would take a pretty dramatic (if not earth shattering) change of direction. Like people have pointed out, hilariously even games with AMD branding perform inexplicably worse on ATi hardware still today. Look no further than Saints Row 3.
 
With the lazyness of developers being a day 1 buyer can be horrible. Pay full price, recieve patches later.

That would change if Nintendo enters into pc market. Not gonna happen.
 
They going from 40 nm(5000/6000 series) straight to 28nm? That's an awfully big jumped.

They were supposed to go 32 so long ago but it's been delayed and delayed and delayed and delayed that they're just skipping it. Hopefully should be a big jump in performance
 
The original HD6970 @32nm had 1920SP VLIW4 (1536SP the 40nm one).

squidyj
Not delayed, TSMC canned the process literally overnight. Both HD6000/GTX500 were the emergency plan. Since the beginning nvidia planned a 32nm shrink of the GTX400, AMD wanted more than that so they got the major screw-up.
 
My move from an AMD 5870 -> Nvidia GTX570 this year was incredible. I try not to be a fanboy; but I really don't see myself jumping back to AMD anytime soon.
 
My move from an AMD 5870 -> Nvidia GTX570 this year was incredible. I try not to be a fanboy; but I really don't see myself jumping back to AMD anytime soon.

What? Aren't the 5870 and 570 pretty similar in terms of performance?

Or are you talking about drivers?
 
What? Aren't the 5870 and 570 pretty similar in terms of performance?

Or are you talking about drivers?

GTX 570 is like 50-80% faster than the 5870.


He's saying that Nintendo's stance isn't "Shove it out the door and patch all the bugs later", but instead "Release a finished product that's been fully tested". But really, it's apples to mangos. Nintendo deals with a static platform.

Yeah, seriously. Nintendo would have no idea how to handle themselves in the PC gaming arena.
 
I think dev support is just as important as post release drivers, if not more so. AMD has been working with DICE on Frostbite 2 and it shows. On the other hand, Saints Row: The Third and RAGE were completely broken at launch.

With Rage they had good drivers ready but they uploaded the wrong ones lol.
 
Is something wrong with Saints Row 3? I've been playing it in DX11 maxed with 8xMSAA just fine, but I limit it to 30fps, so I wouldn't know if it was supposed to be running considerably higher than that and wasn't.
 
Actually is 20-25%~.

HD5870 --> +17% HD6970 = GTX480 --> 0-5% GTX570



570 is not 50-80% faster than the GTX470 AFAIK. Both HD6000/GTX500 are tweaked versions of the previous gen, nothing to be surprised about.
According to these guys, 29% difference between 5850 and 570.

But yeah, we can cherrypick benches to show the difference is 80% ;)
 
Here, have some numbers:
http://www.anandtech.com/bench/Product/294?vs=306

Seems like the biggest differences are at super super high resolutions, which indicates to me a memory limitation on the card.

My numbers are from average gamer performance. Same can be said with HD6970 2GB vs GTX570/580 in multidisplay.

I bet the fan on this new card will be loud as fuck.
Just like the 6990.

GCN 7950/7970 are supposed to use liquid chambers(instead of vapor chamber) for the reference design.
http://www.techpowerup.com/155043/A...amber-Tech-On-Upcoming-7900-Series-Cards.html
 
Then what the fuck are you playing?

Witcher 2 broken crossfire on launch.

Skyrim broken crossfire on launch

Batman Arkham City broken crossfire on launch.

Rage broken everything on launch.

Don't you think developers should take some of the blame here?

According to these guys, 29% difference between 5850 and 570.

But yeah, we can cherrypick benches to show the difference is 80% ;)

Specifically HAWX (ridiculous tessellation) and Civ 5 (optional DX11 feature AMD didn't implement).
 
Switching between nvidia and ATI has lead me to one startling conclusion.

Both sets of drivers have their own sets of bugs and flaws and compatibility issues.

On the other hand, only ATI has triple monitor on one card support.
 
AMD video cards are very hard to love, especially considering just how exceptional Nvidia is at implementing outstanding drivers for their tech. I'm using a 6950 > 6970, but it might be the last card I purchase from them for quite some time. Frankly, I actually enjoy gaming more on my GTX 480. Just a smoother experience.

Even if we go back to the mid-to-late 90s, AMDs drivers were a joke. It seems to be a real problem with this company, and a long-lasting one. I remember the days of enthusiasts writing better drivers for the Rage Fury Maxx. Hilarious if it wasn't so sad.
 
I'm thinking of switching from a 5850 to a 580, see how it goes.
I wouln't sell or trash away the 5850 though, since i have to built another computer anyway.
 
Hey if anyone here doesn't want their AMD card anymore, I'll gladly take it off your hands as long as its faster than my GTX 260 ;-)

Anyway I'm keeping a close eye on the new set of cards coming out. I have the itch to upgrade, but its got to be worthwhile.
 
Am I the only one who had zero issues with ATI?

I've had a lot of driver issues with my 9600GT, particularly the "driver has stopped responding and has successfully recovered" which randomly occurs.

Never even had BF3 and I played since Beta.
 
Haven't had a problem with my 5850 aside from a heat issue which was funnily enough caused by the dust filters on my case being clogged.

Gonna go nvidia next round so I can use NVidia® Physx™
 
What? Aren't the 5870 and 570 pretty similar in terms of performance?

Or are you talking about drivers?

From everything I read, and what you can see has been posted in the last several replies, is that I could expect a 20-30% increase in FPS.

Well, from my experience at 1920x1200 and paired with an i7 930, the performance bump is an almost universal 30% bump.

Which may not sound like much. But it really, really is. Games that hovered in the mid 40s fps, are now a silky smooth 60fps. And that really makes a pretty massive difference.

And it helps keep my lows up, which is most important. Whereas BF3 would sometimes dip to the high 20s (28-29fps) in the most explosive action moments, my 570 hasn't dropped below 30 even once. And while it may not sound like much, that's a pretty big deal.

Factor in better drivers, with more recent updates; Physx; and 3D Vision (for when I upgrade); and better scalability in SLI than CF; well, it's just a spectacular card. And the fact that it OC's like a CHAMP doesn't hurt. I'm rocking basically GTX580 speeds. Not bad for a card I spent less than $300 on (Microcenter sale).

My only complaint is that it runs HOT. My 5870 would be around 65c under load, and my 570 can hit 97-98c pretty regularly. Apparently it's all within spec, and I haven't noticed any graphics distortions once... but in the interest of full disclosure... yeah.

Either way, at this point I'm really looking forward to the next Nvidia series. Nevertheless, if AMD knocks the 7xxx series out of the park, I'll be looking into them for sure.
 
While I'd like performance gains I'm much more interested in 28nm cards being hopefully smaller (funking GPU cards have gotten redonkulously large) that pull less wattage while running cooler. Give me all of that with another small performance bump, and I'd personally be ecstatic.
 
From everything I read, and what you can see has been posted in the last several replies, is that I could expect a 20-30% increase in FPS.

Well, from my experience at 1920x1200 and paired with an i7 930, the performance bump is an almost universal 30% bump.

Which may not sound like much. But it really, really is. Games that hovered in the mid 40s fps, are now a silky smooth 60fps. And that really makes a pretty massive difference.

And it helps keep my lows up, which is most important. Whereas BF3 would sometimes dip to the high 20s (28-29fps) in the most explosive action moments, my 570 hasn't dropped below 30 even once. And while it may not sound like much, that's a pretty big deal.

Factor in better drivers, with more recent updates; Physx; and 3D Vision (for when I upgrade); and better scalability in SLI than CF; well, it's just a spectacular card. And the fact that it OC's like a CHAMP doesn't hurt. I'm rocking basically GTX580 speeds. Not bad for a card I spent less than $300 on (Microcenter sale).

My only complaint is that it runs HOT. My 5870 would be around 65c under load, and my 570 can hit 97-98c pretty regularly. Apparently it's all within spec, and I haven't noticed any graphics distortions once... but in the interest of full disclosure... yeah.

Either way, at this point I'm really looking forward to the next Nvidia series. Nevertheless, if AMD knocks the 7xxx series out of the park, I'll be looking into them for sure.


Uh...you may want to check your setup. What kind of 570 do you have? That's not normal at all...
 
Uh...you may want to check your setup. What kind of 570 do you have? That's not normal at all...

I was worried, too. It's a ZOTAC AMP! card. I agree, the numbers sounded ridiculous. I asked around and did a lot of research, and most said that it wasn't as abnormal as it sounded.

Too sooth my worries, I exchanged it for another card and the temps were the same. I changed the cooling in my case, and got the temps down to the high 80s/low 90s... but then OC'd it quite a bit ;p

For the most part it seems to hover in the 88-89 range, but it'll hit the 96-97 range if I play for three or four hours when my ambient temps are in the low 70s in my room.

A buddy online ended up getting the same card a month later, and even with his really nice, big case with lots of fans, he's still hovering in the high 80s.

If I learned anything, it's that I'll never get a ZOTAC card again ;p I'll probably just stick with eVGA. But I just couldn't pass on the deal.
 
I was worried, too. It's a ZOTAC AMP! card. I agree, the numbers sounded ridiculous. I asked around and did a lot of research, and most said that it wasn't as abnormal as it sounded.

Too sooth my worries, I exchanged it for another card and the temps were the same. I changed the cooling in my case, and got the temps down to the high 80s/low 90s... but then OC'd it quite a bit ;p

For the most part it seems to hover in the 88-89 range, but it'll hit the 96-97 range if I play for three or four hours when my ambient temps are in the low 70s in my room.

A buddy online ended up getting the same card a month later, and even with his really nice, big case with lots of fans, he's still hovering in the high 80s.

If I learned anything, it's that I'll never get a ZOTAC card again ;p I'll probably just stick with eVGA. But I just couldn't pass on the deal.

Something is either wrong with your card or your cooling setup. I have tested 5 GTX 570's and all have stayed under 80c under the most intense of games.
 
My stock cooled EVGA GTX 570 will hit the upper 80s in a HAF X case. It can depend a lot on the fan profile for the card. Upper 90s doesn't sound right, though.
 
I rarely had major problems with AMD cards and I own nearly every generations of them.

8500 > 9800 > X800 Pro > x1950 Pro > hd3870 > hd4870 > hd6870.
 
I was worried, too. It's a ZOTAC AMP! card. I agree, the numbers sounded ridiculous. I asked around and did a lot of research, and most said that it wasn't as abnormal as it sounded.

Too sooth my worries, I exchanged it for another card and the temps were the same. I changed the cooling in my case, and got the temps down to the high 80s/low 90s... but then OC'd it quite a bit ;p

For the most part it seems to hover in the 88-89 range, but it'll hit the 96-97 range if I play for three or four hours when my ambient temps are in the low 70s in my room.

A buddy online ended up getting the same card a month later, and even with his really nice, big case with lots of fans, he's still hovering in the high 80s.

If I learned anything, it's that I'll never get a ZOTAC card again ;p I'll probably just stick with eVGA. But I just couldn't pass on the deal.

Something is either wrong with your card or your cooling setup. I have tested 5 GTX 570's and all have stayed under 80c under the most intense of games.


I agree with Booby. Almost 100c is not right. Something is going on with your setup.
 
I rarely had major problems with AMD cards and I own nearly every generations of them.

9800 > X800 Pro > 1950 Pro > 3870 > 4870 > 6870.
I went from a Geforce 6800 -> x1950GT -> 3850 -> 4850 -> 5850 -> 2x 5850
The only game that gave me a vendor specific performance issue was Saints Row the goddamned Third. I should have known better after Saints Row 2.
 
Switching between nvidia and ATI has lead me to one startling conclusion.

Both sets of drivers have their own sets of bugs and flaws and compatibility issues.

On the other hand, only ATI has triple monitor on one card support.

I have both (6950 & GTX 570). AMD tends to not have the best day 1 support, Nvidia is constantly breaking older games (took them 6 months to fix a CTD in King's Bounty, stuttering for many months in Bad Company 2, Fallout 3 would break and have shitty stuttering and frame rate issues for almost a year of driver releases - had to go back to a very old driver to play it which meant I couldn't play some newer games, Valve games like TF2 would run like shit for some months of driver releases meanwhile people with ATI cards that were half as powerful ran it perfectly smooth, etc)

Luckily i have both so I just switch computers.
 
Just bought a 6850, and it was a jump from ATI 9800. I'm skipping the upcoming gen for sure (and the next one probably too).
 
I am interested in seeing this. Due to drivers it the card will have to be something special in order for me not to keep with nvidia for the next upgrade path.
 
So you think the entire world is just biased against them? There are so many people who still post that they have problems with AMD cards and comparatively fewer with NV. My own personal experience dictates that I've never had major driver level issues with an NV card. I owned mostly ATI cards back in the day, starting with a 9800 Pro, then an X1900XTX, a 4850 and most recently a 4890. I had an 8800GT at some point in there as my only NV card and now I'm on a 560 Ti. I had no problems at all with the NV cards but constant headaches with the AMD ones.

It's like the people who defend Razer because they own a Deathadder that works fine while you can find a thousand posts on the internet complaining about Razer's build quality, quality control and support being complete shit. Just because you personally don't have issues with it doesn't mean you should ignore the relevant masses of people who are posting about issues with the hardware.
Nono, I can totally understand that you turn your back on a company when you get constantly burned by them. But it's like the Saints Row 3 example. The game runs very smooth on my machine and then Volition releases a patch (DX11 performance enhancements) and the performance gets worse on AMD cards - AMD still is getting blamed which is wrong in my eyes. This is just one example, I am sure people have worse issues, but it's not always that easy to point the finger at AMD.

And hey, I never had any problems with my Razer products ;)
 
Status
Not open for further replies.
Top Bottom