AMD Radeon 7000 series to be unveiled Dec 5 - first with 28nm again

Status
Not open for further replies.
I've got a Phenom II X6 1075T @ 3.0ghz, and a 6970 2GB.

Would one of these cards be a worthwhile upgrade, or is my processor acting as a bottleneck here?

I don't think so. I would feel little more inadequate if I didn't have my 2500k at 4.5ghz though :P

If your CPU starts more to be obsolete with HW coming up then there's something wrong and tech is just moving too fast then. Either way, we have a rock solid line of CPU's by Intel right now and even your HW won't necessarily be a bottleneck unless its a certain game that makes it so.
 
Ryoma-Echizen posted some rumor power consumption numbers. Not sure where he got them though. If what those rumors says is true, no freaking way am I going to buy Nvidia (I'm an nvidia owner BTW). Driver problems can be fixed through updates, but power consumption can't unless you buy a new card.

That suspect FUD from yesterday aside, this was supposed to be one of the points of emphasis with Nvidia's new cards. Under this currently proposed scenario, obviously AMD is the clear cut winner (and I'll go back if I upgrade next Spring). But let's not draw any conclusions yet. Lots of people have agendas and I wouldn't trust anything until we get out of the speculative stage.
 
We probably won't know until we know the actual specs of the card.

OK I had figured that comparing my current cpu with my current gpu, someone might be able to say whether or not I should build a Sandybridge/Ivybridge system or something before I spend more on graphics.
 
If these launch for real in early 2012 (as opposed to a paper launch), I guess I'll be using one for a new PC build.

It seems like AMD cards are the only ones that have display port standard. I can never seem to find them on Nvidia cards. maybe I'm not looking hard enough (or not willing to pay for the top line cards that may have the DP output).
 
OK I had figured that comparing my current cpu with my current gpu, someone might be able to say whether or not I should build a Sandybridge/Ivybridge system or something before I spend more on graphics.

Also don't ignore what LiquidMetal14 just said. He has a very good point.

That suspect FUD from yesterday aside, this was supposed to be one of the points of emphasis with Nvidia's new cards. Under this currently proposed scenario, obviously AMD is the clear cut winner (and I'll go back if I upgrade next Spring). But let's not draw any conclusions yet. Lots of people have agendas and I wouldn't trust anything until we get out of the speculative stage.

Oh I am well aware of that. It's just that that's all we have to go on for now. Plus, AMD has been delivering in the power consumption department so at least that's something. The numbers do sound too good to be true. I want to BELIEVE.
 
Ryoma-Echizen posted some rumor power consumption numbers. Not sure where he got them though. If what those rumors says is true, no freaking way am I going to buy Nvidia (I'm an nvidia owner BTW). Driver problems can be fixed through updates, but power consumption can't unless you buy a new card.


Yeah, if the rumored power consumption for the nvidia cards end up being accurate, I'll have to rethink my plans. It's going to be a long wait before I have the information I need to make any kind of decision.
 
I don't think so. I would feel little more inadequate if I didn't have my 2500k at 4.5ghz though :P

If your CPU starts more to be obsolete with HW coming up then there's something wrong and tech is just moving too fast then. Either way, we have a rock solid line of CPU's by Intel right now and even your HW won't necessarily be a bottleneck unless its a certain game that makes it so.
Also don't ignore what LiquidMetal14 just said. He has a very good point.

Thanks guys.

From what I can gather, you're saying that my proc shouldn't be a bottleneck unless a particular app is over-reliant on cpu without taking advantage of the multiple cores. It's funny you bring that up, because my disappointment in Skyrim's performance on my machine is probably why I'm thinking about this.

I just remembered that I promised myself I'd switch to Nvidia next time around. I think an upgrade will be worthwhile when I can get a single card that will allow me to run ubersampling on TW2 at 60fps.
 
These rumours have been proven false over and over and over again. Please stop reposting them. This is their original source time and time again from way back in September, and it's crap.
Link1: Poster on B3D confirms that the info is incorrect. For all it could be that the XDR bit in that could be wrong or the mem configuration etc.
Link2: Same as above, redundant and useless post.
Link3: Irrelevant post on B3D that contains guesstimated specs

lol?

The only thing we're certain at this point is that XDR will not be used.
 
Link1: Poster on B3D confirms that the info is incorrect. For all it could be that the XDR bit in that could be wrong or the mem configuration etc.
Link2: Same as above, redundant and useless post.
Link3: Irrelevant post on B3D that contains guesstimated specs

lol?

The only thing we're certain at this point is that XDR will not be used.

Search "neliz" post history on B3D. They're fake.
 
I've got a Phenom II X6 1075T @ 3.0ghz, and a 6970 2GB.

Would one of these cards be a worthwhile upgrade, or is my processor acting as a bottleneck here?

Worth the upgrade if you want to see more fps/better mins. There are no unplayable games with 6970/580 and if you think about gaming @over 1080p then dual cards are always the better option.


=====
HD4870 55nm --> 175w
HD4890 55nm --> 190w
HD5870 40nm (70%+ perf over HD4870) --> 175w

It's totally possible that new midrange (7850/7870 @28nm) will come with a really low consumption. My personal estimates are 100-125w (same as Barts 6850/6870).

Remember that they need to squeeze the 28nm's all way to the Radeon 8000's in 2013.
 
I trust that guy but he hasnt refuted it specifically. And for sure we havent had it refuted over and over ..

He seemed pretty clear to me. And I was one who talked about those numbers before pretty often. Hopefully this will all be resolved Monday.
 
I really can't wait for this to come out. I've held back upgrading my HD5850 since it runs everything at max anyway. Easily the best AMD card since the 9700pro in my book. They should also have some nice tech demo's as well :)
 
Hi

Quite a few people spend that on cards...just not on GAF.

Actually, if you look at some of the past "Post your PC" braggin' threads, you'd find that GAFers spend A LOT of money on their PCs. Tons of GTX580s, dual and even tri-fire configs. Pretty impressive to say the least.
 
Actually, if you look at some of the past "Post your PC" braggin' threads, you'd find that GAFers spend A LOT of money on their PCs. Tons of GTX580s, dual and even tri-fire configs. Pretty impressive to say the least.

I mean compared to other forums. I spent $1,200 combined on my 2 Lightning 3GB 580s. That's what I was referring to. $700 card is the highest of high-end for a single card solution. Hell at one point you could SLI traditional 580s for under $750 if you got the Zotac brand from Microcenter.
 
Actually, if you look at some of the past "Post your PC" braggin' threads, you'd find that GAFers spend A LOT of money on their PCs. Tons of GTX580s, dual and even tri-fire configs. Pretty impressive to say the least.
That's like 6 people?
 
Skimming the benchmark thread, more like 20-25 or so people with dual/triple GPU setups.
Somehow I read that in my head as 'SLi 580s'.

Yeah, it isn't that surprising considering the types of people who would run the bench and post their results + how big GAF is imo.
Still pissed at my score. I deserve that spot
 
AMD drivers have been solid for years now. The one spot where they fail is Crossfire drivers.

AHAHAHAHAHAHAH

No. :(

5850 here, can't switch to Nvidia unless they finally come out with something comparable to AMD as far as high end HD HTPCs go with proper RGB and YCbCr pixel format's color output, HD audio support and true 24p refresh rate out of the box

I'll definitely upgrade anyway.
 
AHAHAHAHAHAHAH

No. :(

5850 here, can't switch to Nvidia unless they finally come out with something comparable to AMD as far as high end HD HTPCs go with proper RGB and YCbCr pixel format's color output, HD audio support and true 24p refresh rate out of the box

I'll definitely upgrade anyway.

My GTX 570 does YCbCr + RGB color format, HD audio, and I believe it does 24p refresh rate as well.
 
AHAHAHAHAHAHAH

No. :(

5850 here, can't switch to Nvidia unless they finally come out with something comparable to AMD as far as high end HD HTPCs go with proper RGB and YCbCr pixel format's color output, HD audio support and true 24p refresh rate out of the box

I'll definitely upgrade anyway.

can you go dual card? Nice low power, passively cooled AMD card for HTPC (quiet too) and a powerful Nvidia card for gaming?
 
Dual Gpu is the shit
2xgtx570
too bad i can't switch to amd, they don't have a good 3d solution compared to nvidia vision.
 
7970 looks to be my next card.

If the rumors are true that Nvidia is late 2012 for its high end one chip solution that majorly sucks.

http://www.xbitlabs.com/news/mobile...r_Graphics_Chips_to_Support_DirectX_11_1.html

Unless they shrink the current Fermi and re-name it.

What is a one chip solution?

If these launch for real in early 2012 (as opposed to a paper launch), I guess I'll be using one for a new PC build.

It seems like AMD cards are the only ones that have display port standard. I can never seem to find them on Nvidia cards. maybe I'm not looking hard enough (or not willing to pay for the top line cards that may have the DP output).

What the need for the display port?


I am just curious.
 
My GTX 570 does YCbCr + RGB color format, HD audio, and I believe it does 24p refresh rate as well.

It doesn't, sorry to bring you the bad news.
HD audio gets downsampled, no proper bitstreaming, 24p support is iffy (some workarounds to get proper 23.976Hz instead of 24Hz but the clock is very unstable) and YCbCr color output is fucked up and off standard.

AMD GPU cards from Hd 5xxx series onwards will give you Full Bitstreaming of PAVP (protected audio video path) lossless audio (Blu Ray's DTS HD Master Audio and Dolby True HD -the one you mention, DD Plus, is a lossy codec which was used mainly on HD DVD) without downsampling (so for example 24bit 96Khz audio won't be downsampled to 16bit 48Khz).
There's a couple of other ways to get full bistreaming but it's no worth looking at: a few, selected, Intel's H55/57 cards are able to bs thru the onboard video out, but not all models can -for example ASRock boards don't) OR a couple of costly&hard to find audio cards like the Asus Xonar HDAV amd the Auzen Xfi HT 1.3a).
Nvidia cards officially supports DTSHD MA and DD True as of the 460 with a post release beta-driver (260.63) but it's much more difficult to find it supported in players and it's also still very buggy.

As of today, for NVidia cards, only the 430, 460 and 560 have to proper bistream HD audio under PAP, and not with every driver, support is wonky.
If you wanted to ask then nope, the GTX 580 does not support full HD audio bitstreaming as well.
 
can you go dual card? Nice low power, passively cooled AMD card for HTPC (quiet too) and a powerful Nvidia card for gaming?

Unfortunately going dual card would mean I'd need an HD6xxx anyway for 3D blu-ray playback and proper PAP support for true non downsampled HD Audio bitstream... Not really worth it to go with that plus NVidia for gaming on top of that (cost+noise+power consumption+all the annoying calibration and custom work to do for dual GPUs solution)
 
AMD's bullshit crossfire support is really annoying for me.
It takes a month or two until games with broken crossfire gets fixed.

I'm considering SLI for my next upgrade.
 
As of today, for NVidia cards, only the 430, 460 and 560 have to proper bistream HD audio under PAP, and not with every driver, support is wonky.
If you wanted to ask then nope, the GTX 580 does not support full HD audio bitstreaming as well.
As usual with NVIDIA all the multimedia features of lower range of chips will be present in the next generation high end chips.
 
If irfan or a mod wants to update the OP, the same site also had an English version of the article in the OP.

http://www.nordichardware.com/news/...o-be-presented-in-london-on-december-5th.html



Also Fudzilla has a rumor saying that Tahiti will be $449/$549.

http://www.fudzilla.com/graphics/item/25089-amd-radeon-hd-7000-series-priced

We hear whispers of the final launch price of AMD's upcoming Radeon HD 7000 series cards and it looks as though the faster variation of the Tahiti chip will hit US $549 price tag while the slower one should be priced at US $449.

Of course, bear in mind that nothing is carved in stone here as AMD still has the option and ample time to change the price, or as they call it, adjust it, even a few days before the actual launch. The same thing goes for the name of the card as these are only finallized when partners boxes are sent for printing. But for now, the HD 7950 and the HD 7970 are the names most partners are using and these US $449/549 are the prices that came from AMD.

As you already know, AMD planned to ship the cards in the first week of December as holiday shopping is always a nice timeframe for a launch. Unfortunately, TSMC is the one to blame as we are hearing that even the yields of the 40nm HD 6900 series are still not great and TSMC is also the reason why HD 7000 got pushed to early-January shipping date, or in time for CES 2012.

In any case, we'll keep our eyes and ears ready in case AMD decides to change these anytime soon.
 
If irfan or a mod wants to update the OP, the same site also had an English version of the article in the OP.

http://www.nordichardware.com/news/...o-be-presented-in-london-on-december-5th.html



Also Fudzilla has a rumor saying that Tahiti will be $449/$549.

http://www.fudzilla.com/graphics/item/25089-amd-radeon-hd-7000-series-priced

I really, really don't see that happening.

AMD has made it a point over the last few generations to stress performance/$ value. They have kept the 6970 priced well below the GTX 580 while offering similar performance. Also, I think (and I assume AMD has a similar line of thinking) that they'll sell far more 7950/7970's at $350/450 price points, respectively. $450/550 would mean a lot less sales and would also give up one of their key advantages vs. Nvidia. I'm pretty damn sure AMD would LOVE to have the 7950 @ $350 being 50%+ faster than a $450-500 GTX 580. That'd also encourage more people to go Crossfire.

AMD may not be getting great yields with 28nm yet, but I'm almost positive they will push hard to keep their excellent pricing. Besides, their whole 7xxx pricing line-up would have huge gaps in it:

7850 - $200-250
7870 - $260-300
7950 etc $450+?

Nah. I just don't think they'll pull an Nvidia on pricing.

Because that'll really suck if I have to fork out $450 for a 7950... :P
 
I really, really don't see that happening.

AMD has made it a point over the last few generations to stress performance/$ value. They have kept the 6970 priced well below the GTX 580 while offering similar performance. Also, I think (and I assume AMD has a similar line of thinking) that they'll sell far more 7950/7970's at $350/450 price points, respectively. $450/550 would mean a lot less sales and would also give up one of their key advantages vs. Nvidia. I'm pretty damn sure AMD would LOVE to have the 7950 @ $350 being 50%+ faster than a $450-500 GTX 580. That'd also encourage more people to go Crossfire.

AMD may not be getting great yields with 28nm yet, but I'm almost positive they will push hard to keep their excellent pricing. Besides, their whole 7xxx pricing line-up would have huge gaps in it:

7850 - $200-250
7870 - $260-300
7950 etc $450+?

Nah. I just don't think they'll pull an Nvidia on pricing.

Because that'll really suck if I have to fork out $450 for a 7950... :P


+1

AMD/ATI hasn't been more expensive for years(if not ever?). Unless the 7970 is a monster, no reason for them to price the card at $550+
 
I was really hoping to upgrade my 4850 to an Nvidia this time, but if the rumors of their high end single cards not being available until late 2012 are true, I don't think I can hold on that long, might just snag me a 7970 if it's ~$450
 
This is true but it depends how much you value your own time. I'd rather just get on with gaming rather then spending countless hours beta testing AMD drivers.
As probably one of the few guys who has two gaming rigs with cards from both sides, this kind of fanboy rhetoric irks me. If anything, I've had more issues with instability in the two months I've had my 560Ti's than I ever did with the 5870. NVIDIA has better relationships with a lot of devs, which generally leads to better drivers for games, but they certainly aren't without faults. Remember the whole Rage issue from just a few months ago?
I've got a Phenom II X6 1075T @ 3.0ghz, and a 6970 2GB.

Would one of these cards be a worthwhile upgrade, or is my processor acting as a bottleneck here?
You could get more out of your processor with a solid OC on it. In this thread, #7 and #16 have pretty much identical clocks on the 580s, but there's a significant jump going from the 1100 to even the Nehalem. So, you wont be gaining as much as you could be with a recent sandybridge proc. However, as #19 on that thread (putting me below the 1100T rig), I can tell you the guy with the 1100T has an amazing gaming experience right now. If anything, the only reason I would question the upgrade is that going from one gen's top of the line card to the next is rarely a worthwhile upgrade. We'll see once benches are released.
It seems like AMD cards are the only ones that have display port standard. I can never seem to find them on Nvidia cards. maybe I'm not looking hard enough (or not willing to pay for the top line cards that may have the DP output).
Apple sources GPUs for their systems from AMD. Apple more or less is heading the move to displayport. NVIDIA doesn't gain very much (currently) from going out of their way and paying for the tech and integrating it into their designs.
Come on ATI, make me upgrade the GTX480. MAKE MEEEEEEE
(It's AMD, ATI doesn't exist anymore). 480 is a pretty amazing card. The fact that you get an awesome GPU AND a space heater in one package is a killer deal. :P
Yeah, it isn't that surprising considering the types of people who would run the bench and post their results + how big GAF is imo.
Still pissed at my score. I deserve that spot
Well, with the 7xxx just over the horizon, maybe you'll grow a pair and throw some more voltage on your 6950s. If you brick them, you'll have an awesome replacement card.
What is a one chip solution? What the need for the display port?
Some cards, like the 6990 and GTX590 are actually dual chip cards. They essentially take the parts of two 6970s and 580s (respectively), down clock them, and throw them onto the same PCB instead of having two separate cards.

A lot of new high end 120Hz monitors require displayport to function properly. Same with the Apple Cinema Display.
AMD's bullshit crossfire support is really annoying for me.
It takes a month or two until games with broken crossfire gets fixed.

I'm considering SLI for my next upgrade.
Yep. Went 5870 crossfire myself. I was playing APB: Reloaded, Brink, and Heroes of Newerth at the time. Neither APB nor HoN ever received crossfire profiles, and it took like 2 weeks after the launch of Brink to finally get them. Considering I had to disable one of my two cards for the games I was playing, I ended up giving it away to a friend. Either than obvious AAA games, it seems like the programmers throw a dart at a board to pick the games that get crossfire support. That experience is what pushed me to go green for my recent build.
 
I really, really don't see that happening.

AMD has made it a point over the last few generations to stress performance/$ value. They have kept the 6970 priced well below the GTX 580 while offering similar performance. Also, I think (and I assume AMD has a similar line of thinking) that they'll sell far more 7950/7970's at $350/450 price points, respectively. $450/550 would mean a lot less sales and would also give up one of their key advantages vs. Nvidia. I'm pretty damn sure AMD would LOVE to have the 7950 @ $350 being 50%+ faster than a $450-500 GTX 580. That'd also encourage more people to go Crossfire.

AMD may not be getting great yields with 28nm yet, but I'm almost positive they will push hard to keep their excellent pricing. Besides, their whole 7xxx pricing line-up would have huge gaps in it:

7850 - $200-250
7870 - $260-300
7950 etc $450+?

Nah. I just don't think they'll pull an Nvidia on pricing.

Because that'll really suck if I have to fork out $450 for a 7950... :P

I don't see why they wouldn't charge high prices.

-Presumably no competition on the high end
-Likely shortages based on manufacturing trouble
-AMD not doing great financially right now
 
Status
Not open for further replies.
Top Bottom