AMD Radeon 7000 series to be unveiled Dec 5 - first with 28nm again

Status
Not open for further replies.
Yes and no. The mid-range 7000 series are essentially going to be 6950's and 6970's on 28nm so they'll be much cooler, overclock better, use far less power and be smaller/cheaper. Does that make sense as to why it's yes/no?
I'm not quite following you. The high end 6000 series will be slightly improved and released as the mid 7000 series, while the high end 7000 series will be 'new' cards.
 
By the rumors:

HD7850 (~HD6950)--> 90w
HD7870 (~HD6970)--> 105w

Hopefully this news get nvidia talking about their new cards. I want to know how much longer I have to wait, no way in hell am I buying a radeon. I want the most trouble free computing experience possible.

AMD/Nvidia have their driver problems from time to time. I use AMD gpu's since 2007 (and most of my close friends are happy with their radeon's gpu. Seems some people born blessed with some kind of hardware curse were they face every hardware problem a human can face.

I know a friend that bought 10 250GB HDD and 8 of them passed away in 1 week.
 
AMD driver problems are not overblown, not bullshit, not fanboyism. It was always a talking point back in the day during the height of the Nvidia vs ATI war, and I treated it as such and always just went for the best value in any given hardware cycle, but my personal experiences with AMD in the past year have completely soured me on their products.

Unstable crashing drivers, broken day 1 game performance, wonky issues with HMDI audio that change with every driver update but refuse to actually work properly. I am not going by what other people are saying, I'm going by my own experiences. AMD will need 1-2 years of clean track record (and a price:performance advantage) before I give them another shot.
While some cards have been generally better than others, for the most part ATI/AMD drivers always seem to take longer to mature/stabilize for new chipsets for a long while now.

Back when I did hardware testing full-time for games, there were almost always more driver issues found on new ATI cards we'd receive compared to Nvidia and when we reported issues we'd usually get a new beta driver with a fix quicker from Nvidia than from ATI. I wouldn't be surprised if this is still the case today, as I said earlier Nvidia's developer support is top-notch.
 
That would be GODLY! What was the power draw on the 6000 equivalent again?

Almost 2x as much.

power_peak.gif
 
Are they going back to the 5000 style numbers? The way they named their cards was a pain in the ass with the 6000 series. I know it isn't too confusing, but every time I see their numbers, I have to stop and rethink if it's the class of cards I'm thinking of.
 
While some cards have been generally better than others, for the most part ATI/AMD drivers always seem to take longer to mature/stabilize for new chipsets for a long while now.

Back when I did hardware testing full-time for games, there were almost always more driver issues found on new ATI cards we'd receive compared to Nvidia and when we reported issues we'd usually get a new beta driver with a fix quicker from Nvidia than from ATI. I wouldn't be surprised if this is still the case today, as I said earlier Nvidia's developer support is top-notch.
Support is still an issue for AMD, but they tend to do pretty well on high profile releases after the post launch drivers get released.

Nvidia is typically more proactive. But, on the other hand, Nvidia has released drivers that damaged hardware before.
 
The whole ATI driver thing may not be a problem of the past but they do at least try to fix problems games have running on their cards. Nvidia has deep pockets so they'll send an engineer to a developer to squeak every bit of performance out of a particular game. Cross promotion happens far more with their stuff because they pay for it. Not saying this is wrong, I think ATI should be doing this more, perhaps spend a little money to cross promote more games on ATI setups as well as getting driver fixes out the same day or earlier with big name games. The last ATI promo'd game I can think of was dirt 2 with the 5XXX series cards , that was the fall of 2009. Nvidia has done like a half dozen major releases since then and tends to get their logo pasted into every other pc game. I suppose this is partially due to their PhysX aquisition but still.

I kind of prefer supporting ATI even though it usually feels like the scruffy underdog and I'm not even sure why , the only time I bought an nvidia product it happened to be the 8800 gts which was one helluva card. So I'm not opposed to the idea of giving money to Nvidia , it's just that the last product cycle I was under the impression they didn't do anything that special , especially for the price.
 
I don't know if it's about the drivers, I always thought the games were just better optimized for nvidia cards (first because of the support Nvidia gave via "the way it's meant to be played" and then it just kinda became standard I guess)
Add LOTR: War in the North & Saint's Row 3 to recent releases completely bonjaxed on ATI cards.
Saints Row 3 was one of those ATI-optimised (or authorised? or endorsed?) games. It still runs like shit on my PC and after several profile updates, still runs better with Crossfire disabled. Fuck em. Going nVidia next year purely based on ATI/AMD's pathetic driver support.
Unless you count the driver problems they've had just this year, with Dead Island, and Crysis 2, and Batman Arkham City, and Battlefield 3, and RAGE...want me to keep going?
Saints 3, F1 2010/2011 (though admittedly, there's some generally shonky coding in general there), Skyrim, TDU2...

Seriously, waiting to see what the best single GPU nVidia card will be and if it's close to or on par with AMD's equivalent, I'll be getting that. Shitty software is worse than shitty hardware.
 
AMD driver problems are not overblown, not bullshit, not fanboyism. It was always a talking point back in the day during the height of the Nvidia vs ATI war, and I treated it as such and always just went for the best value in any given hardware cycle, but my personal experiences with AMD in the past year have completely soured me on their products.

Unstable crashing drivers, broken day 1 game performance, wonky issues with HMDI audio that change with every driver update but refuse to actually work properly. I am not going by what other people are saying, I'm going by my own experiences. AMD will need 1-2 years of clean track record (and a price:performance advantage) before I give them another shot.

things were alot worse back in the day......I feel ATI/AMD drivers have dramatically improved as well as day 1 support for top games.

There was a time you could wait a month if ever for ATI to fix something.
 
AMD driver problems are not overblown, not bullshit, not fanboyism. It was always a talking point back in the day during the height of the Nvidia vs ATI war, and I treated it as such and always just went for the best value in any given hardware cycle, but my personal experiences with AMD in the past year have completely soured me on their products.

Unstable crashing drivers, broken day 1 game performance, wonky issues with HMDI audio that change with every driver update but refuse to actually work properly. I am not going by what other people are saying, I'm going by my own experiences. AMD will need 1-2 years of clean track record (and a price:performance advantage) before I give them another shot.

What kind of issues are you having with HDMI audio? Are you going through through a receiver as well?
 
What kind of issues are you having with HDMI audio? Are you going through through a receiver as well?

Started out, no problems.

Eventually, for several months I'd need to disable the driver and re-enable every time I started windows or it wouldn't work. Sometimes if I muted sound the driver would just shit itself and stay muted, and I wouldn't get sound back again until I restarted my computer.

That changed in an eventual driver update. Now, occasionally I need to go to Sound --> Playback and *click* on AMD HDMI Output to get sound. Not enable or disable, just click on it.


The disable/reenable scenario persisted through a receiver brand change from Onkyo to Denon. Gonna blame AMD here.
 
That changed in an eventual driver update. Now, occasionally I need to go to Sound --> Playback and *click* on AMD HDMI Output to get sound. Not enable or disable, just click on it.


I'm having the same issue. Whenever I switch receiver inputs and then back to the pc HDMI input I have to right click sound and select Playback Devices for the card to recognize that HDMI audio is available. It's been annoying. I was under the impression that it was a Windows 7 issue but apparently it's my 5850.
 
Haven't had a Nvidia card since the GF1 DDR. I've a had a few issues with drivers over the years, but seeing as how I like tinkering with computers, most haven't bugged me.

I think AMD will eventually get their drivers straightened out, but I can easily understand how people would rather buy a product where the games just work as they should at release. I'll keep a close eye on this new series and lets hope it is significantly more powerful/feature-rich/cooler than what we have now to really drive the competition.
 
This guy has a terrible track record so take this info FWIW

Radeon HD 7000 Revealed: AMD to Mix GCN with VLIW4 & VLIW5 Architectures

Starting next week, AMD is going to organize Tech Days in several destinations around the globe, such as London or Paris - during which the company is going to present 28nm Radeon HD 7000 series.

There is a lot of rumors flying around the web, some of which are spun by AMD themselves to raise confusion, as the Radeon HD 7000 series is going to mix the existing VLIW4 and VLIW5 architectures with the "Graphics Core Next" (GCN), introduced during June's Fusion Development Summit held in Bellevue, WA.

Radeon HD 7000 Series with the old VLIW4 and VLIW5 Architectures
A couple of years ago, AMD and favorable media were all over NVIDIA for mixing different GPU architectures within the same product line. Then with the Radeon HD 6000 series, all of a sudden nobody questioned why AMD mixed two distinctive GPU architectures within a single series (new VLIW4 architecture only powered three high end parts). With Radeon HD 7000 Series, the situation is set to become even more complicated, with AMD mixing no less than three distinctive GPU architectures within the single generation of products.

Given the recent cancellation of 28nm Krishna and Wichita APUs, AMD will rebrand the Brazos 2.0 APU platform as Radeon HD 7200 and 7300 series, and for instance rebranded AMD E-Series APU will be powered by Radeon HD 7200 or 7300 series (all based on Evergreen GPU - VLIW5).

The higher end Trinity APU, the heir to the successful Llano A-Series APU will be powered by a Devastator GPU core, based on contemporary "Northern Islands" VLIW4 architecture, featuring product names such as Radeon HD 7450(D), 7550(D) and so on and so forth.

When it comes to discrete parts, parts with the codename Cape Verde (HD 7500, 7600, and 7700) and Pitcairn (HD 7800), they are all based on the VLIW4 architecture. The "Graphics Core Next" architecture is reserved just for the 7900 Series. Desktop parts are codenamed on Southern Islands, while mobile parts are codenamed after parts of London (read: Cape Verde becomes Lombok, Pitcairn becomes Thames etc.).

If you compare the VLIW4-based HD 6900 and the upcoming HD 7800 series, there isn't much difference between the two. According to our sources, HD 7800 "Pitcairn" is a 28nm die shrink of the popular HD 6900 "Cayman" GPU with minor performance adjustments. This will bring quite a compute power into the price sensitive $199-$249 bracket and we expect a lot of headaches for NVIDIA in that respect.

Welcome Graphics Core Next: Powering the Tahiti and New Zealand (HD 7900)
AMD spoke about Graphics Core Next (GCN) quite openly, a move we can only commend them for. During his keynote session in June, Eric Demers of AMD explained the reasoning behind the move to GCN: compute is graphics, graphics is compute. There is no doubt that the future of GPUs are enhanced compute capabilities and we already hear from game developers who are using computational power of the GPU to create details inside the games instead of gigabytes and gigabytes of textures.

The new GCN architecture brings numerous innovations to GPU architecture, out of which we see x86 virtual memory as perhaps one of the most important ones. While the GPU manufacturers have promised functional virtual memory for ages, this is the first time we're seeing a working implementation. This is not a marketing gimmick, IOMMU is a fully functional GPU feature, supporting page faults, over allocating and even accepting 64-bit x86 memory pointers for 100% compatibility with 64-bit CPUs. Virtual memory is going to be the large part of next-gen Fusion APUs (2013) and FireStream GPGPU cards (2012), and we can only commend the effort made in making this possible.

All of this required to expand the GPU controller by two additional lines for a grand total of 384-bits, identical to GeForce GTX 580, for example. However, AMD timings are much more aggressive than the conservative NVIDIA, so expect the memory clock to remain higher with AMD GPUs.

A rumor recently exploded that HD 7900 Series will come with Rambus XDR2 memory. Given the fact that AMD has a memory development team and the company being the driving force behind creation of GDDR3, GDDR4 and GDDR5 memory standards - we were unsure of the rumors.

Bear in mind that going Rambus is not an easy decision, as a lot of engineers inside AMD flat out refuse to even consider the idea of using Rambus products due to company's litigious behavior. However, our sources are telling us that AMD is frustrated that the DRAM industry didn't made good on the very large investment on AMD's part, creating two GDDR5 memory standards: Single Ended (S.E. GDDR5) and Differential GDDR5. Thus, the company applied pressure to the memory industry in bridging GDDR5 and the future memory standard with XDR2 memory. The production Tahiti part will utilize GDDR5 memory, though.

Is AMD going to continue investing in future memory standards? We would say yes, but with all the changes that have happened, it just might take the executive route to utilize available market technologies rather than spending time and money on future iterations of GDDR memory. After all, AMD recently reshuffled their memory design task force. In any case, Differential GDDR5 comes at very interesting bandwidth figures and those figures are something AMD wants to utilize "as soon as possible".

AMD is pushing forward with their Fusion System Architecture (FSA) and the goals of that architecture will take some time to implement - we won't see a full implementation before 2014. However, Southern Islands brings several key features which AMD lacked when compared to NVIDIA Fermi and the upcoming Kepler architectures.

The GPU itself replaced SIMD array with MIMD-capable Compute Units (CU), which bring support for C++ in the same way NVIDIA did with Fermi, but AMD went beyond Fermi's capabilities with aforementioned IOMMU. There is also a link between power management for the CPU and GPU, which should reduce power consumption (currently, single action that GPU makes will wake up the CPU, even if it's something as simple as screen refresh).

As you can see on the image above, a single CU block is consisted out of a single Scalar and 64 Vector units which are fed through multiple layers of cache. Overall, the Compute Unit comes with 16KB of L1 Data cache and 64KB LDS memory (i.e. scratch memory), with an additional 48KB shared between four CU blocks. Each CU connects to 64KB of dedicated L2 cache.

With Tahiti packing 32 Compute Units in a maximum configuration, a 32 CU GPU with 2048 processing cores features almost 5MB of on-die memory: 512KB L1 Data cache, 384KB Shared L1 cache and 2MB of LDS and 2MB of L2 Cache. This is a record amount of cache for the GPUs so far, and you can expect this trend to continue.

AMD adopted a smart compute approach. Graphic Core Next is a true MIMD (Multiple-Instruction, Multiple Data) architecture. With the new design, the company opted for "fat and rich" processing cores that occupy more die space, but can handle more data. AMD is citing loading the CU with multiple command streams, instead of conventional GPU load: "fire a billion instructions off, wait until they all complete". Single Compute Unit can handle 64 FMAD (Fused Multiply Add) or 40 SMT (Simultaneous Multi-Thread) waves. Wonder how much MIMD instructions can GCN take? Four threads. Four thread MIMD or 64 SIMD instructions, your call. As Eric explained, Southern Islands is a "MIMD architecture with a SIMD array".

These compute units are paired with conventional fixed function hardware. AMD tried the non-fixed function hardware route with the R600 in 2007 (Radeon HD 2000 series) and after that experiment, the company saw no value in avoiding fixed function hardware. Thus, Southern Islands will continue to have up to 64 fixed Raster Ops (ROP), Z units, up to 128 Texture Memory Units, FSAA logic etc.

Tahiti becomes HD 7950 and 7970, New Zealand becomes HD 7990
Now that we're properly introduced with the GPU core, the time has come to pay more attention to the lineup itself. Given that the memory bus was extended to 384-bits, i.e. the same as GeForce GTX 580, 3GB GDDR5 are being used across the board, and we would not exclude a 1.5GB or even 896MB "7930" part coming as the number of partially functional GPUs increases.

AMD kept the unified clock concept and given that Radeon HD 7970 is based on fully configured "Tahiti XT" GPU, 2048 cores (32 Compute Units) operate at 1GHz clock. 3GB of GDDR5 memory operates in Quad Data Rate mode i.e. 1.37GHz ODR ("effective 5.5GHz"). This results with record video memory bandwidth for a single GPU - 264GB/s.

The HD7950 is based on "Tahiti Pro" and packs 30 Compute Units for 1920 cores operating at 900MHz. The number of ROPs decreased to 60, while Texture units naturally reduced to 120 (as every CU connects to 2 ROPs and 4 TMUs). Our sources did not disclose if the memory controller is still 384-bit or a 256-bit one, but the memory clock was decreased to 1.25GHz, i.e. the same clock as previous gen models. Should 384-bit controller stay, the clock should be good for 240GB/s of bandwidth.

Both products are expected to be released on CES 2012 in Las Vegas, NV, occupying the $349-449 price bracket. Those additional gigabytes of memory (and processing cores) will certainly cost a lot of $$$.

As far as the dual-GPU "New Zealand", 6GB GDDR5 is expected to be clocked on the same level as the HD6990/7970, meaning you will be getting full performance out of the dual-GPU part.

Unlike HD7950 and HD7970, Radeon HD 7990 will debut in March 2012 and the target price is the same as the original price of its predecessor - $699.
 
Started out, no problems.

Eventually, for several months I'd need to disable the driver and re-enable every time I started windows or it wouldn't work. Sometimes if I muted sound the driver would just shit itself and stay muted, and I wouldn't get sound back again until I restarted my computer.

That changed in an eventual driver update. Now, occasionally I need to go to Sound --> Playback and *click* on AMD HDMI Output to get sound. Not enable or disable, just click on it.


The disable/reenable scenario persisted through a receiver brand change from Onkyo to Denon. Gonna blame AMD here.

I use the HDMI for both audio and video from both my GTX 470 SLI in my gaming box and my GTS 450 from my HTPC. It works flawlessly from both sources, through my Denon receiver, to my speakers and my HDTV. I never need to fuck with the Nvidia HD Audio, ever. It just works. I have successfully outputted uncompressed LPCM 5.1 over HDMI in addition to bitstreaming DTS-HD Master Audio and Dolby TrueHD over HDMI from my PCs.

Like most receivers, the Denon doesn't properly report when it's set to RGB Full Range as opposed to Limited Range but creating a custom resolution on my HTPC fixes the black levels since custom resolutions in the Nvidia Control Panel always treat the output device as a computer monitor. This is something that consistently fucks up output from both Nvidia and AMD but at least Nvidia lets you work around it this way.

I can also create a custom 24hz refresh as well which is closer to 23.976hz, since NONE OF THE BIG 3 (Nvidia, AMD, and Intel) are capable of making a video card which correctly outputs 23.976hz. This is so fucking annoying with any HTPC, the closest Nvidia can get with a custom resolution set is 23.978hz. This is still closer than anything AMD and Intel can achieve.
 
7990
Release in March '12
Uses 2 Tahiti (7970) cores
6GB GDDR5
$699

7970
GCN (new arch)
2048 cores @ 1GHz
384-bit
3GB GDDR5 @ 5.5GHz for 264GB/s
Release @ CES (Jan '12)
Price $449

7950
GCN (new arch)
1920 cores @ 900MHz
384-bit or 256-bit?
Release @ CES (Jan '12)
Price $349

7800 series
VLIW4 arch (28nm die shrink of 6970)
$199-$249
 
Who would pay $700 for a video card? The guy that has to play everything max'd out at insane resolutions already has like 3 580s anyway.
 
Who would pay $700 for a video card? The guy that has to play everything max'd out at insane resolutions already has like 3 580s anyway.

The highly impractical dual-GPU, single-card products aren't really meant for people to buy anyways. It's why production runs of the things are limited to a few thousand at most. They are there to win benchmarks in reviews and make Nvidia or AMD look good.
 
Started out, no problems.

Eventually, for several months I'd need to disable the driver and re-enable every time I started windows or it wouldn't work. Sometimes if I muted sound the driver would just shit itself and stay muted, and I wouldn't get sound back again until I restarted my computer.

That changed in an eventual driver update. Now, occasionally I need to go to Sound --> Playback and *click* on AMD HDMI Output to get sound. Not enable or disable, just click on it.


The disable/reenable scenario persisted through a receiver brand change from Onkyo to Denon. Gonna blame AMD here.


I'm going through a Denon receiver as well. The most annoying issue I had is when I powered the receiver it would make my flat planel the main display even though it's not turned on. Issue seems to be resolved with the latest drivers though.

I'd like to switch to NVIDIA as well but I dont think their cards would work well with my setup. I have two monitors connected via DVI and Display Port. I usually like to turn on my receiver and listen to music while working. I don't think that's possible with NVIDA unless you have a sound card. The latest NVIDIA cards don't seem to be able to bitstream HD audio either. Just though I'd let you know the issues you might run into if you have a similar setup.

Edit: It looks like Unknown Solider was able to bitstream DTS-HD Master Audio and Dolby TrueHD. I was looking into getting a 570 and read that it was no capable of doing so. Maybe it depends on the manufacturer?
 
I'd like to switch to NVIDIA as well but I dont think their cards would work well with my setup. I have two monitors connected via DVI and Display Port. I usually like to turn on my receiver and listen to music while working. I don't think that's possible with NVIDA unless you have a sound card. The latest NVIDIA cards don't seem to be able to bitstream HD audio either. Just though I'd let you know the issues you might run into if you have a similar setup.

Ummm, the only Nvidia Fermi cards which can't bitstream HD audio are the original GTX 470 and 480, all subsequent cards can do it. My GTS 450 is HDMI 1.4 compliant and bitstreams HD audio in addition to supporting Blu-ray 3D.

You can output just audio over Nvidia's HDMI output, simply set the receiver as your second monitor (even though the HDTV connected to it is off) and use the Nvidia HD Audio as the default audio device. You have a phantom monitor being output to but that doesn't hurt anything. I used to do this a lot before I moved all my PC gaming over to my comfy couch and HDTV.

edit: This is kinda complicated. The high-end Fermis don't support it, 470/480 and 570/580 can't bitstream. The low-end Fermis can, GT4xx/GTS4xx and GTS5xx can. This is because Nvidia's lower-end Fermis use an entirely different GPU design. However you can still decode DTS-HDMA and DTHD to uncompressed multi-channel LPCM and output it that way over HDMI. This is the same problem original Phat PS3s have, they can't bitstream DTS-HDMA and DTHD but they can decode to LPCM and output the raw LPCM instead.
 
This guy has a terrible track record so take this info FWIW

I don't see any of that being outlandish considering that it all lines up with recent leaks, rumors, and roadmaps. We'll know more in 5 days, though.

I would be surprised if AMD doesn't launch 7970/7950 at CES. And if 7950-7970 launch at $350/$450, resisting a purchase will be very hard. 7950 for $350, even if it's only 50% faster than a GTX 580, would be a steal. I don't see the 7970 being worth the extra $100 price premium.
 
They already have the price:performance advantage over Nvidia.

This is true but it depends how much you value your own time. I'd rather just get on with gaming rather then spending countless hours beta testing AMD drivers.
 
What's with the bad driver talk on AMD's side?
I still have my HD 4870 and no problems whatsoever.
If you had two of them, you wouldn't be here saying the same thing. (GTX460 SLI here in one machine, 4850s in Crossfire in another, and the latter has had Crossfire or just general driver-related problems with half the big releases this fall.)

AMD really has to step up their game with regards to on-time, solid driver releases to match the big game launches.
 
Started out, no problems.

Eventually, for several months I'd need to disable the driver and re-enable every time I started windows or it wouldn't work. Sometimes if I muted sound the driver would just shit itself and stay muted, and I wouldn't get sound back again until I restarted my computer.

That changed in an eventual driver update. Now, occasionally I need to go to Sound --> Playback and *click* on AMD HDMI Output to get sound. Not enable or disable, just click on it.


The disable/reenable scenario persisted through a receiver brand change from Onkyo to Denon. Gonna blame AMD here.

this kind of shit I really hate.

it works flawlessly, BAM it stopped working with no reason.
 
Started out, no problems.

Eventually, for several months I'd need to disable the driver and re-enable every time I started windows or it wouldn't work. Sometimes if I muted sound the driver would just shit itself and stay muted, and I wouldn't get sound back again until I restarted my computer.

That changed in an eventual driver update. Now, occasionally I need to go to Sound --> Playback and *click* on AMD HDMI Output to get sound. Not enable or disable, just click on it.


The disable/reenable scenario persisted through a receiver brand change from Onkyo to Denon. Gonna blame AMD here.

I have this issue and it drives me mad. HD5850 into an Onkyo receiver and it seems like every I time I leave it alone for more than ten minutes, I have to follow the steps you describe to get sound back.

Really wanted to get on board with a 7XXX card but I'm seriously thinking of switching to Nvidia instead. It doesn't seem like a big issue but it happens so often that it's winding me up.
 
I have this issue and it drives me mad. HD5850 into an Onkyo receiver and it seems like every I time I leave it alone for more than ten minutes, I have to follow the steps you describe to get sound back.

Really wanted to get on board with a 7XXX card but I'm seriously thinking of switching to Nvidia instead. It doesn't seem like a big issue but it happens so often that it's winding me up.

I've got a GTX 460. One HDMI goes to my tv or projector via switch. The other goes into my Onkyo amp for some lovely 5.1 goodness.
Once set. I've never had to change it. It just works.
 
I've got a GTX 460. One HDMI goes to my tv or projector via switch. The other goes into my Onkyo amp for some lovely 5.1 goodness.
Once set. I've never had to change it. It just works.

Mines just a single HDMI into the receiver and when it works, it's golden.

When it works. I'm trying to figure out what the icon means when the sound is 'suspended'. It's not disabled and no other audio device is selected, it's just sort of suspended.

I'm joining your camp, I reckon. I'll miss updating my drivers through Steam though :(
 
Mines just a single HDMI into the receiver and when it works, it's golden.

When it works. I'm trying to figure out what the icon means when the sound is 'suspended'. It's not disabled and no other audio device is selected, it's just sort of suspended.

I'm joining your camp, I reckon. I'll miss updating my drivers through Steam though :(

I like the AMD hardware but the drivers side leaves a lot to be desired. The constant issues Radeon users seems to face every time a big game comes out tells me that AMD don't have a good handle on driver development.

Usually, the nVidia, when a new big game comes out, the drivers are updated and out the same day.
 
New info on 7970 XT

Did some digging, found this:

Sentential at XtremeSystems said:
Originally Posted by Sentential @ XS summary
Radeon HD7970XT:
2048 processing cores 1ghz
128 Texture Memory Units
3GB GDDR5 (384bit) @ 5.5ghz
MSRP $450

Radeon HD7970PRO:
1920 processing cores @ 900mhz
120 Texture Memory Units
1.5GB GDDR5 (384/256bit) @ 5ghz
MSRP $350

For Comparison:

HD6970
1536 Stream Processors @ 880mhz
96 Texture Units
1GB GDDR5 (256bit) @ 5ghz

Also

7970XT against 6970 would be:
33% More Shaders
33% More Texture Units
13,6% Higher Clock Speed
65% Higher Memory Bandwidth
51% more Gflops

Add a bit higher Efficiency of GCN to it and we are at a normal 70% jump like from 4870 to 5870.

Looks like I was pretty close: One 7970 should be almost as fast as 2x 6970's in Crossfire. Damn.
 
New info on 7970 XT

Did some digging, found this:



Also



Looks like I was pretty close: One 7970 should be almost as fast as 2x 6970's in Crossfire. Damn.
man, fuck this console style artificial sku grading of ram. there is zero reason for any kind of performance card in 2012 to have less than 2gbs.
 
Modified your quote slightly with my calculations ..

7970XT against 6970 would be:

127% More Pixel Fillrate
51% More Texture Fillrate
65% Higher Memory Bandwidth
51% more processing power

Add a bit higher Efficiency of GCN to it and we are at a normal 70% jump like from 4870 to 5870.
Just based on this alone, it seems easy to pull off the 70% jump. However I still cant put too much faith in these specs yet.

The other concerning thing about these rumors is that we have heard nothing about the power consumption. Usually AMD rumors do mention the consumption numbers since AMD has much better efficiency compared to Nvidia.
 
Modified your quote slightly with my calculations ..


Just based on this alone, it seems easy to pull off the 70% jump. However I still cant put too much faith in these specs yet.

The other concerning thing about these rumors is that we have heard nothing about the power consumption. Usually AMD rumors do mention the consumption numbers since AMD has much better efficiency compared to Nvidia.

Ryoma-Echizen posted some rumor power consumption numbers. Not sure where he got them though. If what those rumors says is true, no freaking way am I going to buy Nvidia (I'm an nvidia owner BTW). Driver problems can be fixed through updates, but power consumption can't unless you buy a new card.
 
I'm liking this. I will stick with my 6970 for at least next year as my main income tax spending will go to VITA stuff. Maybe by the time 2013 hits there will be a 4-500 possibility of an upgrade for me. I will have to see how it goes but am more that happy with what my current GPU is doing for me.

I'm liking the idea of 2x6970 performance on a single GPU but maybe by 2013, there will be even better GPU's. I don't know if my 600w PSU will be able to handle it though.

Nothing is guaranteed though as I'm not biased and am looking at NVIDIA GPU's too and am open to a change.
 
I use a mobility 5850 HD, and there hasn't been a single game I can't run flawlessly. The only game that gives me trouble is BF3 if I play too long on high settings, and that is probably due to my weak ass CPU.
 
I've got a Phenom II X6 1075T @ 3.0ghz, and a 6970 2GB.

Would one of these cards be a worthwhile upgrade, or is my processor acting as a bottleneck here?
 
Status
Not open for further replies.
Top Bottom