Cow Mengde
Banned
I've got a Phenom II X6 1075T @ 3.0ghz, and a 6970 2GB.
Would one of these cards be a worthwhile upgrade, or is my processor acting as a bottleneck here?
We probably won't know until we know the actual specs of the card.
I've got a Phenom II X6 1075T @ 3.0ghz, and a 6970 2GB.
Would one of these cards be a worthwhile upgrade, or is my processor acting as a bottleneck here?
I've got a Phenom II X6 1075T @ 3.0ghz, and a 6970 2GB.
Would one of these cards be a worthwhile upgrade, or is my processor acting as a bottleneck here?
Ryoma-Echizen posted some rumor power consumption numbers. Not sure where he got them though. If what those rumors says is true, no freaking way am I going to buy Nvidia (I'm an nvidia owner BTW). Driver problems can be fixed through updates, but power consumption can't unless you buy a new card.
We probably won't know until we know the actual specs of the card.
OK I had figured that comparing my current cpu with my current gpu, someone might be able to say whether or not I should build a Sandybridge/Ivybridge system or something before I spend more on graphics.
That suspect FUD from yesterday aside, this was supposed to be one of the points of emphasis with Nvidia's new cards. Under this currently proposed scenario, obviously AMD is the clear cut winner (and I'll go back if I upgrade next Spring). But let's not draw any conclusions yet. Lots of people have agendas and I wouldn't trust anything until we get out of the speculative stage.
Ryoma-Echizen posted some rumor power consumption numbers. Not sure where he got them though. If what those rumors says is true, no freaking way am I going to buy Nvidia (I'm an nvidia owner BTW). Driver problems can be fixed through updates, but power consumption can't unless you buy a new card.
I don't think so. I would feel little more inadequate if I didn't have my 2500k at 4.5ghz though
If your CPU starts more to be obsolete with HW coming up then there's something wrong and tech is just moving too fast then. Either way, we have a rock solid line of CPU's by Intel right now and even your HW won't necessarily be a bottleneck unless its a certain game that makes it so.
Also don't ignore what LiquidMetal14 just said. He has a very good point.
When one of the mods found out an older tinypic link changed to show a NSFW image.lol since when is tinypic banned?
Link1: Poster on B3D confirms that the info is incorrect. For all it could be that the XDR bit in that could be wrong or the mem configuration etc.
Link1: Poster on B3D confirms that the info is incorrect. For all it could be that the XDR bit in that could be wrong or the mem configuration etc.
Link2: Same as above, redundant and useless post.
Link3: Irrelevant post on B3D that contains guesstimated specs
lol?
The only thing we're certain at this point is that XDR will not be used.
I trust that guy but he hasnt refuted it specifically. And for sure we havent had it refuted over and over ..Search "neliz" post history on B3D. They're fake.
I've got a Phenom II X6 1075T @ 3.0ghz, and a 6970 2GB.
Would one of these cards be a worthwhile upgrade, or is my processor acting as a bottleneck here?
I trust that guy but he hasnt refuted it specifically. And for sure we havent had it refuted over and over ..
Who would pay $700 for a video card?
Hi
Quite a few people spend that on cards...just not on GAF.
The 7970 will be exactly the performance of two 6870s. No more or less
Actually, if you look at some of the past "Post your PC" braggin' threads, you'd find that GAFers spend A LOT of money on their PCs. Tons of GTX580s, dual and even tri-fire configs. Pretty impressive to say the least.
That's like 6 people?Actually, if you look at some of the past "Post your PC" braggin' threads, you'd find that GAFers spend A LOT of money on their PCs. Tons of GTX580s, dual and even tri-fire configs. Pretty impressive to say the least.
Skimming the benchmark thread, more like 20-25 or so people with dual/triple GPU setups.That's like 6 people?
Skimming the benchmark thread, more like 20-25 or so people with dual/triple GPU setups.
Somehow I read that in my head as 'SLi 580s'.Skimming the benchmark thread, more like 20-25 or so people with dual/triple GPU setups.
AMD drivers have been solid for years now. The one spot where they fail is Crossfire drivers.
AHAHAHAHAHAHAH
No.
5850 here, can't switch to Nvidia unless they finally come out with something comparable to AMD as far as high end HD HTPCs go with proper RGB and YCbCr pixel format's color output, HD audio support and true 24p refresh rate out of the box
I'll definitely upgrade anyway.
AHAHAHAHAHAHAH
No.
5850 here, can't switch to Nvidia unless they finally come out with something comparable to AMD as far as high end HD HTPCs go with proper RGB and YCbCr pixel format's color output, HD audio support and true 24p refresh rate out of the box
I'll definitely upgrade anyway.
7970 looks to be my next card.
If the rumors are true that Nvidia is late 2012 for its high end one chip solution that majorly sucks.
http://www.xbitlabs.com/news/mobile...r_Graphics_Chips_to_Support_DirectX_11_1.html
Unless they shrink the current Fermi and re-name it.
If these launch for real in early 2012 (as opposed to a paper launch), I guess I'll be using one for a new PC build.
It seems like AMD cards are the only ones that have display port standard. I can never seem to find them on Nvidia cards. maybe I'm not looking hard enough (or not willing to pay for the top line cards that may have the DP output).
My GTX 570 does YCbCr + RGB color format, HD audio, and I believe it does 24p refresh rate as well.
can you go dual card? Nice low power, passively cooled AMD card for HTPC (quiet too) and a powerful Nvidia card for gaming?
As usual with NVIDIA all the multimedia features of lower range of chips will be present in the next generation high end chips.As of today, for NVidia cards, only the 430, 460 and 560 have to proper bistream HD audio under PAP, and not with every driver, support is wonky.
If you wanted to ask then nope, the GTX 580 does not support full HD audio bitstreaming as well.
We hear whispers of the final launch price of AMD's upcoming Radeon HD 7000 series cards and it looks as though the faster variation of the Tahiti chip will hit US $549 price tag while the slower one should be priced at US $449.
Of course, bear in mind that nothing is carved in stone here as AMD still has the option and ample time to change the price, or as they call it, adjust it, even a few days before the actual launch. The same thing goes for the name of the card as these are only finallized when partners boxes are sent for printing. But for now, the HD 7950 and the HD 7970 are the names most partners are using and these US $449/549 are the prices that came from AMD.
As you already know, AMD planned to ship the cards in the first week of December as holiday shopping is always a nice timeframe for a launch. Unfortunately, TSMC is the one to blame as we are hearing that even the yields of the 40nm HD 6900 series are still not great and TSMC is also the reason why HD 7000 got pushed to early-January shipping date, or in time for CES 2012.
In any case, we'll keep our eyes and ears ready in case AMD decides to change these anytime soon.
My GTX 570 does YCbCr + RGB color format, HD audio, and I believe it does 24p refresh rate as well.
Wow. That is a bit expensive, thought it i snot final. Hope the cards turn out really good.
If irfan or a mod wants to update the OP, the same site also had an English version of the article in the OP.
http://www.nordichardware.com/news/...o-be-presented-in-london-on-december-5th.html
Also Fudzilla has a rumor saying that Tahiti will be $449/$549.
http://www.fudzilla.com/graphics/item/25089-amd-radeon-hd-7000-series-priced
I really, really don't see that happening.
AMD has made it a point over the last few generations to stress performance/$ value. They have kept the 6970 priced well below the GTX 580 while offering similar performance. Also, I think (and I assume AMD has a similar line of thinking) that they'll sell far more 7950/7970's at $350/450 price points, respectively. $450/550 would mean a lot less sales and would also give up one of their key advantages vs. Nvidia. I'm pretty damn sure AMD would LOVE to have the 7950 @ $350 being 50%+ faster than a $450-500 GTX 580. That'd also encourage more people to go Crossfire.
AMD may not be getting great yields with 28nm yet, but I'm almost positive they will push hard to keep their excellent pricing. Besides, their whole 7xxx pricing line-up would have huge gaps in it:
7850 - $200-250
7870 - $260-300
7950 etc $450+?
Nah. I just don't think they'll pull an Nvidia on pricing.
Because that'll really suck if I have to fork out $450 for a 7950...![]()
I use a GTS 450 in my media PC but yeah, it does all those things.
As probably one of the few guys who has two gaming rigs with cards from both sides, this kind of fanboy rhetoric irks me. If anything, I've had more issues with instability in the two months I've had my 560Ti's than I ever did with the 5870. NVIDIA has better relationships with a lot of devs, which generally leads to better drivers for games, but they certainly aren't without faults. Remember the whole Rage issue from just a few months ago?This is true but it depends how much you value your own time. I'd rather just get on with gaming rather then spending countless hours beta testing AMD drivers.
You could get more out of your processor with a solid OC on it. In this thread, #7 and #16 have pretty much identical clocks on the 580s, but there's a significant jump going from the 1100 to even the Nehalem. So, you wont be gaining as much as you could be with a recent sandybridge proc. However, as #19 on that thread (putting me below the 1100T rig), I can tell you the guy with the 1100T has an amazing gaming experience right now. If anything, the only reason I would question the upgrade is that going from one gen's top of the line card to the next is rarely a worthwhile upgrade. We'll see once benches are released.I've got a Phenom II X6 1075T @ 3.0ghz, and a 6970 2GB.
Would one of these cards be a worthwhile upgrade, or is my processor acting as a bottleneck here?
Apple sources GPUs for their systems from AMD. Apple more or less is heading the move to displayport. NVIDIA doesn't gain very much (currently) from going out of their way and paying for the tech and integrating it into their designs.It seems like AMD cards are the only ones that have display port standard. I can never seem to find them on Nvidia cards. maybe I'm not looking hard enough (or not willing to pay for the top line cards that may have the DP output).
(It's AMD, ATI doesn't exist anymore). 480 is a pretty amazing card. The fact that you get an awesome GPU AND a space heater in one package is a killer deal.Come on ATI, make me upgrade the GTX480. MAKE MEEEEEEE
Well, with the 7xxx just over the horizon, maybe you'll grow a pair and throw some more voltage on your 6950s. If you brick them, you'll have an awesome replacement card.Yeah, it isn't that surprising considering the types of people who would run the bench and post their results + how big GAF is imo.
Still pissed at my score. I deserve that spot
Some cards, like the 6990 and GTX590 are actually dual chip cards. They essentially take the parts of two 6970s and 580s (respectively), down clock them, and throw them onto the same PCB instead of having two separate cards.What is a one chip solution? What the need for the display port?
Yep. Went 5870 crossfire myself. I was playing APB: Reloaded, Brink, and Heroes of Newerth at the time. Neither APB nor HoN ever received crossfire profiles, and it took like 2 weeks after the launch of Brink to finally get them. Considering I had to disable one of my two cards for the games I was playing, I ended up giving it away to a friend. Either than obvious AAA games, it seems like the programmers throw a dart at a board to pick the games that get crossfire support. That experience is what pushed me to go green for my recent build.AMD's bullshit crossfire support is really annoying for me.
It takes a month or two until games with broken crossfire gets fixed.
I'm considering SLI for my next upgrade.
I really, really don't see that happening.
AMD has made it a point over the last few generations to stress performance/$ value. They have kept the 6970 priced well below the GTX 580 while offering similar performance. Also, I think (and I assume AMD has a similar line of thinking) that they'll sell far more 7950/7970's at $350/450 price points, respectively. $450/550 would mean a lot less sales and would also give up one of their key advantages vs. Nvidia. I'm pretty damn sure AMD would LOVE to have the 7950 @ $350 being 50%+ faster than a $450-500 GTX 580. That'd also encourage more people to go Crossfire.
AMD may not be getting great yields with 28nm yet, but I'm almost positive they will push hard to keep their excellent pricing. Besides, their whole 7xxx pricing line-up would have huge gaps in it:
7850 - $200-250
7870 - $260-300
7950 etc $450+?
Nah. I just don't think they'll pull an Nvidia on pricing.
Because that'll really suck if I have to fork out $450 for a 7950...![]()