Haha, yeah nice job, Krizz! Forget the PS4 launch! How dare our GPU banter be interrupted!
You can't compare TDPs to actual power draw, this is the top wattage the chip is designed to handle and almost never use this wattage level, it's almost always 2/3rds to 3/4ths this number. meaning for this card's GDDR5 version would run between 26watts and 30watts on average, though stuff like furmark can overheat a card and push them beyond their TDP, this doesn't happen in normal use like you'd see on a console, especially one targeting a specific low clock, which usually will see an efficiency gain with a lower power draw for the same performance. The same is true for MCM which could shave another 1-2watts thanks to sharing the board with a CPU that is also drawing less than 8 watts. (they would consume some of that power together) The e6760 for instance uses 35watts comes with 1GB GDDR5 and is 480 shaders clocked at 600MHz, I love how everyone says this is a binned part but yet is used in millions of devices world wide from PoS units to Casino machines.
I am aware that max TDP is not necessarily indicative of gameplay power draw, but to claim that it is completely irrelevant is disingenuous, given the data we have. If a TDP describes a worst case scenario (or rather, how much the card would be required to cool under max load), we must ask how close this is to the real life peaks in AMD’s GPUs. The facts are these: in gameplay tests run by TechPowerup (linked to previously), the Radeon HD 5550 peaked at 37watts – that’s within 2 watts of the figure AMD listed as max TDP.
AMD’s embedded lines are not relevant for comparison. People keep saying they are binned parts because
they are binned parts. They are identical to the chips on laptop cards, only stuck on a BGA package with some RAM.
AnandTech said:
As we mentioned previously, the E6760 is based on the Turks GPU. Specifically, AMD’s embedded video cards are based on their mobile products, so the E6760 is derived from the 6700M series both in performance and naming.
http://www.anandtech.com/show/4307/amd-launches-radeon-e6760
That they are widely used is besides the point – laptops are pretty big market these days too!
I actually think we could be looking at the e6460 which is a 160 shader part, Nintendo would likely want to use an existing chip and modify it, in 2009 this chip wasn't ready so they built their kits on the high end (at the time) AMD GPU HD 4800 series and simply down clocked it until it was usable. I'm almost positive there is no TEVs in the shader blocks thanks to the Wii U Iwata asks where the engineers who built Wii U said that they integrated Gamecube's design into the AMD GPU they used. This could be an interesting aspect though because they may of had to add ALUs to the shader blocks to emulate TEV freely, also something that has been bugging me is that Gamecube had 3 raternizers, I'm not an expert at this so maybe someone like blu could answer this. Does that cause a problem with the idea that latte only has 1? or could it be hiding a couple more without the entire 5 block duality that HD 6970 and HD 7000 series displays?
I really don’t think there’s any point in positing that Nintendo used a stock HD 6000 series as a base. If that were the case, we wouldn’t be seeing things like “stripped down from DirectX 11” in developer notes, DirectX 10 on the Unity slide, or “based on R700 series” in the leaked specs sheet (which numerous independent sources on this board and elsewhere have confirmed as the final specs, but many in this thread want to discount as an unreliable rumor).
And I've long since abandoned the idea of any TEVs being on Latte. According to Marcan, BC runs via shim, so there are likely some extra transistors to do the translation, but not full units.
Fourth Storm, while you have answered some questions definitely, the truth is you also are not an expert on AMD designs and you could simply be seeing what you want to see. That is why I keep bringing up the idea that you could be wrong, because to be frank. You probably are about quite a few of the things you think you know. I'm sure plenty of people who might have insider information have helped you through PMs but you need to realize that most of those people are probably fake and the people that do have an idea might not actually know anything. I've been fine with the idea of 160 shaders in latte for a long time, it really doesn't make any difference whether it is 160, 256 or 320, but it is quite possible that we have this wrong and it is a bit different.
Also just because the chip is VLIW (if it is) It doesn't mean that it has to be VLIW 5 or 4, VLIW has been around a lot longer than HD2000, it's not even ATI's idea afaik, it's just another instruction type. Nintendo could of worked with AMD and their tech team to have something that is a bit different, which could explain why the blocks are 90% bigger than 20 ALU blocks, which is a huge huge difference, and while you are sure that Renesas is going to make it bigger, there are certainly things they should be able to make smaller especially since they are hand layouts which usually improves on density over machine layouts like we see in Brazo. You are just too sure to be taken seriously, and maybe that is because you know some things you can't divulge, but since I don't know those things, it just looks silly that you are so sure.
Wow z0m, that certainly was a full frontal assault. I know I may have questioned your level of expertise in the past, and I’m sorry if it came off as a personal attack. I’d like to keep it civil from here on out, though. Attack the argument, not the poster. No, I don’t have a degree in AMD graphics cards, but I have committed myself to researching them since about the time of the Wii U’s announcement and I’m learning more every day.
Regardless, I am not asking anyone to take my word for it on these matters. All the info I have found is freely available for people who take the time to research. I’ve merely shared my opinions and then stated my reasoning behind them, presenting evidence where applicable. Yes, I'm pretty damn sure of certain aspects, because I've seen enough evidence to satisfy.
I have not been reliant on any inside sources, except for the 45nm tidbit, and that came from a reputable channel. Even so, a 5nm difference isn’t going to factor in as much as differing fab houses will. A process node does not set a mandatory size for the hardware blocks – we can only look at what’s typical for one product from one manufacturer. Other than that, all bets are off.
To suggest that there's some type of custom VLIW architecture at work is a bit wild, imo. We have dev comments that state that it's a pretty standard design. The posts I read talking about how Nintendo/AMD/Renesas worked their asses off on this chip and while citing R&D numbers underestimate how much it takes to design a new memory subsystem, integrate all the components on an SoC, get BC running, put everything on an MCM, and so on. It's not just slapping the blocks together and calling it a day. Btw, who said it's a hand layout?
PS lets throw out the notion that Wii U games are even using all of the GPU's resources and thus pushing it's limits as far as TDP goes. I would guess we will see 37-38watts becoming the norm in the future which could give 3+ watts to the GPU. I really don't think 360 ports are going to push an AMD GPU with even 160 shaders to it's full power potential.
In a series of gameplay tests run by Digital Foundry, Wii U peaked at just over 33 watts. Mind you, just because the graphics in those launch games left something to be desired, it doesn’t follow that the GPU itself was slacking off.
Digital Foundry said:
One thing that did stand out from our Wii U power consumption testing - the uniformity of the results. No matter which retail games we tried, we still saw the same 32w result and only some occasional jumps higher to 33w. Those hoping for developers to "unlock" more Wii U processing power resulting in a bump higher are most likely going to be disappointed, as there's only a certain amount of variance in a console's "under load" power consumption.
http://www.eurogamer.net/articles/digitalfoundry-wii-u-is-the-green-console
Let's not forget, games like BLOPS2, ACIII, MEIII are no slouches when it comes to taxing a system. I'm sure that future games will be able to display nicer eye candy and more efficiently utilize resources, but I see no indicator that Wii U will ever consume over 34 watts.