Rumor: Wii U final specs

If there's no dedicated video ram pool (even if not GDDR5) it makes that 1GB not sound like much at all, doesn't it?

Or are console boxes alright with that? But iirc most previous consoles have had dedicated vram, even if very little in comparison.
 
The problem with this kind of thinking, is that we don't really know which info we got about old dev kits, who gave us second hand information and what information we have now.
Yes, that is a priority if we want to know more prior to launch.

About the e6760: If the original dev kits used HD4850s to approximate the performance of a chip, the e6760's performance does fall perfectly in line with this, not saying it is but it also really doesn't matter if it's not; we are getting a grasp on the performance and this fits the power draw and the targeted performance.

It works with all of our rumors too.
Yes, it shares enough resemblance with our culprits. It remains to be seen if it can easily be mass-produced on the long term. But I don't think we can have this info.

My guess is that Wii U [CPU] is 3 custom 476fp's built on PPC7 tech (the soi and embedded ram) sharing the 32mb edram with the custom embedded GPU7 which is 5000 series shaders or 6000 series shaders combined with 1GB ram and some fixed functions to emulate Flipper more accurately. I do think we are talking about something beyond the 4000 series as a base
Matt stated that the CPU is not a 476p but Matt has also been half in the shadow so far. bgassassin trusts him though, which is not really helpful given how easy it would be for one of them to spill completely the beans if they word their thoughts not so carefully.

GDDR5 makes no sense, there would be no point of having 32MB of EDRAM if the main pool was GDDR5 as the speed of the EDRAM would be wasted, due to the very high bandwidth of GDDR5.
Talking about that, what is the role of GPR in the context of the type of VRAM used in the GPU?
 
So, it looks like we got a decent spec estimate going on in here:

GPU: AMD modified e6760
CPU: 3-core Out-of-order CPU consisting of three enhanced Broadway CPUs, with an increased amount of CPU cache.
RAM: 32MB eDRAM and 2GB of GDDR3 general RAM (of which 1GB is currently available for developers)
MISC: DSP processor.
 
If there's no dedicated video ram pool (even if not GDDR5) it makes that 1GB not sound like much at all, doesn't it?

Or are console boxes alright with that? But iirc most previous consoles have had dedicated vram, even if very little in comparison.

It's the same way as PS3/360's RAM is divided.

PS3 is divided into two pools but they're both effectively usable as VRAM.

360 is one pool of 512 usable as VRAM

Same for Wii U, one pool usable either way.
 
Given the latest rumours, would it be relatively meaty? Obviously not as powerful as the PS4/720, but certainly not another Wii situation, no?
 
So, it looks like we got a decent spec estimate going on in here:

GPU: AMD modified e6760
CPU: 3-core Out-of-order CPU consisting of three enhanced Broadway CPUs, with an increased amount of CPU cache.
RAM: 32MB eDRAM and 2GB of GDDR3 general RAM (of which 1GB is currently available for developers)
MISC: DSP processor.

GPU more likely to be modifed R700 resembling cut back e6760.
 
Given the latest rumours, would it be pretty meaty? Certainly not another Wii situation, no?
I really don't what metrics we should use to compare the situation last gen with this one but you can assume the Wii U is 2x the X360 whereas the Wii was roughly on par with Xbox.

So, there is some progress :P
 
Given the latest rumours, would it be relatively meaty? Obviously not as powerful as the PS4/720, but certainly not another Wii situation, no?
As far as I can tell it is likely another Wii situation when comparing it to the predecessors but it may not make that much of a difference in practice since architecture and technologies in use won't change as dramatically this time so it won't be left too far behind. It will still have normal mapping and pretty lighting and reflections and godrays and shadows and whatever, just maybe (probably) lack the latest things like tessellation and soft shadows (with good performance at least). Of course that still makes a big difference and things will still need to be scaled back in terms of resolution, polycounts, physics, etc, if the other consoles are much more powerful. That's just a guess though, we don't really have much to go on just yet. And don't know shit about the others.
 
GPU more likely to be modifed R700 resembling cut back e6760.

Why cut back? I think it will resemble it, with some added features specifically tailored to WiiU and some features cut out (less interesting for a dedicated game console?). Overall in the same class, but likely more suitable for a gaming platform. That would be my guess.

I really don't what metrics we should use to compare the situation last gen with this one but you can assume the Wii U is 2x the X360 whereas the Wii was roughly on par with Xbox.

So, there is some progress :P

Wii was severely lacking in the shader department. I think other than the GPU, it was a couple of steps ahead of xbox. WiiU will likely be ahead of 360 everywhere where it counts.
 
Why cut back? I think it will resemble it, with some added features specifically tailored to WiiU and some features cut out (less interesting for a dedicated game console?). Overall in the same class, but likely more suitable for a gaming platform. That would be my guess.

Most rumors indicate the DX11 feature set isn't completely supported on Wii U, but the e6760 has full support.
 
Why cut back? I think it will resemble it, with some added features specifically tailored to WiiU and some features cut out (less interesting for a dedicated game console?). Overall in the same class, but likely more suitable for a gaming platform. That would be my guess.



Wii was severely lacking in the shader department. I think other than the GPU, it was a couple of steps ahead of xbox. WiiU will likely be ahead of 360 everywhere where it counts.

I agree cut back would be pointless, the e6760 chip as it is would be a great chip for wii u as it is (low cost, low power consumption and surprisingly high powerful), if there have been cut backs I'd expect improvements in other ways to overall make it a better chip
 
AMD Radeon™ E6760

Package Dimensions - GPU + memory, 37.5 mm x 37.5 mm BGA
Thermal Design Power (TDP) - 35W8

Graphics Processing Unit

Process Technology - 40 nm
Graphics Engine Operating Frequency (max) - 600 MHz
CPU Interface - PCI Express® 2.1 (x1, x2, x4, x8, x16)
Shader Processing Units - 6 SIMD engines x 80 processing elements = 480 shaders
Floating Point Performance (single precision, peak) - 576 GFLOPs
3DMark™ VantageP Score4 - 5870
Display Engine - AMD APP, AMD Eyefinity & AMD HD3D technologies
DirectX™ capability - DirectX® 11
Shader Model Shader - Shader Model 5.0
OpenGL - OpenGL 4.1
Compute - AMD APP technology, OpenCL™ 1.1, DirectCompute 11
Unified Video Decoder (UVD) - UVD3 for H.264, VC-1, MPEG-2, MPEG-4 part 2 decode
Internal Thermal Sensor - yes

Memory

Operating Frequency (max) - 800 MHz / 3.2 Gbps
Configuration type - 128-bit wide, 1 GB, GDDR5, 51.2 GB/s

Display Interfaces

Analog RGB - 1x Triple 10-bit DAC, 400 MHz
Analog TV - NA
Single / Dual-Link DVI - 4x Single-Link DVI / 1x Dual-Link DVI
DisplayPort 1.1a - 2x
DisplayPort 1.2 - 4x
Single / Dual-Link LVDS - 1 x Single-Link / Dual-Link
HDMI™ - 1x HDMI™ 1.4a
Number Independent Displays (max) - Up to 2 display outputs from VGA, Single / Dual-Link DVI, Single / Dual-Link LVDS, HDMI™ 1.4a, DisplayPort 1.1a / 1.2, + up to 4 display outputs from DisplayPort 1.1a / 1.2
HD Audio Controller (Azalia) - 1x
HDCP Keys - 6x
DVO - 12-bit DDR or 24-bit SDR / DDR
 
Most rumors indicate the DX11 feature set isn't completely supported on Wii U, but the e6760 has full support.

Like i said, maybe not all DX11 features are relevant. The investigative article from (forgot the source) a while back, spoke of DX11 level features but DX9 level performance.

The two most interesting DX11 features were tesselation and newer GPGPU features if i remember correctly?
 
I'm wondering if it's a license thing, or a feature thing.
In the case of the former, not having it would just mean they don't want to support Microsoft's license.
 
What does DX11 support mean?

Technically nothing as its a proprietary Microsoft API for use on windows and Xbox but it can be useful for approximating the features of the graphics chips

However hasn't there been talk that amd dx10 chips are technically capable of all dx11 effects and features anyway but did them in a different way to Microsoft's standard so aren't certified for it (and I guess unusable on PC)?
 
Technically nothing as its a proprietary Microsoft API for use on windows and Xbox but it can be useful for approximating the features of the graphics chips

However hasn't there been talk that amd dx10 chips are technically capable of all dx11 effects and features anyway but did them in a different way to Microsoft's standard so aren't certified for it (and I guess unusable on PC)?

I'm unclear as well on this. But for example, even though the GPU's in PS3/360 would be DX9 in PC hardware terms, they can do some things that cant be done in DX9 cause they're not restricted by an API and can be programmed to the metal in a console.

That's why I think the Wii U GPU is probably DX10 (R700), and similar to the above it can do some DX11-like things naturally.
 
Like i said, maybe not all DX11 features are relevant. The investigative article from (forgot the source) a while back, spoke of DX11 level features but DX9 level performance.

The two most interesting DX11 features were tesselation and newer GPGPU features if i remember correctly?

There's more than that, but I don't know whats useful on console or not. But why would the rumors be of DX10+ rather than console-relevant DX11?
 
I'm unclear as well on this. But for example, even though the GPU's in PS3/360 would be DX9 in PC hardware terms, they can do some things that cant be done in DX9 cause they're not restricted by an API and can be programmed to the metal in a console.

That's why I think the Wii U GPU is probably DX10 (R700), and similar to the above it can do some DX11-like things naturally.

It's getting difficult to hold back the tide on this e6760 circle jerk.
 
Given the latest rumours, would it be relatively meaty? Obviously not as powerful as the PS4/720, but certainly not another Wii situation, no?

It was never going to be another Wii situation. The problem that the Wii faced is that between when the original GPU base for it was developed and when the GPUs for the PS360 were developed, a shift happened in the GPU world. The way things were done in hardware changed big time, and the Wii's GPU didn't follow suit of that change. Thus any engines for it or games ported from the other two had to be reprogrammed from the ground up. You could get certain similar effects and such, it just meant doing it a completely different way.

Unlike that situation, it looks like the Wii-U is going to support compute shaders, tessellation, and other modern GPU featuresets. It just won't have the same level of oompf if you will that the PS4/720 will have. What that means is that instead of having to re-write engines from scratch going from say the PS4 - Wii-U they'll just have to be toned down. It's a very different situation than the current gen.
 
It's getting difficult to hold back the tide on this e6760 circle jerk.
It seems to be a mix of overexcited people who've now settled on it being the E6760 because AMD Tech Support said so - it must be true.

And cautiously optimistic people hoping that the bespoke solution in the Wii U is similar. The latter doesn't seem entirely unfounded, if, as noted, optimistic.

Either way, the internet has caught hold of it. Reason and logic were thrown out the window pages ago.
 
Guys, are we reading the same thread.

There is not such a thing as a circle of jerk offs first and secondly the assumption that the Wii U's insides contain a modified E6760 GPGPU is based on the fact that this chip features all functionalists that are required by the Wii U (made by AMD, Eyefinity for the uPad support, is a GPGPU, power consumption that is in line with the console's constraints, etc.). It is the closest off-the-shelf component that fits the puzzle.

Why do we deserve such a mention for our investigation work. The E6760 was mentioned in this thread before someone sent a mail to AMD, and back then we already thought it was the culprit. So please, I like to discuss but not be insulted.
 
It was never going to be another Wii situation. The problem that the Wii faced is that between when the original GPU base for it was developed and when the GPUs for the PS360 were developed, a shift happened in the GPU world. The way things were done in hardware changed big time, and the Wii's GPU didn't follow suit of that change. Thus any engines for it or games ported from the other two had to be reprogrammed from the ground up. You could get certain similar effects and such, it just meant doing it a completely different way.

Unlike that situation, it looks like the Wii-U is going to support compute shaders, tessellation, and other modern GPU featuresets. It just won't have the same level of oompf if you will that the PS4/720 will have. What that means is that instead of having to re-write engines from scratch going from say the PS4 - Wii-U they'll just have to be toned down. It's a very different situation than the current gen.
That's what I guessed.

So the Wii U will probably be inbetween current-gen, and next? I'm ok with that.
 
Guys, are we reading the same thread.

There is not such a thing as a circle of jerk offs first and secondly the assumption that the Wii U's insides contain a modified E6760 GPGPU is based on the fact that this chip features all functoinalities that are required by the Wii U (made by AMD, Eyefinity for the uPad support, is a GPGPU, power consumption that is in line with the console's constraints, etc.). It is the closest off-the-shelf component that fits the puzzle.

Why do we deserve such a mention for our investigation work. The E6760 was mentioned in this thread before someone sent a mail to AMD, and back then we already thought it was the culprit. So please, I like to discuss but not be insulted.


Yeah for some people it seems easier to overreact to someone else having a different opinion to them, than to just discuss rationally why they disagree.
 
That's what I guessed.

So the Wii U will probably be inbetween current-gen, and next? I'm ok with that.

Yeah I think that's where we're at right now and most potential Wii U owners are ok with that too.

I think the reason for all this speculation and this thread alone being over 100 pages long is because people are afraid of another Wii situation where they don't get the big third party titles Sony and MS are getting. We're trying to come up with an answer to that question but I don't think one exists and if it does it has more to do with simply how much power the Wii U has under the hood.

As Shin said above it isn't going to be another Wii situation where it's extremely difficult to port next gen games to it, but that still doesn't mean devs will bother with it if they don't think the time and money they put into porting it over will be worth the effort. Fortunately, it looks like ports of PS4/720 games to Wii U will require significantly less time/money than PS3/360 ports to Wii, so there is a smaller barrier to getting ports, but that still doesn't mean they're going to happen.
 
128 GPRs is what an ALU clause in a single R700 thread can address (a clause is a schedulable sub-routine, put loosely). An R700 SIMD engine runs 64 threads. A GPU has multiple SIMD engines. You could say that a R700 has 'a tad more' than 128 GPRs.

Here's something educational: http://gpgpu.org/wp/wp-content/uploads/2009/09/E1-OpenCL-Architecture.pdf (pay particular attention to pg. 10)

If Matt was referring to GPRs that the ALU clause in single GPU thread, what number does a e6760 has for that?
 
Yeah for some people it seems easier to overreact to someone else having a different opinion to them, than to just discuss rationally why they disagree.

It's hard to have a rational discussion with people who don't understand what they're talking about and ignore any evidence that disagrees with their conclusions, no matter the source. We spent a dozen pages in this very thread with people in denial about the PSU and power usage of the WiiU as people attempted to deny the facts given to us directly by Iwata. See also the Power7 discussion, which was always an argument between people with the technical expertise to know anything actually resembling the Power7 would never work in the WiiU and the overeager fanboys who latched on to a couple ambiguous and/or ill-informed tweets to argue for its inclusion.

There are pages and pages of rational arguments in this very thread for why it can't be e6760, but too many are willing to ignore that based on easily faked emails from a source we've been told (and should have known intuitively) would have no actual knowledge in any case.
 
If Matt was referring to GPRs that the ALU clause in single GPU thread, what number does a e6760 has for that?
AFAIK, that number did not change for the duration of the VLIW era, i.e. a Cayman ALU clause was still limited to 128 GPRs.
 
It's hard to have a rational discussion with people who don't understand what they're talking about and ignore any evidence that disagrees with their conclusions, no matter the source. We spent a dozen pages in this very thread with people in denial about the PSU and power usage of the WiiU as people attempted to deny the facts given to us directly by Iwata. See also the Power7 discussion, which was always an argument between people with the technical expertise to know anything actually resembling the Power7 would never work in the WiiU and the overeager fanboys who latched on to a couple ambiguous and/or ill-informed tweets to argue for its inclusion.

There are pages and pages of rational arguments in this very thread for why it can't be e6760, but too many are willing to ignore that based on easily faked emails from a source we've been told (and should have known intuitively) would have no actual knowledge in any case.
Brad Grenz, this is the way out.

Thank you.
 
However hasn't there been talk that amd dx10 chips are technically capable of all dx11 effects and features anyway but did them in a different way to Microsoft's standard so aren't certified for it (and I guess unusable on PC)?

Just off the top of my head:

On the tessellator side of things, pre-dx11 HW was much less programmable (not just API limitation) and would require more steps/ render passes to do certain things in order to achieve similar results - so yes, "possible". There were some tweaks for r7xx for geometry shader interaction so that's "getting there".

On compute side, there's a pretty clear distinction in HW capability between SM4.x and SM5.0. It's less about exposing functionality (unlike DX9 console vs DX9 PC) - the HW just can't do certain things (accessing resource types, # of threads/shared mem etc). For example, nVidia actually did have atomics in G80, but that functionality wasn't made part of the standard until SM5.0 because AMD lacked the HW capability up until Cypress.


There are some other things like texture formats that would need to be added in hardware, but that ought to be rather trivial by comparison to wholly upgrading the compute capabilities & tessellator stages.
 
AFAIK, that number did not change for the duration of the VLIW era, i.e. a Cayman ALU clause was still limited to 128 GPRs.

Hmm..seems like Matt was looking at a different number then. Any logical guess on hat he may be referring to? (r700 >> Wii U GPU < e6760) I'm starting to wonder if he makes these type of mistakes on purpose to stay off the radar :D

Just off the top of my head:

On the tessellator side of things, pre-dx11 HW was much less programmable (not just API limitation) and would require more steps/ render passes to do certain things in order to achieve similar results - so yes, "possible". There were some tweaks for r7xx for geometry shader interaction so that's "getting there".

On compute side, there's a pretty clear distinction in HW capability between SM4.x and SM5.0. It's less about exposing functionality (unlike DX9 console vs DX9 PC) - the HW just can't do certain things (accessing resource types, # of threads/shared mem etc). For example, nVidia actually did have atomics in G80, but that functionality wasn't made part of the standard until SM5.0 because AMD lacked the HW capability up until Cypress.


There are some other things like texture formats that would need to be added in hardware, but that ought to be rather trivial by comparison to wholly upgrading the compute capabilities & tessellator stages.

Thank you for the clarification, Alstrong. If Nintendo did decided to enhanced the GPU with some features beyond dx10.1 due to third-party dev's feedback, is it safe to assume that enchanting the GPGPU capabilities and the revamping the tessellation stages would be the major ones to fix?
 
It's hard to have a rational discussion with people who don't understand what they're talking about and ignore any evidence that disagrees with their conclusions, no matter the source. We spent a dozen pages in this very thread with people in denial about the PSU and power usage of the WiiU as people attempted to deny the facts given to us directly by Iwata. See also the Power7 discussion, which was always an argument between people with the technical expertise to know anything actually resembling the Power7 would never work in the WiiU and the overeager fanboys who latched on to a couple ambiguous and/or ill-informed tweets to argue for its inclusion.

There are pages and pages of rational arguments in this very thread for why it can't be e6760, but too many are willing to ignore that based on easily faked emails from a source we've been told (and should have known intuitively) would have no actual knowledge in any case.


I'll keep it brief:

- We're not saying its a e6760
- Theres alot of rational discussion as to why the final wii u gpu could be very similar to e6760 (which is all we're discussing here)
- I dont think any sane person actually thought it was a power7 cpu. In fact I dont hink anyone ever said that, they were merely discussing the tweets which said it used a power 7 chip. Most of that discussion ended in the conclusion " The tweets mean nothing. Maybe its somthing loosely based on p7, but probabaly not"
- I seem to recall you were one of the people who ended up wrong about the psu, saying it was rated for 75w and therefore could only output around 60% of that. Thats false. I shouldn't have to explain why its false again, but in a nutshell: It outputs up to 75w (which is what the pre-e3 photo of the psu confirmed) As Iwata himself said in the Nintendo direct the console will consume up to 75w of power at times, normal use will be about 40w) blu and others have backed that up, afaik. Thats why there was alot of discussion, because people were incorrectly claiming as fact that it'll only output ~45w (60% of 75w) max as they confused psu efficiency/output/rating etc.

So in short, theres no need to come in and make an off hand comment about "circle jerking" just because we dared mention "e6760" in a discussion again. Read all the rest of the posts before making an assumption about what we're talking about.
 
Wow, a 6760 is only a couple steps down from my 6850. My rig feels old..
No worries mate, your HD Radeon 6850 is well above this thing. Based on AMD metrics, the E6760 develops a bit less than 600 GFLOPS whereas yours outputs a healthy 1500 GFLOPS. It is not the same world.
 
Thank you for the clarification, Alstrong. If Nintendo did decided to enhanced the GPU with some features beyond dx10.1 due to third-party dev's feedback, is it safe to assume that enchanting the GPGPU capabilities and the revamping the tessellation stages would be the major ones to fix?

I suppose, though I'm not sure what the context of the feedback was. (link?)
 
No worries mate, your HD Radeon 6850 is well above this thing. Based on AMD metrics, the E6760 develops a bit less than 600 GFLOPS whereas yours outputs a healthy 1500 GFLOPS. It is not the same world.


GFLOPS aren't everything though, the e6760 also out performs the HD 4850 which has over 1TFLOP......
 
No worries mate, your HD Radeon 6850 is well above this thing. Based on AMD metrics, the E6760 develops a bit less than 600 GFLOPS whereas yours outputs a healthy 1500 GFLOPS. It is not the same world.

Oh that sounds encouraging. I wasn't ready to spend another couple hundred on an upgrade anytime soon...
 
Guys, are we reading the same thread.

There is not such a thing as a circle of jerk offs first and secondly the assumption that the Wii U's insides contain a modified E6760 GPGPU is based on the fact that this chip features all functionalists that are required by the Wii U (made by AMD, Eyefinity for the uPad support, is a GPGPU, power consumption that is in line with the console's constraints, etc.). It is the closest off-the-shelf component that fits the puzzle.

Why do we deserve such a mention for our investigation work. The E6760 was mentioned in this thread before someone sent a mail to AMD, and back then we already thought it was the culprit. So please, I like to discuss but not be insulted.

I completely agree. I do not understand the seemingly angered attitudes in regards to others posting ideas about the E6760. Maybe we have some paid disinformation specialists to get us off the scent? Serious.
 
Bah. The "CJ" comment appears again.

The problem with this kind of thinking, is that we don't really know which info we got about old dev kits, who gave us second hand information and what information we have now.

That is why I strongly think we should try to figure out what we know, before we declare it as fact. For instance unified memory doesn't really make complete sense, given that the system and game resources are completely separate right now, who is saying that they aren't simply 2 different pools.

About the e6760: If the original dev kits used HD4850s to approximate the performance of a chip, the e6760's performance does fall perfectly in line with this, not saying it is but it also really doesn't matter if it's not; we are getting a grasp on the performance and this fits the power draw and the targeted performance.

It works with all of our rumors too. Also while the e6760 is part of the "turks" family, it's using "evergreen" shaders, which is the HD5000 series. Embedded graphic cards also are perfect for a low powered console and looking at the software side of it, e6760 fits the bill quite a bit. You could call it the full featured GPU lacking teeth (even if it's performance can match a 2008's 1TFLOPs card, that doesn't mean it does it with raw power)

I want to point out also that the 40nm was likely shrunk, what exactly does 1GB of GDDR5 cost? 5watts? I am not sure but lets use that number so we can move forward with our speculations, 30watts is now what the e6760 runs at, shrunken to 32nm or 28nm (there was the rumor about problems in manufacturing the gpu, that certainly isn't something done at 40nm) at 28nm it's easy to push the GPU to 20watts or less, but we will be realistic and say 25watts for the 32nm. 8-10watts for the CPU and another 10watts for the system would reach that 45watts fairly easily.

My guess is that Wii U is 3 custom 476fp's built on PPC7 tech (the soi and embedded ram) sharing the 32mb edram with the custom embedded GPU7 which is 5000 series shaders or 6000 series shaders combined with 1GB ram and some fixed functions to emulate Flipper more accurately. I do think we are talking about something beyond the 4000 series as a base, in fact I haven't seen one single semi reliable rumor that confirms that Wii U is still using something based off of R770 (which is what the HD4850 is) instead all I see is a lot of people who don't want to get hyped, so they down play whatever the specs could be and hold onto rumors like the OP's, which comes from at best second hand info from someone who knows very little about what that info means.

Arkam has been great trying to give us this info, even when he was attacked but just because he was told something about the system a year ago, doesn't mean that is what the system is now.

I really hope this got through to some of you, I understand that you have a lot of insider info being thrown around at you, but I think you are mixing info from old sources, new sources and unreliable sources in such a way, that it becomes useless information.

Arkam only started posting this year and even indicated in this thread it's not a "full DX11 equivalent". Also Nintendo's not going to scrap years of R&D just like that.
 
It's hard to have a rational discussion with people who don't understand what they're talking about and ignore any evidence that disagrees with their conclusions, no matter the source. We spent a dozen pages in this very thread with people in denial about the PSU and power usage of the WiiU as people attempted to deny the facts given to us directly by Iwata. See also the Power7 discussion, which was always an argument between people with the technical expertise to know anything actually resembling the Power7 would never work in the WiiU and the overeager fanboys who latched on to a couple ambiguous and/or ill-informed tweets to argue for its inclusion.

There are pages and pages of rational arguments in this very thread for why it can't be e6760, but too many are willing to ignore that based on easily faked emails from a source we've been told (and should have known intuitively) would have no actual knowledge in any case.

Put the AMD email(s) aside. There are other sources, specifically the PR releases from May, that together, point right at the e6760.

If by rational arguments, you mean getting your ideas called 'foolishness', with no other reasoning than that, then, well, your are mistaken.

The fact is, we don't know how much the GPU has been modified, and with this unknown variable, not one person on the outside can say it is not a modified e6760.
 
Bah. The "CJ" comment appears again.

Arkam only started posting this year and even indicated in this thread it's not a "full DX11 equivalent". Also Nintendo's not going to scrap years of R&D just like that.
CJ? What is this.

Also, we can't exclude the possibility that the E6760 is the mass-market, AMD-proprietary version of the GPGPU that is inside the Wii U, can we. By that I mean: the GPGPU in the Wii could predate, or is the reason why we have, the E6760 as a whole.
 
Top Bottom