Yes, that is a priority if we want to know more prior to launch.The problem with this kind of thinking, is that we don't really know which info we got about old dev kits, who gave us second hand information and what information we have now.
Yes, it shares enough resemblance with our culprits. It remains to be seen if it can easily be mass-produced on the long term. But I don't think we can have this info.About the e6760: If the original dev kits used HD4850s to approximate the performance of a chip, the e6760's performance does fall perfectly in line with this, not saying it is but it also really doesn't matter if it's not; we are getting a grasp on the performance and this fits the power draw and the targeted performance.
It works with all of our rumors too.
Matt stated that the CPU is not a 476p but Matt has also been half in the shadow so far. bgassassin trusts him though, which is not really helpful given how easy it would be for one of them to spill completely the beans if they word their thoughts not so carefully.My guess is that Wii U [CPU] is 3 custom 476fp's built on PPC7 tech (the soi and embedded ram) sharing the 32mb edram with the custom embedded GPU7 which is 5000 series shaders or 6000 series shaders combined with 1GB ram and some fixed functions to emulate Flipper more accurately. I do think we are talking about something beyond the 4000 series as a base
Talking about that, what is the role of GPR in the context of the type of VRAM used in the GPU?GDDR5 makes no sense, there would be no point of having 32MB of EDRAM if the main pool was GDDR5 as the speed of the EDRAM would be wasted, due to the very high bandwidth of GDDR5.
Apologies. Fixed now.Link seems busted![]()
If there's no dedicated video ram pool (even if not GDDR5) it makes that 1GB not sound like much at all, doesn't it?
Or are console boxes alright with that? But iirc most previous consoles have had dedicated vram, even if very little in comparison.
So, it looks like we got a decent spec estimate going on in here:
GPU: AMD modified e6760
CPU: 3-core Out-of-order CPU consisting of three enhanced Broadway CPUs, with an increased amount of CPU cache.
RAM: 32MB eDRAM and 2GB of GDDR3 general RAM (of which 1GB is currently available for developers)
MISC: DSP processor.
I really don't what metrics we should use to compare the situation last gen with this one but you can assume the Wii U is 2x the X360 whereas the Wii was roughly on par with Xbox.Given the latest rumours, would it be pretty meaty? Certainly not another Wii situation, no?
As far as I can tell it is likely another Wii situation when comparing it to the predecessors but it may not make that much of a difference in practice since architecture and technologies in use won't change as dramatically this time so it won't be left too far behind. It will still have normal mapping and pretty lighting and reflections and godrays and shadows and whatever, just maybe (probably) lack the latest things like tessellation and soft shadows (with good performance at least). Of course that still makes a big difference and things will still need to be scaled back in terms of resolution, polycounts, physics, etc, if the other consoles are much more powerful. That's just a guess though, we don't really have much to go on just yet. And don't know shit about the others.Given the latest rumours, would it be relatively meaty? Obviously not as powerful as the PS4/720, but certainly not another Wii situation, no?
GPU more likely to be modifed R700 resembling cut back e6760.
I really don't what metrics we should use to compare the situation last gen with this one but you can assume the Wii U is 2x the X360 whereas the Wii was roughly on par with Xbox.
So, there is some progress![]()
Why cut back? I think it will resemble it, with some added features specifically tailored to WiiU and some features cut out (less interesting for a dedicated game console?). Overall in the same class, but likely more suitable for a gaming platform. That would be my guess.
Why cut back? I think it will resemble it, with some added features specifically tailored to WiiU and some features cut out (less interesting for a dedicated game console?). Overall in the same class, but likely more suitable for a gaming platform. That would be my guess.
Wii was severely lacking in the shader department. I think other than the GPU, it was a couple of steps ahead of xbox. WiiU will likely be ahead of 360 everywhere where it counts.
Most rumors indicate the DX11 feature set isn't completely supported on Wii U, but the e6760 has full support.
Most rumors indicate the DX11 feature set isn't completely supported on Wii U, but the e6760 has full support.
Early dev kits used a 4850 (which is only dx10.1)
What does DX11 support mean?
What does DX11 support mean?
What does DX11 support mean?
Technically nothing as its a proprietary Microsoft API for use on windows and Xbox but it can be useful for approximating the features of the graphics chips
However hasn't there been talk that amd dx10 chips are technically capable of all dx11 effects and features anyway but did them in a different way to Microsoft's standard so aren't certified for it (and I guess unusable on PC)?
Like i said, maybe not all DX11 features are relevant. The investigative article from (forgot the source) a while back, spoke of DX11 level features but DX9 level performance.
The two most interesting DX11 features were tesselation and newer GPGPU features if i remember correctly?
I'm unclear as well on this. But for example, even though the GPU's in PS3/360 would be DX9 in PC hardware terms, they can do some things that cant be done in DX9 cause they're not restricted by an API and can be programmed to the metal in a console.
That's why I think the Wii U GPU is probably DX10 (R700), and similar to the above it can do some DX11-like things naturally.
Given the latest rumours, would it be relatively meaty? Obviously not as powerful as the PS4/720, but certainly not another Wii situation, no?
It seems to be a mix of overexcited people who've now settled on it being the E6760 because AMD Tech Support said so - it must be true.It's getting difficult to hold back the tide on this e6760 circle jerk.
That's what I guessed.It was never going to be another Wii situation. The problem that the Wii faced is that between when the original GPU base for it was developed and when the GPUs for the PS360 were developed, a shift happened in the GPU world. The way things were done in hardware changed big time, and the Wii's GPU didn't follow suit of that change. Thus any engines for it or games ported from the other two had to be reprogrammed from the ground up. You could get certain similar effects and such, it just meant doing it a completely different way.
Unlike that situation, it looks like the Wii-U is going to support compute shaders, tessellation, and other modern GPU featuresets. It just won't have the same level of oompf if you will that the PS4/720 will have. What that means is that instead of having to re-write engines from scratch going from say the PS4 - Wii-U they'll just have to be toned down. It's a very different situation than the current gen.
Guys, are we reading the same thread.
There is not such a thing as a circle of jerk offs first and secondly the assumption that the Wii U's insides contain a modified E6760 GPGPU is based on the fact that this chip features all functoinalities that are required by the Wii U (made by AMD, Eyefinity for the uPad support, is a GPGPU, power consumption that is in line with the console's constraints, etc.). It is the closest off-the-shelf component that fits the puzzle.
Why do we deserve such a mention for our investigation work. The E6760 was mentioned in this thread before someone sent a mail to AMD, and back then we already thought it was the culprit. So please, I like to discuss but not be insulted.
That's what I guessed.
So the Wii U will probably be inbetween current-gen, and next? I'm ok with that.
128 GPRs is what an ALU clause in a single R700 thread can address (a clause is a schedulable sub-routine, put loosely). An R700 SIMD engine runs 64 threads. A GPU has multiple SIMD engines. You could say that a R700 has 'a tad more' than 128 GPRs.
Here's something educational: http://gpgpu.org/wp/wp-content/uploads/2009/09/E1-OpenCL-Architecture.pdf (pay particular attention to pg. 10)
Yeah for some people it seems easier to overreact to someone else having a different opinion to them, than to just discuss rationally why they disagree.
AFAIK, that number did not change for the duration of the VLIW era, i.e. a Cayman ALU clause was still limited to 128 GPRs.If Matt was referring to GPRs that the ALU clause in single GPU thread, what number does a e6760 has for that?
Brad Grenz, this is the way out.It's hard to have a rational discussion with people who don't understand what they're talking about and ignore any evidence that disagrees with their conclusions, no matter the source. We spent a dozen pages in this very thread with people in denial about the PSU and power usage of the WiiU as people attempted to deny the facts given to us directly by Iwata. See also the Power7 discussion, which was always an argument between people with the technical expertise to know anything actually resembling the Power7 would never work in the WiiU and the overeager fanboys who latched on to a couple ambiguous and/or ill-informed tweets to argue for its inclusion.
There are pages and pages of rational arguments in this very thread for why it can't be e6760, but too many are willing to ignore that based on easily faked emails from a source we've been told (and should have known intuitively) would have no actual knowledge in any case.
However hasn't there been talk that amd dx10 chips are technically capable of all dx11 effects and features anyway but did them in a different way to Microsoft's standard so aren't certified for it (and I guess unusable on PC)?
AFAIK, that number did not change for the duration of the VLIW era, i.e. a Cayman ALU clause was still limited to 128 GPRs.
Just off the top of my head:
On the tessellator side of things, pre-dx11 HW was much less programmable (not just API limitation) and would require more steps/ render passes to do certain things in order to achieve similar results - so yes, "possible". There were some tweaks for r7xx for geometry shader interaction so that's "getting there".
On compute side, there's a pretty clear distinction in HW capability between SM4.x and SM5.0. It's less about exposing functionality (unlike DX9 console vs DX9 PC) - the HW just can't do certain things (accessing resource types, # of threads/shared mem etc). For example, nVidia actually did have atomics in G80, but that functionality wasn't made part of the standard until SM5.0 because AMD lacked the HW capability up until Cypress.
There are some other things like texture formats that would need to be added in hardware, but that ought to be rather trivial by comparison to wholly upgrading the compute capabilities & tessellator stages.
It's hard to have a rational discussion with people who don't understand what they're talking about and ignore any evidence that disagrees with their conclusions, no matter the source. We spent a dozen pages in this very thread with people in denial about the PSU and power usage of the WiiU as people attempted to deny the facts given to us directly by Iwata. See also the Power7 discussion, which was always an argument between people with the technical expertise to know anything actually resembling the Power7 would never work in the WiiU and the overeager fanboys who latched on to a couple ambiguous and/or ill-informed tweets to argue for its inclusion.
There are pages and pages of rational arguments in this very thread for why it can't be e6760, but too many are willing to ignore that based on easily faked emails from a source we've been told (and should have known intuitively) would have no actual knowledge in any case.
No worries mate, your HD Radeon 6850 is well above this thing. Based on AMD metrics, the E6760 develops a bit less than 600 GFLOPS whereas yours outputs a healthy 1500 GFLOPS. It is not the same world.Wow, a 6760 is only a couple steps down from my 6850. My rig feels old..
Thank you for the clarification, Alstrong. If Nintendo did decided to enhanced the GPU with some features beyond dx10.1 due to third-party dev's feedback, is it safe to assume that enchanting the GPGPU capabilities and the revamping the tessellation stages would be the major ones to fix?
No worries mate, your HD Radeon 6850 is well above this thing. Based on AMD metrics, the E6760 develops a bit less than 600 GFLOPS whereas yours outputs a healthy 1500 GFLOPS. It is not the same world.
No worries mate, your HD Radeon 6850 is well above this thing. Based on AMD metrics, the E6760 develops a bit less than 600 GFLOPS whereas yours outputs a healthy 1500 GFLOPS. It is not the same world.
The number codes aren't equivalent between embedded and PCI-E models.Wow, a 6760 is only a couple steps down from my 6850. My rig feels old..
Guys, are we reading the same thread.
There is not such a thing as a circle of jerk offs first and secondly the assumption that the Wii U's insides contain a modified E6760 GPGPU is based on the fact that this chip features all functionalists that are required by the Wii U (made by AMD, Eyefinity for the uPad support, is a GPGPU, power consumption that is in line with the console's constraints, etc.). It is the closest off-the-shelf component that fits the puzzle.
Why do we deserve such a mention for our investigation work. The E6760 was mentioned in this thread before someone sent a mail to AMD, and back then we already thought it was the culprit. So please, I like to discuss but not be insulted.
The problem with this kind of thinking, is that we don't really know which info we got about old dev kits, who gave us second hand information and what information we have now.
That is why I strongly think we should try to figure out what we know, before we declare it as fact. For instance unified memory doesn't really make complete sense, given that the system and game resources are completely separate right now, who is saying that they aren't simply 2 different pools.
About the e6760: If the original dev kits used HD4850s to approximate the performance of a chip, the e6760's performance does fall perfectly in line with this, not saying it is but it also really doesn't matter if it's not; we are getting a grasp on the performance and this fits the power draw and the targeted performance.
It works with all of our rumors too. Also while the e6760 is part of the "turks" family, it's using "evergreen" shaders, which is the HD5000 series. Embedded graphic cards also are perfect for a low powered console and looking at the software side of it, e6760 fits the bill quite a bit. You could call it the full featured GPU lacking teeth (even if it's performance can match a 2008's 1TFLOPs card, that doesn't mean it does it with raw power)
I want to point out also that the 40nm was likely shrunk, what exactly does 1GB of GDDR5 cost? 5watts? I am not sure but lets use that number so we can move forward with our speculations, 30watts is now what the e6760 runs at, shrunken to 32nm or 28nm (there was the rumor about problems in manufacturing the gpu, that certainly isn't something done at 40nm) at 28nm it's easy to push the GPU to 20watts or less, but we will be realistic and say 25watts for the 32nm. 8-10watts for the CPU and another 10watts for the system would reach that 45watts fairly easily.
My guess is that Wii U is 3 custom 476fp's built on PPC7 tech (the soi and embedded ram) sharing the 32mb edram with the custom embedded GPU7 which is 5000 series shaders or 6000 series shaders combined with 1GB ram and some fixed functions to emulate Flipper more accurately. I do think we are talking about something beyond the 4000 series as a base, in fact I haven't seen one single semi reliable rumor that confirms that Wii U is still using something based off of R770 (which is what the HD4850 is) instead all I see is a lot of people who don't want to get hyped, so they down play whatever the specs could be and hold onto rumors like the OP's, which comes from at best second hand info from someone who knows very little about what that info means.
Arkam has been great trying to give us this info, even when he was attacked but just because he was told something about the system a year ago, doesn't mean that is what the system is now.
I really hope this got through to some of you, I understand that you have a lot of insider info being thrown around at you, but I think you are mixing info from old sources, new sources and unreliable sources in such a way, that it becomes useless information.
[*]A stock E6760 scores a 5870 in 3D Mark Vantage, which is higher than the HD 4850.
[/LIST]
...
http://www.em.avnet.com/en-us/desig...60-Embedded-Discrete-Graphics-Processors.aspx
Maybe is the average or the High preset in AMD's page? IDk.I'm somewhat confused here. According to the document, it's 5870 in 3DMark Vantage Performance (P), but I'm finding systems with 4850's scoring well above that (blue bars).
I'm somewhat confused here. According to the document, it's 5870 in 3DMark Vantage Performance (P), but I'm finding systems with 4850's scoring well above that (blue bars).
It's hard to have a rational discussion with people who don't understand what they're talking about and ignore any evidence that disagrees with their conclusions, no matter the source. We spent a dozen pages in this very thread with people in denial about the PSU and power usage of the WiiU as people attempted to deny the facts given to us directly by Iwata. See also the Power7 discussion, which was always an argument between people with the technical expertise to know anything actually resembling the Power7 would never work in the WiiU and the overeager fanboys who latched on to a couple ambiguous and/or ill-informed tweets to argue for its inclusion.
There are pages and pages of rational arguments in this very thread for why it can't be e6760, but too many are willing to ignore that based on easily faked emails from a source we've been told (and should have known intuitively) would have no actual knowledge in any case.
CJ? What is this.Bah. The "CJ" comment appears again.
Arkam only started posting this year and even indicated in this thread it's not a "full DX11 equivalent". Also Nintendo's not going to scrap years of R&D just like that.