DonMigs85
Member
guys, the WII U is capable of hiding its true power level.
just wait for it to turn super saiyan
Its power brick is big so that it can open and suck up the 360 and PS3 and become Perfect Wuu.
guys, the WII U is capable of hiding its true power level.
just wait for it to turn super saiyan
Can someone please explain if this is a good thing or not? The eDRAM is 14nm? Is this related to the CPU in some way? Sorry for the multiple questions, I just want to be clear on what we are talking about.
In IBM System z9 (and successor) mainframes, the System z Integrated Information Processor (zIIP) is a special purpose processor. It was initially introduced to relieve the general mainframe central processors (CPs) of specific DB2 processing loads, but currently is used to offload other z/OS workloads as described below.
IBM publicly disclosed information about zIIP technology on January 24, 2006. The zIIP hardware (i.e. microcode, as the processors hardware does not currently differ from general purpose CPUs) became generally available in May, 2006. The z/OS and DB2 PTFs to take advantage of the zIIP hardware became generally available in late June, 2006.
Commercial software developers, subject to certain qualification rules, may obtain technical details from IBM on how to take advantage of zIIP under a Non-Disclosure Agreement.
Depending on the capacity model, a PU (processing units ) can be characterized as a Central Processor (CP), Integrated Facility for Linux (IFL) processor, z Application Assist Processor (zAAP), z10 Integrated Information Processor (zIIP), or Internal Coupling Facility (ICF) processor. (The specialty processors are all identical and IBM locks out certain functions based on what the processor is characterized as.) It is also possible to configure additional System Assist Processors, but most customers find the mandatory minimum SAP allocation sufficient.
The z196 microprocessor is a chip made by IBM for their zEnterprise 196 mainframe computers, announced on July 22, 2010. The processor was developed over a three year time span by IBM engineers from Poughkeepsie, New York; Austin, Texas; and Böblingen, Germany at a cost of US$1.5 billion. Manufactured at IBM's Fishkill, New York fabrication plant, the processor began shipping on September 10, 2010. IBM stated that it is the world's fastest microprocessor.
fabricated in IBM's 45 nm CMOS silicon on insulator fabrication process
It has four cores, each with a private 64 KB L1 instruction cache, a private 128 KB L1 data cache and a private 1.5 MB L2 cache. In addition, there is a 24 MB shared L3 cache implemented in eDRAM and controlled by two on-chip L3 cache controllers. There's also an additional shared L1 cache used for compression and cryptography operations.[1]
Each core has six RISC-like execution units,
Since the GameCube era, Nintendo has been rather conservative with it's numbers. Maybe we'll see some better performance with the CPU just because it's not the awful Cell PPE architecture (BTW, Xenon could achieve a theoretical peak performance of 115GFLOPS. It propbably can't reach half of that [hell, I've heard as low as 30-40GFLOPS. John Carmak has even said that the Xbox 360 is only 10X more powerful than the Wii [and Broadway has an "actual" peak rating of 2.9GFLOPS]).
OK, let me try this one last time and see how people react to these proposed specs:
CPU: IBM Tri-Core PowerPC "Enhanced Broadway" (2-way SMT, improved SIMD, ect)
@1.2GHz, Approx. 25-35GFLOPS
Approx TDP: 7 Watts
GPU : Custom AMD Radeon HD GPU
300 Shader units
360 MHz clock speed
Approx 400 GFLOPS
Approx TDP: 20 watts
RAM: 1GB GDDR3 for games (clocked at 1100MHz)
1GB DDR3 for background applications (clocked again at 1100MHz)
DSP: commodity priced generic chip @ 120MHz
Sound reasonable? If not, add your thoughts.
i thought it was supposedly slightly under 600mhz (or was it 500mhz) according to...what his face. Matt?
Yeah Matt said that 600 was a little too high.......hmm now I'm thinking about it was he talking about Jigglyflops or MegaHertz?
Its power brick is big so that it can open and suck up the 360 and PS3 and become Perfect Wuu.
Then the Wii U absorbs the game tablet and becomes Super Wuu.
you guys are getting your DBZ metaphors mixed up, is it cell or is it buu?
oh shit, or is it super android 13??
By the way, do we also know if the Wii U's 1080p upscaling is comparable to or better than the 360's?
Wait, the eDRAM is only 13 MB? What happened to 32 MB? I thought that was (somehow) confirmed for months?
OK, let me try this one last time and see how people react to these proposed specs:
CPU: IBM Tri-Core PowerPC "Enhanced Broadway" (2-way SMT, improved SIMD, ect)
@2.2GHz, Approx. 35-45GFLOPS
Approx TDP: 7 Watts
GPU : Custom AMD Radeon HD GPU
400 Shaders
550MHz clock speed
Approx 500GFLOPS
Approx TDP: 28 watts
RAM: 1GB GDDR5 for games (clocked at 1100MHz)
1GB DDR3 for background applications (clocked again at 1100MHz)
DSP: ARM11-based @ 225MHz
Sound reasonable? If not, add your thoughts.
By shaders they're referring to scalar ALUs.Wouldn't 400 shaders at 550MHz be more like 1.76 TF?
Wait, the eDRAM is only 13 MB? What happened to 32 MB? I thought that was (somehow) confirmed for months?
By shaders they're referring to scalar ALUs.
Yea, I was looking at the specs for Xenos and RSX, where the term shader is used to refer to vector5/vector4 ALUs, but noticed that on the wiki pages for modern GPUs it seems to refer to single APUs, going by the flops. not sure why...
Either way people are going some weird approximations with the flops in their speculations.
So flops calculations is easy with AMD, the formula is # of individual shaders * 2 * (clockspeed/100)
This imaginary 400 shaders x 2 x .550 = 440GFLOPs. a bit low, 480 shaders however, gives us 528GFLOPs maybe a bit low still, but much closer to that ~600GFLOPs rumor (however BGassassin believed it to exceed 600GFLOPs.
My imaginary shot in the dark is 640 shaders @ ~488mhz (DSP's clock speed x 4) giving us 624GFLOPs.
However soon we will likely know quite a bit more.
Looking at you're maths, shouldn't that be "clockspeed/1000"?Or more accurately "clockspeed/1048576000"?
Do you think there's a chance that Nintendo would allow developers in the future to unlock the WiiU full power? Aka not being forced to stream to the Gamepad? Just having a game run completely with the TV and classic controller/ or only on gamepad?
Do you think there's a chance that Nintendo would allow developers in the future to unlock the WiiU full power? Aka not being forced to stream to the Gamepad? Just having a game run completely with the TV and classic controller/ or only on gamepad?
Do you think there's a chance that Nintendo would allow developers in the future to unlock the WiiU full power? Aka not being forced to stream to the Gamepad? Just having a game run completely with the TV and classic controller/ or only on gamepad?
I dont think Nintendo is forcing developers to stream to the gamepad now.
Do you think there's a chance that Nintendo would allow developers in the future to unlock the WiiU full power? Aka not being forced to stream to the Gamepad? Just having a game run completely with the TV and classic controller/ or only on gamepad?
Developers never were forced to use the gamepad at all.Do you think there's a chance that Nintendo would allow developers in the future to unlock the WiiU full power? Aka not being forced to stream to the Gamepad? Just having a game run completely with the TV and classic controller/ or only on gamepad?
OK, let me try this one last time and see how people react to these proposed specs:
CPU: IBM Tri-Core PowerPC "Enhanced Broadway" (single thread per core, 4-5 instructions per cycle, improved SIMD, ect)
@1.6 GHz, Approx. ? GFLOPS
Approx TDP: 7 Watts
GPU : Custom AMD Radeon HD GPU
320 Shaders
532 MHz clock speed
Approx 340 GFLOPS
Approx TDP: 28 watts
RAM: MEM1: 32 MB Renesas eDRAM (UX8GD), 532 Mhz, <2ns latency
MEM2: 2 GB DDR3 (1 GB available initially for games) clocked at 1064 Mhz
128-bit bus: 1064 Mhz (34 GB/s) connection to GPU, 532 Mhz (17 GB/s) to CPU
I/O: Dual Core ARM Cortex-A5 at 532 Mhz
DSP: 133 Mhz, possibly Cortex-M3 based
Sound reasonable? If not, add your thoughts.
On the RAM front, I'd say you're right in that it's 2GB DDR3 at about 1GHz on a 128-bit bus. Regarding the split between CPU/GPU bandwidth, my guess is that there's a memory controller on the GPU die which handles RAM access for both chips (and eDRAM access for the CPU), and there's 34GB/s or so of bandwidth shared between the CPU and GPU. I don't see any reason for CPU to be limited in bandwidth compared to the GPU, considering they're on the same MCM.
Can it work that way? I based alot of the above off what they did on Gamecube, as it seems they kind of used that as a matrix and then started replacing units with new compatible ones piece by piece. I'm with you on the memory controller being on the GPU, but it seems most IBM cores use a 2:1 ratio of core:memory clock at the lowest. So maybe a 798 Mhz connection to CPU(keeping the multipliers I used intact), but, theoretically speaking, could it run that fast if the GPU is only 532 Mhz and the memory controller is on the GPU?
Edit: Yes, I've been waiting for that particular information to emerge any day now. I hope I don't have to do it myself.
Edit 2: Actually, another way for the CPU to gain faster access to the RAM without upping the clocks is if they had a fatter bus between the CPU and GPU. We know little about that connection, but considering the proximity of the chips, perhaps it isn't that ludicrous. But then you start thinking about balance and wonder, "Does the CPU Nintendo used even require that much bandwidth?"
That runs contrary to what we've been recently hearing actually. Well, in regards to pre-orders, when it comes to non-preordered systems then unless you're lucky with the waitlist or willing to wait in a line for a long time you're probably screwed.This may be old news, but some of you with pre-orders may end up disappointed this weekend. Nintendo's having manufacturing trouble with the gamepad, specifically. The amount of consoles going out is less than what was planned and they're going to have a very difficult time satisfying the holiday rush for the console.
This may be old news, but some of you with pre-orders may end up disappointed this weekend. Nintendo's having manufacturing trouble with the gamepad, specifically. The amount of consoles going out is less than what was planned and they're going to have a very difficult time satisfying the holiday rush for the console.
StevieP
Doesn't actually understand technology or have insider info.
Seems reliable.
This may be old news, but some of you with pre-orders may end up disappointed this weekend. Nintendo's having manufacturing trouble with the gamepad, specifically. The amount of consoles going out is less than what was planned and they're going to have a very difficult time satisfying the holiday rush for the console.
This may be old news, but some of you with pre-orders may end up disappointed this weekend. Nintendo's having manufacturing trouble with the gamepad, specifically. The amount of consoles going out is less than what was planned and they're going to have a very difficult time satisfying the holiday rush for the console.
Made my own changes. As far as shader count goes, anything between 320 and 400 seems reasonable. The R700 chips seem to scale in groups of 40, though. I like 320 shaders, and thus, 32 TMUs, for Wii BC purposes. A key point in getting that to work is replicating Wii's 1 MB on-chip texture cache. The texture memory on Flipper was broken up into 512 banks, 32 of which could be accessed simultaneously. Each of those 32 banks also had its own address bus. In my design for BC, rather than add more L1 texture cache (expensive SRAM), they would bypass it and use 1 of the 32 MB eDRAM for this purpose. So the eDRAM on Wii U would have to be similarly split up into many banks (at least 512) and have 32 address buses.
Quote my tag if you must, I'm just passing on a bit of information that I came across. Take it or leave it. The situation to pre-orders is information in regards to my own country (Canada) where there may not be enough consoles to fill every pre-order made across all chains. Frankly I expected this (the first priority for NCL has always been USA). The gamepad manufacturing thing was new to me, though. I don't know what to make of that little piece of information, or how accurate it is. But I was told specifically "they're having an issue with the gamepad, not the console".
Edit: Plinko - see in regards to my "Canada" comment.
I think the DSP will be 121.5mhz like Wii since thats what the first leak mentioned. Also wow at the lowball GPU GFLOP prediction. That could be the lowest I've ever seen here, lower even than USC-fan :O
I think the DSP will be 121.5mhz like Wii since thats what the first leak mentioned. Also wow at the lowball GPU GFLOP prediction. That could be the lowest I've ever seen here, lower even than USC-fan :O
The issue with the 121.5MHz DSP rumour, although I'm not disputing the validity of it at the time, is that it came out before Nintendo would have done full thermal testing on the final hardware, which would mean before clock speeds were finalised. It's entirely possible that 121.5MHz was the target speed for the DSP at the time, but it got changed (either upwards or downwards) as they fine-tuned the various clock speeds in the system.
The issue with the 121.5MHz DSP rumour, although I'm not disputing the validity of it at the time, is that it came out before Nintendo would have done full thermal testing on the final hardware, which would mean before clock speeds were finalised. It's entirely possible that 121.5MHz was the target speed for the DSP at the time, but it got changed (either upwards or downwards) as they fine-tuned the various clock speeds in the system.
IGN are saying Ninja Gaiden has worse graphics on the Wii U than the other consoles.
Sloppy ports ahoy! Unless you're trying to flip this thing for a profit, or if you really need Nintendoland, I have no idea why anyone would buy this at launch. It might very well turn out to be a promising system, but the level of effort in some of these ports is disconcerting.