Rumor: Wii U final specs

I'd be really surprised if the CPU is clocked lower than 2 ghz considering the apparent quality of the multiplatform titles seen so far. Tekken Tag 2 is a good example - that game is already making the 360 and PS3 creak with its dynamic resolution drops to maintain framerate. Aside from possibly less fish in the boat stage, it appears comparable to the PS360 versions.

Resolution is GPU dependent only, not affected by CPU.

And compared to Xenon, a 3 core Broadway @ 1.5 GHz wouldn't be underpowered.
 
I'd be really surprised if the CPU is clocked lower than 2 ghz considering the apparent quality of the multiplatform titles seen so far. Tekken Tag 2 is a good example - that game is already making the 360 and PS3 creak with its dynamic resolution drops to maintain framerate. Aside from possibly less fish in the boat stage, it appears comparable to the PS360 versions.
Resolution drops are entirely unrelated to CPU performance. In fact, I'd be hard pressed to think about anything more unrelated ;)

CPU clock speed and and CPU power often have little in common
True. But that didn't stop people from jumping at me when I dared suggest that the CPU might be more than slightly lower clocked than Cell/Xenon.
 
That's not how CPUs work... -_-

My god what's with the sudden doom and gloom on the ultra low CPU specs? Developers were saying they felt the CPU was a LITTLE weak, not massively underpowered.
I believe that it was only one developer on record who later clarified their remarks to say that they were still coming to grips with the hardware although one of the spec insiders (I can't remember who) was reporting that devs they spoke to were complaining that the CPU wasn't on par with Xenon/Cell.

Those reports were from a while ago so it could have been an early dev kit issue.
 
No. You can't store many textures in 32 MB. ;)
The "1 GB for games" is meant to serve as main and VRAM, the eDRAM being a helpful addition nonetheless of course. It's just like what we have in Xbox 360.
Bad comparison, it seems closer to the Wii. Same MEM1/ MEM2 terminology, too. The eDRAM is "real", general purpose RAM.
 
I'm making the Tekken comparison in the context of Ken Harada personally observing that his team was having trouble equalizing performance on the Wii U version.

If his team actually did have difficulty, then I assume it wasn't that much considering how close the Wii U version turned out.
 
That's not how CPUs work... -_-

My god what's with the sudden doom and gloom on the ultra low CPU specs? Developers were saying they felt the CPU was a LITTLE weak, not massively underpowered.
I want to know why only Japanese developers complained about the CPU.

We haven't had a western dev go on record with similar statements. In retrospect, comments suggesting Wii U is actually very powerful have been mostly western.

I really suspect it's an issue with programming expertise. It could also explain why Ideaman's sources actually liked the CPU (assuming they're western).
 
No. You can't store many textures in 32 MB. ;)
The "1 GB for games" is meant to serve as main and VRAM, the eDRAM being a helpful addition nonetheless of course. It's just like what we have in Xbox 360.

Isn't VRAM a kind of ambiguous term? Technically i refers to a type of RAM that hasn't been used for many years, but it's often used to describe any memory used for graphics. In a system with unified memory plus a few MBs of embedded framebuffer memory, the eDRAM can be refered to as VRAM. That's what nintendo does in the 3DS documentation, while most games put the bulk of their textures in main memory (though I'm able to squeze almost all of the textures in Gunman Clive into VRAM).
 
Isn't VRAM a kind of ambiguous term? Technically i refers to a type of RAM that hasn't been used for many years, but it's often used to describe any memory used for graphics. In a system with unified memory plus a few MBs of embedded framebuffer memory, the eDRAM can be refered to as VRAM. That's what nintendo does in the 3DS documentation, while most games put the bulk of their textures in main memory (though I'm able to squeze almost all of the textures in Gunman Clive into VRAM).

Is it 6mb on 3ds?
 
I want to know why only Japanese developers complained about the CPU.

We haven't had a western dev go on record with similar statements. In retrospect, comments suggesting Wii U is actually very powerful have been mostly western.

I really suspect it's an issue with programming expertise. It could also explain why Ideaman's sources actually liked the CPU (assuming they're western).
Ideaman's sources are western (although cryptic he has revealed that much) but I'm not sure if I'd go as far as saying this is an issue with Japanese developers. The CPU doesn't appear to be a continuation of the PS3 and 360's super CPUs which might be what tripped up devs in the beginning. It could be functionally more powerful in than Xenon/Cell but not in pure clock speed.

In fact, based on the rumors I'm not sure if either MS or Sony will use CPUs in their new systems as powerful as Xenon and Cell were when they were released.
 
I want to know why only Japanese developers complained about the CPU.

We haven't had a western dev go on record with similar statements. In retrospect, comments suggesting Wii U is actually very powerful have been mostly western.

I really suspect it's an issue with programming expertise. It could also explain why Ideaman's sources actually liked the CPU (assuming they're western).

Well, they didn't praise it, but they didn't complain about it once either, and i talked with them a lot and obviously asked about this specific component.

Ideaman's sources are western (although cryptic he has revealed that much) but I'm not sure if I'd go as far as saying this is an issue with Japanese developers. The CPU doesn't appear to be a continuation of the PS3 and 360's super CPUs which might be what tripped up devs in the beginning. It could be functionally more powerful in than Xenon/Cell but not in pure clock speed.

In fact, based on the rumors I'm not sure if either MS or Sony will use CPUs in their new systems as powerful as Xenon and Cell were when they were released.

yup western. My Japanese sources were totally tight-lipped since one year. And for the CPU, it's really more a question of learning curve, of optimization. Although well designed, Wii U CPU requires to grow more and more familiar with it, it's not a "CPU so powerful it can run everything on the fly without even knowing its specificities". To illustrate that, some studios managed to self-gimp the performances of their engine by not using the Wii U CPU properly for months.
 
Bad comparison, it seems closer to the Wii. Same MEM1/ MEM2 terminology, too. The eDRAM is "real", general purpose RAM.

360's eDRAM is different? Can you elaborate?


Isn't VRAM a kind of ambiguous term? Technically i refers to a type of RAM that hasn't been used for many years, but it's often used to describe any memory used for graphics. In a system with unified memory plus a few MBs of embedded framebuffer memory, the eDRAM can be refered to as VRAM. That's what nintendo does in the 3DS documentation, while most games put the bulk of their textures in main memory (though I'm able to squeze almost all of the textures in Gunman Clive into VRAM).

I see your point. Yes, one could call the eDRAM VRAM. But still, the main memory is also used for what you'd call VRAM purposes.
 
360's eDRAM is different? Can you elaborate?
Xenos eDRAM is a dedicated framebuffer, comparable to the 2MB eFB in Flipper and Hollywood. The Wii U eDRAM seems to be a high speed, general purpose scratch pad - it could be used for anything, including, but not limited to, a framebuffer.
 
Xenos eDRAM is a dedicated framebuffer, comparable to the 2MB eFB in Flipper and Hollywood. The Wii U eDRAM seems to be a high speed, general purpose scratch pad - it could be used for anything, including, but not limited to, a framebuffer.

What are you basing this on? Also isn't it 3 MB in flipper/hollywood, which I'm fairly sure can be used for textures as well
 
What are you basing this on? Also isn't it 3 MB in flipper/hollywood, which I'm fairly sure can be used for textures as well

2MB framebuffer
1MB texture cache

they are split.

Also had no clue the 32MB of edram was for anything that really should be changing the discussion of things alone.
 
It's not in mass production yet, so no. I'd say there's a 90% chance it's DDR3, and a 10% chance that it's some obscure kind of RAM I've never heard of before.

The modules seem kinda small for DDR3. Maybe it is an obscure RAM? After all, this IS Nintendo (1T-SRAM, FCRAM).
 
If they use APUs, as the rumors suggest, then yes they will be. AMD's HSA should provide a healthy leap for gaming and calculator applications.

I didn't mean that they wouldn't be a big leap but I don't expect them to be as fast in pure clocks or as power hungry as Xenon/Cell. I could be wrong though as I haven't started following the other new systems as closely.

This is kind of what I expect going from current gen to next gen processors.

If I give Xenon/Cell the equation 4+4+4+4+4+4=24 I'd expect both to complete that 1 million times per second (not real)

If I give WiiU CPU the same equation in performs it 800,000 times per second.

Now I give Xenon/Cell the equation 3x4+3x4=24. They don't understand this equation given their instruction set (again, not real).

I give WiiU CPU the same equation and it performs it 1.5 million times per second.

I don't know if any of that makes sense but it's just what I'd expect as CPUs get more efficient at handling instructions at reduced clock speeds as technology improves. I would imagine that the same applies to the other new systems CPUs as well.
 
What happened to all those strange patents Nintendo secured ages ago (holographic storage, etc.)? Where any of those applicable to RAM modules?

Nintendo may have a technology they are interested but I figure after all those blog on various products over the last years people would learn something they have now might not be the right time to launch. They had wii tech since GC but couldn't do it for a variety of reasons.
 
What happened to all those strange patents Nintendo secured ages ago (holographic storage, etc.)? Where any of those applicable to RAM modules?

Edit: Just to be clear I'm not suggesting the RAM is some new holo-RAM.

Holographic patents were for a new form of disc based media that could hold hundreds to thousands of gigabytes
 
What are you basing this on? Also isn't it 3 MB in flipper/hollywood, which I'm fairly sure can be used for textures as well
I'm basing it on earlier rumors and the fact that Nintendo kept the MEM1/ MEM2 terminology from the Wii (assuming the info in the OP is indeed copied from warioworld.com). Flipper/ Hollywood eDRAM is split:

3YFcp.jpg


eFB is the framebuffer, eTC is the texture cache.
 
Nintendo may have a technology they are interested but I figure after all those blog on various products over the last years people would learn something they have now might not be the right time to launch. They had wii tech since GC but couldn't do it for a variety of reasons.

Holographic patents were for a new form of disc based media that could hold hundreds to thousands of gigabytes
OK, I remembered there being a period where there was a slew of new patents registered but I couldn't remember if any of them applied to components to the WiiU that hadn't previously been identified.
 
The CPU doesn't appear to be a continuation of the PS3 and 360's super CPUs

Nor should it be.

The Xenon and Cell CPUs by comparison to today's technology are archaic.

Also dont' fall for the BS hype that Sony and Microsoft sprouted about the Xbox 360 and PS3's CPU power, both CPUs had significant limitations. While both CPUs excelled at SMID and high flotational point calculations, they were basically shit for anything else.

To put it into perspective Sound and I/O processing combined can easily take up an entire thread on an Xbox 360 CPU if not more. Kind of sad to think a 3.2GH core can have 1 of its 2 threads taken up entirely by sound and I/O processing. A dedicated DSP and I/O controllers of only a few hundred megahertz could do a better job then the Xenon processor at both tasks.


which might be what tripped up devs in the beginning. It could be functionally more powerful in than Xenon/Cell but not in pure clock speed.

Clock speed means very little.

The Xbox 360 and PS3's CPUs needed high clock speeds to perform. They were in order CPUs, limited cache, heavy focus on SMID and high flotational point calculations, as such both Xenon and CELL were very dependant on clock speed for their performance.

These days SMID and high flotational point tasks can be off loaded to the GPU, think CUDA, GPGPU, etc. Modern day CPUs really don't need to match the Xbox 360 or PS3 for SMID or high flotational point calculations, GPUs can do these tasks significantly faster then the Cell and Xenon ever could, and do it at a low cost and consuming less power.

As for the Wii U's CPU, given it appears to be Out of Order, has large L1-L3 cache pools, doesn't have to do Audio, I/O, and O/S tasks like Xenon and Cell do as Nintendo have dedicated chips for these, and doesn't need that raw SMID performance, its clock speed can be a lot lower while still delivering same or better performance.

Also Intel made Pentium 4 CPUs at 3.8ghz with hyper threading. Do you honestly believe a P4 CPU could keep up with even a single core of a equally or lower clocked i7? No it would get smashed at every and any tasks. Again showing just how little clock speed means when architecture and technology evolves. Clock speeds for CPUs have barely changed even in the PC market for years, these days it's all about TDP, performance per watt, efficiency, multi core, SMT, Hyperthreading etc. The days of clock speed races died in the early 2000s.

In fact, based on the rumors I'm not sure if either MS or Sony will use CPUs in their new systems as powerful as Xenon and Cell were when they were released.

They wont have to be, the focus now is on tech like GPGPU, programamble shaders, etc. GPU is far more important then CPU in modern consoles.
 
I didn't mean that they wouldn't be a big leap but I don't expect them to be as fast in pure clocks or as power hungry as Xenon/Cell. I could be wrong though as I haven't started following the other new systems as closely.

This is kind of what I expect going from current gen to next gen processors.

If I give Xenon/Cell the equation 4+4+4+4+4+4=24 I'd expect both to complete that 1 million times per second (not real)

If I give WiiU CPU the same equation in performs it 800,000 times per second.

Now I give Xenon/Cell the equation 3x4+3x4=24. They don't understand this equation given their instruction set (again, not real).

I give WiiU CPU the same equation and it performs it 1.5 million times per second.

I don't know if any of that makes sense but it's just what I'd expect as CPUs get more efficient at handling instructions at reduced clock speeds as technology improves. I would imagine that the same applies to the other new systems CPUs as well.

erm.. im not sure if theres some radically new instruction set in new PPC arcs...
 
Nor should it be.
Clock speed means very little.

The Xbox 360 and PS3's CPUs needed high clock speeds to perform. They were in order CPUs, limited cache, heavy focus on SMID and high flotational point calculations, as such both Xenon and CELL were very dependant on clock speed for their performance.

These days SMID and high flotational point tasks can be off loaded to the GPU, think CUDA, GPGPU, etc. Modern day CPUs really don't need to match the Xbox 360 or PS3 for SMID or high flotational point calculations, GPUs can do these tasks significantly faster then the Cell and Xenon ever could, and do it at a low cost and consuming less power.

As for the Wii U's CPU, given it appears to be Out of Order, has large L1-L3 cache pools, doesn't have to do Audio, I/O, and O/S tasks like Xenon and Cell do as Nintendo have dedicated chips for these, and doesn't need that raw SMID performance, its clock speed can be a lot lower while still delivering same or better performance.

Also Intel made Pentium 4 CPUs at 3.8ghz with hyper threading. Do you honestly believe a P4 CPU could keep up with even a single core of a equally or lower clocked i7? No it would get smashed at every and any tasks. Again showing just how little clock speed means when architecture and technology evolves. Clock speeds for CPUs have barely changed even in the PC market for years, these days it's all about TDP, performance per watt, efficiency, multi core, SMT, Hyperthreading etc. The days of clock speed races died in the early 2000s.



They wont have to be, the focus now is on tech like GPGPU, programamble shaders, etc. GPU is far more important then CPU in modern consoles.

You might be overstating the importance of offloading Audio, and underestimating SIMD instructions, and forgetting that the GPUs in the 360 and PS3 were also used for such tasks.
 
Nor should it be. *Lots of interesting info removed*

So basically CPU clock speeds and GFLOP numbers mean very little, at the end of the day the console will be judged by it's big budget first party games at E3 2013 but by then we will have probably seen PS4 / 720 'tech demos'.

The argument will go from 'on par with PS360' to 'much less powerful than PS4/720' after E3 2013 lol :).
 
erm.. im not sure if theres some radically new instruction set in new PPC arcs...
I don't believe there's one either but my example was just based on my expectations not the reality of what may be applied in IBM's new CPUs. They have been touting a lot of advancements in processor efficiency the last few years, at least on the business application side.
*redacted*
We're on the same page but you stated it on a much better technical level than I could.
 
You might be overstating the importance of offloading Audio, and underestimating SIMD instructions, and forgetting that the GPUs in the 360 and PS3 were also used for such tasks.

Care to elaborate?

When you look at the Xbox 360 and PS3's audio capabilities, ie number of channels, bit rate, encoding, etc, and factor in load audio places on the CPU. A DSP of only a few hundred mhz could equal and surpass both. To consume and lock a xenon thread for a whole clock cycle just for sound is incredibly inefficient.

Xenon and CELL really are not ideal for audio tasks, using 10-20% of of their resources just in audio is insane.

As for SMID, how am i understating it? Most of the routines you'd require SMID or high flotating point for can be done equally or better a modern GPU. Also the Xenon and Cell placed a incredibly high importance on SMID and flotational point calculation abilities, at the detrminent of other abilities.

With Nintendo going down the route of dedicated DSP, I/O, ARM processor for the O/S, as well as the GPU having GPGPU capabilities, i don't think the CPU needs to have incredibly strong SMID/floating point capabilities like Cell and Xenon.

Also as for the Xbox 360 and PS3's GPUs being capable of these tasks, indeed they were. But compared to modern GPU architecture, both the Nvidia and ATi GPUs in the Xbox 360 and PS3 are incredbily limited and inefficient at this.

Even a bargain basment Evergreen ATi GPU with 320-480 SPUs would mop the floor with the Xbox 360 and PS3's GPUs when it comes to GPGPU, SMIDish/floating point abilities.
 
Xenon and CELL really are not ideal for audio tasks, using 10-20% of of their resources just in audio is insane.

As for SMID, how am i understating it? Most of the routines you'd require SMID or high flotating point for can be done equally or better a modern GPU. Also the Xenon and Cell placed a incredibly high importance on SMID and flotational point calculation abilities, at the detrminent of other abilities.

With Nintendo going down the route of dedicated DSP, I/O, ARM processor for the O/S, as well as the GPU having GPGPU capabilities, i don't think the CPU needs to have incredibly strong SMID/floating point capabilities like Cell and Xenon.

Also as for the Xbox 360 and PS3's GPUs being capable of these tasks, indeed they were. But compared to modern GPU architecture, both the Nvidia and ATi GPUs in the Xbox 360 and PS3 are incredbily limited and inefficient at this.

Even a bargain basment Evergreen ATi GPU with 320-480 SPUs would mop the floor with the Xbox 360 and PS3's GPUs when it comes to GPGPU, SMIDish/floating point abilities.

SIMD instructions are largely for either media decoding or used in 3D applications, both of which I doubt will make very large usage on the Wii U CPU. You're trying to make it seem like its a bad thing?

I have NEVER heard of the audio taking more than 10% let alone 20% consistently in any modern generation game, which is why both on PC and console they've been on the CPU. On any decent CPU a separate audio DSP wouldnt matter. I would hope its presence is moreso to help backwards compatibility.
 
SIMD instructions are largely for either media decoding or used in 3D applications, both of which I doubt will make very large usage on the Wii U CPU. You're trying to make it seem like its a bad thing?

I have NEVER heard of the audio taking more than 10% let alone 20% in any modern generation game, which is why both on PC and console they've been on the CPU. On any decent CPU a separate audio DSP wouldnt matter. I would hope its presence is moreso to help backwards compatibility.

There was some talk that some 360 games even used a whole core (not just a thread) for audio so that'd be 33%
 
There was some talk that some 360 games even used a whole core (not just a thread) for audio so that'd be 33%

Sounds hilariously inefficient. Link me.

Also, a thread is just a way of saying the amount of simultaneous instructions at a single time. For a processor that does literally billions per second thats not much of a task. in other words, a thread can refresh over 300-400 million times a second in the Xenon
 
Keep in mind guys, 360 games use as much as a full CPU core just for handling audio. Wii U has a separate DSP for that.

EDIT: beaten, lol. I had this page open for too long before replying. Damn you Sandy and you're internet fucking up ways.
 
One thing that hasn't been mentioned for a while is Ancel's comments regarding 'almost no memory limitations'. Was thinking that a chunk of that 8GB/32GB of that flash could be reserved for swapspace. SLC flash would be fast enough and be less prone to errors compared to MLC flash. Even as little as 512MB would be a major boon for developers.

You'd then have 32MB of eDRAM to handle textures and AA, 3MB of eDRAM as cache for the CPU, 1GB of what's likely to be highly clocked DDR3 and 512MB of flash to use as swapspace.

Ancel's wording of 'almost no memory limitations' doesn't sound like he was just talking about the difference between 512MB and 2GB, otherwise he would have said twice as much or four times as much.

Unless the eDRAM on-die for the GPU makes that much of a difference..?
 
Actually, after reading through what everyone has been saying on this page it sounds as if what was tripping up WiiU devs initially is that there are functions that were included on Xenon/Cell that have been offloaded to sub-processors in the WiiU not that the WiiU CPU has additional functionality that devs weren't applying to their code as I initially thought.

Sorry Snowman.
 
Actually, after reading through what everyone has been saying on this page it sounds as if what was tripping up WiiU devs initially is that there are functions that were included on Xenon/Cell that have been offloaded to sub-processors in the WiiU not that the WiiU CPU has additional functionality that devs weren't applying to their code as I initially thought.

Sorry Snowman.

You are correct. Nintendo has supplied what I like to call toys to play with that are beyond the usual. Using those toys greatly assists in overall system performance as the toys are meant to be used.

Small teams that cant due to cost or just too lazy to use these toys will suffer.

Harada got quoted a lot when he commented that the Wii U CPU was clocked lower than Xenon etc and he is right. The problem is people took that and ran with it as omg Wii U CPU sucks etc etc and he even came back and complained about the fact people were twisting his words
 
*Puts the Ideaman signal into the air and see's if he is around*.

You are obviously in contact with at least a few sources, do you know if the hardware NDA will be lifted when the consoles launches (can't believe im asking this less than 3 weeks before launch lol).

Also from the conversations you have had with these sources, are they impressed with the GPU, as in twice as powerful as PS360 (500 GFLOPs +) or just a small upgrade (300 - 400 GFLOPs) ?.

With all the previews in the past week of NL and ZombiU, it looks like the software NDA's are starting to lift at least.

Thanks.
 
There was some talk that some 360 games even used a whole core (not just a thread) for audio so that'd be 33%

Its for i/o and audio not just audio alone, but it was improved slightly during the years but yeah it is not the most efficient method. Kind of puts the whole 729MHz broadway core in the wii being only 20% slower than a xenon core into perspective. A 900mhz unenhanced tricore broadway would have probably given xenon a run for its money back in the day. A tri core enhanced (enhanced to run over 1ghz) broadway at under 2.2 Ghz would run rings around xenon but I highly doubt the cpu is going to be over 2.2ghz
 
SIMD instructions are largely for either media decoding or used in 3D applications, both of which I doubt will make very large usage on the Wii U CPU. You're trying to make it seem like its a bad thing?

I'm saying the complete opposite.

Tasks like media decoding or 3D app content will be handled by the Wii U's GPU, DPS, and I/O controller. ATi GPUs have long had hardware acceleration for most media codecs and even adobe flash. The GPU and DSP will no doubt be able to do these decoding/encoding tasks more efficiently then most if not all CPUs on the market in terms of performance per watt, load, resources consumed, etc.

Again highlighting why the Wii U's CPU likely won't need strong SMID capabilities. Which is a good thing.

I have NEVER heard of the audio taking more than 10% let alone 20% consistently in any modern generation game

It is the case with the Xbox 360. 3 cores 2 SMT threads each, so a total of 6 SMT threads. Sound alone has been known to max out an entire thread, but commonly 1/3rd - 2/3rds a thread. I/O + Audio, known as above to max out an entire thread. So 10-20% load is realistic just for sound and I/O opperations. Not great for a 3.2ghz tri core CPU to be penalised that much with audio and input output requests.

Also we have to factor in sound quality for the above. It's quite common for sound in Xbox 360 and PS3 games to be incredibly low and compressed bit rate, limited sounds per audio channel, and sampling rate. So in game audio on the Xbox 360 and PS3 tends to be of very poor quality, done intentionally to prevent taxing the Cell and Xenon too hard.

A dedicated audio DSP of only 200mhz could run rings around both Xenon and Cell when it comes to developers ability to improve sound quality in their games.

which is why both on PC and console they've been on the CPU. On any decent CPU a separate audio DSP wouldnt matter. I would hope its presence is moreso to help backwards compatibility.

You cannot compare PC CPUs to those within the Xbox 360 and PS3. AMD and Intel CPUs have been Out of Order for close to 20 years, Xenon and Cell aren't. Then you have completely different instruction sets, with Intel and AMD cpus featuring instruction sets specifically for media encoding/decoding to improve performance, instruction sets the Xbox 360and PS3 counter parts lack and equivalent off. Comparing PC to Console is very very bad.
 
You are correct. Nintendo has supplied what I like to call toys to play with that are beyond the usual. Using those toys greatly assists in overall system performance as the toys are meant to be used.

Small teams that cant due to cost or just too lazy to use these toys will suffer.

Harada got quoted a lot when he commented that the Wii U CPU was clocked lower than Xenon etc and he is right. The problem is people took that and ran with it as omg Wii U CPU sucks etc etc and he even came back and complained about the fact people were twisting his words

Are you referring to additional SIMD instructions by 'toys'?

I'm saying the complete opposite.

Tasks like media decoding or 3D app content will be handled by the Wii U's GPU. ATi GPUs have long had hardware acceleration for most media codecs and even adobe flash. The GPU and DSP will no doubt be able to do these decoding/encoding tasks more efficiently then most if not all CPUs on the market in terms of performance per watt, load, resources consumed, etc.

Again highlighting why the Wii U's CPU likely won't need strong SMID capabilities. Which is a good thing.



It is the case with the Xbox 360. 3 cores 2 SMT threads each, so a total of 6 SMT threads. Sound alone has been known to max out an entire thread, but commonly 1/3rd - 2/3rds a thread. So 10-20% load is realistic. Not great for a 3.2ghz tri core CPU to be penalised that much with audio alone.

Also we have to factor in sound quality for the above. It's quite common for sound in Xbox 360 and PS3 games to be incredibly low and compressed bit rate, limited sounds per audio channel, and sampling rate. So in game audio on the Xbox 360 and PS3 tends to be of very poor quality, done intentionally to prevent taxing the Cell and Xenon too hard.

With the CELL and Xenon CPUs being in order, it can't use that other 1/3rd - 2/3rd of a thread to process something else easily, they both typically need to wait for a new clock cycle. Thus why the both CPUs are clocked so high, the higher clock helps them jump to the next routine ASAP and masks the inefficientices of tasks like I/O and audio processing.



You cannot compare PC CPUs to those within the Xbox 360 and PS3. AMD and Intel CPUs have been Out of Order for close to 20 years, Xenon and Cell aren't. Then you have completely different instruction sets, with Intel and AMD cpus featuring instruction sets specifically for media encoding/decoding to improve performance, instruction sets the Xbox 360and PS3 counter parts lack and equivalent off.
The 360, PS3, PS2, Xbox and Gamecube all made use of media extensions on their gpu to some extent. It isn't a new thing in the console space. 'SMID' is essential in parallel processing, and isn't just something tacked onto CPUs. I'm not exactly sure what you're trying to prove...

Also, a thread doesn't equal a certain percentage over a second, because if so it would be literally impossible for single core CPUs to multitask. you can't just use part of a thread.....

And often the sound effects you hear are uncompressed, or the CPU does spend time uncompressing them, but thats a RAM issue. I've not seen actual proof of a game with only 256 or so channels taking up so much CPU time on CELL or Xenon. If you have a better example link me, because honestly the Audio DSP you are referring to probably is less than a tenth of the capability of the current gen CPUs...
 
Top Bottom