Are you referring to additional SIMD instructions by 'toys'? lol.
It's quite clear what he's refering to.
The DSP, I/O processor, etc.
The Xenon and Cell processors lose a significant portion of their processing power and efficiency as they're tasked with audio, I/O, encoding/decoding, compresson/decompression, etc etc.
Nintendo have gone down a route that where possible, tasks like the above have dedicated silicon to more efficiently process them.
DSP = audio
I/O = I/O
GPU = dedicated portions to encoding/decoding, decompression/compression, flotation point, SMID, and also features programmable units allowing developers to utelize its power as efficiently as possible.
CPU = Out of Order, large L1-L3 cache, which both carry their own benifits for general purpose processing
The Xenon adn Cell did so much stuff, only a portion of their real power was free to developers to tap.
The 360, PS3, PS2, Xbox and Gamecube all made use of media extensions on their gpu to some extent.
Very limited compared to today's modern GPU architectures. I'm not sure why you keep going on about this. How many programmable units do the Xbox 360 and PS3 have compared to even the cheapest and nastiest GPUs on the market from AMD and Nvidia 2 years ago?
It isn't a new thing in the console space. 'SMID' is essential in parallel processing, and isn't just something tacked onto CPUs. I'm not exactly sure what you're trying to prove...
Again you go on about somethin that is irrelavent. We're not arguing or debating what SMID is used for, only where it should be used. In modern architecture, routines like these have long been shifted onto GPUs.
Also, a thread doesn't equal a certain percentage over a second, because if so it would be literally impossible for single core CPUs to multitask.
What do you think Hyperthreading and SMT are for
you can't just use part of a thread.....
If a thread finishes processing well before the end of a clock cycle, yes you can. That's waste and inefficent use of the power.
If you have a better example link me, because honestly the Audio DSP you are referring to probably is less than a tenth of the capability of the current gen CPUs...
Think about it this way
Creative's XFI DSP is 400mhz give or take. Look at what it can do with that 400mhz compared to the best any Xbox 360 or PS3 game has ever output.
400mhz dedicated Creative DSP vs Xenon's 3.2ghz Tri Core CPU.
DSP wins hands down. Everything from voices per channel, number of channels, 3D audio positioning, sampling 192khz and 24bit, audio quality and quanitity compression and decompression, etc. So this little 400mhz mops the floor with Xenon and Cell's best.
Dedicated silcon is always more efficient then tacking the task onto a general purpose CPU. That's why DSPs, storage controllers, north bridges and other busses, Memory Controllers, etc, tend to be/have dedicated bits of silcon. They offer the absolute best performance per watt, run cooler, are cheaper, and more efficient.
But i'm not going to continue to argue with you about it. You've clearly shown that you have next to no real knowledge of the Xenon, Cell, computer architecture, etc. To compare x86/x64 to Xenon and Cell was a joke, as too your inability to understand the difference between the GPU architecture in the Xbox 360 and PS3 vs modern architecture with their implamentation of programmable units, GPGPU, etc.