Erm, I'm talking bog standard desktop tech circa the beginning of the century, and the corresponding know-how. Gamedevs don't live in caves. Their office workstations and/or home desktops were already hosting SMP and shader tech before 2005. Devs' non-cube/ps2 projects were most like requiring them to know this stuff. The notion that 360 (will return to ps3 shortly) introduced a sizable amount of tech the likes of which nobody had ever touched is as detached from reality as they come. That does not mean that every gamedev coder was brilliant at SMP code and was writing shaders day-in and day-out in 2005. But guess what - devs aren't even today, and that will remain so in the foreseeable future - there's a sound division of labor and responsibilities on every sizable game project. BTW, I'm talking from the POV of somebody who used to do PC-based game engines in that timeframe, and whose teams (on multiple projects) would include gamedevs of various backgrounds - consoles, pc, handhelds, university graduates. I'm not speaking hypothetically, I'm telling you how things were back then for a representative sample of the industry.
Let me guess, many console devs in your eyes means ps2-only devs? Because entirely disregarding any likely PC exposure of the gamedevs from that timeframe, cube's TEV was proto-PS1.3 tech, and Xbox was dx8 through and through. Basically, your entire argument is resting on the premise there was this exotic tribe in the Amazon forest known as 'console devs' whose exposure to technology was limited to only what their console vendor of choice dropped on them via parachutes once per hw generation. Which, again, is still not sufficient for you, because ps3's 'ultra exotic' CPU tech was precisely targeting seasoned ps2 devs, who were already versed in widely-asymmetric architectures where autonomous streaming processors were meant to (pre-)chew graphics workloads. Yes, the RSX was a castrated desktop part, so that did turn quite a bump in the learning porting curve for quite a few, but that was not because there was anything exotic in this tech.
No, it does not go 'hand in hand' - most graphics know-how in use today either originates from the academic/CGI/PC space, or finds its way there shortly, and from there on into the wild. Exception is Cell-targeted sw tech, which, let's face it, was a dead end. And how is 'learning how to properly use the new hw' implying that devs had no experience with SMP or shaders per se?
You're confused so let me help. You are mixing high production values (which is normally an asset thing) and use of advanced graphics algorithms (which is normally an R&D thing) - those are different things. Yes, high-production-value projects do tend to also have strong R&D, but a small indie title can be high-tech just as well.
And yet we get anecdotal accounts of devs who managed to increase the performance of their WiiU pipelines multi-fold in the span of the last devkit cycle. Again, let me restate that how much learning WiiU devs have to do to get the hang on the paltform is something which you have to get first-hand only.
Which devs? PDZ and Halo4 are both first-party. Are you suggesting Nintendo will neglect their own platform?