the
Strix draws pretty much the same as a
stock 970 when it comes to power.
I'd wager that the same goes for the vapor x 290x.
it
seems like a 290x on its own draws about a hundred watts more than a 980 on average.
The graph in the OP encompasses all discrete graphics cards, aka everything that is a physical graphics card, and not just a chip on a motherboard.
The information text with the model names just give you an idea about what exactly launched at a select point in time.
Those are the type of comparatives I meant. Reading directly from the isolated card instead of the full system. Take into account that a faster card will also make the rest of the system to sweat more.
You can clearly see how the 290x take a massive 64% more power consumption than 970.
That graph kinda says the opposite, their best times as a whole is when they were releasing guff GPU's, either in the form of the fx series, cards that died by the hundreds of thousands because of shit manufacturing, re-badged product and hot 'n loud GPU's (ironically something ATi get's hammered for)
What it says to me is that nvidia's marketing has been absolutely on the ball when ATi has a competitive product.
I think we are reading the graph in a different way.
G80 and G92 were the Conroe-like holy grail for Nvidia. Then, we have to consider that even when at launch they were superior to HD2000/HD3000, the following drivers updates widened the gap substantially, and that took roots on the consumer mindset. Can you remember those 'Big Bang' drivers? That's it, increasing the value of already sold out products suprisingly led to consumer loyalty. On the other side, ATI refused to offer support for Windows Vista to some very recent series, making them unable to perform any basic function on that OS and followings.
Then Nvidia started to bleed marketshare with the acromegalic GT200 derivative series. They were pushing hard the CUDA and compute stuff making the chip less efficient for pure gaming. ATI kept their old VLIW architecture, much less capable but way more efficient for gaming, and moved to GDDR5 and narrower buses, helping to make cheap beasts such as HD4770. That card sure paid a lot of bills for them. You can clearly see the spike there.
And the last spike you see favouring them is the failed Fermi release. It was enough for the conservative HD5000 series to print money. Many newer cards were way hotter than those GTX470 and GTX480, but the damage was already done.
Then the quickly Fermi refresh, with the star GTX560ti leading the way, stablished a slow but firm growing for Nvidia, with few stops starred by sparse models such as HD 7970 Ghz edition or R9 280.
In my head, it makes sense.