I gave Raja the benefit of the doubt before, now I am just pissed off at his incompetence. If he doesn't turn things around for the next launch he has got to go.
I'd say it's hard to blame all this on the guy.
Judging from their newer integrated graphics chips (Intel Xe) they don't seem to have increased the floating point performance of their processors considerately since Broadwell in 2013 (comparing equal shading units and execution units number). It's crazy.
I don't know if ARC has the same efficiency, but Intel Xe was supposed to be a step towards "it". It's been quite evident that the improvement in performance per clock has been quite shit and they are building these GPU's like they build Larabee all these years ago, basically doing huge clusters of something they already have and hoping for the best.
This "Pentium D x100" approach seems engrained on Intel at this point. Perhaps they wanted to ship as fast as possible, and decided on this, then it backfired for quite a few years in a row. I also doubt GPU focus was ever that high for intel, seeing they didn't even want to allocate proper production from their foundries to them. Thus the GPU division was making omelettes without eggs for the longest time, perhaps still is.
Intel is obviously able to develop a good GPU if they wanted and invested enough, if they fail I doubt they really tried. Apple started out by stealing PowerVR tech and employees after buying GPU's from them for years (then had to settle to licence stuff from them) and look at them now.
Vega was a compute beast, and I thought if that didn't translate well into video cards, it could still be a great result if focused on the server side of things. I was thinking they hired him for server processor design.
Yes, but he clearly didn't take that tech/architecture with him.
Working with intel his team is basing their design on existing intel graphic processing pipeline. Which was never "good". Intel GPU's in fact remind me of Raspberry Pi VideoCore gpu's a lot.
Floating point is a plus when their focus generation on generation is on decoding and encoding video "better" (and by better, supporting more formats through fixed function translating into less workload for the CPU) The video block on intel integrated chips has increased steadily gen-on-gen, unlike the graphics core performance per clock.
But sapphire rapids is being delayed again and again, so..
I doubt it's all on him.
At this point he's like a guy working on Burger King who's in charge of doing a vegan burrito.
Burger King sells zero burritos, they're just musing if they can sell them. But he has to use Whopper ptties and also burger bits for it. And fry them elsewhere.
He might even deserve it, but he's been given a tall order.