It was quite a substantial business effort on their part, with some dramatic consequences in both product roadmap and internal political terms. The reason they are indeed 'trying to fit it somewhere' today, as you say, is because of the internal political consequences resulting from Larrabee fallout - the GPU killer that did not deliver. But rest assured you have not seen the last of this branch of Intel endeavor. Actually, Intel acquired recently (as in this year) even more IP and man power to throw at the problem.
I think it's healthy to keep some ideas on the table, and that might be just that, but I also know that, many Pentium 1 cores is certainly a novelty idea to go with, and I'm sure they looked at their catalog and found the most evolved simple pipeline CPU in their backlog, but since they're investing so heavily onto it they'll also have to evolve it quite a bit further, until the point you really can't call it a slightly modified Pentium 1 that's the feeling I get. I mean we aren't calling Haswell something along the lines of a souped up Pentium 3.
Plus I don't know how it's done, but I remember intel apologizing a few years ago for Pentium D just being 2 Pentium 4 cores slotted together; they had things repeated on the die that didn't have to be otherwise. This implementation might be similar to that, I dunno, but it sounds inefficient. I sure remember Pentium 1 not being SMP enabled (no boards with dual cpu slots). On top of it all, using x86 units repeated like that all over also seems inefficient to me as they have all sorts of legacy things going on that could and should be stripped; it could still be x86, but certainly not a Pentium 1 whom you could theoretically strip out and still manage to run windows 98.
I could see a many core architecture using some variant of Atom, which in itself unshelved the Pentium 1 and went from there, but not a Pentium 1, no.
That directly correlates with what we've been saying for months about PPC750 actually; it still has merit despite being such an old implementation, it's pretty minimalist for what it is, but it's certainly not modern. P54C is older, and not as minimalist in my book. (I mean: reduced to the essential yet effective)
This card is sitting in more than a few desktops around the globe at the time of this post ; ) Stay tuned is all I can say on this subject. I mean, it could all crash and burn tomorrow, but my gut feeling today tells me otherwise
CELL's also were at some point. Question really lies if Intel can make it really useful for something more than folding, or something that current GPU's can't catch up in a few months. And through that disseminate it into the world.
For the time being I remember
even John Carmack was confused a few months ago.
We haven't even seen the tip of the iceberg that GPGPU is. And Intel is aiming right in the heart of that market with the MIC.
Is there really a market there though? we live in the era of convergence, yet MIC is not convergence, it's being used as a stand-alone FLOP board, not meant for graphics, not meant for commanding a system either.
If it fails as a CPU and it fails as a GPU I really can't see it getting off the ground, hence the CELL comparison.
They're certainly not crazy for trying, but it's like that plate that looks edible if need be but seems to be lacking something pretty instrumental at this point; perhaps they need a more conventional CPU serving for a front-end or perhaps they need to solve the fact that it's not a normal GPU implementation in any way or form.
In Portuguese we have a saying that goes along the lines of "it isn't meat nor fish" as a way to say that something is really undefined, similar to saying
jack of all trades master of none; I feel it describes Intel MIC existential crisis quite well