I apologize if I have been a bit touchy lately in this thread. GPU speculation is nothing more than a fun hobby, afterall. You could even call it a somewhat bizarre (nerdy) offshoot of gaming in general. Admittedly, it is slightly frustrating that the possibility of a 160 shader Latte is dismissed so hastily (without proposing some type of massive architectural overhaul). I, personally, would love for there to be more going on there, but the deeper I have dug, the less likely it seems.
The bottom line is that all of us are in way over our heads with this technical banter.
None of us really know how these game engines run and where exactly the bottlenecks are. I am taking an educated guess, based on what I have read from developers, that RAM has been a major bottleneck this generation. The number one request that devs have made to the console manufacturers is more RAM, and that's been true for a long time. RAM makes all the difference. Let's look back at some of function's posts on beyond3D:
http://forum.beyond3d.com/showthread.php?t=60501&page=190
As he states, in the CoD4 comparisons (1024x768 and highest settings, btw), the GDDR5 version of the 6450 gets 40%+ better performance over the DDR3 version
despite being only 20% faster in GPU core clock. In Hawx, there is a ridiculously eye opening 76% difference. If anything goes to show that shader count does not necessarily an efficient GPU make, this is it!
No, unfortunately there is no perfect test, because there is no true retail configuration of Latte at its precise speed along w/ the proposed core config and mixture of eDRAM + DDR3, but that does not mean we can not make some general observations in saying that it might, nay,
probably is possible for a 160:8:8 graphics chip to pull off the results we see in the wild.
I have a few more observations about the chip layout and bg's proposed scenario to write up shortly, but I just needed to get that off my chest.