This is the latest version.
I'm pretty much done, too much Photoshop for my brain :lol
I will continue tomorrow if there are any crazy bastards who want do DIE. MAYBE we will reach the legendary Apple asteroid fields..
Which one was that, again?
Also, I thought they stopped allowing ban bets.
I leave to do some achievement hunting in Resident Evil 5 Gold and 3 new section of the train gets added?
Man, we'll need to think of something to top this at E3.
I promise I'll make a fully animated 3D version of the hype train pic. I swear.
Hey, I'm holding you on that!
Wtf...
Sigh, I won't even bother. I should have known posting the original pic on gaf would make it public domain to over-the-top photoshops.
It's lost it's original subtle brilliance! If everything was kept in the carriage it might be alright. Start by getting that gross thing out of the sun![]()
Sorry to derail the Hype Train for a moment but I'm currently thinking about the U's eDRAM and RAM.
Firstly the eDRAM. IBM confirmed that the CPU would have eDRAM and as far as I remember lherre has also stated that the GPU will also have eDRAM to play with. Do you think we're looking at a shared pool of eDRAM or two sets of eDRAM, one for the CPU and one for the GPU..? Correct me if I'm wrong but wouldn't the former option involve some sort of latency or involve conflicts when processing..? And has the GPU actually been confirmed for definite to have eDRAM rather than 1T-SRAM..?
And lastly the RAM. Several sources have stated that the RAM is 'slow'. So could we be looking at GDDR3 with a high clockspeed and larger bus rather than GDDR5 with a lower clockspeed and smaller bus..? And if it is GDDR3 then hasn't the density recently been doubled, or am I remembering things wrongly..? If the density has been doubled then does that mean that more than 2GB of RAM is feasible..?
And if Nintendo have gone for slower GDDR3 will the eDRAM in both the CPU and GPU prevent bottlenecking..?
Apologies in advance for a shedload of questions!
yeah we should kick out all the latecomers
cept for boney to make him mad
Wtf...
Sigh, I won't even bother. I should have known posting the original pic on gaf would make it public domain to over-the-top photoshops.
It's lost it's original subtle brilliance! If everything was kept in the carriage it might be alright. Start by getting that gross thing out of the sun![]()
yeah we should kick out all the latecomers
cept for boney to make him mad
Where is the line drawn?
The rumored amount of L2 cache on the CPU is 3MB. That might be SRAM or it might be eDRAM - IBM have stated an interest in using their eDRAM as L2. It would probably be at some disadvantage to SRAM (any help?) but it would explain the 3x 360 amount, as that is the density IBM's eDRAM achieves over SRAM.
Regardless, the true "eDRAM" I think we're all talking about here is the proposed 32 MB frame buffer. That would likely be local to the GPU, but also accessible by the CPU in some way, shape, or form. Possibly by a 3.2 GB/s bus for Wii BC purposes, but possibly larger if they got crazy. Their memory controller is likely a very custom job.
If they end up using GDDR3 (as I think they will), I would still put the limit at 2 GB.
There is no point in putting much more RAM into a machine if the other components can't keep up. That's why I trust if they go with a 128-bit bus to the GPU, what they lack in bandwidth to main memory will be padded by the likely 256GB/s or more bandwidth of the frame buffer.
1t-SRAM hasn't been confirmed out, but there would be no reason to hide it, and would think that MoSys would announce a new deal by now for the benefit of their company.
The last 5 or so pages of this thread have been terrible and you should all feel bad.
this is bad and you should feel bad.
I must say, I do prefer this one:
![]()
I know IBM is responsible for the CPU and AMD is responsible for the GPU, but Intel? Maybe you got it mixed up, or have I missed something?At the same time, Nintendo has been keeping who's making anything under wraps.
I mean, outside of knowing that they'll use something from Intel and something from AMD, we know nothing about partnetships.
And after the Pica200 incident, I bet they have tighter NDAs on that stuff.
Oddly enough, I feel better than ever because of the last 5 pages.
![]()
Rösti;35691309 said:I know IBM is responsible for the CPU and AMD is responsible for the GPU, but Intel? Maybe you got it mixed up, or have I missed something?
![]()
:b fine ill do it myself
BY2K, your avatar and mine look like the only ones concerned about impending incineration.. hehe..
BY2K, your avatar and mine look like the only ones concerned about impending incineration.. hehe..
More like B2YK is having the time of his life doing Ace in the ass.
This is the latest version.
![]()
I'm pretty much done, too much Photoshop for my brain :lol
I will continue tomorrow if there are any crazy bastards who want do DIE. MAYBE we will reach the legendary Apple asteroid fields..
...did i miss the hype train?
More like B2YK is having the time of his life doing Ace in the ass.
![]()
(err.. I hit 'quote' instead of edit.. I guess the "hype" made my finger slip?)
Wow. Can't unsee. Ace likes it.
Yeah, while I love the work Nibel is doing, a cleaner look is preferred.
Why do you people keep screwing up my username?
Yeah, while I love the work Nibel is doing, a cleaner look is preferred.
Thank you, this should be the final and definitive version.I must say, I do prefer this one:
![]()
If anything, this one could have more people stacked in the back... but to be honest I agree.Thank you, this should be the final and definitive version.
Not that I don't appreciate any further efforts! I had a good laugh. I just think the above version has the perfect amount of edits without becoming stupid.
If anything, this one could have more people stacked in the back... but to be honest I agree.
The gravitational field of the Sun will pull you in anyway. There is no escape.I'm small and I couldn't even get in the cart that has GAF Members on it.
*jumps off train*
The gravitational field of the Sun will pull you in anyway. There is no escape.