I threw that out earlier in the thread as a joke...yea I doubt that's happening.
The whole U thing just needs to go.
Getting back to the last rumor from Wiiudaily.
768 MB of DRAM embedded with the CPU, and shared between CPU and GPU
This blogger makes a good observation:
http://nerdorgeek.com/rumored-wii-spec-sheet-true-false/
He goes on to say:
U need to go!
Getting back to the last rumor from Wiiudaily.
768 MB of DRAM “embedded” with the CPU, and shared between CPU and GPU
This blogger makes a good observation:
http://nerdorgeek.com/rumored-wii-spec-sheet-true-false/
He goes on to say:
Whether you think these specs are true or false, is all entirely up to you. People are disappointed on how powerful the graphics are, but come on, were talking about Nintendo here, they’re not known for having graphics comparable to Sony and Microsoft.
Just drop Wii altogether.
Name it: DestrUctotron!
I recall talk that the Wii was supposed to be an HD machine, not as powerful as the 360 and PS3, and they decided to gimp it last minute. Just like DVD playback was taken out.
We already have enough indicating that it's over 1GB to throw this out. Also, I'm pretty sure that stacked RAM isn't cheap, either.
For those wondering what direction the system might take from a purely geometrical calculation standpoint. Some of the new team members at Nintendo Technology Development include one of the head engineers responsible for the PS3's Cell CPU and RSX GPU.
Stacking dram is happening ATM. PoP, widely used in mobile SoCs, is a rudimentary form of stacking (not using TSVs, apparently, but based on the traditional peripheral off-die wiring). What is *NOT* happening currently, is using stacking with high-performance silicon - if you have an already hot die, stacking more silicon around it makes things only worse - for the hot die, as well as for its neighbors in the stack. So if you already had a hot die at a given fab node, you need to go down a few nodes before you can consider stacking. There's a simple reason why IBM make their monstrous 567mm^2 power7 dies as a 2D IC,and it's that thermodynamics does not allow them to turn that into a much more compact and efficient 3D IC (it's on their roadmaps, though).Stacked DRAM not happening?
why is that?
Throw out? Why?
It could very well have been the specs of the first dev kits.
or the reason for the "50% increase of power" over current gen statements.
No one is saying these are the current specs.
However, remember this rumor?
There's a simple reason why IBM make their monstrous 567mm^2 power7 dies as a 2D IC,and it's that thermodynamics does not allow them to turn that into a much more compact and efficient 3D IC (it's on their roadmaps, though).
Wii Tuu dammit
lherre said that the specs haven't changed, though.
Marketing ownage right there.Just drop Wii altogether.
Name it: DestrUctotron!
Yeah for it not to have changed in several revisions seems strange, but we don't have any other insider info really to go on.As much as lherre seems reliable, I can't help but feel a bit unconvinced about this bit.
I don't doubt lherre's position, and I certainly don't think he's lying... but the specs must have changed since E3, especially if they say we're on the 4th devkit.
As much as lherre seems reliable, I can't help but feel a bit unconvinced about this bit.
I don't doubt lherre's position, and I certainly don't think he's lying... but the specs must have changed since E3, especially if they say we're on the 4th devkit.
Marketing ownage right there.
Yeah for it not to have changed in several revisions seems strange, but we don't have any other insider info really to go on.
As long as the GPU is of the tablet/smartphone magnitude, otherwise your best 'stacking' bet is MCM.I see... could it be used with for the GPU?
As much as lherre seems reliable, I can't help but feel a bit unconvinced about this bit.
I don't doubt lherre's position, and I certainly don't think he's lying... but the specs must have changed since E3, especially if they say we're on the 4th devkit.
Nintendo Ultra.
As much as lherre seems reliable, I can't help but feel a bit unconvinced about this bit.
I don't doubt lherre's position, and I certainly don't think he's lying... but the specs must have changed since E3, especially if they say we're on the 4th devkit.
True, and we have to consider how buggy and glitched some of the earlier dev kits were. Iherre is probably looking at the documentation of the dev kits, but it most likely didn't state, for example, how much can you push the hardware before the GPU causes the system to overheat and crash. The performance can get significantly better with improvements in efficiency alone.maybe the console specs haven't changed, just that the devkit is getting close to final console hardware?
v1 - PC based simulation of expected performance
v2 - update with wiiU controller support
v3 - Final CPU but still using general PC GPU
v4 - final CPU and GPU
etc. (hypothetical versions btw)
[Nintex];33345083 said:I like how every Wii U leak/tidbit/source always ends up being a dead end or like of no use at all.
The current name is fine. Remember the hubbub over the first Wii's name? Many claimed it was the worst name of all time for an electronics product.
As much as lherre seems reliable, I can't help but feel a bit unconvinced about this bit.
I don't doubt lherre's position, and I certainly don't think he's lying... but the specs must have changed since E3, especially if they say we're on the 4th devkit.
Only Nintendo cal unintentionally troll like this...
Fuck it. I learned this robotics stuff for a reason! *goes to build Miyamoto-bot*
Also, we never found out how significant DownWithTheShip's info was (if he found anything out).
Or at least, we never found out why his company had a devkit (because he claimed they were never previously involved with Nintendo).
maybe the console specs haven't changed, just that the devkit is getting close to final console hardware?
v1 - PC based simulation of expected performance
v2 - update with wiiU controller support
v3 - Final CPU but still using general PC GPU
v4 - final CPU and GPU
etc. (hypothetical versions btw)
Was't there information from Vigil that they have to change code with the newer devkit because somethings worked different?
iirc, he said he works in the same building as a devkit, not that he's working on it himself.
The current name is fine. Remember the hubbub over the first Wii's name? Many claimed it was the worst name of all time for an electronics product.
256 MBs of RAM and a dual-core 1.8 Ghz CPU was the target for Revolution developers, resulting in the now (in)famous Red Steel bullshots. Except, those really weren't bullshots at all, just graphics that Wii would never be able to display in its final iteration.