Rumor: Wii U final specs

Yeah I think it's widely accepted that its not an e6760, nor is it based on it.

We've just been having fun with AMD CS reps. It's just funny to see how these end up in all corners if the internet :)

Well, they started with an R700 - that's all I'm saying. The original dev documents match that.
What they end up in the final console is not the same as an R700, though.
 
I definitely believe that, remember when Valve completely bashed Sony and the PS3 because of how different the CPU is and than when they finally knew how to use it they did good.
 
Since it's come up a lot in the CPU talk, could someone elaborate in relatively lay terms the difference between out-of-order and in-order execution and why one should expect code written for the latter to perform poorly on via the former?
 
Since it's come up a lot in the CPU talk, could someone elaborate in relatively lay terms the difference between out-of-order and in-order execution and why one should expect code written for the latter to perform poorly on via the former?

In order: Instructions are processed in the order they are created. They queue. It's linear. If there's nothing ready at the front of the queue, a clock cycle is wasted.

Out of order: a processor executes instructions in an order decided by the availability of input data, rather than by the order they are created. No cycles are wasted. Whatever is ready will be processed next.
 
Since it's come up a lot in the CPU talk, could someone elaborate in relatively lay terms the difference between out-of-order and in-order execution and why one should expect code written for the latter to perform poorly on via the former?

The analogy once popped up, that "in order" could be compared to going to the supermarket and having to go through all the isles just to get one item. If you want a second item, you need to go through all the isles again. "Out of order", you can just take your shopping cart and take all the items you want in one go. Or at least that's how i remember it.
 
*Paging bgassassin* As I understand it yes.



I don't see any info on the wikipedia page about clock speeds. That's an improvement since I recall they had the GPU clock speed at 800 MHz and the CPU at 400 MHz. I thought I read they were 266 MHz though, could of course be very wrong.

the table on the right side of the page

says 268 MHz for graphics, and... yeah, now also the CPU. It was 200 MHz for the CPU the last time I looked.
 
The analogy once popped up, that "in order" could be compared to going to the supermarket and having to go through all the isles just to get one item. If you want a second item, you need to go through all the isles again. "Out of order", you can just take your shopping cart and take all the items you want in one go. Or at least that's how i remember it.
In order: Instructions are processed in the order they are created. They queue. It's linear. If there's nothing ready at the front of the queue, a clock cycle is wasted.

Out of order: a processor executes instructions in an order decided by the availability of input data, rather than by the order they are created. No cycles are wasted. Whatever is ready will be processed next.
So, and anyone please correct me if I'm wrong, the implication is that at the same clockspeed OoOE is more efficient than an in-order, and that at a lower clockspeed a CPU that uses OoOE running code optimized for this paradigm can potentially achieve the same performance as code written for a CPU that uses in-order execution.

So is it that while a CPU that uses OoOE can run in-order code, the lower clockspeed may hamper performance of a CPU intended for OoOE when running code written for a higher clocked in-order CPU?

Does this imply that the Wii U CPU may be better, the same or worse depending upon it's clockspeed relative to the 3.2 GHz of for example Xenon?
 
So, and anyone please correct me if I'm wrong, the implication is that at the same clockspeed OoOE is more efficient than an in-order, and that at a lower clockspeed a CPU that uses OoOE running code optimized for this paradigm can potentially achieve the same performance as code written for a CPU that uses in-order execution.

So is it that while a CPU that uses OoOE can run in-order code, the lower clockspeed may hamper performance of a CPU intended for OoOE when running code written for a higher clocked in-order CPU?

Does this imply that the Wii U CPU may be better, the same or worse depending upon it's clockspeed relative to the 3.2 GHz of for example Xenon?

OoO is more efficient, so could potentially not need as high clock speeds. Whether code optimized for in-order can hamper performance when ran on an out-of-order... i don't know, but i think i read in this thread that OoO should have no problem running in-order (optimized) code, but that an in-order CPU could not run OoO (optimized) code. Or maybe i'm mistaken. What could be the culprit of bad performance of the WiiU-CPU (other than not using the DSP and such)... maybe Wsippel & co can better answer that.
 
I dont see what in that article came to be true?

There are a couple things.

OoO is more efficient, so could potentially not need as high clock speeds. Whether code optimized for in-order can hamper performance when ran on an out-of-order... i don't know, but i think i read in this thread that OoO should have no problem running in-order code, but that an in-order CPU could not run OoO code. Or maybe i'm mistaken. What could be the culprit of bad performance of the WiiU-CPU (other than not using the DSP and such)... maybe Wsippel & co can better answer that.

Well, wsippel is still more qualified to answer this than I (as is blu, etc). But I do not believe there will be any next gen CPUs that will out-perform the floating point performance of the PPE in Cell/Xenon. The Wii U cpu certainly won't. However, in general computing the ancient P4 has a higher IPC than the PPE from what I remember reading. There is a reason wsippel leaked some "comparable or better" results to the Xenon from the Wii U CPU on some benchmarks.

It's the same or better in some ways, worse in others.
 
I may very well be late (ignore if already posted, may be ancient) but i only read the rumor today, that Retro is in charge of "game engines" for WiiU, and would be responsible for changing Epic's mind to bring UE4 to WiiU...? lol? Or not lol?
 
I may very well be late (ignore if already posted, may be ancient) but i only read the rumor today, that Retro is in charge of "game engines" for WiiU, and would be responsible for changing Epic's mind to bring UE4 to WiiU...? lol? Or not lol?

lol...Nintendo doesn't make money licensing engines, especially since it would be limited to the Wii U (no one will want a proprietary engine that won't work on PS4720) so why would Nintendo be putting it's top western studio on such a detail?

Retro is working on a game, every thing we've heard from Rare since the Wii U's announcement has been about them hiring staff to work on games.
 
lol...Nintendo doesn't make money licensing engines, especially since it would be limited to the Wii U (no one will want a proprietary engine that won't work on PS4720) so why would Nintendo be putting it's top western studio on such a detail?

Retro is working on a game, every thing we've heard from Rare since the Wii U's announcement has been about them hiring staff to work on games.

Well, they wouldn't need to license the engine(s), it could be for Nintendo studio's (plenty of those) alone. Maybe Retro wanted to work on a new engine for their upcoming game, and maybe they were asked to provide it for other Nintendo studio's... So, while i agree the article could be far fetched, i don't necessarily agree with your reasoning.
 
I dont see what in that article came to be true?

The most striking thing I noticed was the part where it mentioned lack of analog triggers. I doubt that was an educated guess and I dont think anyone could have known unless they were "in the know". I'm not saying that I believe everything that rumor list posted. After all, it lists the vitality sensor as having something to do with the NFC button but someone with knowledge must have spilled some of it.
 
Well, they wouldn't need to license the engine(s), it could be for Nintendo studio's (plenty of those) alone. Maybe Retro wanted to work on a new engine for their upcoming game, and maybe they were asked to provide it for other Nintendo studio's... So, while i agree the article could be far fetched, i don't necessarily agree with your reasoning.

Or working on Engines for/with others to get them Wii U compatible. Unity for Instance, and UE4 as two examples.
 
OoO is more efficient, so could potentially not need as high clock speeds. Whether code optimized for in-order can hamper performance when ran on an out-of-order... i don't know, but i think i read in this thread that OoO should have no problem running in-order code, but that an in-order CPU could not run OoO code. Or maybe i'm mistaken. What could be the culprit of bad performance of the WiiU-CPU (other than not using the DSP and such)... maybe Wsippel & co can better answer that.

If your CPU is built around running OoO and has it's clock etc factored for the fact stuff is -expected- to run OoO rather than In Order then porting In Order and not considering the scheduling is going to mean the code will run much worse.

So you're left with having to refactor your code to consider OoO operation - which i'm not sure is -that- trivial. Personally, i see it as if the CPU is an on paper a weaker OoO CPU that requires in order refactoring then that's going to hamper the whole "Easy to port to!" idea that Nintendo have been taking to firms.

What it will mean is that the CPU needs OoO to get the best out of it , but even then there's no indications as of yet as to how it's performance will be relative to other platforms. Given that the whole point of porting is to make it as simple as possible, then this -might- become a problem for games that make heavy use of the CPU.

OoO being more efficient isn't a silver bullet, ideally it would be OoO but still clocked fast enough that in order with no OoO optimization would -still- match X360/PS3 for in order. If it doesn't then it's introducing work to the porting stream and that's not ideal.
 
lol...Nintendo doesn't make money licensing engines, especially since it would be limited to the Wii U (no one will want a proprietary engine that won't work on PS4720) so why would Nintendo be putting it's top western studio on such a detail?

Retro is working on a game, every thing we've heard from Rare since the Wii U's announcement has been about them hiring staff to work on games.

nice typo... "Rare"

But think about it for just a moment. Could it not be possible that, you know, Retro might have made an engine before actually making a game for Wii U? Sound good? I'm not aware of many modern games that do not need an engine, you know... that engine, if it's any good, might even be reused by other Nintendo studios, maybe some third parties would want to use it as well, who knows. Anyhow, the idea of someone from Epic seeing the engine at work and saying, hey, this looks like something you would make using our UE4, tell us how you did that, it doesn't sound all that strange to me, which of course does not mean it's true, just that your "lol" seems a little bit too confident
 
Doing a bit more research into how Nintendo might modify an R700...does it really only have 16 kb of local data store for every cluster of 5 shaders? Compared to GCN with its 64 kb here, 64kb there, that's quite low. Perhaps bg's theory of using edram to boost the registers isn't so preposterous after all...
 
But think about it for just a moment. Could it not be possible that, you know, Retro might have made an engine before actually making a game for Wii U? Sound good? I'm not aware of many modern games that do not need an engine, you know... that engine, if it's any good, might even be reused by other Nintendo studios, maybe some third parties would want to use it as well, who knows. Anyhow, the idea of someone from Epic seeing the engine at work and saying, hey, this looks like something you would make using our UE4, tell us how you did that, it doesn't sound all that strange to me, which of course does not mean it's true, just that your "lol" seems a little bit too confident

.... so your reaction to an over confident lol is some fantasy situation where Retro have build an on-par-to-above Engine that's UE4+ class? and they're going to "share" that with all and sundry, including 3rd parties?

i sometimes wonder what you cats are huffing. (yes, i saw it, yes i saw the word "rumour" too)
 
.... so your reaction to an over confident lol is some fantasy situation where Retro have build an on-par-to-above Engine that's UE4+ class? and they're going to "share" that with all and sundry, including 3rd parties?

i sometimes wonder what you cats are huffing. (yes, i saw it, yes i saw the word "rumour" too)

Regardless of the validity of the rumour (which I'm taking with a fairly large grain of salt), it is in Nintendo's interest to share any and all techniques they have for getting good performance out of the Wii U, as it would mean more and better looking third party games, which would help system sales.

Doing a bit more research into how Nintendo might modify an R700...does it really only have 16 kb of local data store for every cluster of 5 shaders? Compared to GCN with its 64 kb here, 64kb there, that's quite low. Perhaps bg's theory of using edram to boost the registers isn't so preposterous after all...

Oh, so now it's bg's theory? :P

Matt mentioned a few pages back that there's a "significant" increase in register memory on the GPU, and I can't imagine them being able to do so without switching over to eDRAM. I don't think it'll be GCN-size, but a doubling is possible with eDRAM.
 
- Assassin's Creed 3 will run at 60fps at 1080p, compared to 720p on PS3 and Xbox 360
- Crytek, EA and Valve are all working on Wii U

Lol.

How is that part lol? Crytek confirmed they are working on something for Wii U, EA is working on Wii U, and Valve said at E3 2011 that they want a Wii U dev kit, so I can believe they are also working on Wii U.
 
Regarding the "leaked" information from Ubisoft: Cherry-picking.

Maybe but other than the Vitality sensor point everything else has turned out relatively true/accurate or is a prediction for the future. For all you know Ubisoft could be targeting a full HD, 60 fps internally for next year especially if Sony/MS are dropping a consoles in Q4 2013.

The RAM especially sticks out to me as prior to the leak I didn't see many people saying the Wii U would have more than 1gb,let alone 1.5gb and knowing what we know now that it's 2gb it seems generous even now. Just seems very un-Nintendo to be so bold so if someone was to cherry pick something like predicting RAM they would have gone lower.
 
nice typo... "Rare"

But think about it for just a moment. Could it not be possible that, you know, Retro might have made an engine before actually making a game for Wii U? Sound good? I'm not aware of many modern games that do not need an engine, you know... that engine, if it's any good, might even be reused by other Nintendo studios, maybe some third parties would want to use it as well, who knows. Anyhow, the idea of someone from Epic seeing the engine at work and saying, hey, this looks like something you would make using our UE4, tell us how you did that, it doesn't sound all that strange to me, which of course does not mean it's true, just that your "lol" seems a little bit too confident
I'm pretty confident that Epic has already ran WiiU benchmarks to see what it's capable of. I doubt they need another developer to show them how to code for the WiiU architecture as they're quite capable with technology.
 
Maybe but other than the Vitality sensor point everything else has turned out relatively true/accurate or is a prediction for the future. For all you know Ubisoft could be targeting a full HD, 60 fps internally for next year especially if Sony/MS are dropping a consoles in Q4 2013.

The RAM especially sticks out to me as prior to the leak I didn't see many people saying the Wii U would have more than 1gb,let alone 1.5gb and knowing what we know now that it's 2gb it seems generous even now. Just seems very un-Nintendo to be so bold so if someone was to cherry pick something like predicting RAM they would have gone lower.

they chose 2GB and said the OS would use about 560MB. that's quite a big difference than what happened.
 
Maybe but other than the Vitality sensor point everything else has turned out relatively true/accurate or is a prediction for the future.

What? Nearly everything on that list is either wrong, likely wrong, currently unproven and/or incredibly obvious/announced prior.

And if you click through to the source forum the topic has "Confirmato Fake" in the subject.
 
they chose 2GB and said the OS would use about 560MB. that's quite a big difference than what happened.

RAM will change throughout. See Epic getting MS to increase it for 360 late on and the development of the PS3 specs changed too late on for various reasons.
 
How is that part lol? Crytek confirmed they are working on something for Wii U, EA is working on Wii U, and Valve said at E3 2011 that they want a Wii U dev kit, so I can believe they are also working on Wii U.

Crytek is working with Nintendo, but it dosn't mean they are working on a game. AsideEA the rest is speculation.
 
How is that part lol? Crytek confirmed they are working on something for Wii U, EA is working on Wii U, and Valve said at E3 2011 that they want a Wii U dev kit, so I can believe they are also working on Wii U.


I could be wrong, and I'm being pedantic, but didn't Crytek only confirm there was a game in development using CryEngine 3? Not that they were developing one.
 
I believe it was stated that Crytek was able to successfully port over CryEngine 3 to the Wii U. Aside from that, I don't think anything else was stated. The reps at E3 were hesitant to mention anything Wii U when asked during their Crysis 3 demo, but that's a long shot.
 
I could be wrong, and I'm being pedantic, but didn't Crytek only confirm there was a game in development using CryEngine 3? Not that they were developing one.
Crytek is primarily a middleware provider. They ported the engine over, so they do in fact develop for Wii U.
 
I could be wrong, and I'm being pedantic, but didn't Crytek only confirm there was a game in development using CryEngine 3? Not that they were developing one.

Yes but also said the old Time Splitters studio in England has a kit. I think Nintendo did say though that in order to have a kit you need to show you are working on a game.

Maybe they make some exceptions.
 
Top Bottom