Rumor: Wii U final specs

The original Wii power situation was strange, looking back on it - the first anyone heard of the Revolution being "less than powerful" was when Nintendo's own VP of Corporate Affairs (Perrin Kaplan) said it would be "2 to 3 times" as powerful as a Gamecube ("It's not about turbo power, it's what you do with it.") And even though it was Nintendo themselves who said it, tons of people were disbelieving. The GAF thread on that was amazing. The next day, however, news started spreading around the internet that Kaplan's statement had been wrong...but it wasn't Nintendo who took it back, it was IGN who only said "The information was later determined to be false," without attributing a single source, yet everybody totally believed that, because they so much wanted the Revolution to be powerful and didn't think Nintendo would release something that wasn't.

However, a few months later, IGN posted a news story with rumored specs they got from developers, and those specs matched up with what Kaplan had said (and there was even some thought that it may be less than 2 times as powerful). And at that point, everyone totally believed them. They wailed and gnashed their teeth and said Nintendo were morons, but they didn't disbelieve the low specs.

Note, this was all long before the system was released, before it even got its "Wii" name. So it was quite a different circumstance than we are seeing these days with the Wii U, where even after the system has launched, people are arguing about system power.

yeah i agree it was a little different to this situation

It was obvious with the footage shown of games from the annoucement and up to release that the Wii was not on the same level as the other 2 which was it's point of comparison on release.

It wasn't obvious that the WiiU would be only on par or very marginaly above the 360/PS3 in the lead up to this release. You cant judge youtube videos of multiplat trailers, 2011 footage for WiiU multiplats was apparently even the 360 versions being shown. People in this thread are still bringing up the bird and Zelda demos and even as late as a few weeks ago when the demo units were heading out people were wondering if the games would be 1080p or 60fps (or both).
 
Unless the GPU and eDram are packing an army of bandwidth ignoring purple Pikmin I'm pretty sure the dream of WiiU easily handling ports from the 360/PS3 are out the window (even if possible... would any publisher dedicate the money?) and of receiving anything more than the occasional pity port from Durango/Orbis. You're talking about not only a lacking CPU, but a GPU that has to fight for very limited memory bandwidth. There's only so much 32MB eDram is going to be able to do to help in a memory sense.

And because of that the GPU will be cockblocked. If it had any true potential for general purpose code... this didn't help. So limited CPU, limited GPU because of low bandwidth DDR3. An esoteric design by any stretch. And probably not suitable for much more than Nintendo's greatest hits.

Most devs aren't going to have the time or money to tailor their games around the WiiU's strengths and faults.

The thing I'm struggling with here is this.

Given what we know, which really is only bandwidth and size of RAM along with asymmetrical CPU caches (We don't know the CPU clock or performance/clock), I haven't seen anyone pop in and go "32MB EDRAM if clocked at H and paired with a GPU of G GFlops should be able to generate us a result that's Y."

Until we get something like this, it's like we're all just talking out our arses. I can look at some information like "Wii U RAM is 50% of 360 bandwidth" and go "Shit, what a pile of crap" but then people talk about bandwidth not being that big of an issue, max RAM usage per frame, EDRAM and it's full read/write, asymmetrical cache and I go "I have no fucking clue really".

It all seems that either

1) Someone here who is knowledgeable like blu for instance, puts together a theory about the system based on his knowledge.
2) Someone in the know leaks everything
3) We wait for a game that should be a showcase. Iwata at E3 2013 "Here is Retro's new Metriod spinoff"

Of those, 3 is the only realistic one and if that's the case, we need to stop drawing insane conclusions either way and wait it out. That doesn't mean we stop talking about it, but just stop saying that it's a piece of shit just yet.
 
Given what we know, which really is only bandwidth and size of RAM along with asymmetrical CPU caches (We don't know the CPU clock or performance/clock), I haven't seen anyone pop in and go "32MB EDRAM if clocked at H and paired with a GPU of G GFlops should be able to generate us a result that's Y."
The problem is that the eDRAM in itself is a complete mystery. It could be clocked at twice the GPU speed and have a 4096bit interface, offering a bandwidth far beyond any top-of-the-line graphics card. Or it could run at half the GPU speed and sit on a 128bit bus. The only thing we can probably say for sure is that it needs to offer at least 15GB/s bandwidth and a sustained latency of 2ns or less for BC purposes.
 
yeah i agree it was a little different to this situation

It was obvious with the footage shown of games from the annoucement and up to release that the Wii was not on the same level as the other 2 which was it's point of comparison on release.

It wasn't obvious that the WiiU would be only on par or very marginaly above the 360/PS3 in the lead up to this release. You cant judge youtube videos of multiplat trailers, 2011 footage for WiiU multiplats was apparently even the 360 versions being shown. People in this thread are still bringing up the bird and Zelda demos and even as late as a few weeks ago when the demo units were heading out people were wondering if the games would be 1080p or 60fps (or both).

You responded before seeing my edit :) Those statements about the Wii were made *before* any footage at all was shown. The most people saw was the little Metroid 3 teaser showing Samus jumping out of her ship, and everyone assumed that was running on Gamecube hardware still, as it was a full 2 years before the Revolution's release.
 
The thing I'm struggling with here is this.

Given what we know, which really is only bandwidth and size of RAM along with asymmetrical CPU caches (We don't know the CPU clock or performance/clock), I haven't seen anyone pop in and go "32MB EDRAM if clocked at H and paired with a GPU of G GFlops should be able to generate us a result that's Y."

Until we get something like this, it's like we're all just talking out our arses. I can look at some information like "Wii U RAM is 50% of 360 bandwidth" and go "Shit, what a pile of crap" but then people talk about bandwidth not being that big of an issue, max RAM usage per frame, EDRAM and it's full read/write, asymmetrical cache and I go "I have no fucking clue really".

It all seems that either

1) Someone here who is knowledgeable like blu for instance, puts together a theory about the system based on his knowledge.
2) Someone in the know leaks everything
3) We wait for a game that should be a showcase. Iwata at E3 2013 "Here is Retro's new Metriod spinoff"

Of those, 3 is the only realistic one and if that's the case, we need to stop drawing insane conclusions either way and wait it out. That doesn't mean we stop talking about it, but just stop saying that it's a piece of shit just yet.

Or we can look at what has been released for the system to date, especially what Nintendo has made for the system, and we can draw reasonable conclusion. The best looking and playing games are Nintendoland and NSMBWiiU. Leaving aside fun, magic, and art direction because this is a tech thread, we can see what the machine can do right now.

Yes, if a company spent 5 years working with the hardware and had a 100 000 000 dollar budget for a game, it would look better and play better, but not that much better. Shooting cartoony space marines and/or aliens in Halo 4 ain't much different than Halo 3 despite all the magic dorito dust that's been spread around.

The fact is Nintendo's showcase games look current gen and show no signs of the wii u having much over and above PS360 in terms of hardware grunt. & yes I believe that wii sports showed what the wii was capable of.
 
The original Wii power situation was strange, looking back on it - the first anyone heard of the Revolution being "less than powerful" was when Nintendo's own VP of Corporate Affairs (Perrin Kaplan) said it would be "2 to 3 times" as powerful as a Gamecube ("It's not about turbo power, it's what you do with it.") And even though it was Nintendo themselves who said it, tons of people were disbelieving. The GAF thread on that was amazing. The next day, however, news started spreading around the internet that Kaplan's statement had been wrong...but it wasn't Nintendo who took it back, it was IGN who only said "The information was later determined to be false," without attributing a single source, yet everybody totally believed that, because they so much wanted the Revolution to be powerful and didn't think Nintendo would release something that wasn't.

However, a few months later, IGN posted a news story with rumored specs they got from developers, and those specs matched up with what Kaplan had said (and there was even some thought that it may be less than 2 times as powerful). And at that point, everyone totally believed them. They wailed and gnashed their teeth and said Nintendo were morons, but they didn't disbelieve the low specs.

Note, this was all long before the system was released, before it even got its "Wii" name, before any game footage had been released. So it was quite a different circumstance than we are seeing these days with the Wii U, where even after the system has launched, people are arguing about system power.

Yeah, I think once everything settled down everyone was relieved when Nintendo finally revealed the final name of the "Revolution". Absolutely no controversy there.
 
There are apparently insiders who have said that the GPU is "not even close" to 600 GFLOPS. Combine that with how the R700 architecture scales, the 1.5x 360 rumors, and what would be the overall balance given the known components, and it is very safe to say that we are looking at 320-400 SPUs. It sucks that there have been so many BS rumors out there, and it got people's hopes up (including my own), but it's still a cool little system with alot of potential for great games and graphics.

When and where did this happen? Not that i couldn't believe it, but "far from" reads like "barely half". My guess has been +/-460 Gflops for quite some time but even that would be too optimistic going by that statement.
 
I don't feel like getting into the whole Ideaman debate. I've talked to him on chat and he came off as a nice guy. I believe he was proven to have solid info. Unfortunately, this was when everyone was breaking things down into 2x, 3x, etc. As we know, those multipliers can be quite misleading. Yeah, it's 2x to 3x in some areas (cache, eDRAM, usable system RAM), but 1/2 in others (system RAM bandwidth, likely CPU clock speed and threads).

Thanks :p

Well, for the multipliers, from my very first message, i said it was closer to 2x than 5x. And one thing than i'm happy to see a lot people remember, is that i always included in this multiplier (flawed, we know it, but everyone is using those kind of measurement) the Gamepad screen. So even at launch, with some games, we're already witnessing, at the very least, a 1,5x xbox360 level of performances, with a xbox360ish content on the TV + 480p different 3D content on the gamepad, on certain sequences. Very close, and we'll reach the 2/2,5x figure i said back in February, it's a sure thing. It's the same for the multipliers pertaining to components, i said it was oscillating, depending on the hardware piece, between 2x and 4x i think. The 4X + all the back & forth discussions in the WUST 2 made that people understood well that the 4x was for the memory (4x 512mo). About the GPU, i haven't participated in your heated debates because i haven't GFLOP or ROP or other parameters numbers. Just that from what i've heard, it's relatively safe to assume that the GPU is capable of at least, around 2x what the Xbox360 GPU can do. Now does this 2x come from 2x more pure "raw power", or let's say 1,2x more power + 0,8x "faked" by the more modern architecture, features and effects, that aren't handled by Xbox360 and PS3 GPU's, i don't know.

For all these infos & guesses (for the GPU power in this case), i was reasonable and nothing was proven wrong, i stand proudly in my shoes on the contrary when i see for example what happened with Miiverse. However, i must say that i was pretty "Public Relationized" by all those infos around E3 2011 praising the system for its accessibility, its easiness, how it's quick to port current gen HD titles on it, etc. It seems it's not the case, and Wii U requires quite an amount of work and optimization of your development to take advantage of its characteristics.
 
It kind of hurts to see them go through it. Especially those that can't just go "Wow... WTF Nintendo?"

Yeah. There's numerous factors that need to be considered when it comes to what the Wii U is theoretically capable of at its most optimised and efficient rendering performance, but the writing is on the wall at this point, if it hasn't been for some time.
 
Let's make some kind of realistic worst case scenario up:

DSP: 121.5MHz (system base clock)
CPU: 1458MHz (12 x 121.5)
GPU: 486MHz (4 x 121.5)
MEM1: 486MHz (4 x 121.5)
MEM2: 729MHz (6 x 121.5)

Everything in sync, just the way Nintendo likes it. To switch to legacy mode, the system only needs to lower the multipliers. Base clock would be the same as the Wii's.

Next up: Performance:

CPU: ~17.5GFLOPS (1458MHz x 3 cores x 4 floating point instructions per cycle)
GPU: ~310GFLOPS (assuming 320 shader units)
MEM1: ~57.9GB/s (486MHz x 1024bit)
MEM2: ~10.9GB/s (729MHz x 2 x 64bit)
 
Let's make some kind of realistic worst case scenario up:

DSP: 121.5MHz (system base clock)
CPU: 1458MHz (12 x 121.5)
GPU: 486MHz (4 x 121.5)
MEM1: 486MHz (4 x 121.5)
MEM2: 729MHz (6 x 121.5)

Everything in sync, just the way Nintendo likes it. To switch to legacy mode, the system only needs to lower the multipliers. Base clock would be the same as the Wii's.

Next up: Performance:

CPU: ~17.5GFLOPS (1458MHz x 3 cores x 4 floating point instructions per cycle)
GPU: ~310GFLOPS (assuming 320 shader units)
MEM1: ~57.9GB/s (486MHz x 1024bit)
MEM2: ~10.9GB/s (729MHz x 2 x 64bit)

Even this would be fine, it's not the level of hardware I desired because developers will scale down their games into the "lowest" settings with something like this, but considering that it wouldn't actually stop 3rd parties from being able to port down to the Wii U, it's fine.

If there was just two things I could of changed about the Wii U, it would of been it's single core CPU (into a dual core) and a GPU with programmable shaders running 40GFLOPs+. that is something that would have gotten huge third party support and probably would of allowed me to avoid buying the other consoles last gen, as I'm a PC gamer, especially when it comes to graphics.
 
Developers will scale down a game to whatever publishers tell them to, and publishers will decide based on perceived market for a title and consequently whether they deem the potential return on investment being worth the opportunity cost of devoting resources to a downport.

The Wii U has yet to prove itself in that regard.

Also, this thread needs less meltdowns and more new information - although the latter seems to be the cause of the former. Someone X-ray something already.
 
Yeah. There's numerous factors that need to be considered when it comes to what the Wii U is theoretically capable of at its most optimised and efficient rendering performance, but the writing is on the wall at this point, if it hasn't been for some time.



Well as I said, when you look at Wii U rumoured specs, or at least, what we can expect, it looks decent. But low power consumption ruins everything.
 
Developers will scale down a game to whatever publishers tell them to, and publishers will decide based on perceived market for a title and consequently whether they deem the potential return on investment being worth the opportunity cost of devoting resources to a downport.

The Wii U has yet to prove itself in that regard.

Right, that is why the performance doesn't matter if it's not fast enough to just dump in PS3/360 code without optimizing. Then it could be anywhere in between 360 level and that dump in code level and it wouldn't matter much at all.

A lot of people think a "weak" Wii U means that Nintendo is in the same place as they were last gen with Wii, but it's not the case. Wii U can use 360 code, where Wii could not, and the same is true with XB3 or PS4 code, Wii U can understand it, and from there it's about scaling exactly as you say.

If we get today's GPU effects with PS3/XB360's general level of performance, I'm ok with that, 720p doesn't all of a sudden mean blurry horrible graphics when these other consoles come out, though there will be a difference, just like there is a difference today with the highest end PCs vs PS3/360 right now. (I know the argument to this is that PCs don't use the graphics they have because they are based around consoles, but it's more because the effects of these engines are not going to give you that "next gen graphics" youtube search result you keep getting, those are specially designed engines built for the fastest PC GPUs on todays markets. There is a huge difference there.
 
Publishers need to take into account that devoting development resources to an SKU may mean they can't put them on DLC or to work on a sequel or a different game. Performance affects platform choice when it's just not worth the effort. Performance comes into play in SKU decisions if publishers deem it's not worth the hassle; and if things like the slow RAM and "horrible, slow CPU" are too much of a hindrance and if third parties don't see success with their target demographics on the platform then I don't see how history wouldn't repeat itself.

People seem to talk about game engines as if there's a slider that you can drag down from Durango to Wii U. Or a button you push that converts to Wii U. I'm not a techhead but that doesn't seem particularly realistic.

Sony and Microsoft's market/demographic overlap meant that despite the effort involved in porting between systems that were very different technically, the return was worthwhile. But early on, it often wasn't worth the effort to port to the Cell.
 
Let's make some kind of realistic worst case scenario up:

DSP: 121.5MHz (system base clock)
CPU: 1458MHz (12 x 121.5)
GPU: 486MHz (4 x 121.5)
MEM1: 486MHz (4 x 121.5)
MEM2: 729MHz (6 x 121.5)

Everything in sync, just the way Nintendo likes it. To switch to legacy mode, the system only needs to lower the multipliers. Base clock would be the same as the Wii's.

Next up: Performance:

CPU: ~17.5GFLOPS (1458MHz x 3 cores x 4 floating point instructions per cycle)
GPU: ~310GFLOPS (assuming 320 shader units)
MEM1: ~57.9GB/s (486MHz x 1024bit)
MEM2: ~10.9GB/s (729MHz x 2 x 64bit)
I think that's still fairly optimistic for a "worst-case" scenario.
In my old calcudiction I ended up at 240 SUs. But that's still before taking the now known die size into account. The eDRAM is in there, so it's not actually a 156mm² GPU. Probably closer to a 110mm² GPU and 46mm² eDRAM.

It could well be 160 SUs.
 
I think that's still fairly optimistic for a "worst-case" scenario.
In my old calcudiction I ended up at 240 SUs. But that's still before taking the now known die size into account. The eDRAM is in there, so it's not actually a 156mm² GPU. Probably closer to a 110mm² GPU and 46mm² eDRAM.

It could well be 160 SUs.
Then the GPU would be weaker than Xenos, and we know it isn't. Your assumption doesn't take the relationship between clock speed, transistor count and TDP into account.
 
Publishers need to take into account that devoting development resources to an SKU may mean they can't put them on DLC or to work on a sequel or a different game. Performance affects platform choice when it's just not worth the effort. Performance comes into play in SKU decisions if publishers deem it's not worth the hassle; and if things like the slow RAM and "horrible, slow CPU" are too much of a hindrance and if third parties don't see success with their target demographics on the platform then I don't see how history wouldn't repeat itself.

People seem to talk about game engines as if there's a slider that you can drag down from Durango to Wii U. Or a button you push that converts to Wii U. I'm not a techhead but that doesn't seem particularly realistic.

Sony and Microsoft's market/demographic overlap meant that despite the effort involved in porting between systems that were very different technically, the return was worthwhile. But early on, it often wasn't worth the effort to port to the Cell.

You do realize, this is basically exactly how graphics options work in a PC right?

Then the GPU would be weaker than Xenos, and we know it isn't. Your assumption doesn't take the relationship between clock speed, transistor count and TDP into account.

Or Ubisoft's possibly optimistic 1.5x GPU number, which if taken literally would be 360GFLOPs, but the newer architecture has to be taken into account to be truly literally, so 310GFLOPs isn't unrealistic for that 1.5 BS multiplier.
 
Then the GPU would be weaker than Xenos, and we know it isn't. Your assumption doesn't take the relationship between clock speed, transistor count and TDP into account.
I thought Xenos counts as 80 SUs, if you "translate" the counting styles (16 x 5-way SIMD units)?

e: Anyone got the die size for the Valhalla version of Xenos? It's the same process node, the eDRAM is on the die. So it would make an interesting reference point for comparison to the Wii U GPU. I've been googling up and down and didn't find anything.
 
You do realize, this is basically exactly how graphics options work in a PC right?



Or Ubisoft's possibly optimistic 1.5x GPU number, which if taken literally would be 360GFLOPs, but the newer architecture has to be taken into account to be truly literally, so 310GFLOPs isn't unrealistic for that 1.5 BS multiplier.
I think you missed his point.
 
Yes. I wasn't aware adjusting options in a PC game was equivalent to creating an additional SKU.

That wasn't what your point was, obviously "scaling" down an engine vs coding to a new platform is two completely different things. PCs scale game engine's graphics to fit the performance level of the hardware, which is exactly what you are talking about with people assuming that developers could do the same with Wii U and Durango.

However coding to Wii U is obviously a completely different topic than that, as it's about architecture and not performance of hardware.
 
I think that's still fairly optimistic for a "worst-case" scenario.
In my old calcudiction I ended up at 240 SUs. But that's still before taking the now known die size into account. The eDRAM is in there, so it's not actually a 156mm² GPU. Probably closer to a 110mm² GPU and 46mm² eDRAM.

It could well be 160 SUs.


RV740 had 640 SUs at a die size of 137mm². Why would the Wii U GPU have less than half of that with a ~110mm² Die size for the GPU?
 
I think you missed his point.

The bolded part I responded to, was about scaling an engine down to Wii U performance level from other next gen consoles. It is in this way very similar to what a PC does, especially since all of the same engines will be supported by Wii U, turning off various effects (like PC video options have) and "sliding" a "bar" to hit Wii U's lower performance is generally how the engine would work. After that it's about optimizing code for the Wii U. It's generally how Darksides 2 was ported to Wii U in just 5 short weeks by a few people at Vigil, you can basically dump 360 code into Wii U, it just isn't going to run as well because Wii U is designed to do things differently.
 
You do realize, this is basically exactly how graphics options work in a PC right?



Or Ubisoft's possibly optimistic 1.5x GPU number, which if taken literally would be 360GFLOPs, but the newer architecture has to be taken into account to be truly literally, so 310GFLOPs isn't unrealistic for that 1.5 BS multiplier.

Can we expect ipad versions of durango/orbis games? Since it understands all the code and devs just need push the slider back and hit the port button.

The bolded part I responded to, was about scaling an engine down to Wii U performance level from other next gen consoles. It is in this way very similar to what a PC does, especially since all of the same engines will be supported by Wii U, turning off various effects (like PC video options have) and "sliding" a "bar" to hit Wii U's lower performance is generally how the engine would work. After that it's about optimizing code for the Wii U. It's generally how Darksides 2 was ported to Wii U in just 5 short weeks by a few people at Vigil, you can basically dump 360 code into Wii U, it just isn't going to run as well because Wii U is designed to do things differently.

How do you know every engine is running on the wii u? Is SE's new luminous engine going to run on it?
 
That wasn't what your point was
Why are you telling me what my point was? :/

My point was that people are making it out as if simply because things are "scalable" that churning out SKUs is a touch of a button affair. And that doesn't seem remotely realistic.

Investment must be made, and as with any investment return is expected.

If one can't foresee a return, then investment won't be made, regardless of how small that initial effort may seem.
 
Thanks :p

Well, for the multipliers, from my very first message, i said it was closer to 2x than 5x. And one thing than i'm happy to see a lot people remember, is that i always included in this multiplier (flawed, we know it, but everyone is using those kind of measurement) the Gamepad screen. So even at launch, with some games, we're already witnessing, at the very least, a 1,5x xbox360 level of performances, with a xbox360ish content on the TV + 480p different 3D content on the gamepad, on certain sequences. Very close, and we'll reach the 2/2,5x figure i said back in February, it's a sure thing. It's the same for the multipliers pertaining to components, i said it was oscillating, depending on the hardware piece, between 2x and 4x i think. The 4X + all the back & forth discussions in the WUST 2 made that people understood well that the 4x was for the memory (4x 512mo). About the GPU, i haven't participated in your heated debates because i haven't GFLOP or ROP or other parameters numbers. Just that from what i've heard, it's relatively safe to assume that the GPU is capable of at least, around 2x what the Xbox360 GPU can do. Now does this 2x come from 2x more pure "raw power", or let's say 1,2x more power + 0,8x "faked" by the more modern architecture, features and effects, that aren't handled by Xbox360 and PS3 GPU's, i don't know.

For all these infos & guesses (for the GPU power in this case), i was reasonable and nothing was proven wrong, i stand proudly in my shoes on the contrary when i see for example what happened with Miiverse. However, i must say that i was pretty "Public Relationized" by all those infos around E3 2011 praising the system for its accessibility, its easiness, how it's quick to port current gen HD titles on it, etc. It seems it's not the case, and Wii U requires quite an amount of work and optimization of your development to take advantage of its characteristics.


I don't remember where I read it now but I remember reading that the Wii U would be the hardest to develop for out of the Wii U , PS4 & Xbox Next. with the Xbox Next being the easiest.
 
Why are you telling me what my point was? :/

My point was that people are making it out as if simply because things are "scalable" that churning out SKUs is a touch of a button affair. And that doesn't seem remotely realistic.

Investment must be made, and as with any investment return is expected.

If one can't foresee a return, then investment won't be made, regardless of how small that initial effort may seem.

Do you even read my posts? I explain that the code has to be optimized, but to say that it's not about moving a slider, turning on and off certain effects. Then I think you need to become a "tech head" a bit more before you talk on these points.

These exact things have basically been confirmed from different studios as to what they are doing, and it's nothing new. Gamecube didn't get very good ports of PS2 games even though it was more capable, and that is because the code they moved over didn't run as fast thanks to being written to the PS2's strengths. Xbox generally could port those games over perfectly because it was so much faster than PS2.

So yes, I'm telling you what your point was, because I only bolded the part I was talking to, which wasn't the investments or anything else. It was about the graphics slider.

Can we expect ipad versions of durango/orbis games? Since it understands all the code and devs just need push the slider back and hit the port button.
The ones that make sense? sure... UE4 for instance can scale down that far, so it's technically possible. It isn't as simple as just throwing the code on the console and hitting a switch, but after some optimizations, there is little reason that it couldn't be done.

How do you know every engine is running on the wii u? Is SE's new luminous engine going to run on it?
Rumors of UE4, obviously UE3, unity, IWe, Anvilnext, and frostbite have already been confirmed. SE's new luminous engine could be ported to it, but I did miss that one since square signed a deal to use UE4 right? so who knows what luminous will really accomplish with their dev cycles.
 
I don't remember where I read it now but I remember reading that the Wii U would be the hardest to develop for out of the Wii U , PS4 & Xbox Next. with the Xbox Next being the easiest.

I hope for them it's not. If it's the case, it would be a let-down compared to the declarations we've heard around E3 2011, on the accessibility of the system. Well, it's already a deception for me (again, i'm strictly talking about ports here, i'm as confident as i ever was - so reasonably optimistic & enthusiast - for games tailored for the console).
 
Unless the GPU and eDram are packing an army of bandwidth ignoring purple Pikmin I'm pretty sure the dream of WiiU easily handling ports from the 360/PS3 are out the window (even if possible... would any publisher dedicate the money?) and of receiving anything more than the occasional pity port from Durango/Orbis. You're talking about not only a lacking CPU, but a GPU that has to fight for very limited memory bandwidth. There's only so much 32MB eDram is going to be able to do to help in a memory sense.

And because of that the GPU will be cockblocked. If it had any true potential for general purpose code... this didn't help. So limited CPU, limited GPU because of low bandwidth DDR3. An esoteric design by any stretch. And probably not suitable for much more than Nintendo's greatest hits.

Most devs aren't going to have the time or money to tailor their games around the WiiU's strengths and faults.


Very well said, and something I think everyone needs to accept as what will be the likeliest reality of the situation.

In fact, I wouldn't be surprised with this gen winding down and publishers shifting over to next gen development, this launch effort may be the best output from them that we see for a while. I mean, what incentive has Nintendo given them with hardware specced the way it is?
 
Very well said, and something I think everyone needs to accept as what will be the likeliest reality of the situation.

In fact, I wouldn't be surprised with this gen winding down and publishers shifting over to next gen development, this launch effort may be the best output from them that we see for a while. I mean, what incentive has Nintendo given them with hardware specced the way it is?

Gamecube's easier(to code for) and more powerful hardware (compared to PS2) didn't win over 3rd parties.

If Wii U sells 20M before the second half of 2014 while the other new consoles are hitting 5-10m a piece. I think you will see every effort of keeping development on PS360 or porting their titles across XB3/PS4/Wii U. Because hitting a 30-40m console base is far more likely to return investments than 10-20m.

It's exactly why PS3's architecture was worked on so hard after it's release. Because 360's market wasn't big enough at the time to support the larger budgets that the HD games were seeing.
 
Gamecube's easier(to code for) and more powerful hardware (compared to PS2) didn't win over 3rd parties.

If Wii U sells 20M before the second half of 2014 while the other new consoles are hitting 5-10m a piece. I think you will see every effort of keeping development on PS360 or porting their titles across XB3/PS4/Wii U. Because hitting a 30-40m console base is far more likely to return investments than 10-20m.

It's exactly why PS3's architecture was worked on so hard after it's release. Because 360's market wasn't big enough at the time to support the larger budgets that the HD games were seeing.

I don't see the Wii U doing 20M over the next 18 months.
 
Gamecube's easier(to code for) and more powerful hardware (compared to PS2) didn't win over 3rd parties.

If Wii U sells 20M before the second half of 2014 while the other new consoles are hitting 5-10m a piece. I think you will see every effort of keeping development on PS360 or porting their titles across XB3/PS4/Wii U. Because hitting a 30-40m console base is far more likely to return investments than 10-20m.

It's exactly why PS3's architecture was worked on so hard after it's release. Because 360's market wasn't big enough at the time to support the larger budgets that the HD games were seeing.

This is totally irrelevant. Do you think publishers are holding off on next gen development to see how the consoles sell? They're already designing games on ps4/720 dev kits, not wii u kits, so that battle has already been decided. It's kind of how ps3 had all this support before it even launched. Its sales numbers didn't change anything.

The wii u is just a port machine. For the devs sticking with ps360, wii u is an additional platform to port to, if it makes business sense. For the devs already moving on to ps4/720, wii is an additional platform to port to, if it is technically possible and makes business sense. The wii u won't get much more support than that, and this is all nintendo's doing.
 
I think I should post a best case scenario as well. I think my clocks are correct so this part stays:

DSP: 121.5MHz (system base clock)
CPU: 1458MHz (12 x 121.5)
GPU: 486MHz (4 x 121.5)
MEM1: 486MHz (4 x 121.5)
MEM2: 729MHz (6 x 121.5)

Other things could be different though:

CPU: ~35GFLOPS (1458MHz x 3 cores x 8 floating point instructions per cycle)
GPU: ~620GFLOPS (assuming 640 shader units)
MEM1: ~463.5GB/s (486MHz x 8192bit)
MEM2: ~10.9GB/s (729MHz x 2 x 64bit)

Pretty sure it's much closer to the worst case scenario though...
 
I don't see the Wii U doing 20M over the next 18 months.

Me neither, to be honest.

I expect it to sell better than the GC, but worse than the N64 when it's all said and done.

The WiiU is no Wii when it comes to hype and interest. That was obvious weeks ago.
 
Gamecube's easier(to code for) and more powerful hardware (compared to PS2) didn't win over 3rd parties.

If Wii U sells 20M before the second half of 2014 while the other new consoles are hitting 5-10m a piece. I think you will see every effort of keeping development on PS360 or porting their titles across XB3/PS4/Wii U. Because hitting a 30-40m console base is far more likely to return investments than 10-20m.

It's exactly why PS3's architecture was worked on so hard after it's release. Because 360's market wasn't big enough at the time to support the larger budgets that the HD games were seeing.

The PS3 was worked on because developers could do (and needed to do) 360/PS3/PC. Next gen they'll have 720/PS4/PC and might ignore WiiU if the power gap is large enough. I think it'll be quite a bit better than the Wii had it, but I expect the WiiU port to be the exception rather than the rule.
 
I think I should post a best case scenario as well. I think my clocks are correct so this part stays:

DSP: 121.5MHz (system base clock)
CPU: 1458MHz (12 x 121.5)
GPU: 486MHz (4 x 121.5)
MEM1: 486MHz (4 x 121.5)
MEM2: 729MHz (6 x 121.5)

Other things could be different though:

CPU: ~35GFLOPS (1458MHz x 3 cores x 8 floating point instructions per cycle)
GPU: ~620GFLOPS (assuming 640 shader units)
MEM1: ~463.5GB/s (486MHz x 8192bit)
MEM2: ~10.9GB/s (729MHz x 2 x 64bit)

Pretty sure it's much closer to the worst case scenario though...



I can't see Wii U GPU being 620Gflops. That would mean the GPU drawing ~50W alone based on RV740 design.
 
I can't see Wii U GPU being 620Gflops. That would mean the GPU drawing ~50W alone based on RV740 design.

Not only that but I think 640 SU's is outside the best-case scenario and into fantasy land, given the die size and the fact that you have to also make room on it for the 32Mb eDRAM and the ARM core.

I think 400ish is best case, and 320 SU's is the most likely case.
 
Not only that but I think 640 SU's is outside the best-case scenario and into fantasy land, given the die size and the fact that you have to also make room on it for the 32Mb eDRAM and the ARM core.

I think 400ish is best case, and 320 SU's is the most likely case.


ARM core takes nothing. I think 640SU would be possible, even with eDram included.
320SU seems really small amount considering the die size.
I think it could be 640 but 480 being more likely.
 
This is totally irrelevant. Do you think publishers are holding off on next gen development to see how the consoles sell? They're already designing games on ps4/720 dev kits, not wii u kits, so that battle has already been decided. It's kind of how ps3 had all this support before it even launched. Its sales numbers didn't change anything.

The wii u is just a port machine. For the devs sticking with ps360, wii u is an additional platform to port to, if it makes business sense. For the devs already moving on to ps4/720, wii is an additional platform to port to, if it is technically possible and makes business sense. The wii u won't get much more support than that, and this is all nintendo's doing.

Tell that to the Vita. Sales numbers will control this gen far more than the previous one.

Also 3DS had sold 22m in the same amount of time, with only 1 holiday season.

So while I'm not claiming that Wii U will sell 20m in the next 18months, I am saying it's possible that it could.

BTW I love that everyone here knows exactly how this generation is going to turn out, do you guys realize that the majority of gamers won't even realize that there is a graphical difference between last gen and next gen. They might assume one is there, but they generally won't see it. Just look at all the people who thought PS2 was super powerful for it's generation.
 
Me neither, to be honest.

I expect it to sell better than the GC, but worse than the N64 when it's all said and done.

The WiiU is no Wii when it comes to hype and interest. That was obvious weeks ago.

I really don't see any situation where this could end up happening. I don't expect it to sell as well as the wii but there is a big gap between that and selling less than the N64.
 
Top Bottom