Wii U Speculation Thread The Third: Casting Dreams in The Castle of Miyamoto

That should not happen. Do you connect it via HDMI? Then go to your graphics settings and search for Overscan, you can adjust a slider there so that the picture will fit perfectly.

Bloody hell, thanks for that mate! I was having the same problem but have managed to get rid of the annoying black border that's been getting on my nerves for ages! Ta!

I just assumed it was my telly being cheap, only cost me £249.99 from Sainsburys two years ago and it's a 37" Full HD telly with Freeview. No input lag, no motion blur, no ghosting, vibrant colours and the black is nice and black. It even has a neat little feature that automatically adjusts the volume when the adverts come on so you don't get your ears blasted during commercial breaks. Best thing I've ever bought I think, definitely a bargain.
 
Very cool. Thanks Rösti!
I hope that the DD roster of Wii U will have some suprises for us. :)

And hey, if we give the GPU a German name, then we pick Project Salatgurke!

That'll work, too, haha! I just want a German-named something associated with the system so that Conan O'Brien can catch wind of it and go crazy into his 'fake-German' routine one night, hehe..
 
*tries to recall what he meant to say earlier today.. fails. Starts thought process anew*

Ok, first thing first: on the subject of WiiU's upscaling 360 games to 1080p (or lack thereof).

The general statement that 'if the GPU is sufficiently more powerful one should be able to trivially upres a workload that was originally running at a given fb res on an inferior GPU' is generally correct. Yet (there's always a catch, isn't there?), that assumes that 'sufficiently' here meets a certain definition: that whatever the original GPU's bottleneck at each frame of the original workload might be, the new GPU should be able to meet that at the higher res. Including if there's a frame of content which is entirely fillrate-limited (in the general case: fragment ALUs + texel fetch + ROPs), the new GPU should have enough of those resources (i.e count * speed) to meet new_res:old_res of those original fillrate requirements. 'But we did not upres the texture assets - why more texel fetching?' would be a good question to ask at this point. The answer is really simple: because due to the way common texture filtering works (hint: it uses texture LODs), unless a texture was always projected 1:1 on the screen (say, for UI elements), chances are that most of the time the GPU does not need all the texels from that texture for the original res, thus most of the time it does not use the top-most LOD for a given texture. Now, when we upres the fb, we also upres whatever primitives use that texture, so we actually implicitly tell the GPU to use a higher-res LOD of the same texture*. So unless the GPU was already using the top-most LOD of the texture (unlikely) at the lower-res fb, at the higher-res fb now it will start using a higher-res LOD. Ergo more tex fetching. So as you see, FLOPs (i.e. shader ALU resources) don't give the full x-times-more-powerful picture (if somebody actually thought they were ; )

But let's assume the new GPU has all the grunt it takes to upres the original content. Does that mean we can flip that res switch and enjoy the show? The alert reader should be already guessing the answer: nope. We have to make sure one more tiny detail is satisfied: that the GPU is made sure (by the game logic) to not idle at any given moment. IOW, that the GPU is always busy with work, and never waiting for the game logic to feed it. Or IYOW (in yet other words), we have to make sure that our game's performance is entirely GPU-limited. Then, and only then we can say that if the new GPU has enough grunt to face the old GPU's 'worst nightmares' (read: bottlenecks) at the new res, then we can flip that res switch and get a framerate at least as good as the original. Now, most well optimised game code does not let the GPU idle, much or at all. Unfortunately, shit happens, particularly with new platforms featuring early software - HAL, middlewares, compilers, etc.


Now, onto the next thing from the though process. .. Oops. Buffer underrun. Sorry, going down for maintenance.


* Actually it's more complicated than that, but for the sake of this post that explanation should do.
 
*tries to recall what he meant to say earlier today.. fails. Starts thought process anew*

Ok, first thing first: on the subject of WiiU's upscaling 360 games to 1080p (or lack thereof).

The general statement that 'if the GPU is sufficiently more powerful one should be able to trivially upres a workload that was originally running at a given fb res on an inferior GPU' is generally correct. Yet (there's always a catch, isn't there?), that assumes that 'sufficiently' here meets a certain definition: that whatever the original GPU's bottleneck at each frame of the original workload might be, the new GPU should be able to meet that at the higher res. Including if there's a frame of content which is entirely fillrate-limited (in the general case: fragment ALUs + texel fetch + ROPs), the new GPU should have enough of those resources (i.e count * speed) to meet new_res:old_res of those original fillrate requirements. 'But we did not upres the texture assets - why more texel fetching?' would be a good question to ask at this point. The answer is really simple: because due to the way common texture filtering works (hint: it uses texture LODs), unless a texture was always projected 1:1 on the screen (say, for UI elements), chances are that most of the time the GPU does not need all the texels from that texture for the original res, thus most of the time it does not use the top-most LOD for a given texture. Now, when we upres the fb, we also upres whatever primitives use that texture, so we actually implicitly tell the GPU to use a higher-res LOD of the same texture*. So unless the GPU was already using the top-most LOD of the texture (unlikely) at the lower-res fb, at the higher-res fb now it will start using a higher-res LOD. Ergo more tex fetching. So as you see, FLOPs (i.e. shader ALU resources) don't give the full x-times-more-powerful picture (if somebody actually thought they were ; )

But let's assume the new GPU has all the grunt it takes to upres the original content. Does that mean we can flip that res switch and enjoy the show? The alert reader should be already guessing the answer: nope. We have to make sure one more tiny detail is satisfied: that the GPU is made sure (by the game logic) to not idle at any given moment. IOW, that the GPU is always busy with work, and never waiting for the game logic to feed it. Or IYOW (in yet other words), we have to make sure that our game's performance is entirely GPU-limited. Then, and only then we can say that if the new GPU has enough grunt to face the old GPU's 'worst nightmares' (read: bottlenecks) at the new res, then we can flip that res switch and get a framerate at least as good as the original. Now, most well optimised game code does not let the GPU idle, much or at all. Unfortunately, shit happens, particularly with new platforms featuring early software - HAL, middlewares, compilers, etc.


Now, onto the next thing from the though process. .. Oops. Buffer underrun. Sorry, going down for maintenance.


* Actually it's more complicated than that, but for the sake of this post that explanation should do.
Who said learning isn't fun?
 
Mitsurugi said:
Correct me if I'm wrong, in a local milti-player game like COD, Wii U would have to process two separate and potentially very dissimilar looking 720p/60fps video streams (in this instance, first player could be on the ground in a tank and the second player could be in a chopper providing air support), send one directly to the HDTV, downscale the other's res to 480p and send it to the controller. Right?
There should be no reason the pad view would have to be rendered in 720p.
 
Thanks for the reply blu! I guess I implicitly assumed Nintendo would make a console that was a superset of the Xbox 360 in nearly every way at least graphically, which of course may not be the case. I sure hope for Nintendo's sake that the GPU outperforms Xenos in every way, be it by 50% or by 250%.
 
ibib9TlUt53Uo2.png

ibwhFkUOUDaT6s.jpg

ipiPqPG35jksj.jpg


Looks like next sony system (orbis) is also using some kind of ipad/iphone screen to communicate with the tv. I''m still wondering which direction all next gens will choose.
If the weight is more to or more to casual or even in balance. I hope for the wiiu that the balance will be more hardcore. But if they choose the same direction that they for the 3ds, than it will be ok.
 
*tries to recall what he meant to say earlier today.. fails. Starts thought process anew*

Ok, first thing first: on the subject of WiiU's upscaling 360 games to 1080p (or lack thereof).

The general statement that 'if the GPU is sufficiently more powerful one should be able to trivially upres a workload that was originally running at a given fb res on an inferior GPU' is generally correct. Yet (there's always a catch, isn't there?), that assumes that 'sufficiently' here meets a certain definition: that whatever the original GPU's bottleneck at each frame of the original workload might be, the new GPU should be able to meet that at the higher res. Including if there's a frame of content which is entirely fillrate-limited (in the general case: fragment ALUs + texel fetch + ROPs), the new GPU should have enough of those resources (i.e count * speed) to meet new_res:old_res of those original fillrate requirements. 'But we did not upres the texture assets - why more texel fetching?' would be a good question to ask at this point. The answer is really simple: because due to the way common texture filtering works (hint: it uses texture LODs), unless a texture was always projected 1:1 on the screen (say, for UI elements), chances are that most of the time the GPU does not need all the texels from that texture for the original res, thus most of the time it does not use the top-most LOD for a given texture. Now, when we upres the fb, we also upres whatever primitives use that texture, so we actually implicitly tell the GPU to use a higher-res LOD of the same texture*. So unless the GPU was already using the top-most LOD of the texture (unlikely) at the lower-res fb, at the higher-res fb now it will start using a higher-res LOD. Ergo more tex fetching. So as you see, FLOPs (i.e. shader ALU resources) don't give the full x-times-more-powerful picture (if somebody actually thought they were ; )

But let's assume the new GPU has all the grunt it takes to upres the original content. Does that mean we can flip that res switch and enjoy the show? The alert reader should be already guessing the answer: nope. We have to make sure one more tiny detail is satisfied: that the GPU is made sure (by the game logic) to not idle at any given moment. IOW, that the GPU is always busy with work, and never waiting for the game logic to feed it. Or IYOW (in yet other words), we have to make sure that our game's performance is entirely GPU-limited. Then, and only then we can say that if the new GPU has enough grunt to face the old GPU's 'worst nightmares' (read: bottlenecks) at the new res, then we can flip that res switch and get a framerate at least as good as the original. Now, most well optimised game code does not let the GPU idle, much or at all. Unfortunately, shit happens, particularly with new platforms featuring early software - HAL, middlewares, compilers, etc.


Now, onto the next thing from the though process. .. Oops. Buffer underrun. Sorry, going down for maintenance.


* Actually it's more complicated than that, but for the sake of this post that explanation should do.

Great post, blu, thanks, I think even I, not a true techie, can understand.
 
It's s 480p LCD. That's pretty much all it'll do.

I can't complain about that one bit; the screens I've seen look just fine.

~~~~~

And as far as console audience maneuvering goes.. the thing about Nintendo is that it has inherently casual-friendly IPs that will always attract some portion of the casual audience. Mario is so friendly and approachable - he's probably the closest thing that Nintendo has to a trump card when competing with the other companies.

So yeah, I hope that Nintendo pursues the core much more aggressively this time around. As long as Mario, Luigi, Toad, & the Princess are around to bring in casuals and female gamers (for some reason, all of the girl gamers I know adore the Italian plumber.. go figure), they'll be able to shore-up their casual appeal.
 
The only thing I think people need to accept (at least in the beginning) is that a lot of devs won't up-res or update their game for the Wii U. At least not initially. Granted it would be great to see higher texture quality at the very least, but whether or not companies do this is the real question. Don't expect it.
Just so long as the system is flexible enough that plenty of 3rd parties are willing to port, period without making endless Kojima-style excuses is all I care about.

This thread is impossible to keep up with and we're still over two months away from E3 yet I keep coming back, what have you people done to me?!
 
how about it takes years to design a GPU, if they started in 2009, that means it took them a little under 3 years for them to finish and have chips ready in fall 2012.
Yeah, but you don't need to design a GPU for three years unless you plan to go in a completely non-standard direction. Bolting an I/O and an audio processor on top of an off-the-shelf GPU is a matter of months, not years. Whatever they have now should be very different from a PC GPU.
 
No. None of the demos we've actually seen do anything meaningful with the Upad.

We did. The bird demo was showing two distinct views of the same scene.

I'd say these days, as long as Epic and maybe Crytek are cool with Nintendo's custom extensions, they'd be fine. It's even possible companies like Epic had input there as well - instead of designing an engine around a GPU, they'd design a GPU around an engine. The audio algorithms embedded in the Gamecube GPU were in part developed by Factor 5 for example, and they also did the MusyX audio middleware for the system.

I can see Nintendo taking this train of thought but I think that would be stupid of them to do. Many other developers use their own engines, like EA for instance, and if they have to do a tonne of work to get their stuff optimised on Wii U's hardware, they might just take the simple approach of using the standard shader functionality with the slight bump in clock speeds and call it a day. They could technically then say the Wii U version performs better.
 
We did. The bird demo was showing two distinct views of the same scene.
Yeah, I checked it again. Can't believe I missed that.

I don't see Nintendo doing their own heavily customized shaders. First of all it's unclear how it would be advantageous at all, when they can spend the silicon on more 'regular' shaders. Furthermore, messing with the standard shader pipeline means rewrites are necessary for every third party graphics engine. Nintendo should not want to go through that *** again. I don't see any reason to think it's likely except for the fact that the Wii had weird shaders too. But there's nothing in the Wii that Nintendo should try to bring back in the Wii U.
 
Just so long as the system is flexible enough that plenty of 3rd parties are willing to port, period without making endless Kojima-style excuses is all I care about.

This. This is all I ask. Don't use the screen, make it a simple status or inventory screen.. I don't care. Just spare us the flimsy "we need to do a game specifically for that system, and we're tossing-around ideas right now" excuses. I'd have much more respect for a developer that came forward and said, "we don't care for Nintendo, so no" instead of "not now, but maaaaybe sometime in the future. I'm not sure."

Developers: don't be chickenshit in your statements - come out and admit that your're really not considering the machine, so that we don't have to give any more thought to you for the rest of the generation. Thanks a bunch.
 
This. This is all I ask. Don't use the screen, make it a simple status or inventory screen.. I don't care. Just spare us the flimsy "we need to do a game specifically for that system, and we're tossing-around ideas right now" excuses. I'd have much more respect for a developer that came forward and said, "we don't care for Nintendo, so no" instead of "not now, but maaaaybe sometime in the future. I'm not sure."

Developers: don't be chickenshit in your statements - come out and admit that your're really not considering the machine, so that we don't have to give any more thought to you for the rest of the generation. Thanks a bunch.

This so much.
 
This. This is all I ask. Don't use the screen, make it a simple status or inventory screen.. I don't care. Just spare us the flimsy "we need to do a game specifically for that system, and we're tossing-around ideas right now" excuses. I'd have much more respect for a developer that came forward and said, "we don't care for Nintendo, so no" instead of "not now, but maaaaybe sometime in the future. I'm not sure."

Developers: don't be chickenshit in your statements - come out and admit that your're really not considering the machine, so that we don't have to give any more thought to you for the rest of the generation. Thanks a bunch.

The Upad excuses are also especially rediculous when you consider that the DS touch screen never caused 3rd parties to hesitate making DS games due to feeling imaginary pressure to fully utilize the touch screen and 3D effects on 3DS sure aren't stopping them either (and neither factor apparently ever had a bearing on Kojima porting Snake Eater, hmmm isn't that interesting? Hypocrite).
 
Yeah, I checked it again. Can't believe I missed that.

There was a discussion about this earlier (either in this thread or the last). General consensus was that it wasn't a clear cut case if the images on the uPad were indeed real-time in-game or a video feed of the environment (maybe with a couple of elements overlayed). I posted a youtube where it did seem to be rea-time content, but it was hard to say for sure.
 
Man, fixed function shaders in a 2012 console GPU? This is getting to be too much! What possible advantage could that serve outside of Wii BC?
The only thing I can think of is simpler abstraction, but even that is handled very well in modern systems and suites. So unless a bunch of people having only worked with let's say Sega Pico wants to create games for Wii U, I don't see why fixed function (shaders) would play any significant role outside, as you mentioned, backwards compatibility.
 
The Wii U GPU won't be built around or based on Southern Islands. From everything we know, it's a based on R700 but customized beyond recognition. It's unlike any off-the-shelf AMD GPU.

I talked to bgassassin a few days ago, and I believe we concluded that the chip is probably pretty slow on paper, maybe 300, 400GFLOPS or something, but extended with a couple of shortcuts to accelerate certain common, taxing operations.

If this turns out to be the case, then WiiU ends up to be to 360 what Wii was to GCN. I would find that a slap in the face tbh. GPU plus RAM would be where the WiiU could set itself apart from the 360, and rather cheaply, 7 years later. And would such a custom chip be really that much cheaper than an existing chip, taking into account the R&D that goes into it?

PS: doesn't it go against even the most pessimistic rumors? Could such a chip do 360 graphics on both screens?

ffs. How many times does this have to be said. It's possible for the Wii U to be "on par" and still be quite more powerful than the 360. "on par" and "more powerful" ARE NOT MUTUALLY EXCLUSIVE
xbox had almost twice the amount of ram of the ps2. Yet I don't think anyone would say the ps2 isn't on par of the xbox. None of these Vigil statements contradict eachother.

Basically what i've been saying. http://www.neogaf.com/forum/showpost.php?p=36352530&postcount=9301
 
I for one do not believe Nintendo & AMD have created Flipper 2, I think any fixed function hardware from Flipper will be there, for BC only. All Wii U games will use programmable shader tech which has been around since at or before the original Xbox launch. edit: BEFORE (NV20 / GeForce 3, early 2001).
 
So 1 xbox 360 + 0,5x Xbox 360 + 0,5x Xbox 360 = 2x Xbox 360, at the very least, considering the two screens. The Wii U isn't "on par" technologically, period, and this demonstration has always considered the worst that could have happened.

ffs. How many times does this have to be said?
It's possible for the Wii U to be "on par" and still be quite more powerful than the 360.
"on par" and "more powerful" ARE NOT MUTUALLY EXCLUSIVE
xbox had almost twice the amount of ram of the ps2. Yet I don't think anyone would say the ps2 isn't on par of the xbox. None of these Vigil statements contradict eachother.
 
By the way, is anyone else a little less interested in Darksiders II after seeing that boffo cgi trailer of the game? (Darksiders II: Death Strikes) I want a game that looks like THAT! With the moody coloring and surreal rendering.
 
No. None of the demos we've actually seen do anything meaningful with the Upad. Drawing a map on the Upad or mirroring the buffer that was already rendered on the main screen is a very small operation in the grand scheme of things. The E3 demos did not really showcase the Wii U power at all (and they couldn't have since it was the old problematic devkit).

Furthermore, rendering a different scene on the Upad takes roughly 0% extra CPU power, indeed almost 50% more pixel fillrate, and possibly 100% more vertex processing in an unoptimized scenario. If we assume the Xbox 360 is the baseline here, then that's very unimpressive. Of course, this is above being on par with the Xbox 360, but we're not arguing that. Whatever the case, if the Wii U turns out to be only able to run Xbox 360 games on the main screen and a smaller different viewpoint on the Upad (it likely won't, but let's pretend for now) then that is seriously unimpressive. That's why I consider these statements bad news.

Don't get me wrong, i understand that having for some titles (ports mainly) roughly the same things displayed on the main screen between the Wii U version and the Xbox 360 one can be disappointing. But "for some titles" and "roughly" are important variables here, you can expect a guaranteed 720p resolution, with solid framerate, better overall IQ (AA/Textures filtering), at least a small improvements in texture quality, and better/additional effects not handled by 360 GPU, for the average 3rd party ports. In the glimpse of an eye, in a crowded showroom, you'll not spot huge differences, but after some time, the contrast will be quite noticeable.

For all your doubts concerning the 0,5x xbox360 for the padlet, well, some others already answered you (a matter of resolution + calculating another scene from the same hardware).

And i can add something: imagine a TPS/FPS/BTA/A-RPG with a close-to-the-character view on the main screen, and a bird eye view akin to a RTS on the padlet. No only the system must handle two different angles of the same content, but it has to manage two UI, two ways of using the engine, more action-centered on the TV, and more strategic on the Upad, with additional AI, physics, even some specific sounds, etc. What you see on the main screen is a limited first-person field of view, picturing a part of a street. On the other hand, the padlet displays your character from above, in the middle of a town block, with many more units/characters, doing things that you can't see on the TV, and with a gameplay and strategic features specific to this screen. Titles in development use the Wii U like that, with at the very least, Xbox360+ visuals on the main screen, and current gen complex content at 480p on the padlet. Color me rather impressed, and irritated when i read "the Wii U is on par technically".

And all this was in a v4 dev kit, 3rd party, yadda yadda (i already introduced my information enough) context. So it will get better, and we can surely expect even more from first-party titles.
 
By the way, is anyone else a little less interested in Darksiders II after seeing that boffo cgi trailer of the game? (Darksiders II: Death Strikes) I want a game that looks like THAT! With the moody coloring and surreal rendering.

I'm not interested in DSII yet at all, yet anyway. I need to see more gameplay footage before I decide.
 
I’m sure these GAF-quotes we’re providing are a source of great entertainment at the Vigil Games morning meetings. :P So much out of so little..
 
ffs. How many times does this have to be said?
It's possible for the Wii U to be "on par" and still be quite more powerful than the 360.
"on par" and "more powerful" ARE NOT MUTUALLY EXCLUSIVE
xbox had almost twice the amount of ram of the ps2. Yet I don't think anyone would say the ps2 isn't on par of the xbox. None of these Vigil statements contradict eachother.

I was one of the first to put in perspective Vigil director statement, saying that it could fit the picture of the Wii U power that we draw from all the gathered informations.

But don't nit-pick the "on par" thing. When people hear "on par", as you saw in this thread + the dedicated ones for these Vigil interviews, it means for 90% of them EQUAL (or at best ROUGHLY EQUAL, meaning 10, 20, 30% better than the Xbox360). And from what i know, the Wii U ISN'T matching AT ALL this "on par", technically.

The only scenario where the Wii U is "on par", is VISUALLY, for certain titles. And i posted a message one weak before the vigil director intervention, announcing that in certain circumstances, in a specific context, what is displayed on Wii U versions won't be dramatically different (a tons better) that its current gen counterparts. (Still better though, and still in development, in the cases of the two titles for which my sources are working).
 
3DS isn't using unified shaders, there are no pixel shaders at all. Maestro is fixed function.

Pixel shaders don't have to been unified or fully programmable to be called pixel shaders. 3DS is capable of fixed function pixel shaders just like the Geforce 3 and 4 (they were in early GPUs too, just not exposed by DirectX) or the gpu in the original Xbox.
 
I’m sure these GAF-quotes we’re providing are a source of great entertainment at the Vigil Games morning meetings. :P So much out of so little..

Not only Vigil, but also the person(s) that Nintendo has monitoring this thread. They must think we're all mental cases.

I kinda admire the continued silence. The whole world could be chattering about the U, even most saying horrible/false things, and Nintendo stays strongly silent. It's gotta be confidence.. right?
 
Why do some people still expect 700-1000 GFLOP GPU in WiiU? I, mean, why? We already saw how Wii U looks like, its volume is a bit bigger than Wii. You just physically CANT fit that kind of power in it without it catching a fire, especially not with kind of cooler Wii U has. You can go for something like one guy on last page said. 640 stream processors at 300-350 mhz. I think around 400 GFLOPS should be something to expect.
 
Why do some people still expect 700-1000 GFLOP GPU in WiiU? I, mean, why? We already saw how Wii U looks like, its volume is a bit bigger than Wii. You just physically CANT fit that kind of power in it. You can go for something like one guy on last page said. 640 stream processors at 300-350 mhz. I think around 400 GFLOPS should be something to expect.

Guess what? It was stated (multiple times) that the Wii U WILL change. Design wise, not just what's inside the damn machine. So how is it not possible that the Wii U will get bigger in size, or is this another one of those "common sense" things?
 
Not only Vigil, but also the person(s) that Nintendo has monitoring this thread. They must think we're all mental cases.

I kinda admire the continued silence. The whole world could be chattering about the U, even most saying horrible/false things, and Nintendo stays strongly silent. It's gotta be confidence.. right?

Nintendo allready knows what we want. They decide what we want. So they don't monitor gaming forums.
 
Not only Vigil, but also the person(s) that Nintendo has monitoring this thread. They must think we're all mental cases.

I kinda admire the continued silence. The whole world could be chattering about the U, even most saying horrible/false things, and Nintendo stays strongly silent. It's gotta be confidence.. right?
They just want to surprise people :]
 
Guess what? It was stated (multiple times) that the Wii U WILL change. Design wise, not just what's inside the damn machine. So how is it not possible that the Wii U will get bigger in size, or is this another one of those "common sense" things.
It should change more than twice in size to fulfill those wishes and that won't happen. I mean, system with GPU around 400GFLOPS and twice more ram than PS360, even with that kind of controller, should give better visuals while still being smaller in size and not breaking the bank. If you expect 1TFLOP GPU in it, than you will be disappointed.
 
The Wii U GPU won't be built around or based on Southern Islands. From everything we know, it's a based on R700 but customized beyond recognition. It's unlike any off-the-shelf AMD GPU.

I talked to bgassassin a few days ago, and I believe we concluded that the chip is probably pretty slow on paper, maybe 300, 400GFLOPS or something, but extended with a couple of shortcuts to accelerate certain common, taxing operations.

What year was R700 released?
 
It should change more than twice in size to fulfill those wishes and that won't happen. I mean, system with GPU around 400GFLOPS and twice more ram than PS360, even with that kind of controller, should give better visuals while still being smaller in size and not breaking the bank. If you expect 1TFLOP GPU in it, than you will be disappointed.

How can you be so sure it won't happen? Serious question.
 
Not only Vigil, but also the person(s) that Nintendo has monitoring this thread. They must think we're all mental cases.

I kinda admire the continued silence. The whole world could be chattering about the U, even most saying horrible/false things, and Nintendo stays strongly silent. It's gotta be confidence.. right?
I know how to get them to talk, but it involves rather dirty methods and I don't wanna anger Nintendo of America again (won't elaborate, so don't ask). But Nintendo doesn't really need to talk much as the enthusiasts that care about the technical specifications aren't a large demographic and those that need to know about the tech already know about it. Surely could you suggest investors want to know what potential relation Wii U via it's power will have to third party developers especially considering rumors/facts about Microsoft's and Sony's respective next generation consoles start popping up; but I believe investors (and analysts) more care about what immediate effect(s) Wii U will have on the market, and that's via the Wii U remote, software (both packaged goods and digital) and connected services such as Netflix. And these things are most appropriate for E3, where they can be shown on a big screen with not only enthusiasts but general press following as well. And the mass market will know about these things closer to launch, or if the console appears for example on The Today Show or The Ellen DeGeneres Show, or in popular magazines like Reader's Digest.

If they are going to talk while maintaining a stable stock level, it will be mainly for us enthusiasts. And even then not much information may be provided, but then we are extracting info from the FCC, SEC and Singaporean government sites to satisfy our needs, so what point would a statement by Reggie actually make? I'm eager for something new, but I won't hold anything else than E3 as the epitome of megaton events.
 
How can you be so sure it won't happen? Serious question.
Because Nintendo doesn't want PS3 kinda case and controller the size of tablet in peoples living room and going by the rumors that won't happen. You won't see 1TFLOP GPU in it and dev kits aren't twice faster, as soon as some of you get over with it, the better.
 
ffs. How many times does this have to be said?
It's possible for the Wii U to be "on par" and still be quite more powerful than the 360.
"on par" and "more powerful" ARE NOT MUTUALLY EXCLUSIVE
xbox had almost twice the amount of ram of the ps2. Yet I don't think anyone would say the ps2 isn't on par of the xbox. None of these Vigil statements contradict eachother.

On par means equal to (though I can believe the vigil guy might not have meant it that way). Also PS2 wasn't even close to XBox, let alone on par.
 
Top Bottom