Wii U Speculation Thread The Third: Casting Dreams in The Castle of Miyamoto

I just went to the B3D messageboards and a long time poster wanted me to tell you maniacs that:



Get it together you freaking maniacs. :P
That much is obvious. Look at who and what they've been hiring. My money is on some kind of Metroid spinoff, but the revival of another old Adventure series isn't out of the question.
 
Yeah we talked about that earlier. The only thing that could worrying us, is the absence of some features nearly required in future HD titles, in the Wii U GPU. Even a meh situation regarding the raw calculation/rendering capacities of this component would be less preoccupying. See how a lot of entry-budgeted graphic cards can still run demanding dx11 games on PC with a solid CPU and a good amount of RAM, because they can handle the features. Ok you must lower the shadows, the AA, the distance of display, etc, but even with these sacrifices, these titles run and are still nice for a lot of them.

I would be way more shocked at the GPU not being a "DX11-equivalent" than if the GPU ended up with fewer than 500 ALUs. Looking at Nintendo's hardware history of making the hardware do as much of the work as possible, I would definitely expect them to use a current tessellation unit. If they are listening to devs as much as it sounds, I doubt they would steer them wrong.

I hope to god this is true (unless it has a completely generic dudebro aesthetic), because I want to see Nintendo step beyond their historical competencies and actually try to compete with Sony and MS for that market.

My sentiments exactly. And it goes back to some of the early discussions in that Nintendo has to establish what Wii U will be. Whether intentional or not they established Wii as a non-gamer console and the majority of the games released followed suit. Their challenge will be balancing that so isn't totally skewed to being a non-gamer console again.
 
The difference in the new generation will not be similar to Wii versus PS360, it might be similar to PS360 versus modern PC though.

However, look at the sales of PS360 games not being affected by their graphical inferiority. I wouldn't worry about the Wii U so much.
 
I really dunno if this gives us an answer about the GPU or not (probably not), but here's AMD's press release on the Wii U:

AMD and Nintendo Join Forces To Create A New Way To Enjoy Console Gaming Entertainment
AMD’s custom HD graphics processor enables immersive HD multimedia gaming entertainment for Nintendo’s new Wii U™ console



LOS ANGELES —6/7/2011
Today at E3, AMD (NYSE: AMD) announced its support for Nintendo’s newly-announced Wii U™ system, as a new way to enjoy HD console gaming entertainment. The custom AMD Radeon™ HD GPU reflects the best characteristics of AMD’s graphics technology solutions: high-definition graphics support; rich multimedia acceleration and playback; and multiple display support. As an industry leader, AMD has supplied the game console market with graphics expertise and ongoing support for more than 10 years.

“We greatly value our synergistic relationship with the AMD design team. The AMD custom graphics processor delivers the best of AMD’s world-class graphics expertise. AMD will support our vision of innovating play through unique entertainment experiences," said Genyo Takeda, senior managing director, Integrated Research & Development of Nintendo Co. Ltd.

“AMD shares Nintendo’s excitement for the new HD entertainment experience planned for the Wii U console,” said David Wang, corporate vice president of Silicon Engineering, AMD. “We’re proud to provide our leading-edge HD multimedia graphics engine to power the new entertainment features of the console. Nintendo is a highly-valued customer and we look forward to the launch in 2012.”

AMD custom graphics enable the new Nintendo system to provide exciting, immersive game play and interaction for consumers around the world. The AMD custom graphics processor features a modern and rich graphics processing core, allowing the new console to shine with new graphics capabilities.


http://www.amd.com/us/press-releases/Pages/amd-and-nintendo-join-2011june07.aspx
 
That much is obvious. Look at who and what they've been hiring. My money is on some kind of Metroid spinoff, but the revival of another old Adventure series isn't out of the question.

It is really not obvious conjecture. Not by any means. There is no certainty whether RetroStudios next game is for the 3DS or Wii U. Or even when it will release. Anything else is subjective speculation or fanboy dream really at this point.
 
Meh, I'll be happy just with something like this...

mQ6ZC.jpg
:p
 
It is really not obvious conjecture. Not by any means. There is no certainty whether RetroStudios next game is for the 3DS or Wii U. Or even when it will release. Anything else is subjective speculation or fanboy dream really at this point.
They've been hiring people with very "mature" artistic background. People who have worked on very intensive engines. People that create large worlds.
All that screams anything but a new Donkey Kong, and I'd be shocked if it was for the 3DS and not Wii U.
 
Regarding this Shader Model 5.0 (HLSL) talk, the greatest reason (in my opinion) for Nintendo to utilize something equivalent would be if they could get compute shaders running under GLSL or whatever API they are using. While compute shaders (also known as DirectCompute) are heavily bound to Microsoft, AMD and NVIDIA have worked much with it and I do not believe it's impossible for AMD to provide an API that can mimic what HLSL 5.0 does with compute shaders, or they could in conjunction with the Khronos Group supply OpenGL under a certain designation.

Here's a direct comparison of using compute shaders on Direct3D 10 and Direct3D 11 hardware:

Using Compute Shader on Direct3D 10.x Hardware

A compute shader on Microsoft Direct3D 10 is also known as DirectCompute 4.x.

If you use the Direct3D 11 API and updated drivers, feature level 10 and 10.1 Direct3D hardware can optionally support a limited form of DirectCompute that uses the cs_4_0 and cs_4_1 profiles. When you use DirectCompute on this hardware, keep the following limitations in mind:
The maximum number of threads is limited to D3D11_CS_4_X_THREAD_GROUP_MAX_THREADS_PER_GROUP (768) per group.
The X and Y dimension of numthreads is limited to D3D11_CS_4_X_THREAD_GROUP_MAX_X (768) and D3D11_CS_4_X_THREAD_GROUP_MAX_Y (768).
The Z dimension of numthreads is limited to 1.
The Z dimension of dispatch is limited to D3D11_CS_4_X_DISPATCH_MAX_THREAD_GROUPS_IN_Z_DIMENSION (1).
Only one unordered-access view can be bound to the shader (D3D11_CS_4_X_UAV_REGISTER_COUNT is 1).
Only RWStructuredBuffers and RWByteAddressBuffers are available as unordered-access views.
A thread can only access its own region in groupshared memory for writing, though it can read from any location.
SV_GroupIndex or SV_DispatchThreadID must be used when accessing groupshared memory for writing.
Groupshared memory is limited to 16KB per group.
A single thread is limited to a 256 byte region of groupshared memory for writing.
No atomic instructions are available.
No double-precision values are available.

Using Compute Shader on Direct3D 11.x Hardware

A compute shader on Direct3D 11 is also known as DirectCompute 5.0.

When you use DirectCompute with cs_5_0 profiles, keep the following items in mind:
The maximum number of threads is limited to D3D11_CS_THREAD_GROUP_MAX_THREADS_PER_GROUP (1024) per group.
The X and Y dimension of numthreads is limited to D3D11_CS_THREAD_GROUP_MAX_X (1024) and D3D11_CS_THREAD_GROUP_MAX_Y (1024).
The Z dimension of numthreads is limited to D3D11_CS_THREAD_GROUP_MAX_Z (64).
The maximum dimension of dispatch is limited to D3D11_CS_DISPATCH_MAX_THREAD_GROUPS_PER_DIMENSION (65535).
The maximum number of unordered-access views that can be bound to a shader is D3D11_PS_CS_UAV_REGISTER_COUNT (8).
Supports RWStructuredBuffers, RWByteAddressBuffers, and typed unordered-access views (RWTexture1D, RWTexture2D, RWTexture3D, and so on).
Atomic instructions are available.
Double-precision support might be available. For information about how to determine whether double-precision is available, see D3D11_FEATURE_DOUBLES.
Source: http://msdn.microsoft.com/en-us/library/windows/desktop/ff476331(v=vs.85).aspx

Now, the amount of threads available when using compute shaders on Direct3D 11 hardware is easy to see and why Nintendo would use, but more interestingly, atomic instructions are available under this mode. I'm not even going to try to explain atomic instructions as it more or less requires several sheets of paper, but here's a common definition:

A concurrent object is a data object shared by concurrent processes. Linearizability is a correctness condition for concurrent objects that exploits the semantics of abstract data types. It permits a high degree of concurrency, yet it permits programmers to specify and reason about concurrent objects using known techniques from the sequential domain. Linearizability provides the illusion that each operation applied by concurrent processes takes effect instantaneously at some point between its invocation and its response, implying that the meaning of a concurrent object's operations can be given by pre- and post-conditions. This paper defines linearizability, compares it to other correctness conditions, presents and demonstrates a method for proving the correctness of implementations, and shows how to reason about concurrent objects, given they are linearizable.
Easily put, linearizability is more expansive and easier to use than general sequential consistency, and linearizability also provides slightly lower development costs over its "competitor". If I were a developer creating Wii U software, I'd be delighted to have something like this to work with (as well as POR and a good overview of possible primitive types). Other than this and some buffers, HLSL 5 doesn't excite me too much though of course I see the immediate advantages over HLSL 4.1. I can see Nintendo using a HLSL 5 equivalent if they want to future proof Wii U, but if they are just aiming for a current gen compliant offering they would surely go with 4.1. But I think seeing as Epic has been vouching for Unreal Engine 4 lately and it's more or less inevitable success, the first option is more likely, if not for effects and such then for administrative work and development time.

Here's also a presentation by AMD regarding this (from SIGGRAPH '08): http://s08.idav.ucdavis.edu/boyd-dx11-compute-shader.pdf
 
It is really not obvious conjecture. Not by any means. There is no certainty whether RetroStudios next game is for the 3DS or Wii U. Or even when it will release. Anything else is subjective speculation or fanboy dream really at this point.

Pretty much, it's all up in the air really. I like the shenanigans about Retro Zelda, new-old hotness (StarTropics) and a new IP, but just from the main choices you can see that no one knows anything about where they're going with their current hiring or development. Which sucks since they're always the Nintendo branch that seems quietest for no reason.
 
They've been hiring people with very "mature" artistic background. People who have worked on very intensive engines. People that create large worlds.
All that screams anything but a new Donkey Kong, and I'd be shocked if it was for the 3DS and not Wii U.

But don't you see, they are making a Donkey Kong FPS.
 
Can everyone give a very detailed break down of what everything does and what it implies for reference? Rops, Shaders, RAM types (especially in regards to latency and its relationship with bus width and chip complexity of said bus), Tesselators etc. Would be nice :D

I second that, can someone give some links to good introductory sources of modern graphics tech?

I know bg you gave me something a while ago, but I can't find it.

Hopefully these links can do a better job of fulfilling your requests than I could.

http://http.developer.nvidia.com/GPUGems/gpugems_ch28.html

http://duriansoftware.com/joe/An-intro-to-modern-OpenGL.-Chapter-1:-The-Graphics-Pipeline.html
 
Search works great, I've already backed it up.

Industry revenue this gen is much higher than it's ever been by a massive margin and the length of the cycle is really long, it shouldn't be a surprise to anyone that the two leading consoles are going to end up the top two most profitable consoles in history.

Your evasiveness notwithstanding, I'd like to see the numbers you base your assessment on.

Quick tip: Industry Revenue being up doesn't mean the 360 is the second most profitable home console ever.

I'm willing to believe you, but you have to provide proof to back up your claim. Telling me to search for it myself is like telling me to wave your own hand over your ass after you fart in an elevator.

Also, unless I'm mistaken, despite MS's recent gains in the last few years, they still incurred a $3-$4billion loss in the first 2 years of the 360, meaning it would have had to set industry records for the remaining 4 years up to now in order to beat the PS2 or the N64 or what have you. Nintendo was pulling down billion dollar annuals in the mid nineties, and that's before inflation jacked up the dollar amounts in recent times.
 
Which sucks since they're always the Nintendo branch that seems quietest for no reason.

Can you name a boisterous branch of Nintendo? I certainly can not. The entire company operates in secrecy. It is just their tradition.

They've been hiring people with very "mature" artistic background. People who have worked on very intensive engines. People that create large worlds.
All that screams anything but a new Donkey Kong, and I'd be shocked if it was for the 3DS and not Wii U.

The graphic designers who worked on Donkey Kong Country Returns didn't work on Little Mermaid Pinball before. You can not positively assume anything from two random hires.
 
Rösti;36237342 said:
Regarding this Shader Model 5.0 (HLSL) talk, the greatest reason (in my opinion) for Nintendo to utilize something equivalent would be if they could get compute shaders running under GLSL or whatever API they are using. While compute shaders (also known as DirectCompute) are heavily bound to Microsoft, AMD and NVIDIA have worked much with it and I do not believe it's impossible for AMD to provide an API that can mimic what HLSL 5.0 does with compute shaders, or they could in conjunction with the Khronos Group supply OpenGL under a certain designation.

Here's a direct comparison of using compute shaders on Direct3D 10 and Direct3D 11 hardware:




Source: http://msdn.microsoft.com/en-us/library/windows/desktop/ff476331(v=vs.85).aspx

Now, the amount of threads available when using compute shaders on Direct3D 11 hardware is easy to see and why Nintendo would use, but more interestingly, atomic instructions are available under this mode. I'm not even going to try to explain atomic instructions as it more or less requires several sheets of paper, but here's a common definition:


Easily put, linearizability is more expansive and easier to use than general sequential consistency, and linearizability also provides slightly lower development costs over its "competitor". If I were a developer creating Wii U software, I'd be delighted to have something like this to work with (as well as POR and a good overview of possible primitive types). Other than this and some buffers, HLSL 5 doesn't excite me too much though of course I see the immediate advantages over HLSL 4.1. I can see Nintendo using a HLSL 5 equivalent if they want to future proof Wii U, but if they are just aiming for a current gen compliant offering they would surely go with 4.1. But I think seeing as Epic has been vouching for Unreal Engine 4 lately and it's more or less inevitable success, the first option is more likely, if not for effects and such then for administrative work and development time.

Here's also a presentation by AMD regarding this (from SIGGRAPH '08): http://s08.idav.ucdavis.edu/boyd-dx11-compute-shader.pdf


Thanks for your reply, Rösti, I cannot pretend I understand 3/4 of that, though. Perhaps you could put it in more layman's, as much as possible. Especially for those of us that lack the high-level of knowledge that you clearly have.
 
Also, unless I'm mistaken, despite MS's recent gains in the last few years, they still incurred a $3-$4billion loss in the first 2 years of the 360, meaning it would have had to set industry records for the remaining 4 years up to now in order to beat the PS2 or the N64 or what have you. Nintendo was pulling down billion dollar annuals in the mid nineties, and that's before inflation jacked up the dollar amounts in recent times.

That's why I'm not readily willing to believe that claim nor feel I need to verify it on my own.
 
It's nothing but surprising but we had Autodesk in yesterday and the presenter was showing us videos and what have you and was discussing Nintendo having licensed their products. The agreement to start using Autodesk software came in February when it was announced. I would've thought Nintendo would have been using Autodesk for quite some time.
 
Rösti;36237342 said:
Regarding this Shader Model 5.0 (HLSL) talk, the greatest reason (in my opinion) for Nintendo to utilize something equivalent would be if they could get compute shaders running under GLSL or whatever API they are using. While compute shaders (also known as DirectCompute) are heavily bound to Microsoft, AMD and NVIDIA have worked much with it and I do not believe it's impossible for AMD to provide an API that can mimic what HLSL 5.0 does with compute shaders, or they could in conjunction with the Khronos Group supply OpenGL under a certain designation.
Perhaps OpenCL?
 
It's nothing but surprising but we had Autodesk in yesterday and the presenter was showing us videos and what have you and was discussing Nintendo having licensed their products. The agreement to start using Autodesk software came in February when it was announced. I would've thought Nintendo would have been using Autodesk for quite some time.

Though the main thing abut the agreement was that Nintendo will be paying/giving those tools to all licensed Wii U developers. Was one of the "shocking" things during GDC.
 
I really dunno if this gives us an answer about the GPU or not (probably not), but here's AMD's press release on the Wii U:

http://www.amd.com/us/press-releases/Pages/amd-and-nintendo-join-2011june07.aspx
The press release is mostly interesting in that it does not only not contradict, but actually support what an AMD spokesperson said at E3: The Wii U GPU is brand new, leading edge, built on AMDs expertise, but still completely different from any existing AMD GPU. Why spend more than two years and millions of dollars if all you end up is something any mid range GPU from two years ago could do better?
 
Can you name a boisterous branch of Nintendo? I certainly can not. The entire company operates in secrecy. It is just their tradition.

Oh it's not about them being outspoken as well all know Nintendo likes their workers quiet and efficient. It's more that you never really get interviews or comments from Retro like you do from any of the calvary at NoJ among Miyamoto, Iwata, Sakurai and if we're on a lucky streak, Koizumi or Hayashida (sp?).

It's almost like Retro has no voice.
 
But you would think they would have done this alot sooner than February.
Is an incentive more than anything, probably to get more people on board. So far, developers had to pay a few thousands for them; so Nintendo providing them "for free" was a nice move.

It also shows that many thing has been changing around the Wii U and who knows what else might be announced. Even stuff like NFC, some developers are excited about it and experimenting with it, but can't talk about it.
 
Oh it's not about them being outspoken as well all know Nintendo likes their workers quiet and efficient. It's more that you never really get interviews or comments from Retro like you do from any of the calvary at NoJ among Miyamoto, Iwata, Sakurai and if we're on a lucky streak, Koizumi or Hayashida (sp?).

It's almost like Retro has no voice.

Retro's voice is Tanabe isn't it?
 
Well, OpenCL provides some atomic functions, though not as consistently as Direct3D, but I suppose Nintendo could make good use of it. I can't see anything resembling a compute shader for it though.

And herzogzwei1989, I tried to come up with a simple explanation for linearizability vs sequential consistency, but I failed. It's not that interesting anyway and doesn't really tell anything about if we'll see Unreal Engine 4 or any other fancy engine on Wii U, it simply would mean simplifying work a bit and if you really wanna stretch it, it could mean some good use of memory processes. The compute shader though is described in an intelligible language on Microsoft's page, which I linked, so have a look at that. And to note, I made an error, linearizability is not cheaper than sequential consistency, at least in most systems.
 
The press release is mostly interesting in that it does not only not contradict, but actually support what an AMD spokesperson said at E3: The Wii U GPU is brand new, leading edge, built on AMDs expertise, but still completely different from any existing AMD GPU. Why spend more than two years and millions of dollars if all you end up is something any mid range GPU from two years ago could do better?

Are there any news im not following?about the gpu?
 
One thing that we forget is how actively MS/Sony seek, care third party relations, I say this at the light of Kojima comments, Nintendo have a lot to prove this gen, if they don't get support from the start they will repeat the Wii cycle again, if a specific developer is not sure about Wii U (read Kojima), I think it's Nintendos job try to convince to get it on board. Sony/MS didn't get this massive support by waiting third parties, they go after them all the time.

I say Iwata should call Kojima personally, a say "what's wrong with you"
 
Are there any news im not following?about the gpu?
Maybe?

The AMD press release says the GPU is modern and built on AMD's expertise, but doesn't say it's based on some existing Radeon chip. An AMD spokesperson said at E3 2011 that the GPU was custom and not based on any existing AMD GPU (as reported by German tech site Golem.de). And according to members of the team (via Linkedin), AMD started working on the GPU in June 2009 or earlier, and finished in June 2011 or later.
 
I just went to the B3D messageboards and a long time poster wanted me to tell you maniacs that:



Get it together you freaking maniacs. :P

Hype gone.

My guess is DKCR 2 is a holiday 2013 title.
 
That much is obvious. Look at who and what they've been hiring. My money is on some kind of Metroid spinoff, but the revival of another old Adventure series isn't out of the question.

Startropics??

:D
 
One thing that we forget is how actively MS/Sony seek, care third party relations, I say this at the light of Kojima comments, Nintendo have a lot to prove this gen, if they don't get support from the start they will repeat the Wii cycle again, if a specific developer is not sure about Wii U (read Kojima), I think it's Nintendos job try to convince to get it on board. Sony/MS didn't get this massive support by waiting third parties, they go after them all the time.

I say Iwata should call Kojima personally, a say "what's wrong with you"

I would love that.

Hell, I'd love Iwata to publicly call-out shitty excuses, excuses that don't hold-up particularly well under scrutiny. It'd be hysterical to witness.. although I'm sure some media folks would portray Nintendo as the big bad bully in that kind of scenario.
 
I want a Fire Emblem U so bad, but I think the serie will be portable only from now on =(

wsippel said:
Maybe?

The AMD press release says the GPU is modern and built on AMD's expertise, but doesn't say it's based on some existing Radeon chip. An AMD spokesperson said at E3 2011 that the GPU was custom and not based on any existing AMD GPU (as reported by German tech site Golem.de). And according to members of the team (via Linkedin), AMD started working on the GPU in June 2009 or earlier, and finished in June 2011 or later.

Do want, I really like those little marvels, I miss the days of custom chips like Gekko, Flipper.
Wii used a something like a custom CPU/GPU?
 
The press release is mostly interesting in that it does not only not contradict, but actually support what an AMD spokesperson said at E3: The Wii U GPU is brand new, leading edge, built on AMDs expertise, but still completely different from any existing AMD GPU. Why spend more than two years and millions of dollars if all you end up is something any mid range GPU from two years ago could do better?

Man, as much as I want a custom RV770 / HD 4850/70, the other part of me wants a completely custom GPU that is as innovative as Flipper was in 1999-2000 (even at launch in 2001) in GameCube. That was one of the best pieces of console GPU engineering that I have ever seen.

I agree, why would Nintendo spend many many millions of dollars on something a midrange GPU from 2 years ago could do. I hope AMD/Nintendo surprises us with a delightful new marvel of engineering.
 
Hype gone.

My guess is DKCR 2 is a holiday 2013 title.


Please for the love of God, let Retro's title NOT be DKCR 2 for 2012/2013. Please if they must, revisit the game later, but I'd like something crazy to happen like Retro is collaborating with EAD on Zelda. Or, at least something senseble like F-ZERO or Star Fox, or even, heaven forbid, Metroid.
 
I say Retro's next game is a sequel to DKCR for 3DS. And although that wouldn't nesessarily be a bad thing, the meltdowns on here would be glorious.

Really though, a Zelda spinoff is likely because my intuitive guts tell me so.
 
All the Fry Cry talk made think about a Wii U port Ubisoft would probably never do.
The subscreen would make the level editor easier to use.
 
Maybe?

The AMD press release says the GPU is modern and built on AMD's expertise, but doesn't say it's based on some existing Radeon chip. An AMD spokesperson said at E3 2011 that the GPU was custom and not based on any existing AMD GPU (as reported by German tech site Golem.de). And according to members of the team (via Linkedin), AMD started working on the GPU in June 2009 or earlier, and finished in June 2011 or later.

But who said it was midrange?
 
Retro's next game will be an open world, post-apocalyptic, survival-horror reboot of the Ice-Climber's franchise where Nana and Popo must beat radioactive seals to a bloody death with giant hammers.
 
Top Bottom