Wii U Speculation Thread of Brains Beware: Wii U Re-Unveiling At E3 2012

Status
Not open for further replies.
When Reggie says 1080P, I assume he means up-scaled or compatible.

Only the more graphically basic games will render 1080P native. Hell, I'm expecting a lot of PS720 games to render below 1080P.

knowing nintendo, a potential mario galaxy 3 would look pretty similar to 1 and 2 but would be running in 1080p at 60 fps.
they're weird when it comes to graphics.
 
1080p needs to be the standard with no exceptions. I know it won't happen, but it's pretty sad that people have these sets and play games that look like ass on them due to sub-hd resolutions. This and microtransactions are the worst thing about gaming now imo. Pick a resolution (like how it used to be) and stick with it.
 
Only the more graphically basic games will render 1080P native. Hell, I'm expecting a lot of PS720 games to render below 1080P.
I'm thinking the Wii U will probably not suffer as much of a performance hit when rendering 1080p. This means that it is a more viable tradeoff to render games at 1080p if it fits the game. Pikmin 3 could really profit from being rendered in 1080p. Same goes for Super Smash Bros.. Lots of Nintendo games are not about visual fidelity per se, but could profit from more pixels on the screen.
 
When Reggie says 1080P, I assume he means up-scaled or compatible.

Only the more graphically basic games will render 1080P native. Hell, I'm expecting a lot of PS720 games to render below 1080P.

Brain_stew mentioned on B3D that the eDRAM will be able to do "720p w/ MSAA or 1080p rendering in a single pass".
 
Brain_stew mentioned on B3D that the eDRAM will be able to do "720p w/ MSAA or 1080p rendering in a single pass".

So this means we could see every game in 1080p native w/o MSAA or displayed in 720p with it? If this is the case, would there be any reason not to display in 1080p?
 
going back and watching the HD zelda demo, I can't fathom how people think uncharted 3 or any current gen game looks better. The lighting is so far and away superior to anything on current consoles, as far as I've seen anyway.
 
going back and watching the HD zelda demo, I can't fathom how people think uncharted 3 or any current gen game looks better. The lighting is so far and away superior to anything on current consoles, as far as I've seen anyway.

History also dictates Nintendo has always made their Zeldas look even better than their tech demos, so we'll see.
 
knowing nintendo, a potential mario galaxy 3 would look pretty similar to 1 and 2 but would be running in 1080p at 60 fps.
they're weird when it comes to graphics.

And look stunning. Third party games will mostly do 720P/30fps and look 70-80% as good as PS720.

bgassassin said:
Brain_stew mentioned on B3D that the eDRAM will be able to do "720p w/ MSAA or 1080p rendering in a single pass".

Generally speaking, games like Mario Galaxy could use the eDRAM for 1080P and Call of Duty for example could use it to do 720P w/MSAA.

Until more solid specs leak out, it is all guess work though.
 
Does anyone know if their is an IR blaster under the IR window on the tablet, or if it is just a IR camera sensor?

I'm wondering if it could be used as a universal remote of some sort.

Will the tablet be smart enough to do anything independently of the console? - probably not
 
So this means we could see every game in 1080p native w/o MSAA or displayed in 720p with it? If this is the case, would there be any reason not to display in 1080p?

More like it could be done without putting the burden on the main memory that has less bandwidth than the eDRAM. So they can go beyond that (1080p with some version of AA), they just have to use the system memory. That in turn takes available memory away from other things it can be used on. That's why you'll see people talking about why you won't see a standard set by console makers because devs will probably prefer a lower resolution and putting more resources into image quality (IQ).

going back and watching the HD zelda demo, I can't fathom how people think uncharted 3 or any current gen game looks better. The lighting is so far and away superior to anything on current consoles, as far as I've seen anyway.

From what I've seen of U3, it's probably because Naughty Dog did a very good job with how they added pre-rendered scenes in with actual gameplay.
 
I've said it before in this thread and I'm going to state it again.

The thing that separates the consoles from each other next gen, aren't going to be graphics...
Nintendo is going to be about their tablet controller, and who knows in regards to their online experience.

I think their online will be pretty decent. Only because publishers in Japan are pushing for online games for the future. So Nintendo will at least want to accommodate them.

Regarding their tablet, I think they will also introduce the Vitality Sensor built into the tablet. They claimed the technology wasn't fixed yet, but I think this is what they will be working on.
 
Brain_stew mentioned on B3D that the eDRAM will be able to do "720p w/ MSAA or 1080p rendering in a single pass".
I don't understand AA algorithms much. The last few years, people seem to have been getting very excited about FXAA and MLAA as AA algorithms that eliminate bandwidth and memory usage in exchange for some processing power. Could Nintendo have one such algorithm implemented on the Wii U's intelligent EDRAM for free AA? Because it seems MSAA requires extra space on the EDRAM, and that seems to be quite a limitation on a fixed size framebuffer.
 
The question is how much will that difference translate into what we end up seeing.

In terms of software releases it won't matter at all for the first 2-3 years, after that, though? It could become problematic if its towards the upper end and Wii U hasn't established itself as a strong seller of third party software.
 
I don't understand AA algorithms much. The last few years, people seem to have been getting very excited about FXAA and MLAA as AA algorithms that eliminate bandwidth and memory usage in exchange for some processing power. Could Nintendo have one such algorithm implemented on the Wii U's intelligent EDRAM for free AA? Because it seems MSAA requires extra space on the EDRAM, and that seems to be quite a limitation on a fixed size framebuffer.

Post processing algorithms can't deal with sub-pixel aliasing problems properly, there's still a place for MSAA, particularly if you can implement it with an utterly trivial performance cost.

So this means we could see every game in 1080p native w/o MSAA or displayed in 720p with it? If this is the case, would there be any reason not to display in 1080p?

You're still doubling the number of pixels to fill and shade by rendering at 1080p. Being able to fit your whole framebuffer into super fast eDRAM will certainly help performance but its only one piece of the puzzle.

More like it could be done without putting the burden on the main memory that has less bandwidth than the eDRAM. So they can go beyond that (1080p with some version of AA), they just have to use the system memory. That in turn takes available memory away from other things it can be used on. That's why you'll see people talking about why you won't see a standard set by console makers because devs will probably prefer a lower resolution and putting more resources into image quality (IQ).
.

No, in such a situation, you'd simply employ tiling, like you do if you want MSAA in a HD X360 game. In reality, most developers will probably use a post-process AA for native 1080p games or even opt for something like 1280x1080p w/2xmsaa and a native 1080p HUD, as that would fit into the eDRAM just fine.

I'm just repeating his post, which I'm assuming came directly from Nintendo for it to be that specific. So whatever components there are to the equation must have already been calculated by Nintendo to say that.

Just because you can fit a 1080p framebuffer in eDRAM, doesn't say anything of its complexity. It gets rid of one of the biggest obstacles to 3D graphics (bandwidth) but you're still left with ~2 million pixels to fill and shade.

If the eDRAM is more a general pool of memory (as I suspect it is) rather than the 360's scratch pad, then many developers will probably want to use that space for multiple other buffers, rendering at 1080p is going to ruin your chances of that.
 
Post processing algorithms can't deal with sub-pixel aliasing problems properly, there's still a place for MSAA, particularly if you can implement it with an utterly trivial performance cost.
Ok, I sort of understand that. The reason I asked was that if the Wii U can render 1080p frames on the EDRAM, then implemented FXAA or MLAA in the EDRAM it would allow 'free' 1080p with 'free' jaggies removal, right? So do you think FXAA/MLAA could be hardwired in the Wii U? Or would a shader based implementation be a viable choice as well?
 
Actually I could do with a couple more buttons, especially when playing FPS, e.g. for strafing or for crouching.

Let me make my point a little clearer.
The nunchank has only 2 bottons; if you set one for jumping and the other one for running, the A and B buttons on Wii Remote usually are meant for main and secondary weapons, even considering 1, 2, - & + buttons I still feel there's a lack of options.

Something like this?

cafe_remote.jpg


The slide pad and the directional buttons would all read as D-Pad in backwards compatibility mode.
 
Ok, I sort of understand that. The reason I asked was that if the Wii U can render 1080p frames on the EDRAM, then implemented FXAA or MLAA in the EDRAM it would allow 'free' 1080p with 'free' jaggies removal, right? So do you think FXAA/MLAA could be hardwired in the Wii U? Or would a shader based implementation be a viable choice as well?

There's absolutely no need to hardwire a post AA effect like that into fixed function hardware. Modern GPUs (like the Wii U's) can get good results in under a ms with general purpose hardware and the algorithms are evolving and improving all the time. If a vastly improved algorithm comes around that is easy and cheap to implement in a shader (like SMAA for example) then you've just gone and wasted a chunk of transistors.

There's no such thing as "free" 1080p rendering. The eDRAM would fix the issue of the increased bandwidth necessary for rendering in 1080p, nothing more, nothing less.
 
Ok, I sort of understand that. The reason I asked was that if the Wii U can render 1080p frames on the EDRAM, then implemented FXAA or MLAA in the EDRAM it would allow 'free' 1080p with 'free' jaggies removal, right? So do you think FXAA/MLAA could be hardwired in the Wii U? Or would a shader based implementation be a viable choice as well?


FXAA is just another shader post-process, so hardwiring it would make as much sense as hardwiring depth of field or bloom or motion blur i.e. there is no single implementation.

MLAA can be done on the GPU, but it's not very well tailored to how GPUs operate. That's one of the reasons why FXAA was developed in the first place. It's also worth noting continued development of FXAA is making it much more viable with better quality than MLAA in motion. MLAA is kind of a dead end for performance considerations, but it depends on the hardware situation, with PS3 being somewhat unique in terms of SPU utilization.
 
Something like this?

cafe_remote.jpg


The slide pad and the directional buttons would all read as D-Pad in backwards compatibility mode.

Yes, quite, although buttons may be shaped a little better perhaps, by the way this reminds me of Gamecube's amazingly comfortable controller: X and Y were quite handy to reach near that A button, so huge it was quite impossible to miss.

The four arrow buttons in the image above could be used to switch weapons, to crouch and to jump, as for nunchank Z and C buttons may be used to run and strafe.

Incidentally, one big fault in Wii Remote's design in my opinion is that +, -, and especially 1 and 2 buttons are far from being handy, when you're engaged in some frenzy FPS action: you have to move your thumb all the way down whereas it should be supposed to stay near the A button, maybe another trigger on the back could have been more handy.

Ah, I forgot to say, of couse I agree a couple of analogic triggers like those on GC controllers could be very useful for accellerate/brake or for other purposes as well.
 
No, in such a situation, you'd simply employ tiling, like you do if you want MSAA in a HD X360 game. In reality, most developers will probably use a post-process AA for native 1080p games or even opt for something like 1280x1080p w/2xmsaa and a native 1080p HUD, as that would fit into the eDRAM just fine.

Thanks. I never really understood how/when tiling would come into play, just that it does.

Just because you can fit a 1080p framebuffer in eDRAM, doesn't say anything of its complexity. It gets rid of one of the biggest obstacles to 3D graphics (bandwidth) but you're still left with ~2 million pixels to fill and shade.

If the eDRAM is more a general pool of memory (as I suspect it is) rather than the 360's scratch pad, then many developers will probably want to use that space for multiple other buffers, rendering at 1080p is going to ruin your chances of that.

Oh I wasn't making any assumptions other than Nintendo themselves had figured out what that "complexity" was for them to possibly make that claim. But this info helps me out some. Sounds like there's so many different things to know about buffers that even getting a basic understanding would take awhile.

(like SMAA for example)

Speaking of this, I've seen it mentioned before, but was not able to find really anything on it. Anyone who can answer it can, but what is SMAA exactly and what benefits (if any) are there compared to MSAA?
 
Speaking of this, I've seen it mentioned before, but was not able to find really anything on it. Anyone who can answer it can, but what is SMAA exactly and what benefits (if any) are there compared to MSAA?

And I'll add another question - serious because I'm interested to know. Does it really even matter? Is there really much of a significant, noticable difference during realtime gaming between most of the methods?
 
Something like this?

cafe_remote.jpg


The slide pad and the directional buttons would all read as D-Pad in backwards compatibility mode.

Can you make a mock up of one placing a d-pad at the bottom where the "1 & 2" buttons are. Below is a pic of a prototype and I think it would have worked a lot better had they gone that route. They could have raised some of the buttons surround the "A button" like they did with GameCube. That way it would have worked in a horizontal and vertical position. Adding a d-pad at the bottom of your prototype would have been the perfect button layout imo. If they had done this, we wouldn't have had near the problems with waggle as we do now.

9A7sx.jpg


Edit - Didn't even see the slide pad. Seems kind of pointless to me since there's already a nun-chuck but I still think it would work better where the "1 & 2" buttons are.
 
Post processing algorithms can't deal with sub-pixel aliasing problems properly, there's still a place for MSAA, particularly if you can implement it with an utterly trivial performance cost.



You're still doubling the number of pixels to fill and shade by rendering at 1080p. Being able to fit your whole framebuffer into super fast eDRAM will certainly help performance but its only one piece of the puzzle.



No, in such a situation, you'd simply employ tiling, like you do if you want MSAA in a HD X360 game. In reality, most developers will probably use a post-process AA for native 1080p games or even opt for something like 1280x1080p w/2xmsaa and a native 1080p HUD, as that would fit into the eDRAM just fine.



Just because you can fit a 1080p framebuffer in eDRAM, doesn't say anything of its complexity. It gets rid of one of the biggest obstacles to 3D graphics (bandwidth) but you're still left with ~2 million pixels to fill and shade.

If the eDRAM is more a general pool of memory (as I suspect it is) rather than the 360's scratch pad, then many developers will probably want to use that space for multiple other buffers, rendering at 1080p is going to ruin your chances of that.
Every time I get excited about this thing, I get let down again. Thanks for the reality check though.
 
Every time I get excited about this thing, I get let down again. Thanks for the reality check though.

There's nothing to get down about. All he said was that devs might prefer to use the eDRAM differently if it's general purpose. No need to take that meaning anything beyond that. I don't see stew as the type to imply things.
 
There's nothing to get down about. All he said was that devs might prefer to use the eDRAM differently if it's general purpose. No need to take meaning anything beyond that. I don't see stew as the type to imply things.

It's just frustrating that technology keeps advancing and that devs are compromising resolution (which is image quality) for more effects. I realize this isn't important to most casuals because they can't see it, but once you do, you can't not see it. It doesn't help matters that PC's have capable of this for years now. There is no reason devs shouldn't be making it their own standard regardless of whether you have to or not. To me, the starting point at next gen should be 1080p and 60 fps. Adjust and tweak the graphics from there. I can see sacrificing framerate in an adventure game for better graphics or even a single player campaign of a first person shooter, but it's a necessity for multiplayer in FPS's, racers and action games. Higher resolution also equates to better gameplay opportunities as well. The one saving grace of the Wii U is that you can always play on the tablet controller and not have to worry about washed out visuals. I just don't want to have to do that.
 
<--- Still can't tell the difference in resolutions besides on PC games.

=/

For BluRays and Console games... it's all the same to me.
I mean, even side by side it's hard to tell sometimes.
 
<--- Still can't tell the difference in resolutions besides on PC games.

=/

For BluRays and Console games... it's all the same to me.
I mean, even side by side it's hard to tell sometimes.
480p vs 720 or 1080 doesn't look different? I'm actually curious. Going from Skyrim to SS, I immediately noticed it, mostly because of the textures.
 
Textures is one thing.
Just plain pixel counting is another.
I notice the improved field of view in PC games when I up the resolution, but I don't actually notice it looking better.
 
Textures is one thing.
Just plain pixel counting is another.
I notice the improved field of view in PC games when I up the resolution, but I don't actually notice it looking better.

I think it boils down to the distance of a PC monitor vs. a full TV set. A PC you need to be close enough to read fine print, where the TV you can be several feet away to enjoy the full resolution. If you're already that far, the jump from 720 to 1080 isn't always too obvious.
 
Since its going to have ati gpu its going to have an excellent scaler and image quality. wii/gamecube titles would actually look somewhat acceptable on hd sets.

to bad nintendo is so cheap that they removed gamecube support. Even though its the same fucking system as the wii.

Yeah, just like the PS3 is the same system as the PS2...
 
to bad nintendo is so cheap that they removed gamecube support. Even though its the same fucking system as the wii.

You want them to support two full generations of backwards compatibility, when Sony and MS can't even do one?

For me there's so many GCN games I don't own that I would prefer to see them on an eshop where I can download them, and I would almost guarantee that will come.
 
Really? The Wii is not emulating the Cube. The actual chips from the Cube are in the Wii unit.

Aren't the chips of the Cube actually the chips of the Wii?
that is why the Wii downclocks itself into GC mode when you play GC games.

The New Wii's are capable of everything the old Wii's are except the fact that Nintendo removed the GC ports, Mem card slots and the ability to insert mini disc.

If Wii U can do Wii it should be able to do GC too since the chips are one in the same.
Wii U just won't support any GC controllers and mem cards or disc.
 
You want them to support two full generations of backwards compatibility, when Sony and MS can't even do one?

For me there's so many GCN games I don't own that I would prefer to see them on an eshop where I can download them, and I would almost guarantee that will come.

Nintendo's handhelds also have a history of just going back one generation for legacy support. DS dropped GB compatibility and 3DS dropped GBA compatibility.
 
Aren't the chips of the Cube actually the chips of the Wii?
that is why the Wii downclocks itself into GC mode when you play GC games.

The New Wii's are capable of everything the old Wii's are except the fact that Nintendo removed the GC ports, Mem card slots and the ability to insert mini disc.

If Wii U can do Wii it should be able to do GC too since the chips are one in the same.
Wii U just won't support any GC controllers and mem cards or disc.

This assumes the Wii U is doing Wii BC in hardware. Even if the games only run in 480p still, this isn't an indicator of software or hardware emulation since Nintendo has always been known as emulating their games as close to the original as possible, including output resolution. They're not real fans of applying smear-o-vision upscale filters to their previous generation games, and haven't had a history of supporting more than the previous generation of BC even in their handhelds.

Implementing GameCube BC just has too much baggage to justify supporting a 10 year old system that was a niche when it was active, let alone now.
 
Aren't the chips of the Cube actually the chips of the Wii?
that is why the Wii downclocks itself into GC mode when you play GC games.

The New Wii's are capable of everything the old Wii's are except the fact that Nintendo removed the GC ports, Mem card slots and the ability to insert mini disc.

If Wii U can do Wii it should be able to do GC too since the chips are one in the same.
Wii U just won't support any GC controllers and mem cards or disc.

We have no confirmation on how the wii u will handle its BC. The chips could be in there, they could not.
 
Really? The Wii is not emulating the Cube. The actual chips from the Cube are in the Wii unit.

Actually, in many ways, the Wii is an overclocked and beefed up cube - architecturally, they're almost exactly the same. If you can emulate the Wii (which I'm assuming that the Wii U will do), you can emulate the gamecube with very little additional effort. If they somehow do decide to include Wii guts in the Wii U, then it should still be able to run gamecube code with very little work.

The point is that if it can play Wii games, it should be able to play gamecube games, with small exceptions like problems with lack of compatibility with gamecube controllers.
 
besides, isn't that multi-size disc drive just another added cost?

I remember when Reggie mentioned in an interview that although he knew people wanted to know about the controller, he commented that "it's not the only thing about the Revolution that's revolutionary." The disc drive that accepts GameCube and full-sized discs was pretty revolutionary, too, he suggested. Classic Reggie.
 
Actually, in many ways, the Wii is an overclocked and beefed up cube - architecturally, they're almost exactly the same. If you can emulate the Wii (which I'm assuming that the Wii U will do), you can emulate the gamecube with very little additional effort. If they somehow do decide to include Wii guts in the Wii U, then it should still be able to run gamecube code with very little work.

The point is that if it can play Wii games, it should be able to play gamecube games, with small exceptions like problems with lack of compatibility with gamecube controllers.

And it will be able to emulate gamecube games...

As $12 downloads on the E-Shop. ;)
 
besides, isn't that multi-size disc drive just another added cost?
I imagine so, and then they'd have to support memory cards and controller ports too. Even if those parts combined cost $1, by the end of the generation we're looking at 50+ million less. profit if they added it. I'd prefer Nintendo to put that dosh into games, online, buying exclusives or even a nice holiday for Aonuma and family so he can take a break before getting into the next Zelda.
 
Status
Not open for further replies.
Top Bottom