herzogzwei1989
Banned
Project Flanker - even if it has nothing to do with Wii U - is a badass name.
I fully agree. I mean, Project Flanker can out-flank the competition!
Project Flanker - even if it has nothing to do with Wii U - is a badass name.
That should not happen. Do you connect it via HDMI? Then go to your graphics settings and search for Overscan, you can adjust a slider there so that the picture will fit perfectly.
Should be above that if it's like the 3DS.Does anyone know how good the speakers are on the U-pad? Would it be asking too much to expect at least cell phone speaker quality?
Wanker =I
Very cool. Thanks Rösti!
I hope that the DD roster of Wii U will have some suprises for us.
And hey, if we give the GPU a German name, then we pick Project Salatgurke!
Does anyone know how good the speakers are on the U-pad? Would it be asking too much to expect at least cell phone speaker quality?
Wanker =I
Flanker rhymes with "wanker."
That's all I got.
How about the "Dunklehund?"
(and beaten.. like a wanker)
Very cool. Thanks Rösti!
I hope that the DD roster of Wii U will have some suprises for us.
And hey, if we give the GPU a German name, then we pick Project Salatgurke!
Who said learning isn't fun?*tries to recall what he meant to say earlier today.. fails. Starts thought process anew*
Ok, first thing first: on the subject of WiiU's upscaling 360 games to 1080p (or lack thereof).
The general statement that 'if the GPU is sufficiently more powerful one should be able to trivially upres a workload that was originally running at a given fb res on an inferior GPU' is generally correct. Yet (there's always a catch, isn't there?), that assumes that 'sufficiently' here meets a certain definition: that whatever the original GPU's bottleneck at each frame of the original workload might be, the new GPU should be able to meet that at the higher res. Including if there's a frame of content which is entirely fillrate-limited (in the general case: fragment ALUs + texel fetch + ROPs), the new GPU should have enough of those resources (i.e count * speed) to meet new_resld_res of those original fillrate requirements. 'But we did not upres the texture assets - why more texel fetching?' would be a good question to ask at this point. The answer is really simple: because due to the way common texture filtering works (hint: it uses texture LODs), unless a texture was always projected 1:1 on the screen (say, for UI elements), chances are that most of the time the GPU does not need all the texels from that texture for the original res, thus most of the time it does not use the top-most LOD for a given texture. Now, when we upres the fb, we also upres whatever primitives use that texture, so we actually implicitly tell the GPU to use a higher-res LOD of the same texture*. So unless the GPU was already using the top-most LOD of the texture (unlikely) at the lower-res fb, at the higher-res fb now it will start using a higher-res LOD. Ergo more tex fetching. So as you see, FLOPs (i.e. shader ALU resources) don't give the full x-times-more-powerful picture (if somebody actually thought they were ; )
But let's assume the new GPU has all the grunt it takes to upres the original content. Does that mean we can flip that res switch and enjoy the show? The alert reader should be already guessing the answer: nope. We have to make sure one more tiny detail is satisfied: that the GPU is made sure (by the game logic) to not idle at any given moment. IOW, that the GPU is always busy with work, and never waiting for the game logic to feed it. Or IYOW (in yet other words), we have to make sure that our game's performance is entirely GPU-limited. Then, and only then we can say that if the new GPU has enough grunt to face the old GPU's 'worst nightmares' (read: bottlenecks) at the new res, then we can flip that res switch and get a framerate at least as good as the original. Now, most well optimised game code does not let the GPU idle, much or at all. Unfortunately, shit happens, particularly with new platforms featuring early software - HAL, middlewares, compilers, etc.
Now, onto the next thing from the though process. .. Oops. Buffer underrun. Sorry, going down for maintenance.
* Actually it's more complicated than that, but for the sake of this post that explanation should do.
There should be no reason the pad view would have to be rendered in 720p.Mitsurugi said:Correct me if I'm wrong, in a local milti-player game like COD, Wii U would have to process two separate and potentially very dissimilar looking 720p/60fps video streams (in this instance, first player could be on the ground in a tank and the second player could be in a chopper providing air support), send one directly to the HDTV, downscale the other's res to 480p and send it to the controller. Right?
There should be no reason the pad view would have to be rendered in 720p.
It's s 480p LCD. That's pretty much all it'll do.I'm wondering.. has Nintendo (or anyone else) said anything about a locked or capped resolution for the uPad?
*tries to recall what he meant to say earlier today.. fails. Starts thought process anew*
Ok, first thing first: on the subject of WiiU's upscaling 360 games to 1080p (or lack thereof).
The general statement that 'if the GPU is sufficiently more powerful one should be able to trivially upres a workload that was originally running at a given fb res on an inferior GPU' is generally correct. Yet (there's always a catch, isn't there?), that assumes that 'sufficiently' here meets a certain definition: that whatever the original GPU's bottleneck at each frame of the original workload might be, the new GPU should be able to meet that at the higher res. Including if there's a frame of content which is entirely fillrate-limited (in the general case: fragment ALUs + texel fetch + ROPs), the new GPU should have enough of those resources (i.e count * speed) to meet new_resld_res of those original fillrate requirements. 'But we did not upres the texture assets - why more texel fetching?' would be a good question to ask at this point. The answer is really simple: because due to the way common texture filtering works (hint: it uses texture LODs), unless a texture was always projected 1:1 on the screen (say, for UI elements), chances are that most of the time the GPU does not need all the texels from that texture for the original res, thus most of the time it does not use the top-most LOD for a given texture. Now, when we upres the fb, we also upres whatever primitives use that texture, so we actually implicitly tell the GPU to use a higher-res LOD of the same texture*. So unless the GPU was already using the top-most LOD of the texture (unlikely) at the lower-res fb, at the higher-res fb now it will start using a higher-res LOD. Ergo more tex fetching. So as you see, FLOPs (i.e. shader ALU resources) don't give the full x-times-more-powerful picture (if somebody actually thought they were ; )
But let's assume the new GPU has all the grunt it takes to upres the original content. Does that mean we can flip that res switch and enjoy the show? The alert reader should be already guessing the answer: nope. We have to make sure one more tiny detail is satisfied: that the GPU is made sure (by the game logic) to not idle at any given moment. IOW, that the GPU is always busy with work, and never waiting for the game logic to feed it. Or IYOW (in yet other words), we have to make sure that our game's performance is entirely GPU-limited. Then, and only then we can say that if the new GPU has enough grunt to face the old GPU's 'worst nightmares' (read: bottlenecks) at the new res, then we can flip that res switch and get a framerate at least as good as the original. Now, most well optimised game code does not let the GPU idle, much or at all. Unfortunately, shit happens, particularly with new platforms featuring early software - HAL, middlewares, compilers, etc.
Now, onto the next thing from the though process. .. Oops. Buffer underrun. Sorry, going down for maintenance.
* Actually it's more complicated than that, but for the sake of this post that explanation should do.
It's s 480p LCD. That's pretty much all it'll do.
Just so long as the system is flexible enough that plenty of 3rd parties are willing to port, period without making endless Kojima-style excuses is all I care about.The only thing I think people need to accept (at least in the beginning) is that a lot of devs won't up-res or update their game for the Wii U. At least not initially. Granted it would be great to see higher texture quality at the very least, but whether or not companies do this is the real question. Don't expect it.
Yeah, but you don't need to design a GPU for three years unless you plan to go in a completely non-standard direction. Bolting an I/O and an audio processor on top of an off-the-shelf GPU is a matter of months, not years. Whatever they have now should be very different from a PC GPU.how about it takes years to design a GPU, if they started in 2009, that means it took them a little under 3 years for them to finish and have chips ready in fall 2012.
No. None of the demos we've actually seen do anything meaningful with the Upad.
I'd say these days, as long as Epic and maybe Crytek are cool with Nintendo's custom extensions, they'd be fine. It's even possible companies like Epic had input there as well - instead of designing an engine around a GPU, they'd design a GPU around an engine. The audio algorithms embedded in the Gamecube GPU were in part developed by Factor 5 for example, and they also did the MusyX audio middleware for the system.
Yeah, I checked it again. Can't believe I missed that.We did. The bird demo was showing two distinct views of the same scene.
Just so long as the system is flexible enough that plenty of 3rd parties are willing to port, period without making endless Kojima-style excuses is all I care about.
This. This is all I ask. Don't use the screen, make it a simple status or inventory screen.. I don't care. Just spare us the flimsy "we need to do a game specifically for that system, and we're tossing-around ideas right now" excuses. I'd have much more respect for a developer that came forward and said, "we don't care for Nintendo, so no" instead of "not now, but maaaaybe sometime in the future. I'm not sure."
Developers: don't be chickenshit in your statements - come out and admit that your're really not considering the machine, so that we don't have to give any more thought to you for the rest of the generation. Thanks a bunch.
This. This is all I ask. Don't use the screen, make it a simple status or inventory screen.. I don't care. Just spare us the flimsy "we need to do a game specifically for that system, and we're tossing-around ideas right now" excuses. I'd have much more respect for a developer that came forward and said, "we don't care for Nintendo, so no" instead of "not now, but maaaaybe sometime in the future. I'm not sure."
Developers: don't be chickenshit in your statements - come out and admit that your're really not considering the machine, so that we don't have to give any more thought to you for the rest of the generation. Thanks a bunch.
Yeah, I checked it again. Can't believe I missed that.
The only thing I can think of is simpler abstraction, but even that is handled very well in modern systems and suites. So unless a bunch of people having only worked with let's say Sega Pico wants to create games for Wii U, I don't see why fixed function (shaders) would play any significant role outside, as you mentioned, backwards compatibility.Man, fixed function shaders in a 2012 console GPU? This is getting to be too much! What possible advantage could that serve outside of Wii BC?
The Wii U GPU won't be built around or based on Southern Islands. From everything we know, it's a based on R700 but customized beyond recognition. It's unlike any off-the-shelf AMD GPU.
I talked to bgassassin a few days ago, and I believe we concluded that the chip is probably pretty slow on paper, maybe 300, 400GFLOPS or something, but extended with a couple of shortcuts to accelerate certain common, taxing operations.
ffs. How many times does this have to be said. It's possible for the Wii U to be "on par" and still be quite more powerful than the 360. "on par" and "more powerful" ARE NOT MUTUALLY EXCLUSIVE
xbox had almost twice the amount of ram of the ps2. Yet I don't think anyone would say the ps2 isn't on par of the xbox. None of these Vigil statements contradict eachother.
So 1 xbox 360 + 0,5x Xbox 360 + 0,5x Xbox 360 = 2x Xbox 360, at the very least, considering the two screens. The Wii U isn't "on par" technologically, period, and this demonstration has always considered the worst that could have happened.
No. None of the demos we've actually seen do anything meaningful with the Upad. Drawing a map on the Upad or mirroring the buffer that was already rendered on the main screen is a very small operation in the grand scheme of things. The E3 demos did not really showcase the Wii U power at all (and they couldn't have since it was the old problematic devkit).
Furthermore, rendering a different scene on the Upad takes roughly 0% extra CPU power, indeed almost 50% more pixel fillrate, and possibly 100% more vertex processing in an unoptimized scenario. If we assume the Xbox 360 is the baseline here, then that's very unimpressive. Of course, this is above being on par with the Xbox 360, but we're not arguing that. Whatever the case, if the Wii U turns out to be only able to run Xbox 360 games on the main screen and a smaller different viewpoint on the Upad (it likely won't, but let's pretend for now) then that is seriously unimpressive. That's why I consider these statements bad news.
By the way, is anyone else a little less interested in Darksiders II after seeing that boffo cgi trailer of the game? (Darksiders II: Death Strikes) I want a game that looks like THAT! With the moody coloring and surreal rendering.
ffs. How many times does this have to be said?
It's possible for the Wii U to be "on par" and still be quite more powerful than the 360.
"on par" and "more powerful" ARE NOT MUTUALLY EXCLUSIVE
xbox had almost twice the amount of ram of the ps2. Yet I don't think anyone would say the ps2 isn't on par of the xbox. None of these Vigil statements contradict eachother.
So much use of bold on this page
#StrongThreadsForStrongMen
3DS isn't using unified shaders, there are no pixel shaders at all. Maestro is fixed function.
Did you just made a hashtag in the forum? Cos it looks like you just made a fucking hastag in a forum.
Did you just make a hashtag in the thread? 'Cause it looks like you just made a fucking hashtag in a thread.
Im sure these GAF-quotes were providing are a source of great entertainment at the Vigil Games morning meetings.So much out of so little..
Why do some people still expect 700-1000 GFLOP GPU in WiiU? I, mean, why? We already saw how Wii U looks like, its volume is a bit bigger than Wii. You just physically CANT fit that kind of power in it. You can go for something like one guy on last page said. 640 stream processors at 300-350 mhz. I think around 400 GFLOPS should be something to expect.
Not only Vigil, but also the person(s) that Nintendo has monitoring this thread. They must think we're all mental cases.
I kinda admire the continued silence. The whole world could be chattering about the U, even most saying horrible/false things, and Nintendo stays strongly silent. It's gotta be confidence.. right?
They just want to surprise people :]Not only Vigil, but also the person(s) that Nintendo has monitoring this thread. They must think we're all mental cases.
I kinda admire the continued silence. The whole world could be chattering about the U, even most saying horrible/false things, and Nintendo stays strongly silent. It's gotta be confidence.. right?
It should change more than twice in size to fulfill those wishes and that won't happen. I mean, system with GPU around 400GFLOPS and twice more ram than PS360, even with that kind of controller, should give better visuals while still being smaller in size and not breaking the bank. If you expect 1TFLOP GPU in it, than you will be disappointed.Guess what? It was stated (multiple times) that the Wii U WILL change. Design wise, not just what's inside the damn machine. So how is it not possible that the Wii U will get bigger in size, or is this another one of those "common sense" things.
The Wii U GPU won't be built around or based on Southern Islands. From everything we know, it's a based on R700 but customized beyond recognition. It's unlike any off-the-shelf AMD GPU.
I talked to bgassassin a few days ago, and I believe we concluded that the chip is probably pretty slow on paper, maybe 300, 400GFLOPS or something, but extended with a couple of shortcuts to accelerate certain common, taxing operations.
It should change more than twice in size to fulfill those wishes and that won't happen. I mean, system with GPU around 400GFLOPS and twice more ram than PS360, even with that kind of controller, should give better visuals while still being smaller in size and not breaking the bank. If you expect 1TFLOP GPU in it, than you will be disappointed.
I know how to get them to talk, but it involves rather dirty methods and I don't wanna anger Nintendo of America again (won't elaborate, so don't ask). But Nintendo doesn't really need to talk much as the enthusiasts that care about the technical specifications aren't a large demographic and those that need to know about the tech already know about it. Surely could you suggest investors want to know what potential relation Wii U via it's power will have to third party developers especially considering rumors/facts about Microsoft's and Sony's respective next generation consoles start popping up; but I believe investors (and analysts) more care about what immediate effect(s) Wii U will have on the market, and that's via the Wii U remote, software (both packaged goods and digital) and connected services such as Netflix. And these things are most appropriate for E3, where they can be shown on a big screen with not only enthusiasts but general press following as well. And the mass market will know about these things closer to launch, or if the console appears for example on The Today Show or The Ellen DeGeneres Show, or in popular magazines like Reader's Digest.Not only Vigil, but also the person(s) that Nintendo has monitoring this thread. They must think we're all mental cases.
I kinda admire the continued silence. The whole world could be chattering about the U, even most saying horrible/false things, and Nintendo stays strongly silent. It's gotta be confidence.. right?
Because Nintendo doesn't want PS3 kinda case and controller the size of tablet in peoples living room and going by the rumors that won't happen. You won't see 1TFLOP GPU in it and dev kits aren't twice faster, as soon as some of you get over with it, the better.How can you be so sure it won't happen? Serious question.
ffs. How many times does this have to be said?
It's possible for the Wii U to be "on par" and still be quite more powerful than the 360.
"on par" and "more powerful" ARE NOT MUTUALLY EXCLUSIVE
xbox had almost twice the amount of ram of the ps2. Yet I don't think anyone would say the ps2 isn't on par of the xbox. None of these Vigil statements contradict eachother.
Rösti;36476822 said:I know how to get them to talk, but it involves rather dirty methods and I don't wanna anger Nintendo of America again (won't elaborate, so don't ask).