The Wii U Speculation Thread V: The Final Frontier

  • Thread starter Thread starter Rösti
  • Start date Start date
Status
Not open for further replies.
iPYwCuRuliShf.gif

I was looking for this earlier.
 
As far as diminishing returns go, yeah you double the poly count in some games and it doesn't make a difference to 99% of the people out there. I do think there are some genres that could heavily benefit from more polys. Sand box style games and huge open world games.

It's more than that. I'm not just talking about diminishing returns in terms of how much people notice improvements. I'm also talking about the diminishing returns of technology. In order to make the expected huge jump in performance for the current generation, the Xbox 360 had to draw roughly twice as much as the original Xbox did, and the Playstation 3 used something like four times the power of the Playstation 2. The problem is that they can't go around with consoles next generation that draw 400W or more, and beyond that, it'd get even sillier. So they'll have to start making the choice of waiting longer and longer per generation or living with smaller and smaller boosts in performance.


* caveat: The hard drive makes up for some of the difference in power, though not most of it.
 
Bgassassin's awkward non-response response to my post just confirms he is Wii U insider. Thanks! ;)
 
http://youtu.be/Eht92AO-QII

Nice video to put diminishing returns into perspective. Ps1 was very rough, ps2 put in a lot more detail and then ps3 smoothed it out. For most genres I don't think nextgen will look all that mindblowingly better. Maybe for GTA like games, but the rest doubt it, just look at pc games. Better looking sure, mindblowing nope.

Maybe when you only look at racing games.

But when you look at fps, the leap this gen is probably the biggest:

Gone from this:
zDvao.png

To this:
EcQKj.jpg

To this:
T9xq6.jpg
 
I think what next gen can bring us, as far as improving graphics go, is better lighting and texturing. Right now, IMHO most of what holds back current gen games, is just that. Though I'm not as worried for the Wii-U since we've seen the bird demo demonstrating some really nice lighting effects.

As far as diminishing returns go, yeah you double the poly count in some games and it doesn't make a difference to 99% of the people out there. I do think there are some genres that could heavily benefit from more polys. Sand box style games and huge open world games.

Though again, lighting is one thing I feel has not kept pace with other aspects of GPUs, and hardware.

If Nintendo has put some kind of fixed function, hardware focused on just lighting, etc into their GPU, that is going to be a huge huge deal in shrinking the hardware gap IMHO.

Which is why I personally believe the gap between Wii U and XB3/PS4 is going to be much smaller than PS2 vs Xbox, that sort of difference was not only performance based (xbox being ~2X the specs) but PS2 couldn't do lighting effects, and the texture effects that made xbox games like Halo and Halo 2 stand out so well.

I really think not bringing GCN to the Wii U can also shrink the chip drastically, which is what I've been talking about with GCN being a heavy architecture, a smaller chip can run at higher clocks without taking a large hit to the TDP... Something greater than 640sp @ 600Mhz should be completely possible.
 
What I mean by cutting out "GCN function" is simply not designing a chip for computing as GCN is so obviously built, if it was designed to ignore compute in favor of pure gaming performance. (of course it would still compute) you would have a card closer to 680gtx (closer as in philosophy and not performance) This means it would out perform GCN per shader per hz per watt in games, while leaving the compute functionality of GCN on the sideline in Nintendo's version of a 28nm or even 32nm chip.

I guess for me it's the context of how you're saying it that's throwing me off. As I said before I don't see AMD making a non-GCN GPU at 28nm. At the same time I don't see a similar philosophy to CUDA by "stripping away GCN". VLIW4 didn't show it. At the same "stripping away GCN" the ALUs wouldn't/shouldn't change meaning their performance should still be the same making there be no gain from taking that away. IMO this is where the flaw seems to be coming from and is working more off of assumption. I mean VLIW5 sounds like what you are talking about and it saw poor ALU utilization after the change to DX10. I guess we are saying the same thing in different words as the GPU as I see it would have the "GCN cut out", but I don't see some kind of performance gain because of it.

I also don't care about PS4's and Xbox3's architecture, but if it is GCN, Nintendo can make up a lot of ground as it is a heavy chip (size/performance is hurt in order to focus on compute units)

See above.

You yourself believes that Nintendo could be using a 640sp chip, you give it the clock of 600mhz for no reason other than you think it should be around there, and I've shown that it could as high as 800mhz if it were designed similarly to the e6760 chip. I've also conceded that it might not be as high as 800mhz, but there is nothing locking it in at 600mhz if it's a smaller chip like we both assume, they should at the very least be able to push out 650-700mhz on a 32nm process.

No, I gave a reason you're clearly ignoring. As I said, I gave it that clock based on the fact that Nintendo doesn't use high clock speeds. That's not just a random assumption I made. And while it would be nice for them to push for a higher speed while using a smaller process, there's nothing there to say they will other than hope.

And as for the power gap of the PS2 to Xbox, yes it will be smaller, you and I both expect the same functionality between Wii U and the other consoles (as far as graphical effects) We also both know this:

1. His post wasn't just about cost, it also points to heat, a smaller chip produces less heat, and the Wii U chip will be much smaller than Xbox3/PS4 without losing more than half the power, thanks to "losing" GCN in favor of performance.

I'm assuming you didn't mean to quote EC as he wasn't the one I responded to. That said I don't see how you got that from DC's post. He focused more on cost, heat, and size. Not graphical abilities.

Having Fixed functionality would be absolutely great, and if so, there is no way that that skyrim video should be taken seriously at all... my E350 can play skyrim at those settings (~20fps) which is a 80GFLOPs gpu, the card needed to run skyrim on ultra is a 6970, a 3TFLOPs+ card, the difference in those graphic processing units is so much more larger than were we put Wii U and Xbox3 that it comes off as a huge joke...

I think you have gathered a lot of information, but you simply don't know how to use it, you aren't comparing properly to the hardware you are expecting out of the boxes, even if the boxes were 4x the performance and we tossed out fix functionality, you'd end up with medium to ultra settings on skyrim.

This suggests you didn't see some of my other posts while still assuming some kind of performance gain just from taking away GCN's compute functionality. Although my assumed max gap range would be 1080p on high settings vs 720p on low settings. But it seems you're trying to create a max gap that will be much smaller than what it will be and I'm not seeing the logic behind your support for that.
 
http://youtu.be/Eht92AO-QII

Nice video to put diminishing returns into perspective. Ps1 was very rough, ps2 put in a lot more detail and then ps3 smoothed it out. For most genres I don't think nextgen will look all that mindblowingly better. Maybe for GTA like games, but the rest doubt it, just look at pc games. Better looking sure, mindblowing nope.

I agree, on PC in terms of visuals there is a noticeable difference but nothing that looks major enough to warrant new expensive hardware, nothing that would seem like PS2 to PS3 even.

However PC games are limited themselves by older hardware, most PC games are made to run on older hardware, especially if it's a console port/multiplatform game; you can't hide low polygons. When console generation leaps, then PC usually follows and ups it's lowest common denominator in hardware. Of course there will always be games which require the latest and greatest on PC. but these are never the norm, mostly exceptions.
 
Maybe when you only look at racing games.

Even if you only use racing games you can still see huge leaps forward with visuals

gtevolutionfinalx1inz.jpg


I also think Coolwhip's argument about PC games is off. Those devs rarely have the budget that console devs have and they also work on constantly evolving hardware. Console devs are able to sit on the exact same hardware over a 5-6 year period where they're really able to exploit what it's capable doing.
 
Which is why I personally believe the gap between Wii U and XB3/PS4 is going to be much smaller than PS2 vs Xbox, that sort of difference was not only performance based (xbox being ~2X the specs) but PS2 couldn't do lighting effects, and the texture effects that made xbox games like Halo and Halo 2 stand out so well.

I really think not bringing GCN to the Wii U can also shrink the chip drastically, which is what I've been talking about with GCN being a heavy architecture, a smaller chip can run at higher clocks without taking a large hit to the TDP... Something greater than 640sp @ 600Mhz should be completely possible.

What's your idea of a reasonable upper limit to TDP in a console (or particularly in the expected Wii U form factor)?



See, if most Wii U games were to look like that, I'd be satisfied.

I'd be satisfied if they looked like the first one but were insanely good games.
 
The next person who interviews Reggie or someone else from Nintendo really needs to ask them flat out what the power draw from Wii U will be. Tell him that people like to plan ahead for their electricity budget and with the system launching to year they have to have the tdp figure. American families struggling to get by will appreciate this info.
 
The next person who interviews Reggie or someone else from Nintendo really needs to ask them flat out what the power draw from Wii U will be. Tell him that people like to plan ahead for their electricity budget and with the system launching to year they have to have the tdp figure. American families struggling to get by will appreciate this info.
reggie has no idea I bet. The only thing he seems to know is the buzzword "1080p".
 
This was the generation of shaders coming into thier own, and a tremendous leap over the last.
 
yeah, I don't think it was so much as a power leap as it was a feature set differentiation (i.e. programmable shaders vs. fixed function).
 
You wouldn't.


edit: For clarification, if all Wii U games looked like the first pic, even if they were each the GOTY, I admit I would be disappointed at all the bitching on forums and among my friends and such. So that is, in effect, a lack of complete satisfaction.

Gameplay and design matters most but no way in this fucking HD age and this year would I be happy with games that looked like THAT.
 
And since we're on the topic of graphical leaps.

zelda_hd_for_wii_u_____by_starfoch-d4jcyto.png


Can't wait to find out.

It'll probably be hard to compare as the style will be totally different than the tech demo.

However, maybe they will see how much praise the demo got and make it resemble that art but I doubt they care what we like nor should they really.
 
So what's going to be the next big advancement in gaming? It probably wouldn't be anything related to graphical power, as we're slowing down.

Storytelling?
 
Gameplay and design matters most but no way in this fucking HD age and this year would I be happy with games that looked like THAT.

Aye. Maybe when I get a flat panel television set in a few years, my impression of such things will change. ;)

I do respect and appreciate changes in horsepower that actually do add to the fun of games, mind you. I just don't really care about it that much on a gut level. I'm probably going to be going back to Nethack a couple times a year until the day that I die.
 
Aye. Maybe when I get a flat panel television set in a few years, my impression of such things will change. ;)

I do respect and appreciate changes in horsepower that actually do add to the fun of games, mind you. I just don't really care about it that much on a gut level. I'm probably going to be going back to Nethack a couple times a year until the day that I die.

I still only own a SD set myself.

I know what you are saying but I expect some bells and whistles all the same but if Pikmin 3 looks similar to the older games but is the best in mechanics and concepts, I'll be more than happy.

I guess it depends on the developer and game.

I expect Retro's game to set a new standard for Nintendo games visually.
 
I guess for me it's the context of how you're saying it that's throwing me off. As I said before I don't see AMD making a non-GCN GPU at 28nm. At the same time I don't see a similar philosophy to CUDA by "stripping away GCN". VLIW4 didn't show it. At the same "stripping away GCN" the ALUs wouldn't/shouldn't change meaning their performance should still be the same making there be no gain from taking that away. IMO this is where the flaw seems to be coming from and is working more off of assumption. I mean VLIW5 sounds like what you are talking about and it saw poor ALU utilization after the change to DX10. I guess we are saying the same thing in different words as the GPU as I see it would have the "GCN cut out", but I don't see some kind of performance gain because of it.



See above.



No, I gave a reason you're clearly ignoring. As I said, I gave it that clock based on the fact that Nintendo doesn't use high clock speeds. That's not just a random assumption I made. And while it would be nice for them to push for a higher speed while using a smaller process, there's nothing there to say they will other than hope.



I'm assuming you didn't mean to quote EC as he wasn't the one I responded to. That said I don't see how you got that from DC's post. He focused more on cost, heat, and size. Not graphical abilities.



This suggests you didn't see some of my other posts while still assuming some kind of performance gain just from taking away GCN's compute functionality. Although my assumed max gap range would be 1080p on high settings vs 720p on low settings. But it seems you're trying to create a max gap that will be much smaller than what it will be and I'm not seeing the logic behind your support for that.

The performance gain comes from an assumption that the compute hardware in GCN takes away from it's overall performance, I base this on 680GTX which is much faster than 580GTX thanks in large part to shedding it's compute performance.

The other, much more noticeable performance gain that is correct, is in smaller size of chip, the die size would be much smaller than if they used GCN, for instance, a similar performing 6000 series to the HD7770, would as DCking said, be under 100mm^2 while the HD7770 is 123mm^2, that means higher clock speeds at the same TDP.

You are also wrong about Nintendo not using higher clock speeds to gain performance in the past... The Wii's GPU is clocked something like 50% faster than Gamecube's, and other than memory sizes, the GPU didn't change with more TEV units right? so the assumption that Nintendo wouldn't do that with the Wii U, is a pretty big assumption indeed, especially when it is basically free performance.

And yes I did mean to quote EC, though I thought he was saying that the Xbox had effects that PS2 did not, which added to the distance between them, and that is my point, Wii U has the same effects as Xbox3 and if there is fixed function components, that distance should be much more superficial.

The only real way you are going to have the distance you are describing is if they do actually have separate GPGPU chips so that they can do much higher physics allowing for stuff like individual hair movement, particle effects, and just a more lively world.

Also, I am assuming that Nintendo will use a 28nm/32nm GPU, either should be enough to break the 600mhz you are assuming without raising the TDP very much at all...

Having said all of this, I don't know if they will, this is speculation on what they COULD do with the Wii U casing, I do believe that they will target the highest GFLOPs they can from the GPU without pushing the TDP higher than half for GPU.
 
Billy Hatcher 2 Wii U confirmed.

Haha.

This reminds me. Does anyone have that Iwata LOLNO.gif? I've been looking for it off and on the last few years and I can't seem to find it.

I think what next gen can bring us, as far as improving graphics go, is better lighting and texturing. Right now, IMHO most of what holds back current gen games, is just that. Though I'm not as worried for the Wii-U since we've seen the bird demo demonstrating some really nice lighting effects.

As far as diminishing returns go, yeah you double the poly count in some games and it doesn't make a difference to 99% of the people out there. I do think there are some genres that could heavily benefit from more polys. Sand box style games and huge open world games.

Though again, lighting is one thing I feel has not kept pace with other aspects of GPUs, and hardware.

If Nintendo has put some kind of fixed function, hardware focused on just lighting, etc into their GPU, that is going to be a huge huge deal in shrinking the hardware gap IMHO.

The reason why I don't focus too much on that shrinking the gap is because exclusives for PS4 and Xbox 3 would essentially negate that since Wii U exclusives are going to be the main, if not only, games to use that extra power.

Bgassassin's awkward non-response response to my post just confirms he is Wii U insider. Thanks! ;)

I still don't know anything. >_>

See, if most Wii U games were to look like that, I'd be satisfied.

You should be satisfied then.
 
If only, if only.

It would be great if the rumor that Nintendo was looking to revive dormant third party franchises was true, but with the idea of releasing them as $20 games on the eShop. There are so many properties that publishers would never want to risk at retail, but a game like a Billy Hatcher sequel could do alright for itself through eShop.
 
I realize mixing partial information from Ideaman and bgassassin is kind of like mixing numbers from Famitsu and Media Create BUT. What's confusing me right now is it sounds like it isn't easy to take a 720p game from X360 and make it 1080p on Wii U, but it should be easy to take a 720p game from Wii U and make it 1080p with even better image quality on Future Xbox. That makes the latter gap seem much larger than the former, though most other things we hear wouldn't indicate that. Does Wii U have some sort of resolution bottleneck?
Gahiggidy said:
The next person who interviews Reggie or someone else from Nintendo really needs to ask them flat out what the power draw from Wii U will be. Tell him that people like to plan ahead for their electricity budget and with the system launching to year they have to have the tdp figure. American families struggling to get by will appreciate this info.
I love this man.
 
And since we're on the topic of graphical leaps.

zelda_hd_for_wii_u_____by_starfoch-d4jcyto.png


Can't wait to find out.

N64 is the biggest difference, damn that tech demo looks like PS1 compared to how the games turned out.

Maybe we will be saying the Zelda tech demo looked like an Xbox 360 game by the time the real Zelda game comes out for Wii U? :p
 
I realize mixing partial information from Ideaman and bgassassin is kind of like mixing numbers from Famitsu and Media Create BUT. What's confusing me right now is it sounds like it isn't easy to take a 720p game from X360 and make it 1080p on Wii U, but it should be easy to take a 720p game from Wii U and make it 1080p with even better image quality on Future Xbox. That makes the latter gap seem much larger than the former, though most other things we hear wouldn't indicate that. Does Wii U have some sort of resolution bottleneck?

I love this man.

I think the big difference there is that Wii U exclusives running in 720p will look a great deal better than multiplatform games on the system because BG assumes that the automatic fixed functionality that doesn't decrease performance that we keep hearing about, will be ignored by third parties making ports... I somewhat doubt this, I think if it's so easy to do, that any team that works with the system in the next 2 years, will know how to easily use the fixed functionality of the gpu to produce those visuals... I mean just look at the bird demo that was made in a few weeks, even a week before the event or so, they changed it so drastically, that the floor demo was literally almost a generation beyond what they showed in the conference video.

So 360 games built today can't be uprezed to 1080p on Wii U in some cases, because the extra umph of the system is in fixed functions, though I think for the most part, the devs just aren't trying to run it at higher resolutions, and I sort of expect aliens to actually run at a higher resolution than the consoles, maybe as much as 1080p.
 
The performance gain comes from an assumption that the compute hardware in GCN takes away from it's overall performance, I base this on 680GTX which is much faster than 580GTX thanks in large part to shedding it's compute performance.

The other, much more noticeable performance gain that is correct, is in smaller size of chip, the die size would be much smaller than if they used GCN, for instance, a similar performing 6000 series to the HD7770, would as DCking said, be under 100mm^2 while the HD7770 is 123mm^2, that means higher clock speeds at the same TDP.

You are also wrong about Nintendo not using higher clock speeds to gain performance in the past... The Wii's GPU is clocked something like 50% faster than Gamecube's, and other than memory sizes, the GPU didn't change with more TEV units right? so the assumption that Nintendo wouldn't do that with the Wii U, is a pretty big assumption indeed, especially when it is basically free performance.

And yes I did mean to quote EC, though I thought he was saying that the Xbox had effects that PS2 did not, which added to the distance between them, and that is my point, Wii U has the same effects as Xbox3 and if there is fixed function components, that distance should be much more superficial.

The only real way you are going to have the distance you are describing is if they do actually have separate GPGPU chips so that they can do much higher physics allowing for stuff like individual hair movement, particle effects, and just a more lively world.

Also, I am assuming that Nintendo will use a 28nm/32nm GPU, either should be enough to break the 600mhz you are assuming without raising the TDP very much at all...

Having said all of this, I don't know if they will, this is speculation on what they COULD do with the Wii U casing, I do believe that they will target the highest GFLOPs they can from the GPU without pushing the TDP higher than half for GPU.

I understood the basis for your assumption. :)

It's just that there was nothing logical to me that said it would work. nVidia uses a totally different architecture so it can't be used as a comparison. And to give tangible proof of that on the AMD side here are some benchmarks of a 7870 (1Ghz, 1280 ALUs, 2560 GFLOPs) vs a 6970 (880Mhz, 1536 ALUs, 2703 GFLOPs). As you'll see there's not only no gains, the 7870 has better performance in a lot of benchmarks.

http://www.tomshardware.com/reviews/radeon-hd-7870-review-benchmark,3148-5.html

You're also taking what I said out of context again. I said "high clocks" not "higher clocks". I tend to be very deliberate in my word choices when discussing things. Knowing about the GC's clock changes helped me make sure of that in this case.

And again it's not about the GPU being able to be clocked higher as I agree it could and would be a non-issue for the case. It's about Nintendo's philosophy that suggests it won't happen.
 
Status
Not open for further replies.
Top Bottom