The Wii U Speculation Thread V: The Final Frontier

  • Thread starter Thread starter Rösti
  • Start date Start date
Status
Not open for further replies.
You only have to cope with it if it's true, but I really don't see it happening any time soon. People were saying the same thing going into this generation. You can look at stuff like GT5's photo mode and it will often times look significantly better than the in game visuals. As long as devs have the budget, console power and time that they need, then the graphics will continue to improve at a fairly steady rate.
You can think that all you want, doesn't make it any more true. The fact that we're in a generation of $400-500-at-launch consoles and that it should be expected just meet the minimum expectation of a generational leap forward, or that we're arguing the MOST ABSOLUTELY FINITE aspects of hardware as though they'll make a gigantic difference tells me everything I need to know about how real diminishing returns actually is.

I was saying that at the time!

The tech demos I loved for the GC were the 100 Marios and Metroid ones.

It's the noses and the shiny white unitard under Link's tunic. Seriously bad. He's not a gay Jewish elf, he's LINK, FFS.
 
The reason why I don't focus too much on that shrinking the gap is because exclusives for PS4 and Xbox 3 would essentially negate that since Wii U exclusives are going to be the main, if not only, games to use that extra power.

Well how used it is, is going to totally depend on Nintendo and their SDK for it. If they make it incredibly simple and easy to use it would be a no brainer for devs to use. Let's hope it's easier and more documented than Tev in GC and Wii was.
 
I realize mixing partial information from Ideaman and bgassassin is kind of like mixing numbers from Famitsu and Media Create BUT. What's confusing me right now is it sounds like it isn't easy to take a 720p game from X360 and make it 1080p on Wii U, but it should be easy to take a 720p game from Wii U and make it 1080p with even better image quality on Future Xbox. That makes the latter gap seem much larger than the former, though most other things we hear wouldn't indicate that. Does Wii U have some sort of resolution bottleneck?

This is about comparing multi-plats and focusing on Wii U's GPU without the extra features. But yeah I would say the main thing being crossed is that Ideaman is talking about games including rendering an image for the controller which is why you see it said that it's not easy to take a 360 game and make it 1080p. My view normally eliminates the controller taking resources so you would see a jump from the PS360 version if someone's multi-plat pushed for it. Colonel Marines and (for better or worse due to reviews) Ninja Gaiden 3 might be good examples of that difference being shown graphically.

Well how used it is, is going to totally depend on Nintendo and their SDK for it. If they make it incredibly simple and easy to use it would be a no brainer for devs to use. Let's hope it's easier and more documented than Tev in GC and Wii was.

You made a funny.
 
You should be satisfied then.

TQaeo.gif
 
I understood the basis for your assumption. :)

It's just that there was nothing logical to me that said it would work. nVidia uses a totally different architecture so it can't be used as a comparison. And to give tangible proof of that on the AMD side here are some benchmarks of a 7870 (1Ghz, 1280 ALUs, 2560 GFLOPs) vs a 6970 (880Mhz, 1536 ALUs, 2703 GFLOPs). As you'll see there's not only no gains, the 7870 has better performance in a lot of benchmarks.

http://www.tomshardware.com/reviews/radeon-hd-7870-review-benchmark,3148-5.html

You're also taking what I said out of context again. I said "high clocks" not "higher clocks". I tend to be very deliberate in my word choices when discussing things. Knowing about the GC's clock changes helped me make sure of that in this case.

And again it's not about the GPU being able to be clocked higher as I agree it could and would be a non-issue for the case. It's about Nintendo's philosophy that suggests it won't happen.

Hmm, my memory failed me, I believed the HD7870 was weaker than the 6950, those are quite impressive, but I still wouldn't assume that Nintendo used any existing architecture, which is literally your fault :P as I originally assumed that they would base it on an existing architect, but you persuaded me two or three threads ago that it would be a custom Nintendo design.

I think as to your high clocks vs higher clocks is nit picking, it means the same thing, as whatever i assume to be average clocks, higher clocks would be needed to reach high, I'm 27 and don't understand why I have to play games with words, it's virtually the same point. Nintendo choose to increase the clock speeds rather than add more muscle to the Gamecube GPU in order to create the Wii U chip, that is exactly what should point to them being able to come up with the idea of adding higher clocks to the Wii U than just adding to costs with something like 800sp shaders for the same effect.

As for the last part, you agree with the possibility of Nintendo being able to push the chip past 600mhz, but you assume they won't because they never have in the past, which I have just proved wrong...

At this point, you could be right about the chip, but it wouldn't be because what I'm saying is impossible, your estimations literally come down to "THIS F'ing COMPANY" lol (though I am paraphrasing, as you didn't say that, you only said that they wouldn't add more power that is easily available, because Nintendo doesn't like to do that.)

Well how used it is, is going to totally depend on Nintendo and their SDK for it. If they make it incredibly simple and easy to use it would be a no brainer for devs to use. Let's hope it's easier and more documented than Tev in GC and Wii was.

Yeah, this is a big reason why the difference between Wii U and XB3 might be much smaller than BG assumes, I really doubt low to ultra settings is a fair comparison either way, but it will be interesting to see what happens. I would assume Devs like Ubisoft who is making an exclusive game for the Wii U (freaks) will end up being able to tap into the functionality for multiports, and that is very comforting to me, as games like Assassin's creed might take advantage of it from the get go, but certainly AC4 would be able to take advantage of it.
 
outside of my eyerolling at the meandering positivistic theorizing in this thread, though, i *do* eagerly anticipate the E3 updates all up in heah
 
What? I was talking about the "banned" tag. But it didn't have the "lol".

Wait, was he banned because he trolled here?

But he posted...

Good God, I'm so confused.
 
The "raw-power" gap between Wii U and PS4/XBox3 won't be nearly as big as it is between Wii and PS360. There will still be a definitely noticeable difference between the consoles, but it won't be as big as this gen.

Perhaps more importantly, though, the feature set gap will be almost non-existent. The main reason that we've seen barely any ports from PS360 to Wii this gen is not so much the power difference, but the fact that the Wii has a completely different GPU architecture than the XBox360 and PS3. It simply doesn't have the programmable shader capabilities that the other consoles have, so there hasn't been the option of "scaling down" a game to Wii hardware; you basically have to make the game from scratch (as with the Wii versions of the Call of Duty series, for example). With the next gen of consoles, all three will have essentially identical GPU feature-sets, just with a lot more power on the MS/Sony side of things. This means that, even with the power gap, scaled down ports will be quite doable, and we should expect a lot more multi-platform games on the Wii U than we did on the Wii.

Also worth keeping in mind is whatever special hardware Nintendo may have added to the GPU. We don't know much in the way of specifics, but it looks like there's going to be some extra units on the GPU to help with lighting. It wouldn't bring the console on par with the others, but for games that make use of it it might bring them a step closer.

The other thing I'd say to anyone worried about the hardware capabilities of the Wii is to look at some of the screenshots in the Dolphin thread. Nintendo were able to create games that looked that good on hardware that's now a decade old. Even if the console were literally an XBox360 with a Nintendo logo on the front there'd be some amazing things to look forward to, and the fact that it's a significant jump above that it just icing on the cake.

<your_favorite_clapping_gif.gif>

Great post! This should be at the top of every page of the speculation threads.
 
You can think that all you want, doesn't make it any more true. The fact that we're in a generation of $400-500-at-launch consoles and that it should be expected just meet the minimum expectation of a generational leap forward, or that we're arguing the MOST ABSOLUTELY FINITE aspects of hardware as though they'll make a gigantic difference tells me everything I need to know about how real diminishing returns actually
at least for some genres, we still have a large overhead for graphics to improve; like GTA IV

However, the point s/he misses, is that for example rendering a PS2/Wii games in HD can make it looks several times more beautiful; it won't cost shit too

However, that is not what the devs will go for. So, I guess no; there's not much room for improvement.

That is, because that is not what market expects from a graphical gen leap; particularly that the current cap of the market is set to be 1080p for as long as I can see.

Good God, I'm so confused.
he is not banned obviously, and if he really gets banned, you won't know
 
I think what next gen can bring us, as far as improving graphics go, is better lighting and texturing. Right now, IMHO most of what holds back current gen games, is just that. Though I'm not as worried for the Wii-U since we've seen the bird demo demonstrating some really nice lighting effects.

As far as diminishing returns go, yeah you double the poly count in some games and it doesn't make a difference to 99% of the people out there. I do think there are some genres that could heavily benefit from more polys. Sand box style games and huge open world games.

Though again, lighting is one thing I feel has not kept pace with other aspects of GPUs, and hardware.

If Nintendo has put some kind of fixed function, hardware focused on just lighting, etc into their GPU, that is going to be a huge huge deal in shrinking the hardware gap IMHO.

Actually I think polygon counts make the most visible differences between console generations, at least to the layman.

PS1 looked very rough, and PS2 was able to round out the major things generally with a higher number of polygons. PS3 added more polygons still, but by that point only the smaller details were being enhanced.

Most of the visual improvements from current gen came from lighting, texture, and post-processing effects along with the resolution bump. Even the current op-tier PC games like Crysis and The Witcher seem to mostly be so because of even more new lighting and texture effects, but appear to have the same number of polygons on consoles.

The main difference I noticed with the Samaritan demo was that the characters seemed to contain a lot more polygons than anything I've seen on current gen.

And on diminishing returns overall, developers still say that we eventually need to reach the visual level of Avatar. Maybe not next gen, but eventually. Another example I like to put up is the CG in Final Fantasy XIII. That however begs the question of whether a game has yet been made that looks better than the original Toy Story.
 
What? I was talking about the "banned" tag. But it didn't have the "lol".

Wait, was he banned because he trolled here?

But he posted...

Good God, I'm so confused.

Drinky is a joke account I do believe :p Half the fun is his posting with the banned tag as it confuses the hell out of people
 
Actually I think polygon counts make the most visible differences between console generations, at least to the layman.

PS1 looked very rough, and PS2 was able to round out the major things generally with a higher number of polygons. PS3 added more polygons still, but by that point only the smaller details were being enhanced.

That's not what I was talking about though. I wasn't commenting on previous gens, but on current gen and next gen.




The main difference I noticed with the Samaritan demo was that the characters seemed to contain a lot more polygons than anything I've seen on current gen.

And on diminishing returns overall, developers still say that we eventually need to reach the visual level of Avatar. Maybe not next gen, but eventually. Another example I like to put up is the CG in Final Fantasy XIII. That however begs the question of whether a game has yet been made that looks better than the original Toy Story.

I don't know the poly counts in Samaritan, but I have to wonder how much of that would be from them using high res normal maps, instead of more polygons.

I'm not saying more polygons don't do anything, but I think the days of just adding some more polygons and a game looking amazingly better are gone.
 
Fuck, there are Nintendo fans here who dont know who Drinky is. How old am I getting?

I've been on GAF for not even 2 years and I don't recall ever seeing this guy.

And I get it now, he has a banned tag but he got banned for real for what I guess are the last few posts he made on this thread.
 
I'm not saying more polygons don't do anything, but I think the days of just adding some more polygons and a game looking amazingly better are gone.
True, but it's going to make for some downright brilliant comedic moments of journalism coverage from gamng media over the next few years.

I've been thinking for a while that I need to start a thread dedicated to collecting these moments of comedic excellence once E3 is over. Heck, I kinda regret not doing so over the past five years..
 
I've been on GAF for not even 2 years and I don't recall ever seeing this guy.

And I get it now, he has a banned tag but he got banned for real for what I guess are the last few posts he made on this thread.

No, Drinky has free reign to troll Nintendo fans.
 
I've been on GAF for not even 2 years and I don't recall ever seeing this guy.

And I get it now, he has a banned tag but he got banned for real for what I guess are the last few posts he made on this thread.
he isn't banned lol, this is his shtick, and it's pretty funny imo.
 
No, Drinky has free reign to troll Nintendo fans.

Some even speculate that he is an account set up to be the mods' troll mouthpiece which they collectively share. Though one can not put that theory into anything other than the junk pile, since there's no way to prove it, anyways.
 
He knows something.


I said it as soon as I discovered Nintendo was releasing a HD console: If the games look at least as good as Dolphin Wind Waker, I'm good.

Damn it, BG, what do you know? What is your secret!?

I know enough to have a decent understanding of what it should be capable of, and the speculation I make now is trying to make a tangible representation of how it would physically look like.

Hmm, my memory failed me, I believed the HD7870 was weaker than the 6950, those are quite impressive, but I still wouldn't assume that Nintendo used any existing architecture, which is literally your fault :P as I originally assumed that they would base it on an existing architect, but you persuaded me two or three threads ago that it would be a custom Nintendo design.

I think as to your high clocks vs higher clocks is nit picking, it means the same thing, as whatever i assume to be average clocks, higher clocks would be needed to reach high, I'm 27 and don't understand why I have to play games with words, it's virtually the same point. Nintendo choose to increase the clock speeds rather than add more muscle to the Gamecube GPU in order to create the Wii U chip, that is exactly what should point to them being able to come up with the idea of adding higher clocks to the Wii U than just adding to costs with something like 800sp shaders for the same effect.

As for the last part, you agree with the possibility of Nintendo being able to push the chip past 600mhz, but you assume they won't because they never have in the past, which I have just proved wrong...

At this point, you could be right about the chip, but it wouldn't be because what I'm saying is impossible, your estimations literally come down to "THIS F'ing COMPANY" lol (though I am paraphrasing, as you didn't say that, you only said that they wouldn't add more power that is easily available, because Nintendo doesn't like to do that.)

ROLF! You proved that wrong? Really? And how is that even a nitpick? It's not about "playing games with words". It's making sure people can't take what you said out of context which you keep trying to do despite me being very clear about my point. 800Mhz is a high clock. Going to 650Mhz over 600mhz is a higher clock. I don't know how much clearer I can make that other than making sure it's known I'm talking about the GPU. Looking specifically at GC and GPUs around that time (which is what I was referring to, not the change to Wii), it was originally a little over 200Mhz (before the drop) and even then when compared to ATi GPUs that wasn't a high clock. And even then I don't get the logic of using changing from Flipper to Hollywood as a point supporting your debate. That was a new console. But even then it had a GPU using a smaller Fab, and a clock speed that still wasn't high in comparison to GPUs of the time. That's what I'm arguing against with your speculation.

And in the end while I'm not saying your paraphrase in that context, a person could apply it if they wanted. :P

Yeah, this is a big reason why the difference between Wii U and XB3 might be much smaller than BG assumes, I really doubt low to ultra settings is a fair comparison either way, but it will be interesting to see what happens. I would assume Devs like Ubisoft who is making an exclusive game for the Wii U (freaks) will end up being able to tap into the functionality for multiports, and that is very comforting to me, as games like Assassin's creed might take advantage of it from the get go, but certainly AC4 would be able to take advantage of it.

I'm assuming you missed my original response based on this.

hahaha, you guys will get a dated mid-range mobile gpu solution and you will -- hilariously -- defend it.

TEAM RADEON 4670M UNITE!

Yeah...well your kitty can't dance anymore.




ahhh2.gif


ahhh1un8.gif
 
Status
Not open for further replies.
Top Bottom