tkscz
Member
I was looking for this earlier.
http://i.minus.com/iPYwCuRuliShf.gif[IMG][/QUOTE]
God I hate this thing.
Is it his outfit, or the old computer monitor?God I hate this thing.
or the fact that it has been used a zillion times the past weekIs it his outfit, or the old computer monitor?
Is it his outfit, or the old computer monitor?
Everything. Who is this? Why the smug face? Where does it come from? Why does this represent NeoGAF!?
Just... why!?
Everything. Who is this? Why the smug face? Where does it come from? Why does this represent NeoGAF!?
Just... why!?
As far as diminishing returns go, yeah you double the poly count in some games and it doesn't make a difference to 99% of the people out there. I do think there are some genres that could heavily benefit from more polys. Sand box style games and huge open world games.
You young ins need to watch his mob moviesEverything. Who is this? Why the smug face? Where does it come from? Why does this represent NeoGAF!?
Just... why!?
http://youtu.be/Eht92AO-QII
Nice video to put diminishing returns into perspective. Ps1 was very rough, ps2 put in a lot more detail and then ps3 smoothed it out. For most genres I don't think nextgen will look all that mindblowingly better. Maybe for GTA like games, but the rest doubt it, just look at pc games. Better looking sure, mindblowing nope.
I think what next gen can bring us, as far as improving graphics go, is better lighting and texturing. Right now, IMHO most of what holds back current gen games, is just that. Though I'm not as worried for the Wii-U since we've seen the bird demo demonstrating some really nice lighting effects.
As far as diminishing returns go, yeah you double the poly count in some games and it doesn't make a difference to 99% of the people out there. I do think there are some genres that could heavily benefit from more polys. Sand box style games and huge open world games.
Though again, lighting is one thing I feel has not kept pace with other aspects of GPUs, and hardware.
If Nintendo has put some kind of fixed function, hardware focused on just lighting, etc into their GPU, that is going to be a huge huge deal in shrinking the hardware gap IMHO.
What I mean by cutting out "GCN function" is simply not designing a chip for computing as GCN is so obviously built, if it was designed to ignore compute in favor of pure gaming performance. (of course it would still compute) you would have a card closer to 680gtx (closer as in philosophy and not performance) This means it would out perform GCN per shader per hz per watt in games, while leaving the compute functionality of GCN on the sideline in Nintendo's version of a 28nm or even 32nm chip.
I also don't care about PS4's and Xbox3's architecture, but if it is GCN, Nintendo can make up a lot of ground as it is a heavy chip (size/performance is hurt in order to focus on compute units)
You yourself believes that Nintendo could be using a 640sp chip, you give it the clock of 600mhz for no reason other than you think it should be around there, and I've shown that it could as high as 800mhz if it were designed similarly to the e6760 chip. I've also conceded that it might not be as high as 800mhz, but there is nothing locking it in at 600mhz if it's a smaller chip like we both assume, they should at the very least be able to push out 650-700mhz on a 32nm process.
And as for the power gap of the PS2 to Xbox, yes it will be smaller, you and I both expect the same functionality between Wii U and the other consoles (as far as graphical effects) We also both know this:
1. His post wasn't just about cost, it also points to heat, a smaller chip produces less heat, and the Wii U chip will be much smaller than Xbox3/PS4 without losing more than half the power, thanks to "losing" GCN in favor of performance.
Having Fixed functionality would be absolutely great, and if so, there is no way that that skyrim video should be taken seriously at all... my E350 can play skyrim at those settings (~20fps) which is a 80GFLOPs gpu, the card needed to run skyrim on ultra is a 6970, a 3TFLOPs+ card, the difference in those graphic processing units is so much more larger than were we put Wii U and Xbox3 that it comes off as a huge joke...
I think you have gathered a lot of information, but you simply don't know how to use it, you aren't comparing properly to the hardware you are expecting out of the boxes, even if the boxes were 4x the performance and we tossed out fix functionality, you'd end up with medium to ultra settings on skyrim.
http://youtu.be/Eht92AO-QII
Nice video to put diminishing returns into perspective. Ps1 was very rough, ps2 put in a lot more detail and then ps3 smoothed it out. For most genres I don't think nextgen will look all that mindblowingly better. Maybe for GTA like games, but the rest doubt it, just look at pc games. Better looking sure, mindblowing nope.
Maybe when you only look at racing games.
Which is why I personally believe the gap between Wii U and XB3/PS4 is going to be much smaller than PS2 vs Xbox, that sort of difference was not only performance based (xbox being ~2X the specs) but PS2 couldn't do lighting effects, and the texture effects that made xbox games like Halo and Halo 2 stand out so well.
I really think not bringing GCN to the Wii U can also shrink the chip drastically, which is what I've been talking about with GCN being a heavy architecture, a smaller chip can run at higher clocks without taking a large hit to the TDP... Something greater than 640sp @ 600Mhz should be completely possible.
See, if most Wii U games were to look like that, I'd be satisfied.
Is this the evil guy from Home Alone?
Whats going on...
I'd be satisfied if they looked like the first one but were insanely good games.
reggie has no idea I bet. The only thing he seems to know is the buzzword "1080p".The next person who interviews Reggie or someone else from Nintendo really needs to ask them flat out what the power draw from Wii U will be. Tell him that people like to plan ahead for their electricity budget and with the system launching to year they have to have the tdp figure. American families struggling to get by will appreciate this info.
No you wouldn't.
indeed. As for next generation....This was the generation of shaders coming into thier own, and a tremendous leap over the last.
You wouldn't.
edit: For clarification, if all Wii U games looked like the first pic, even if they were each the GOTY, I admit I would be disappointed at all the bitching on forums and among my friends and such. So that is, in effect, a lack of complete satisfaction.
indeed. As for next generation....
And since we're on the topic of graphical leaps.
![]()
Can't wait to find out.
Billy Hatcher 2 Wii U confirmed.
which apparently the wiiU should do a lot better in. But yeah I see the upcoming generation as the "polishing" generation. Better/cleaner textures, better lighting, etc......will be known for gpus with advanced fixed-function lighting.
It'll probably be hard to compare as the style will be totally different than the tech demo.
However, maybe they will see how much praise the demo got and make it resemble that art but I doubt they care what we like nor should they really.
Gameplay and design matters most but no way in this fucking HD age and this year would I be happy with games that looked like THAT.
Aye. Maybe when I get a flat panel television set in a few years, my impression of such things will change.
I do respect and appreciate changes in horsepower that actually do add to the fun of games, mind you. I just don't really care about it that much on a gut level. I'm probably going to be going back to Nethack a couple times a year until the day that I die.
However, maybe they will see how much praise the demo got and make it resemble that art but I doubt they care what we like nor should they really.
I guess for me it's the context of how you're saying it that's throwing me off. As I said before I don't see AMD making a non-GCN GPU at 28nm. At the same time I don't see a similar philosophy to CUDA by "stripping away GCN". VLIW4 didn't show it. At the same "stripping away GCN" the ALUs wouldn't/shouldn't change meaning their performance should still be the same making there be no gain from taking that away. IMO this is where the flaw seems to be coming from and is working more off of assumption. I mean VLIW5 sounds like what you are talking about and it saw poor ALU utilization after the change to DX10. I guess we are saying the same thing in different words as the GPU as I see it would have the "GCN cut out", but I don't see some kind of performance gain because of it.
See above.
No, I gave a reason you're clearly ignoring. As I said, I gave it that clock based on the fact that Nintendo doesn't use high clock speeds. That's not just a random assumption I made. And while it would be nice for them to push for a higher speed while using a smaller process, there's nothing there to say they will other than hope.
I'm assuming you didn't mean to quote EC as he wasn't the one I responded to. That said I don't see how you got that from DC's post. He focused more on cost, heat, and size. Not graphical abilities.
This suggests you didn't see some of my other posts while still assuming some kind of performance gain just from taking away GCN's compute functionality. Although my assumed max gap range would be 1080p on high settings vs 720p on low settings. But it seems you're trying to create a max gap that will be much smaller than what it will be and I'm not seeing the logic behind your support for that.
Billy Hatcher 2 Wii U confirmed.
I think what next gen can bring us, as far as improving graphics go, is better lighting and texturing. Right now, IMHO most of what holds back current gen games, is just that. Though I'm not as worried for the Wii-U since we've seen the bird demo demonstrating some really nice lighting effects.
As far as diminishing returns go, yeah you double the poly count in some games and it doesn't make a difference to 99% of the people out there. I do think there are some genres that could heavily benefit from more polys. Sand box style games and huge open world games.
Though again, lighting is one thing I feel has not kept pace with other aspects of GPUs, and hardware.
If Nintendo has put some kind of fixed function, hardware focused on just lighting, etc into their GPU, that is going to be a huge huge deal in shrinking the hardware gap IMHO.
Bgassassin's awkward non-response response to my post just confirms he is Wii U insider. Thanks!![]()
See, if most Wii U games were to look like that, I'd be satisfied.
You should be satisfied then.
If only, if only.
God the Gamecube Zelda tech demo was ugly.
What an awful art style.
We'll be saying the same thing about the E3 2011 Zelda tech demo in 12 years.
I love this man.Gahiggidy said:The next person who interviews Reggie or someone else from Nintendo really needs to ask them flat out what the power draw from Wii U will be. Tell him that people like to plan ahead for their electricity budget and with the system launching to year they have to have the tdp figure. American families struggling to get by will appreciate this info.
And since we're on the topic of graphical leaps.
![]()
Can't wait to find out.
I realize mixing partial information from Ideaman and bgassassin is kind of like mixing numbers from Famitsu and Media Create BUT. What's confusing me right now is it sounds like it isn't easy to take a 720p game from X360 and make it 1080p on Wii U, but it should be easy to take a 720p game from Wii U and make it 1080p with even better image quality on Future Xbox. That makes the latter gap seem much larger than the former, though most other things we hear wouldn't indicate that. Does Wii U have some sort of resolution bottleneck?
I love this man.
The performance gain comes from an assumption that the compute hardware in GCN takes away from it's overall performance, I base this on 680GTX which is much faster than 580GTX thanks in large part to shedding it's compute performance.
The other, much more noticeable performance gain that is correct, is in smaller size of chip, the die size would be much smaller than if they used GCN, for instance, a similar performing 6000 series to the HD7770, would as DCking said, be under 100mm^2 while the HD7770 is 123mm^2, that means higher clock speeds at the same TDP.
You are also wrong about Nintendo not using higher clock speeds to gain performance in the past... The Wii's GPU is clocked something like 50% faster than Gamecube's, and other than memory sizes, the GPU didn't change with more TEV units right? so the assumption that Nintendo wouldn't do that with the Wii U, is a pretty big assumption indeed, especially when it is basically free performance.
And yes I did mean to quote EC, though I thought he was saying that the Xbox had effects that PS2 did not, which added to the distance between them, and that is my point, Wii U has the same effects as Xbox3 and if there is fixed function components, that distance should be much more superficial.
The only real way you are going to have the distance you are describing is if they do actually have separate GPGPU chips so that they can do much higher physics allowing for stuff like individual hair movement, particle effects, and just a more lively world.
Also, I am assuming that Nintendo will use a 28nm/32nm GPU, either should be enough to break the 600mhz you are assuming without raising the TDP very much at all...
Having said all of this, I don't know if they will, this is speculation on what they COULD do with the Wii U casing, I do believe that they will target the highest GFLOPs they can from the GPU without pushing the TDP higher than half for GPU.