WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
Nothing. I don't care about it, just Krizzx brought it up.

I asked "you" for nothing, and certainly nothing of Marcan. I said "pros".

I wanted to ask some questions for the people with experience in this thread. I had questions pertaining to the GPU as they relate the arm processor relates to it, and homebrew explanations answer not a single one of them.
 
I asked "you" for nothing, and certainly nothing of Marcan. I said "pros".

I wanted to ask some questions for the people with professional knowledge in this thread. I had questions pertaining to the GPU as they relate to it, and homebrew explanations answer not a single one of them.

There is no need to lash out at him. He's done nothing to you. You asked about starbuck, and that IIRC is the ONLY source for starbuck.
 
There is no need to lash out at him. He's done nothing to you. You asked about starbuck, and that IIRC is the ONLY source for starbuck.

There is no need to exaggerate my statement into lashining out. I am simply being direct in clarifying that I did not request that, because he was trying to dump his reason for posting that hacking blog from Marcan off on me in his response to Jack Cayman. Don't use me as your excuse.

I asked for a "pro" to answer questions about starbuck as they pertain to the GPU. Pro is the keyword. I'm looking for one of the more experienced posters like blu who has technical knowledge about hardware. How to exploit it and hack the Wii U does not answer any of the questions I had.
 
I asked "you" for nothing, and certainly nothing of Marcan. I said "pros".

I wanted to ask some questions for the people with experience in this thread. I had questions pertaining to the GPU as they relate the arm processor relates to it, and homebrew explanations answer not a single one of them.

Exactly, what would the guy who discovered and indeed named it and cracked its predecessor know of it. Tearing into hardware to exploit its weaknesses definitely doesn't tell you more about it than looking at a picture.

Fwiw there's a teeny 8 bit ARM CPU embedded in Latte that helps with Wii-Wii U GPU compatibility. I don't however know what you're trying to get at with the security core interacting with the GPU.
 
There is no need to exaggerate my statement into lashining out. I am simply being direct in clarifying my purpose, because he was trying to dump his reason for posting that hacking blog from Marcan off on me in his response to Jack Cayman.

I asked for a "pro" to answer questions about starbuck as they pertain to the GPU. Pro is the keyword.

Considering Marcan is a pro...

Or the fact that lherre is also a pro...

Yet you seem to be extremely selective on who you take your information from, and yes, I do believe you "lashed out" at him.

"I asked "you" for nothing, and certainly nothing of Marcan. I said "pros". "

I'm sorry if he or we or most of us in here is not good enough for you, but Marcan is the source for the information on Starbuck.
 
Marcan does not post in this thread and I am not looking for hacking information. I don't want to know how to hack the Wii U. Where was lherre brought up? I don't recall him being part of this. If he had replied I would surely ask him, but he didn't. So why mention him?

Going by your attempts to bring unrelated people in to make an argument and your insistence on imposing me being aggressive where I have said wasn't, you seem to be trying to stir up more problems where there are none. I am not going to argue with you about this.

I'm tired of the thread being derailed by these ridiculous arguments, and posters personal issues with others posters. Please, stick to the topic..
 
More evidence? Maybe in the past but not now. I have to disagree. Also, usc didn't "arrive" at any of it. He just jumped on the bandwagon when it came up as it suited his interests. The original idea came from Beyond3D and Fourth decided to throw his chips in with it after a while and give his on take on the possiblity.

Out of all of the major analyst and contributors in this thread, it was pretty much just Fourth that who was pushing for the 176. The rest all had many contradicting arguments. They didn't say he was outright wrong, but that it was unlikely as things stand.

The registry banks/cache beside them was found to be variable which was one of the 2 key points for Fourth's decision. After that was found to not be entirely accurate, he stated that he was going by what was common(something about multiples as well) and said that it could very well be other numbers at that point.

Most of the information we have "now" points to it not being 176Glops.


The argument for 176 was that
1. The TMU's had fixed function hardware alongside the ALU, effectively more than doubling Latte's performance. This with also coupled with as explanation for how backwards compatibility was achieved for the TEV.(first major crux of the argument)
2. That the registry banks/cache were synonymous across all AMD GPUs( the other major crux of the argument and was found to not be the case in the end)
3. That the shaders in Latte are more modern and more efficient than the ones in the 360/PS3 allowing higher performance at a lower shader count.

The arguments for other counts are that
1. The Wii U has shown higher levels of shading than the last gen consoles in many scenarios.
2. The TMU's on Latte are 90% larger than the 20 ALU components that AMD produces.
3. The hardware in Latte is more modern allowing for more efficient design and utilization. (a double edged sword)
4. The levels of efficiency needed to match, much less exceed the 360/PS3 shaders were not their at launch when the dev kits were at their worst and devs were not familiar with the hardware. It wold require them to be utilizing Latte to its fullest from day one for it to be true.
5. Its contradicts the statement about no wasted silicon on the die.
6. That fixed function on die was rules out(I think by either Marcan or B3D) killing what the other crux of the theory.
7. That we can't be certain that what we think is what on the die(aside from the EDRAM) is what it is what we think it is.

There were more for the against, but I can't remember them all off the top of my head. I'd have to go back to when that was the topic of discussion, and I don't have time at the moment. If you do a search, you should be able to find them.

The 176Gflops hypothesis is not the most plausible anymore by any means. Its still possible, but there is more against than for.
What exactly are "my interets"? I have follow the Wiiu tech threads since it was announced. You just seem like some massive fanboy that jump in the last minute to "save to wiiu from haters" or whatever nonsense. No one hating on the wiiu it is what it is... Like some insider said after e3 last year the wiiu is like a 360+.

The reason for the 176 Glfop gpu:
The power consumption of the console itself, registry size, performance and the size of the alu blocks. I have never said anything about "fixed function" or whatever nonsense that somehow doubles performance.

352 was ruled out for performance reasons, power consumption, registry size and size of the alu blocks.

So either is 176 glfop or it some weird number but that doesnt match with any amd design nor the registry size. So its leaves 176 gflops as the most likely option given the information at hand. All this information has been gone over for the last 6 months on here and b3d. I was one of the first to bring up a weird alu number like 30 per block but that was shot down and BG even stated this was not correct. So i really do not see any other options.
 
He does not post in this thread and I am not looking for arbitrary information.
You can ping him on twitter; I did so in the past.

It's not arbitrary, or rather, it's only arbitrary if your intent is to disregard it. He actually tried to register to clear up misconceptions back then but validation took too long. He was actually answering us via twitter, hence the annotated dies and such, in line with what we were trying to do.

In regards to consisting of one-liners you have to discount the fact that it's twitter; he certainly wasn't trying to register here to drop us a tweet.
 
You can ping him on twitter; I did so in the past.

It's not arbitrary, or rather, it's only arbitrary if your intent is to disregard it. He actually tried to register to clear up misconceptions back then but validation took too long. He was actually answering us via twitter, hence the annotated dies and such, in line with what we were trying to do.

Indeed, he was quick to point out even if the cores were low clock, that they were very efficient.
 
It's 'current gen' in terms of performance because everything else blows it out of the water in that regard. It's not 'current gen' in terms of hardware features.

Resolution is the most demanding aspect of GPU performance. 1080p demands 2.25x the performance of 720p and 3.4x the performance of 600p. Xbox One games that run at 1080p should therefore theoretically be possible at 600p on Wii U on those occasions that CPU and memory size differentials aren't an issue and tessellation is kept in check. Just contrast these Crysis 3 comparison shots and tell me you see a huge difference between 'very high' and 'low'. There's little visible difference but I could easily picture Wii U playing it on 'low' at 720p with Xbox One getting 'high' at 1080p and PS4 getting 'very high' at 1080p.

It's relative and game based is what I'm saying. The performance differential between 'low' and 'very high' in that game is absolutely immense.

No it's not fps is.
 
You can ping him on twitter; I did so in the past.

It's not arbitrary, or rather, it's only arbitrary if your intent is to disregard it. He actually tried to register to clear up misconceptions back then but validation took too long. He was actually answering us via twitter, hence the annotated dies and such, in line with what we were trying to do.

In regards to consisting of one-liners you have to discount the fact that it's twitter; he certainly wasn't trying to register here to drop us a tweet.

I do not use twitter or facebook. I dislike them greatly. I posed the questions in the CPU thread instead of here if you want to look. Perhaps you could pass them to him on twitter.

I'm not trying to be confrontational, but It seems no progress can be made in this thread without someone trying to intentionally impeded it or start flame wars. It is impossible to hold a discussion with most of the current crowd as their only goals seem to be downplaying anything that sounds next gen about the Wii U, and disparaging Nintendo. I have no interest in console wars. Maybe one day if the other big analysts come back, this thread can move a few inches again.

Maybe we should take another shot at trying to label the die again.
 
To make that perfectly clear: I absolutely can see it being 176 GFLOPS, and the registers are the most obvious hint we have, so it's easy to come to that conclusion. I simply point out that it doesn't quite seem to match other observations.

There's a lot of aspects of the chip that don't line up. If it's a 160 shader part, then it has almost/more than double the transistors (excluding the eDram) of other 160 shader parts. It's also larger than a 160 shader part at 40nm.

There's something going on with that chip beyond the customization we know of. I think for anyone to say one way or the other how many shaders it has or how many flops it is out side of folks with the documentation telling them so is pushing an agenda.

(not disagreeing with what you said wsippel just adding to it)
 
Considering Marcan is a pro...

Or the fact that lherre is also a pro...

Yet you seem to be extremely selective on who you take your information from, and yes, I do believe you "lashed out" at him.

"I asked "you" for nothing, and certainly nothing of Marcan. I said "pros". "

I'm sorry if he or we or most of us in here is not good enough for you, but Marcan is the source for the information on Starbuck.
Nothing harmful about a slice of confirmation bias. Love how he continuously tries to weasel his way out. Lashing out or not, he obviously questioned their integrity in regards to their knowledge on the hardware. I wonder if english is his second language and if he just has a bit of a hard time getting his points across though.

Edit: I have a question. It has probably been covered but this topic is huge and I felt like having something relevant to the topic in my post :P.
If the 360 and PS3 had the same power draw as the Wii U, would they be able to peform at similar levels? I know the architecture between the three vary a lot but I'm curious just how efficient the Wii U really is.
 
Nothing harmful about a slice of confirmation bias. Love how he continuously tries to weasel his way out. Lashing out or not, he obviously questioned their integrity in regards to their knowledge on the hardware. I wonder if english is his second language and if he just has a bit of a hard time getting his points across though.

Edit: I have a question. It has probably been covered but this topic is huge and I felt like having something relevant to the topic in my post :P.
If the 360 and PS3 had the same power draw as the Wii U, would they be able to peform at similar levels? I know the architecture between the three vary a lot but I'm curious just how efficient the Wii U really is.

No. They wouldn't. PS3 and 360 have a higher power draw, so they are less efficient. Considering Wii U is already more powerful with less power draw, there is literally no way for the PS360 can be as capable with less power.
 
Nothing harmful about a slice of confirmation bias. Love how he continuously tries to weasel his way out. Lashing out or not, he obviously questioned their integrity in regards to their knowledge on the hardware. I wonder if english is his second language and if he just has a bit of a hard time getting his points across though.

Edit: I have a question. It has probably been covered but this topic is huge and I felt like having something relevant to the topic in my post :P.
If the 360 and PS3 had the same power draw as the Wii U, would they be able to peform at similar levels? I know the architecture between the three vary a lot but I'm curious just how efficient the Wii U really is.

Cofirmation bias? I said that it would be fine to pose the question to Marcan, but I don't use twitter and he is not here. I also said that I would ask lherre if he was here but I see him responding to nothing and he made no statements to confirm or deny on the matter.

I do no care who gives the information, so long s it is given, and it was not. Itadakimasa tried to weasel out of his actions and pass the buck to me, but I wasn't accepting it. Get your facts right.

Strop trying to twist my actions. This type of garbage is ruining the thread more than anything else. The only bias here is yours. I don't know who you are but I will not stand for being accused of doing things I haven't done. Go take your beef elsewhere. If you are not discussing the topic, then keep my name and references to me out of your posts.
 
Lherre can't really violate an NDA, so if any questions were posed to him they would have to tiptoe around any questions that revolve around it.

For example, asking him "how many shaders does Latte have" would violate NDA. (Not that the Nintendo docs have that info... =\)

But asking something like "in terms of comparison to the 360, how do you feel Latte bests its performance if it has only 160 shaders" might be something that a developer could answer in a roundabout fashion without breaking the contracts and ending their career.
 
Edit: I have a question. It has probably been covered but this topic is huge and I felt like having something relevant to the topic in my post :P.
If the 360 and PS3 had the same power draw as the Wii U, would they be able to peform at similar levels? I know the architecture between the three vary a lot but I'm curious just how efficient the Wii U really is.

Even all on 45nm processes (and 40 for the Wii U GPU maybe), no. The PS360 Slim both need higher wattage because they are clocked so much higher. Voltage requirement goes up exponentially with clock speed, that's why we've been stuck below 3 as an average for over a decade, relying on more cores and cache and new architectures instead for performance.

So that's why, even shrunk down to 45nm somewhat modern fabrication, they still take a significant chunk of power. If you shrunk an ancient Pentium 4 3.6GHz to an advanced 22nm process, it would still suck a lot of power for its low performance, again as I said power draw goes up exponentially with frequency. Far less than on its original (130nm?) plant, but it would still be very inefficient and require lots of power for the performance. Same with Xenon and Cell.


The PS360 designs are from the days when lots was sacrificed for raw clock speed. They are in-order designs with long pipelines, etc etc.

Offtopic: I have to say "wow" though. Its been one hundred pages since then and the thread hasn't prgressed an inche because of the exact same people who started camping out here to disparage all new info we good. It makes me sad reading back through this thread. There were so many opportunities for advancement, but it kept getting derailed by people trying to defend and promote the PS3 and 360 and downpaly any media that favored the Wii U every time something groundbreaking came up.came up.

For some, that's true. For others that's just a straw man, as shooting down overly optimistic theories != downplaying everything positive. It's just trying to stick to reality. And the people doing the opposite of the detractors are just as bad, always clinging to every iota of optimism no matter how unlikely. There's a middle ground between both sets of crazies.
 
Sorry but I can't answer almost all the questions or give more data.

But I know that some people has problems with wii u (porting games, the same games that runs fine in ps360 without any optimization), others not, etc, depends on the game and the things you need to do. I know that this info is not very relevant, but I can't give you more details.

The documentation ... well let's say it's similar to gc one.

Edited (a mistake with can and can't)
 
Back to the labeling.

This is the annoation of BG's estimates I made a long time ago. Seems we have nothing else to go on(that won't be attacked) now I will start from back here.
wiiudieblockswip.jpg
NOTE: Ignore the SIMD's and TMU's. Those are likely incaccurate.


Offtopic: I have to say "wow" though. Its been one hundred pages since then and the thread hasn't prgressed an inch because of the exact same people who started camping out here to disparage all new info we get. It makes me sad reading back through this thread. There were so many opportunities for advancement, but it kept getting derailed by people trying to defend and promote the PS3 and 360 and downplay any media that favored the Wii U every time something groundbreaking came up for analysis.

Here's a comment from Fourth about his findings on the shaders.
http://www.neogaf.com/forum/showpost.php?p=59804961&postcount=5917

Fourth's findings on the TMU's
http://www.neogaf.com/forum/showpost.php?p=59514681&postcount=5604

http://www.neogaf.com/forum/showpost.php?p=58101880&postcount=5257
 
Sorry but I can't answer almost all the questions or give more data.

But I know that some people has problems with wii u (porting games, the same games that runs fine in ps360 without any optimization), others not, etc, depends on the game and the things you need to do. I know that this info is not very relevant, but I can give you more details.

The documentation ... well let's say it's similar to gc one.

Actually, this is fairly helpful. Given the few numbers of ports that are on the Wii U, its not hard to figure which one had problems and which didn't.

Though, can you at least comment on how much difference there is in shading capability between the consoles? A simple comparison would do, like maybe a v4 versus a v6 or something of the like. Also, how does the consoles ability to draw polygons compare to the last gen consoles?
 
Back to the labeling.
Offtopic: I have to say "wow" though. Its been one hundred pages since then and the thread hasn't prgressed an inch because of the exact same people who started camping out here to disparage all new info we get. It makes me sad reading back through this thread. There were so many opportunities for advancement, but it kept getting derailed by people trying to defend and promote the PS3 and 360 and downplay any media that favored the Wii U every time something groundbreaking came up for analysis.

This is foolish of me to do, but please stop projecting onto other posters. I've been reading and very occasionally commenting in these threads since the very first WiiU speculation thread and your observation here is NOT what has happened.

It was stated on the first page by several posters that this GPU design is highly custom but in this case custom does not mean better, it means literally custom. It was made specifically for Nintendo to achieve a certain level of performance at a low power. Instead of questioning why in over 100 pages nobody has made any progress, you should note all the work and progress that has been made and perhaps realize all that is obvious has already been found. I am NOT implying some sort of secret sauce here, btw.

I actually caught myself a few days ago noting that USC-Fan, someone who was kind of harsh in early threads, has become one of the most level headed posters in here, same with Phosphor. People have been very kind to you for the most part. The way some people treated USC (and Arkam sp?) in the beginning was pretty bad.

It's been nice to see this and the other WiiU speculation threads go from wild speculation to a calm realization that the WiiU is what it is and that it should be taken for that instead of unrealistic hopes that there is some magic thing everyone is missing.

On a random note, big thanks to Blu, BG, Lostinblue, IdeaMan, fourth_storm, USC, Phosphor, Iherre and everyone who has kept this going. Even those we have lost. One part of the fun story of this console is the interesting roller coaster of arguments and discussion that have come out of it in this thread.
 
Lherre can't really violate an NDA, so if any questions were posed to him they would have to tiptoe around any questions that revolve around it.

For example, asking him "how many shaders does Latte have" would violate NDA. (Not that the Nintendo docs have that info... =\)

But asking something like "in terms of comparison to the 360, how do you feel Latte bests its performance if it has only 160 shaders" might be something that a developer could answer in a roundabout fashion without breaking the contracts and ending their career.

LMAO! They really do keep their "partners" that much in the dark.
 
LMAO! They really do keep their "partners" that much in the dark.

Not really. It's just that specifics like "how many shader units" aren't really relevant to what they present in the docs. Simply aren't their forte. If anyone phones them or emails them to ask a question or for help with something, Nintendo's actually very good with responding, whether you're a major publisher or an indie. That's a fairly common misconception on these forums. Problem is that even some developers assume they're "closed off" completely. Mind you what I've seen was pre-launch. I wouldn't know what they look like now.
 
LMAO! They really do keep their "partners" that much in the dark.

Been like this since n64 and it's killing them granted it's not as bad as what it use to be it's still making long term relationships hard. Some of their stupid policies should be outed and called for what it is.
 
This is foolish of me to do, but please stop projecting onto other posters. I've been reading and very occasionally commenting in these threads since the very first WiiU speculation thread and your observation here is NOT what has happened.

It was stated on the first page by several posters that this GPU design is highly custom but in this case custom does not mean better, it means literally custom. It was made specifically for Nintendo to achieve a certain level of performance at a low power. Instead of questioning why in over 100 pages nobody has made any progress, you should note all the work and progress that has been made and perhaps realize all that is obvious has already been found. I am NOT implying some sort of secret sauce here, btw.

I actually caught myself a few days ago noting that USC-Fan, someone who was kind of harsh in early threads, has become one of the most level headed posters in here, same with Phosphor. People have been very kind to you for the most part. The way some people treated USC (and Arkam sp?) in the beginning was pretty bad.

It's been nice to see this and the other WiiU speculation threads go from wild speculation to a calm realization that the WiiU is what it is and that it should be taken for that instead of unrealistic hopes that there is some magic thing everyone is missing.

On a random note, big thanks to Blu, BG, Lostinblue, IdeaMan, fourth_storm, USC, Phosphor, Iherre and everyone who has kept this going. Even those we have lost. One part of the fun story of this console is the interesting roller coaster of arguments and discussion that have come out of it in this thread.

Level headed? A person who throws insults and attacks when he or something he likes is questioned
http://www.neogaf.com/forum/showpost.php?p=82594717&postcount=10228
http://www.neogaf.com/forum/showpost.php?p=82692909&postcount=10290,
ridicule the Wii and Nintendo at every given opportunity
http://www.neogaf.com/forum/showpost.php?p=82516541&postcount=10151
and claims that things have been confirmed that haven't
http://www.neogaf.com/forum/showpost.php?p=82495529&postcount=10139
and in general makes up data that is favorable to him
http://www.neogaf.com/forum/showpost.php?p=82658105&postcount=10254
and states is like it is confirmed fact as well when isn't remotetly true
http://www.neogaf.com/forum/showpost.php?p=82660889&postcount=10259
http://en.wikipedia.org/wiki/Watch_Dogs
and never provides links/media or anything to back up what he says or any explicit technical detail is the most level headed?

Don't ever attack me with nonsense like this again. I'm not here to fighting with people. I'm trying to get back to anlyzing the GPU, but as I said, every time I start someone comes in an blindsides it with their personal emotions and personal beef. If you do not like what I have to say, there is an ignore list.

I don't go hunt down and bash people who have done me no wrong just for not having the same opinion as me.
 
Chill out krizzx, that wasn't an attack...more of an observation with no intended malice as far as I can see. You really need to stop being so sensitive I think tbh.
 
Chill out krizzx, that wasn't an attack...more of an observation with no intended malice as far as I can see. You really need to stop being so sensitive I think tbh.

You could have fooled me.

Well, then I will get back to trying to label the GPU components like I was before this.

It would be nice if someone would actually help though. I'm genuinely trying to analyze the GPU as per the purpose of this thread. It is all I ever wanted to do in here. I'm starting to feel like I'm the only one left though.
 
Chill out krizzx, that wasn't an attack...more of an observation with no intended malice as far as I can see. You really need to stop being so sensitive I think tbh.

There was 0 malice. In fact I was trying to note that the amount of overall malice in the thread had decreased pretty significantly.
 
There was 0 malice. In fact I was trying to note that the amount of overall malice in the thread had decreased pretty significantly.

Well, if it wasn't, then my apologies. Its hard to tell when I've been getting gang bashed by people for everything I post for the last week.
 
I'm genuinely trying to analyze the GPU as per the purpose of this thread. It is all I ever wanted to do in here. I'm starting to feel like I'm the only one left though.

I don't think that's the case, as there are plenty of opinions above and interesting comments from multiple GAF users that are not "attack" in nature, but attempting analysis.

When attempting to gauge a component as you are here, you have to take opinions that seem both positive and negative to what you perceive, and attempt to understand why some people have their opinions. Not just positive stuff. Cognitive dissonance gets everyone nowhere, right?

There doesn't seem to be any malice lately, moreso productive discussion from what I've seen the last couple pages.
 
I don't think that's the case, as there are plenty of opinions above and interesting comments from multiple GAF users that are not "attack" in nature, but attempting analysis.

When attempting to gauge a component as you are here, you have to take opinions that seem both positive and negative to what you perceive, and attempt to understand why some people have their opinions. Not just positive stuff. Cognitive dissonance gets everyone nowhere, right?

There doesn't seem to be any malice lately, moreso productive discussion from what I've seen the last couple pages.

As I've said many times, I have no problem with any opinions one way or the other. All I care is that they are backed or properly explored.

Its fine by me if someone says something negative of the Wii U or doesn't like Nintendo, but they should not expect met to accept it with no sources or linked material. If I find it flawed, I will provide a counter argument. Then I usually get attacked following it, ie labeled a fanboy or have other insults thrown at me.

I do not throw out analysis that run counter to mine, but if they link no material that collaborates it or it contradicts what I know, then I'm not going to give it much though. That is why I ask them to explain or provide sources. I like to give them a chance to clarify

Also, there are people in this thread who have made it clear that they have an alternative agenda to analyzing the GPU. I'm generally not to keen on what those individuals have to say. If you see me ignoring someone, there is a 90% probability that it was one of them. The rest are usually drive by beggers(ie. people who don't contribute anything but pop up out of nowhere to try to dismiss what I have to say).

I'm always happy to hear new ideas otherwise, good or bad, so long as they are logically presented.
 
Unfortunately, krizzx, some people cannot provide sources. People's jobs are on the line. It would be extremely productive to, instead of dismissing what you perceive as a negative/attack type thing without sources as bunk, to attempt to reason as to why they have that opinion.

As an example, if you disagree with USC-fan (as I have many times in the past, even if I don't necessarily disagree with some of what he's saying right now) you can debate as to why you disagree and present your facts without any sort of antagonistic language or outright dismissal. It may serve the thread for good, quite honestly, to have that type of discussion happen. For the most part, the last couple of pages have been this way.

wsippel is a great example. He has a great amount of knowledge and has a lot of sources in the industry (actually, he's just extremely good at digging - one of the best in this forum), and he says "well maybe it's not so simple as to be a 176gf part. And ______ is why. However, it's also possible that it is such a part. But these metrics aren't necessarily the end-all-be-all because we're seeing so and so in game as efficiency improvements"

Without so much as a whimper of antagonistic language, he presented his opinion on why it could be the case and why it couldn't. That made for a productive back-and-forth with USC-fan, who said "hey, maybe you're right - it's not impossible". That's the kind of discussion that gets people's opinions across, negative and positive, without resorting to any kind of "attacks" - right? It makes for an awesome thread, certainly. We would all be better off dropping talk like "agendas" and extreme viewpoints. If you feel someone has an agenda, instead of saying "you have an agenda!" try debating the person the way wsippel has (i.e. "it's possible you are incorrect, and this is why *insert fact or speculation here*").

edit: and obviously this is directed at the thread in general, not just a single poster, when discussing making this place less of a shitstorm
 
I believe the point was that they were surprised that the Wii U could put it off with its non-simd CPU. In other words, it is too soon to define hard limits to the system's capabilities. The concept is not exclusive for Nintendo platforms. I'm quite sure some of the latest 360 and PS3 games have things going on that was beyond what people thought those systems were capable of.
The whole 'non-SIMD' tag is an oversimplification in its own. Behold:

Bobcat, an 'unquestioned' SIMD design:
  • data path and fp ALUs width: 64 bits
  • register file: 16x 128 bits
  • can do a multiplication and an add per clock per ALU lane (and which is how its theoretical FLOPS are calculated)
  • does scalar fp as a special-case one-lane ALU op

PowerPC 750CL, aka Broadway/Espresso - the 'non-SIMD' CPU:
  • data path and fp ALUs width: 64 bits
  • register file: 32x 64 bits
  • can do a multiply-add per clock per ALU lane (and which is how its theoretical FLOPS are calculated)
  • does scalar fp as a special-case one-lane ALU op

At the end of the day, the only 'more SIMD' aspect of Bobcat over 750cl is that the former can decode twice as many fp ops encoded as vectors, since a single Bobcat op encompasses up to 4 scalar ops, whereas a single 750cl op - up to 2 scalar ops. Conversely, you could argue 750cl has better control granularity over its fp resources.
 
Unfortunately, krizzx, some people cannot provide sources. People's jobs are on the line. It would be extremely productive to, instead of dismissing what you perceive as a negative/attack type thing without sources as bunk, to attempt to reason as to why they have that opinion.

As an example, if you disagree with USC-fan (as I have many times in the past, even if I don't necessarily disagree with some of what he's saying right now) you can debate as to why you disagree and present your facts without any sort of antagonistic language or outright dismissal. It may serve the thread for good, quite honestly, to have that type of discussion happen. For the most part, the last couple of pages have been this way.

wsippel is a great example. He has a great amount of knowledge and has a lot of sources in the industry (actually, he's just extremely good at digging - one of the best in this forum), and he says "well maybe it's not so simple as to be a 176gf part. And ______ is why. However, it's also possible that it is such a part. But these metrics aren't necessarily the end-all-be-all because we're seeing so and so in game as efficiency improvements"

Without so much as a whimper of antagonistic language, he presented his opinion on why it could be the case and why it couldn't. That made for a productive back-and-forth with USC-fan, who said "hey, maybe you're right - it's not impossible". That's the kind of discussion that gets people's opinions across, negative and positive, without resorting to any kind of "attacks" - right? It makes for an awesome thread, certainly. We would all be better off dropping talk like "agendas" and extreme viewpoints. If you feel someone has an agenda, instead of saying "you have an agenda!" try debating the person the way wsippel has (i.e. "it's possible you are incorrect, and this is why *insert fact or speculation here*").

hCC043205
 

Some fair points, but Jaguar (not bobcat, maybe you were just bringing that up as an example but Jaguar of course is the most interesting comparison), supports AVX, SSSE3, SSE4.1, SSE4.2, SSE4A, CLMUL, MOVBE, AVX, XSAVE, XSAVEOPT, FC16 BMI etc. From the little we know, Espresso only appears to have a few added instructions over Broadway mostly for multicore support. The FPU pipelines of Jaguar are doubled compared to Bobcat and thus the new chip will come with 128-bit FPU pipelines capable of executing 128-bit FPU operations in one clock cycle unlike Bobcat, which required two clock cycles for 128-bit floating point instructions, as presumably will Espresso.

krizzx, buddy, if everyone's been saying the same thing to you as they have on this page, maybe it's time for some introspection and a tonal change. If everyone else appears to be the problem to someone, maybe the real problem is them.
 
Level headed? A person who throws insults and attacks when he or something he likes is questioned
http://www.neogaf.com/forum/showpost.php?p=82594717&postcount=10228
http://www.neogaf.com/forum/showpost.php?p=82692909&postcount=10290,
ridicule the Wii and Nintendo at every given opportunity
http://www.neogaf.com/forum/showpost.php?p=82516541&postcount=10151
and claims that things have been confirmed that haven't
http://www.neogaf.com/forum/showpost.php?p=82495529&postcount=10139
and in general makes up data that is favorable to him
http://www.neogaf.com/forum/showpost.php?p=82658105&postcount=10254
and states is like it is confirmed fact as well when isn't remotetly true
http://www.neogaf.com/forum/showpost.php?p=82660889&postcount=10259
http://en.wikipedia.org/wiki/Watch_Dogs
and never provides links/media or anything to back up what he says or any explicit technical detail is the most level headed?

Don't ever attack me with nonsense like this again. I'm not here to fighting with people. I'm trying to get back to anlyzing the GPU, but as I said, every time I start someone comes in an blindsides it with their personal emotions and personal beef. If you do not like what I have to say, there is an ignore list.

I don't go hunt down and bash people who have done me no wrong just for not having the same opinion as me.
I can't find what you're saying in the posts that you're linking. Where is he ridiculing Nintendo? Where are the insults?

I'm sorry, but despite my benefit of the doubt, and your assertions otherwise, you come across as one of the most hostile posters I've ever come across on GAF. Look at how you've gone off in post here: someone posts a level-headed and calm analysis, and you post a deluge of links to prove you're a victim, and use inflammatory language while doing so.

How many times in the past 10 pages have you incorrectly gone after people and been corrected about their intentions, like this? How many people have asked you to calm down now? People who are not out to hate Nintendo, mind you, but people who regularly champion Nintendo and their hardware? You can't keep saying you're not here to fight people when you keep doing just that.

That's a pattern of behavior.

And I know you're already going to say this is an attack, I'm out to hate you and Nintendo and you'll no doubt post what my intentions are supposed to be. But when you believe it or not, I'm making an impersonal observation: that you could be so much more constructive as a poster if you let things go.

StevieP wrote a great post. Please heed his advice. The amount of grief you get would go down substantially if you didn't double down and become defensive but instead confronted people you perceive as attacking you in a calm manner and engage in a constructive dialog instead.
 
Before anyone gets a banhammer (if mods are even paying attention at this point) we would probably propagate a more productive discussion if we dropped all the previous stuff/personal discussion and what happened in the past and focused on the future and more productive methods of debate - no?
 
I can't find what you're saying in the posts that you're linking. Where is he ridiculing Nintendo?
"Its seem the main performance goal of the wiiu is to be xbox 360+ without breaking BC. That is what they did"
Where are the insults?
"You just seem like some massive fanboy that jump in the last minute to "save to wiiu from haters" or whatever nonsense."


Before anyone gets a banhammer (if mods are even paying attention at this point) we would probably propagate a more productive discussion if we dropped all the previous stuff/personal discussion and what happened in the past and focused on the future and more productive methods of debate - no?

I'm all for it and have been since the beginning. I wasn't the one who drew first blood though, for the record.

Now can we get back to the annotations?
 
Some fair points, but Jaguar (not bobcat, maybe you were just bringing that up as an example but Jaguar of course is the most interesting comparison), supports AVX, SSSE3, SSE4.1, SSE4.2, SSE4A, CLMUL, MOVBE, AVX, XSAVE, XSAVEOPT, FC16 BMI etc. From the little we know, Espresso only appears to have a few added instructions over Broadway mostly for multicore support. The FPU pipelines of Jaguar are doubled compared to Bobcat and thus the new chip will come with 128-bit FPU pipelines capable of executing 128-bit FPU operations in one clock cycle unlike Bobcat, which required two clock cycles for 128-bit floating point instructions, as presumably will Espresso.
You largely missed what I was saying. Which was: would you call Bobcat a 'non-SIMD' design? If yes - why? If not - why not?
 
Before anyone gets a banhammer (if mods are even paying attention at this point) we would probably propagate a more productive discussion if we dropped all the previous stuff/personal discussion and what happened in the past and focused on the future and more productive methods of debate - no?

Personally, I would love to.
 
I dunno what's going on here, but was thinking we could use W101 as a test subject of the general GPU performance. Kinda a shame it wasn't at aconstant 60 fps, then again there's a vast amount of effects/geometry going on.

Would anyone think Platinum's engine could still use some optimization for better performance? Or is it something a firmware update (a la 3DS and the recent one that seemed to take off a lot of processing overhead) can help?
 
You largely missed what I was saying. Which was: would you call Bobcat a 'non-SIMD' design? If yes - why? If not - why not?

No, even Bobcat supports specialized SIMD functions such as MMX and all the way up to SSE4A, those are SIMD instructions. I guess I am missing the point. If you look at the standard core designs without looking at special instructions at all, sure, I see where you are coming from, but Bobcat has all these SIMD instructions on top of that. What does Espresso have? Paired singles. That does give it SIMD support in a very primitive sense, but compared to Jaguar or anything else, yes I would say it's a relatively non-SIMD core.

Sorry if I misunderstood, I don't see where you were going with that at all. No, I would not call Bobcat non-SIMD at all.
 
I dunno what's going on here, but was thinking we could use W101 as a test subject of the general GPU performance. Kinda a shame it wasn't at aconstant 60 fps, then again there's a vast amount of effects/geometry going on.

Would anyone think Platinum's engine could still use some optimization for better performance? Or is it something a firmware update (a la 3DS and the recent one that seemed to take off a lot of processing overhead) can help?

What I'm the most curious about with the Wonderful 101 are the polygon counts.

The enemies look extremely rounded. I remember people trying to claim it was CG and no realy running on the Wii U when it was first announced as Project 100
the-wonderful-101_006.jpg
 
Status
Not open for further replies.
Top Bottom