Wii U CPU |Espresso| Die Photo - Courtesy of Chipworks

That chunk would have to be of the actual size of Broadway's L2. I can't remember if we've discussed it, but that's a good question nevertheless. My wild guess would be they'd do it fairly i.e. they meet the requirements, as the alternatives would just not work - fooling around with cache latencies is something immediately noticeable.

At 256kb, it would be pretty small. But unlike the GPU shots I don't think anyone has noticed anything that could be SRAM here.
 
At 256kb, it would be pretty small. But unlike the GPU shots I don't think anyone has noticed anything that could be SRAM here.
It wouldn't be smaller than the 6x 32KB L1 sram pools, and those are clearly noticeable.
 
It wouldn't be smaller than the 6x 32KB L1 sram pools, and those are clearly noticeable.

Hmm good point. Googling it I only found my own comments from weeks/months ago, lol. My theory was that since the L2 on the Broadway can only run at the CPUs max clock speed at most, and likely half of that since IBM usually did things that way back then, maybe the eDRAM just stays at a higher clock to mask the latency.
 
Time to clear up what i said here, here and in some other messages a few months ago, as i've already revealed it on IRC.

One of the Wii U core wasn't used throughout the development of several launch window games. It's an "engine related issue", meaning it's the way the teams behind those titles have programmed their engine for the Wii U CPU. It wasn't widespread, not universally seen on all the games, but witnessed on at least a few of those. The developers found and resolved this problem mere months/weeks prior release, and most of them gained a nice increase in FPS. It's one the origin of the huge boost in framerate i reported a long time ago that some studios managed to get (from 30 fps to 60 fps for some games), along with new dev kits, etc.

You heard it right, a whole core of the Wii U CPU wasn't put in use for most of the dev cycle of several titles, before it was fixed.

It's rather telling either: - on the crucial need of studios, accustomed to the HD-Twins framework, to adapt their code to the Wii U specifics - or the perfectible state of documentation/dev kit/SDK's at the time - or [insert your own conclusion/guess derived from this info]

Interesting. That would actually explain a LOT.

This actually makes me even more impressed with the Wii U hardware given that they able to get performance equal to the last gen consoles with only 2 cores of the CPU. I feel that this needs to be covered by major game journalists. This would clear a lot of misconceptions from the launch period.



I still want to see these benchmarks from blu that were mentioned earlier, though.
 
Interesting. That would actually explain a LOT.

This actually makes me even more impressed with the Wii U hardware given that they able to get performance equal to the last gen consoles with only 2 cores of the CPU. I feel that this needs to be covered by major game journalists. This would clear a lot of misconceptions from the launch period.



I still want to see these benchmarks from blu that were mentioned earlier, though.

It wasn't something widespread though, just witnessed on several games (i know of at least two, maybe it concerned more).
 
It wasn't something widespread though, just witnessed on several games (i know of at least two, maybe it concerned more).

If you can't name specifics, what manner of games? Were they something we would consider intensive? Or the sort of thing that didn't really need to squeeze out all the juice?
 
If you can't name specifics, what manner of games? Were they something we would consider intensive? Or the sort of thing that didn't really need to squeeze out all the juice?

Well, in all the games involved, the activation of a whole core resulted in a clear increase of the framerate. They are moderately cpu-intensive i presume, with the average IA, physic and other parameters to handle. It's a less noticeable leap that it would have been if let's say 33% of the GPU was unavailable for these titles until it was unlocked, but it was still a very welcome boost for the teams once they fixed this issue.
 
wonder if this issue might have been experienced by DICE and 4A too

this is what i've been wondering since i've heard this info, how far it was widespread. The teams behind those projects are rather competent, so to overlook a whole core, must implies something (perhaps one of the hypothesis i said above, or simply that they messed up big time)
 
this is what i've been wondering since i've heard this info, how far it was widespread. The teams behind those projects are rather competent, so to overlook a whole core, must implies something (perhaps one of the hypothesis i said above, or simply that they messed up big time)

Or they had awful tools, which we already know was the case.
 

Did you ever get word on when the tools finally got competent? Miyamoto said during an investor Q&A that it was Mid 2012, but Criterion made it sound like post launch. It would be interesting to see how much those early tools handicapped some of the launch ports that didn't turn out.
 
Nope...Unless we all have sudden onset loss of counting skill. He's saying it used 2/3. We can physically count the cores in the picture.

I know, I just find it strange if developers knew the WiiU had three cores how could they not know that one of them was not being used.

Not saying Ideaman is wrong or anything.
 
I know, I just find it strange if developers knew the WiiU had three cores how could they not know that one of them was not being used.

Not saying Ideaman is wrong or anything.

Because there isn't a magical mirror that lets you see it. The only thing you know is what code you are sending to a device. You can't actually pull out a magnifying glass and see what its doing with the code or how. Where are you coming from with this?

Its a problem that is often run into with PC games where a game is only using 1 core where it should be using 2/4, using 2 where it should be using 1, or an issue where splitting the code causes problems like Fallout 3 on the PC had originally. Then there are issues where the game may work right on one dual core processor but not on another with different architecture. This was often run into between AMD/Intel CPUs and it is apparently the problem here. Most of the games ported to the Wii U were made for the 360/PS3 CPUs which were of a completely different design. Tools made for those console will not work for the Wii U CPU out the door. They have to be made/tweaked and that takes time.

Games using the wrong amount of cores or using cores improperly is common. If every game released perfectly and every dev utilized 100% of the hardware potential making no mistakes then there wouldn't be patches. This should be common knowledge to anyone familiar with computer hardware.
 
Because there isn't a magical mirror that lets you see it. The only thing you know is what code you are sending to a device. You can't actually pull out a magnifying glass and see what its doing with the code or how. Where are you coming from with this?

Its a problem that is often run into with PC games where a game is only using 1 core where it should be using 2/4, using 2 where it should be using 1, or an issue where splitting the code causes problems like Fallout 3 on the PC had originally. Then there are issues where the game may work right on one dual core processor but not on another with different architecture. This was often run into between AMD/Intel CPUs and it is apparently the problem here. Most of the games ported to the Wii U were made for the 360/PS3 CPUs which were of a completely different design. Tools made for those console will not work for the Wii U CPU. They have to be made and that takes time.

Games using the wrong amount of cores or using cores improperly is common. If every game released perfect and every dev made no mistake then there wouldn't be patches. This should be common knowledge to anyone familiar with computer hardware.

This is how i've understood what my sources told me. Basically that they mostly worked at a certain level of abstraction, after the cpu processed the code it received. I suppose at one point, their engineers studied more thoroughly the cpu and how it managed their engine (less abstraction, more "to the metal"), and found this problem.
 
This is how i've understood what my sources told me. Basically that they mostly worked at a certain level of abstraction, after the cpu processed the code it received. I suppose at one point, their engineers studied more thoroughly the cpu and how it managed their engine (less abstraction, more "to the metal"), and found this problem.

Quite strange. Looking at developer technical documents, like from Guerrilla, they are able to lock certain processes to certain cores, so that directly implies they know what's going on which cores. Not being perfectly optimized is one thing, but not knowing which cores are loaded at all? What kind of dev tool doesn't show per-core load?
 
Quite strange. Looking at developer technical documents, like from Guerrilla, they are able to lock certain processes to certain cores, so that directly implies they know what's going on which cores. Not being perfectly optimized is one thing, but not knowing which cores are loaded at all? What kind of dev tool doesn't show per-core load?

You mean with the PS4? Guerrilla is owned by Sony right? They should probably have the complete spec information in such a case.
 
Because there isn't a magical mirror that lets you see it. The only thing you know is what code you are sending to a device. You can't actually pull out a magnifying glass and see what its doing with the code or how. Where are you coming from with this?

Its a problem that is often run into with PC games where a game is only using 1 core where it should be using 2/4, using 2 where it should be using 1, or an issue where splitting the code causes problems like Fallout 3 on the PC had originally. Then there are issues where the game may work right on one dual core processor but not on another with different architecture. This was often run into between AMD/Intel CPUs and it is apparently the problem here. Most of the games ported to the Wii U were made for the 360/PS3 CPUs which were of a completely different design. Tools made for those console will not work for the Wii U CPU out the door. They have to be made/tweaked and that takes time.

Games using the wrong amount of cores or using cores improperly is common. If every game released perfectly and every dev utilized 100% of the hardware potential making no mistakes then there wouldn't be patches. This should be common knowledge to anyone familiar with computer hardware.


Yes, I understand, but its a console. This issue always comes up because each generation, each competing console comes up with a new design. How can any developer know from the outset the potential of a console?

Instead of developers making statements the WiiU CPU was weak, etc.
Why didnt they simply tell the truth and say, "well our engines are not optimized for this hardware yet." or, "We still need time to learn the architecture".

These premature statements probably actually hurt the console.
It could of made other developers jump to conclusions and skip the WiiU,
and hurt potential sales.

By now, any developer worth their salt must know Nintendo's philosophy on
console design. Don't underestimate it based on the numbers. It goes for any
console actually.
 
You mean with the PS4? Guerrilla is owned by Sony right? They should probably have the complete spec information in such a case.

They were able to show the per-core load in real time. You think every non-Sony owned developer is blocked from that functionality? Not likely. Imagine the butthurt that would cause. And it would disadvantage them from Microsofts awesome dev tools.

For what it's worth, I've worked in platform specific development environments, there's no "magic mirror" about it, performance information is always a big component. If the story was just that they hadn't optimized well, sure I'd have no reason to doubt it, but not seeing that one core was going unused? Very very strange.
 
Yes, I understand, but its a console. This issue always comes up because each generation, each competing console comes up with a new design. How can any developer know from the outset the potential of a console?

Instead of developers making statements the WiiU CPU was weak, etc.
Why didnt they simply tell the truth and say, "well our engines are not optimized for this hardware yet." or, "We still need time to learn the architecture".

These premature statements probably actually hurt the console.
It could of made other developers jump to conclusions and skip the WiiU,
and hurt potential sales.

By now, any developer worth their salt must know Nintendo's philosophy on
console design. Don't underestimate it based on the numbers. It goes for any
console actually.

Most of what you say actually has a simple explaination. Only a few devs legitemately criticed the CPU and that was only during launch which is the only time where this problem was supposedly an issue.

Unlike the other consoles, the Wii U is mostly custom made and its part don't use architectures that most devs are familiar with. In the case of the Xbox1 and 360, it was pretty much like programming for PC. That is why the 360 is the lead dev platform for multiplatform games more often than not. Its the easiest to develop for. Knowing what may be possible and actually doing it are two different matters.

Same thing happened with the GC. Many devs outright said it couldn't do things like bump mapping, normal mapping and bloom, but Factor 5 had all of those things in its launch game for the console. Being a developer doesn't automatically make you a magical programming genius. Most of them have little more knowledge than I do if not less. Only difference between us is the title and the paycheck. They are pros in name but coding still requires skill and understanding. There is no universal standard that would make you skilled on all architectures and all platforms on the planet Earth.

Most of the devs who made the claim about the CPU didn't even work with it. The Metro Last Light dev is an example. They actually stated that they hadn't done any work with the console but people still took their statement to heart. The rest are all partnered with EA and EA has it out for Nintendo right now.

The other things don't correlate with the issue. Philosophy has nothing to do with grunt work.

Of course, none that would have mattered at all in this issue as it was a software problem from the start, not a hardware problem. There was no way of knowing that it wasn't using 1 core right off. Its like telling a computer to find flaws with itself. If code executes then it reports no flaws. It doesn't know if it executed the way you wanted it to without you telling it that, and you would have to be aware of a problem to look for it first.
 
Quite strange. Looking at developer technical documents, like from Guerrilla, they are able to lock certain processes to certain cores, so that directly implies they know what's going on which cores. Not being perfectly optimized is one thing, but not knowing which cores are loaded at all? What kind of dev tool doesn't show per-core load?

Yes that was i thought too (on the assignment of certain process to specific cores). But how my sources could have overlooked for the main part of their development that one whole core was unused if the Wii U dev tools allowed the classical per-core load monitoring that you can have on PC or other systems ? Or perhaps their games mostly used the "main" core with the most cache and they were more lax, indulgent on the use of the two other cores, assigning less important stuff to those, before finding that only one additional core was put in use ?
 
They were able to show the per-core load in real time. You think every non-Sony owned developer is blocked from that functionality? Not likely. Imagine the butthurt that would cause. And it would disadvantage them from Microsofts awesome dev tools.

For what it's worth, I've worked in platform specific development environments, there's no "magic mirror" about it, performance information is always a big component. If the story was just that they hadn't optimized well, sure I'd have no reason to doubt it, but not seeing that one core was going unused? Very very strange.

Well I don't thin any developer would be locked away from this functionality at least not after launch window. I thought you were speaking specs on paper, not while they were being worked on.

I have no idea what I am talking about
 
Yes that was i thought too (on the assignment of certain process to specific cores). But how my sources could have overlooked for the main part of their development that one whole core was unused if the Wii U dev tools allowed the classical per-core load monitoring that you can have on PC or other systems ? Or perhaps their games mostly used the "main" core with the most cache and they were more lax, indulgent on the use of the two other cores, assigning less important stuff to those, before finding that only one additional core was put in use ?


So the cores in the WiiU are specialized and
developers made the mistake of treating them as homogenous?
 
So the cores in the WiiU are specialized and
developers made the mistake of treating them as homogenous?

From what we know, the cores aren't too much different, but perhaps at that time, with the dev tools of that period, and their engine, they mostly leveraged one core, and hadn't properly took advantage of the other two before finding that one wasn't used. It's just wild guesses here on how they could have overlooked a core. Perhaps they have simply messed up big time.
 
From what we know, the cores aren't too much different, but perhaps at that time, with the dev tools of that period, and their engine, they mostly leveraged one core, and hadn't properly took advantage of the other two before finding that one wasn't used. It's just wild guesses here on how they could have overlooked a core. Perhaps they have simply messed up big time.

Well, if what you are saying is true and going by official confirmation of poor support and documentation at the start of the Wii U life, it was more than likely purely the fault of bad dev kits.
 
From what we know, the cores aren't too much different, but perhaps at that time, with the dev tools of that period, and their engine, they mostly leveraged one core, and hadn't properly took advantage of the other two before finding that one wasn't used. It's just wild guesses here on how they could have overlooked a core. Perhaps they have simply messed up big time.

Generally when doing multi-core stuff, what's done is that you spawn another thread. However, this thread does not necessarily run on a different core from other threads in the system (since you can spawn many more threads than cores). It could just be that developers, expecting N cores, spawned N-1 work threads (separate from the main thread) for jobs and expected all the cores to pick up work. However, it's possible that there was some defect in the scheduler in the OS, and no work was ever picked up by one of the cores. That would be pretty obvious when looking at performance counters though.
 
From what we know, the cores aren't too much different, but perhaps at that time, with the dev tools of that period, and their engine, they mostly leveraged one core, and hadn't properly took advantage of the other two before finding that one wasn't used. It's just wild guesses here on how they could have overlooked a core. Perhaps they have simply messed up big time.


I think Nintendo went with a heterogeneous design:

From purely hardware technology point of view, using dedicated cores for specific
tasks is the best choice in terms of efficiency, as it will yield a simpler structure and
less power consumption, while will deliver higher performance.
 
Today has certainly been progressive.

I wish someone with a homebrewed Wii would test the integer performance vs. floating point performance. I'm still curious about how better the architecture in the Nintendo CPU's handles integers than floating points.
 
Time to clear up what i said here, here and in some other messages a few months ago, as i've already revealed it on IRC.

One of the Wii U core wasn't used throughout the development of several launch window games. It's an "engine related issue", meaning it's the way the teams behind those titles have programmed their engine for the Wii U CPU. It wasn't widespread, not universally seen on all the games, but witnessed on at least a few of those. The developers found and resolved this problem mere months/weeks prior release, and most of them gained a nice increase in FPS. It's one the origin of the huge boost in framerate i reported a long time ago that some studios managed to get (from 30 fps to 60 fps for some games), along with new dev kits, etc.

You heard it right, a whole core of the Wii U CPU wasn't put in use for most of the dev cycle of several titles, before it was fixed.

It's rather telling either: - on the crucial need of studios, accustomed to the HD-Twins framework, to adapt their code to the Wii U specifics - or the perfectible state of documentation/dev kit/SDK's at the time - or [insert your own conclusion/guess derived from this info]
Wow, thanks for the info, Ideaman. It's hard to believe that there were current-gen ports running on just two CPU cores so close to launch. I would imagine the devs that realized this was like, "Seriously?!! :O"

I understand if Darksiders 2 suffered from this issue.. among other things.
 
Did you ever get word on when the tools finally got competent? Miyamoto said during an investor Q&A that it was Mid 2012, but Criterion made it sound like post launch. It would be interesting to see how much those early tools handicapped some of the launch ports that didn't turn out.

For my sources, if you remember the WUSTs, i said that the situation started to really improve a short time before E3 2012. They received the labeled "mass production dev kit" a few weeks before the show. There were additional improvements in the dev tools after, but i would say that for some studios, they got a good (read not self-hindered and hugely not optimized) development framework for Wii U before last year's E3.

Wow, thanks for the info, Ideaman. It's hard to believe that there were current-gen ports running on just two CPU cores so close to launch. I would imagine the devs that realized this was like, "Seriously?!! :O"

I understand if Darksiders 2 suffered from this issue.. among other things.

Yes it was their reaction :)
 
Wow, thanks for the info, Ideaman. It's hard to believe that there were current-gen ports running on just two CPU cores so close to launch. I would imagine the devs that realized this was like, "Seriously?!! :O"

I understand if Darksiders 2 suffered from this issue.. among other things.

I can see this being an issue for exclusives, but not really with ports.

Because I thought the argument was that most issues with WiiU ports were congruent to issues found in other consoles as well. Meaning, drops in frame rates happened in the same areas, etc in all versions. So regardless of how more powerful the WiiU CPU actually is, with one core, or three, because the games were not optimized, these issues would pop up.
 
I can see this being an issue for exclusives, but not really with ports.

Because I thought the argument was that most issues with WiiU ports were congruent to issues found in other consoles as well. Meaning, drops in frame rates happened in the same areas, etc in all versions. So regardless of how more powerful the WiiU CPU actually is, with one core, or three, because the games were not optimized, these issues would pop up.

The concurrent problems were the results of laziness in the porting, being a straight port in other words. That doe snot forgo other problem from porting to new hardware at all.
 
You mean we have actual benchmarks for espresso?
We don't. We do extrapolation from Broadway, as that's (1) the best we can do, and (2) it's not without a reason.
 
I can see this being an issue for exclusives, but not really with ports.

Because I thought the argument was that most issues with WiiU ports were congruent to issues found in other consoles as well. Meaning, drops in frame rates happened in the same areas, etc in all versions. So regardless of how more powerful the WiiU CPU actually is, with one core, or three, because the games were not optimized, these issues would pop up.

Actually.. I think the games that Ideaman knows this type of info about are current-gen ports.
 
So I have read every single page of this thread over the last 2 months, please correct me if Im wrong but is this a simplistic condensation of the consensus this thread has reached: "Wii U is marvelously efficient but builds off of withered tech that would allow smexy visuals if a dev were wiling to code for it explicitly"?
 
So I have read every single page of this thread over the last 2 months, please correct me if Im wrong but is this a simplistic condensation of the consensus this thread has reached: "Wii U is marvelously efficient but builds off of withered tech that would allow smexy visuals if a dev were wiling to code for it explicitly"?
Not necessarily "withered tech"; more like capped tech, in the sense that...

say, AMD does a GPU and does 5 flavours for it, integraded low-end, low-end with cooling, middle end, high end and a variant like Titan or those "ultimate card get's more ultimate, GHz Edition!" versions, and Nintendo invested heavily on the integrated low-end one, in order to improve efficiency.

It's like saying they tuned a 1.0 engine, so it's never gonna be a race car; but it's very efficient nonetheless. Of course, 1.0 "tuning" usually goes for fuel efficiency, and that's no different here.


That's for the GPU, it's so radically changed over what has been on the market that it can't be called withered and based on a conservative production line (we're talking about different stuff like LPG being employed on this by TMSC); CPU is somewhat withered, yes (eDRAM instead of L2 cache is a bold change, as SMP is), but still competent enough; for the performance they're going for.

I still think they should have gone with 4 cores, judging by the silicon it takes and power it must consume it would be painless and inexpensive to implement and perhaps help down the line.
For my sources, if you remember the WUSTs, i said that the situation started to really improve a short time before E3 2012. They received the labeled "mass production dev kit" a few weeks before the show. There were additional improvements in the dev tools after, but i would say that for some studios, they got a good (read not self-hindered and hugely not optimized) development framework for Wii U before last year's E3.
What do your contacts say of the Wii U now?

Because third party support crumbled, do they want to work on it, they have no opinion for or against it... trying to understand what's going on, perhaps from a limited showing of people, but still.

Is there any good will left?
 
Something I missed which warrants a response:

I'm hoping for something from someone with credentials not on a forum, like when Timothy Lottes wrote that cool blog post about coding through apis vs coding to the metal and why consoles can punch so far above their weight. I don't think it will happen though because like that Lotte blog proved, saying anything about consoles is like sticking your dick in a wasp nest.
What's wrong with forums? The moment we decide forums are shitpits only good for emotion dumping is the moment forums do become shitpits only good for emotion dumping.

Let's not forget all the research this forum did over the past year along the advent of the new console gen. I, just like a few other gaffers who've invested personal time in these curiosity matters, could have posted my little pass-time research initiative on a blog, on b3d, or on my personal site. I, along with those other gaffers, decided to post it on GAF, because I have sufficient respect for the community here. Are you telling me I shouldn't have?
 
So I have read every single page of this thread over the last 2 months, please correct me if Im wrong but is this a simplistic condensation of the consensus this thread has reached: "Wii U is marvelously efficient but builds off of withered tech that would allow smexy visuals if a dev were wiling to code for it explicitly"?

I think thats how I kind of view it.

It's' biggest weakness isn't so much being weak--although it is in comparison to upcoming consoles--but, that it is hard to work with because nothing is standard about it. It truly is a Nintendo-only gamebox as far as big budget games go.
 
I think thats how I kind of view it.

It's' biggest weakness isn't so much being weak--although it is in comparison to upcoming consoles--but, that it is hard to work with because nothing is standard about it. It truly is a Nintendo-only gamebox as far as big budget games go.
Porting doesn't seem like a hard thing to do. Being it up-port or a downport, seeing it has the feature set for next gen in there.

It's a little different yes, and most devs aren't willing to learn it (as always) but it nowhere near the different PS2, PS3, GC or Wii were to PC/Xbox. Not by a long shot.

Which leaves us with ill will towards it somewhere along the line; perhaps every step of the way.
 
Something I missed which warrants a response:


What's wrong with forums? The moment we decid forums are shitpits only good for emotion dumping is the moment forums do become shitpits only good for emotion dumping.

Let's not forget all the research this forum did over the past year along the advent of new console gen. I, just like a few other gaffers who've invested personal time in these curiosity matters, could have posted my little pass-time research initiative on a blog, on b3d, or on my personal site. I decided to post it on GAF, because I have sufficient respect for the community here. Are you telling me I shouldn't have?

Forums are great for many things. However, relying on them for accurate information is a terrible idea. Nobody goes to a forum for medical diagnoses of investment advice, or at least I hope not. And what I was looking for was a good breakdown of the merits of the Wii U before ordering one. So what I was hoping for in terms of forum posts was some links to actual non-anonymous sources.

The Lottes' thing really bugged me because the egotistical douchebaggery of some anonymous forum types, including a couple at b3d, shot down an interesting topic. Lottes deleted the blog and stopped talking.

& look at the reaction to some random tweets from a completely unknown EA software engineer who was only answering some questions on twittter to see what I mean. If people were less weird on forums, he would have probably answered more questions on his twitter.

But no, quotes get posted on forum, shitstorm ensues, guy shuts up and deletes his comments, just like Lottes. so yeah, sometimes forums suck and actually hamper information dissemination.

So yeah, good with the bad...
 
Porting doesn't seem like a hard thing to do. Being it up-port or a downport, seeing it has the feature set for next gen in there.

It's a little different yes, and most devs aren't willing to learn it (as always) but it nowhere near the different PS2, PS3, GC or Wii were to PC/Xbox. Not by a long shot.

From what I understand, it is a lot different from the x86 setup of the other guys. Even Nintendo is having problems.

Xbox was PPC, not x86 like PC btw. It also required much work to understand from developers. Thats why the best performing games were at the end.
 
Forums are great for many things. However, relying on them for accurate information is a terrible idea. Nobody goes to a forum for medical diagnoses of investment advice, or at least I hope not. And what I was looking for was a good breakdown of the merits of the Wii U before ordering one. So what I was hoping for in terms of forum posts was some links to actual non-anonymous sources.

The Lottes' thing really bugged me because the egotistical douchebaggery of some anonymous forum types, including a couple at b3d, shot down an interesting topic. Lottes deleted the blog and stopped talking.

& look at the reaction to some random tweets from a completely unknown EA software engineer who was only answering some questions on twittter to see what I mean. If people were less weird on forums, he would have probably answered more questions on his twitter.

But no, quotes get posted on forum, shitstorm ensues, guy shuts up and deletes his comments, just like Lottes. so yeah, sometimes forums suck and actually hamper information dissemination.

So yeah, good with the bad...

he didn't answer questions he ranted like an 11 year old fanboy in the school playground and included lies to make his point
 
he didn't answer questions he ranted like an 11 year old fanboy in the school playground and included lies to make his point

his tweets were in response to some questions from some guy. He didn't just didn't start tweeting for no reason.

& calling him a "lying 11 year old fan boy" for a couple of blunt 140 character tweets is the kind of egotistical forum douchebaggery I'm talking about. Cuz you know, maybe from his perspective, as a highly trained software engineer with years of experience in making certain types of multiplatform games, the wii u is crap.

But whatever, people like getting angry and indignant I guess, & I can't change that so lets move on.
 
his tweets were in response to some questions from some guy. He didn't just didn't start tweeting for no reason.

& calling him an "11 year old fan boy" for a couple of blunt 140 character tweets is the kind of egotistical forum douchebaggery I'm talking about. Cuz you know, maybe from his perspective, as a highly trained software engineer with years of experience in making certain types of multiplatform games, the wii u is crap.

But whatever, people like getting angry and indignant I guess, & I can't change that so lets move on.

and thats why if he cant say anything that comes across intelligently in 140 characters he shouldn't have said anything at all, in an industry in which companies go bust or people get laid off on a regular basis talking such trash about a company that one day he may end up having to apply for a job at is just foolish
 
and thats why if he cant say anything that comes across intelligently in 140 characters he shouldn't have said anything at all, in an industry in which companies go bust or people get laid off on a regular basis talking such trash about a company that one day he may end up having to apply for a job at is just foolish

??

I guess thats why we only ever hear from PR people and there are 4 PR workers for every 1 journalist. Cuz they know how to tweet intelligently. What a great world of NDAs and PR drones we live in.
 
his tweets were in response to some questions from some guy. He didn't just didn't start tweeting for no reason.

& calling him a "lying 11 year old fan boy" for a couple of blunt 140 character tweets is the kind of egotistical forum douchebaggery I'm talking about. Cuz you know, maybe from his perspective, as a highly trained software engineer with years of experience in making certain types of multiplatform games, the wii u is crap.

But whatever, people like getting angry and indignant I guess, & I can't change that so lets move on.
The problem is that he made some throwaway comments without any context whatsoever. We don't know if he ever actually worked on the platform or even just read the documentation. We don't know why he thinks it's "crap", we don't know how exactly it's supposed to be "weaker than 360", and his "3rd parties don't make money on Nintendo platforms, only Mario and Zelda sell" statement is factually wrong and reeks of bias.
 
Top Bottom