WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
I also find interesting what is going on on the other games for the new consoles. Another downgrade was informed, Ryse is 900p native and not 1080p. I find this interesting to put the latest Wii U games performance in context of their rivals, as a lot of 720p60 games on this low powered console seem even more amazing to me. The Wii U is capable of handling some games at 1080p, but going by Shinen, it is not worth it, as going 720p they can add more effects. Maybe with more optimizations this could change, but for now I still believe this is an excellent 720p machine.
Resolution is not the end-all, be-all measure of a console's power, so I'm not sure what use comparing games across platforms merely by resolution is meant to illustrate. It completely ignores every other factor that affects a game's IQ.
 
IIRC, DXTC1 through 5 were not mandatory in DX6, but they were (the entire set or thereabout) in DX8.
Actually makes sense, because DirectX 6 was from 1998 and it took until 2001 for OpenGL 1.3 to come out and make it mandatory; if it was on DirectX then it wouldn't make much sense to omit for so long.

DirectX 6 most likely added support for the initial cards that had or could have S3TC. This said, Geforce 256 and Geforce 2 were capable of it albeit not upon Geforce 256 launch in 1999; but from May 2000 onwards.

I'm guessing they licenced it at that point and it just happened that their cards could pull it without going software-mode on it; sadly it was limited to 16-bit color textures and that seems to have lasted until Geforce 5 launch, so Xbox still suffered from it.
 
Resolution is not the end-all, be-all measure of a console's power, so I'm not sure what use comparing games across platforms merely by resolution is meant to illustrate. It completely ignores every other factor that affects a game's IQ.

And I am not stating it is the only thing am I? Just sharing the info, actually interesting as everyone was expecting every next gen game to be 1080p60. IMO there is nothing out of the extraordinary Wii U could not do shown in KI as far as the IQ you are mentioning.

The titles I am referring for Wii U being 720p60 look great IMO, specially Bayo2, MK8, SSB and Sonic.

There is nothing at launch the Wii U could not do with some scaling and GTA V goes to show that even current gen consoles are pretty capable.

Inb4, I am not saying Wii U is in the same league as PS4/XB1, but I do believe that most games can be ported in a satisfying way.
 
You're going to see a lot of 720p games and everything in between. Everywhere.
Really? Because so far, that's not the case.

Ryse and Killer Instinct =/= All of next gen.

Seeing as every gen (not counting Wii) had a resolution increase, I have no reason to believe PS4/XBO wont follow that.

Edit: At worst, some multiplats on XBO might drop the resolution while the PS4 is mostly 1080p since after all, PS4 is more powerful. But even that's a guess.
 
Really? Because so far, that's not the case.

Ryse and Killler Instinct =/= All of next gen.

Seeing as every gen (not counting Wii) had a resolution increase, I have no reason to believe PS4/XBO wont follow that.

You don't have to lower your expectations now if you don't want to. They will be lowered for you. If you think you're getting 1080p for most games this 8th generation, I have a bridge in alaska to sell you. (720p won't be uncommon)
Edit: and just to reiterate the point. The resolution doesn't really matter when comparing what you're rendering. It's what you do with it, and what you're pushing out of the GPU.
 
While using a fraction of the wattage.

It really is impressive in its own right. Just not the kind of impressive to turn heads.


Yup. That about sums it up, really.

No idea why they weren't just more open about it's capabilities. It's a fairly impressive little box imo. They weren't aiming for a power house, with this being their best effort; for better or worse they aimed for a 360+ (or PS3.5) with a more progressive architecture and feature set (to avoid Wii's dev isolation), and a unique input to differentiate it. Might have worked well too, with some better marketing/advertising and had MS stuck to their original plans (if rumours are to be believed) - I'm betting Nintendo anticipated being much closer to XBone than it ended up.

The problem with Nintendo's "secrecy" complex is to some people it comes across like they were trying to deceive us into thinking it was more powerful or that they had something to hide.

Anyhoo, lets kick back until some more actual info comes out about the GPU :)
 
I don't remember the exact words, but I think the better description for wii u was already made in this thread ... a current gen machine refined or the perfect current gen machine. Not more not less (speaking about performance). Obviosly with almost "current" tech. Basically what stevieP said.

Of course i think someone won't agree ...

Yeah, a perfected 7th gen system. Its a real shame we won't see GTA V running on it to judge what differences the advancements the system has on the overall graphics.
 
I don't remember the exact words, but I think the better description for wii u was already made in this thread ... a current gen machine refined or the perfect current gen machine. Not more not less (speaking about performance). Obviosly with almost "current" tech. Basically what stevieP said.

Of course i think someone won't agree ...

Now you've done it.
 
I agree with the earlier comment about this thread needing to be renamed. It should be changed "lets bash Nintendo and the Wii U felloowship" thread because that is apparently the only form of opinion that goes unmolested. If someone were to just muck up a PS3/PS4/Xboxone thead with nothing but console bashing, they'd be banned in a heart beat.

All those raging fits people have been throwing over the past 10-20 pages about derailing the thread with offtopic statements, but apparently its perfectly fine so long as the opinions doing it are those they are fond of. Half of this page is nothing but that, but no one says a word. This is a new height of hypocrisy. The GPU isn't even arbitrarily mentioned to where they could at least pretend to be on topic.

I would suggest someone just make a new topic and analyze Latte in there and leave them to have a bashing party in here, but I'm 100% certain that all of the same individuals would just move to that one and continue to derail that thread with offtopic Nintendo bashing complete with congratulatory fist bumping for anyone who joins in.

Can someone recommend a place where its possible to have an on topic technical discussion without the thread being dragged down by people taking repeat shots at business models, promoting their preferred hardware or endlessly hammering its producer with condescending labels with no intention of making any progressive contribution on the hardware being discussed?

I really hate all of this console war garbage.
 
Actually makes sense, because DirectX 6 was from 1998 and it took until 2001 for OpenGL 1.3 to come out and make it mandatory; if it was on DirectX then it wouldn't make much sense to omit for so long.
S3TC also made its debut in the Savage3D card in 1998. 3dfx made its own FXT1 in 1999 and DXTC support was made mandatory in DX7 if my memory serves correctly. I don't remember any evolution in texture compression since 3Dc appeared in 2004 and it's a basic function in modern GPUs. I'm not entirely sure why it's even being brought up.

A decent read can be found on the (then current) subject here for those interested.
 
I also agree somewhat. I think the 32MB of EDRAM has not been fully put to good use.

The RAM is 2x PS360, but the EDRAM is 3 times that of X360 (same amount of ESRAM as XB1) plus the focus on so much caches.

Its more than that. The PS3/360 both have RAM reserved for the OS so its not 512 for games, its 512 total. How much is reserved in each one, you would have to ask someone else.

The Wii U has over double the memory for games and its has a little over 3 times the eDRAM of the 360. Its also a possibility that in the future, the Wii U may also have some of the other 1GB freed up for dev use the.

That is not really helping us analyze Latte though. What'd I like to look into for how the RAM relates to the GPU is the supposed performance boost.
http://www.gamechup.com/ps4-sony-earlier-thought-about-slow-gddr5-edram-1088gbs/
 

At this point I have no clue what krixx is ranting about. Apparently being impressed with what they've achieved while still being realistic about the systems capabilities is console war stuff. But then again we've had people that firmly believe ray traced lighting is feasible, that a smattering of fixed function shaders is akin to an entirely new fixed function featureset. That the entire thing taken as a whole is more than marginally more powerful than consoles on shelves today, while using a fraction of the electricity on similar if not outright identical die sizes.

It's not enough to look at it and go "Very impressive efficiency gains there Nintendo and AMD." It must have a hidden reserve. We... just... can't... see... where...
 
And brings us back around to "It might have a comparable featureset to PS4/One but nowhere near the brute capability to put it to good use." I mean if it has tessellation units comparable to PS4/One it likely doesn't have the poly crunching capability to put it to good use.

While using a fraction of the wattage.

It really is impressive in its own right. Just not the kind of impressive to turn heads.

This right here. As lherre mentioned, it's a perfect iteration of a current gen machine. And it's doing great graphics (bayonetta2, being an example, looks so so good), with what they have under the hood.

Don't mind him. Trust me, it's for the best. Many have tried to talk some sense into him, all have failed.
 
And I am not stating it is the only thing am I? Just sharing the info, actually interesting as everyone was expecting every next gen game to be 1080p60. IMO there is nothing out of the extraordinary Wii U could not do shown in KI as far as the IQ you are mentioning.

The titles I am referring for Wii U being 720p60 look great IMO, specially Bayo2, MK8, SSB and Sonic.

There is nothing at launch the Wii U could not do with some scaling and GTA V goes to show that even current gen consoles are pretty capable.

Inb4, I am not saying Wii U is in the same league as PS4/XB1, but I do believe that most games can be ported in a satisfying way.

I think you're mostly right. I think the biggest confound regarding the Wii U with regard to resolution is not knowing how the core is configured (in terms of shaders/TMUs/ROPs) rather than what its actual feature set is. It's its processing throughput that is the big question. We do know that Xbox One has 16 ROPs to the PS4's 32 ROPs which makes the PS4 better suited to 1080p resolution. 8 ROPs has been the de facto standard for the previous generations' 720p resolutions so there's no question that the Wii U is very well suited to that. The bigger question is whether it has 8 or 16 ROPs which is harder to quantify without disregarding its transistor budget and die size.

At the two extremes (for example), it could have a 320:16:8 core or it could have 160:32:16 (shader processors:TMUs:ROPs). The former configuration is better suited to shader heavy games at 720p while the latter is better suited to shader light games at 1080p. Personally, I'm inclined to think (and hope) it's the former because it would make multiplats more feasible than the latter.

Edit:

And enough with the self-victimization going on in here. This is not a 'bash Nintendo' thread but a 'GPU technology' thread because some of us happen to love Nintendo. Disagreement is fine but illusions don't belong here.
 
Krizzx - lherre has access to the dev kit. He has the documentation. What he says rings true. Why not, instead of a rant, ask him question you think he could answer without breaking the (very strict) Nintendo NDA?
 
Krizzx - lherre has access to the dev kit. He has the documentation. What he says rings true. Why not, instead of a rant, ask him question you think he could answer without breaking the (very strict) Nintendo NDA?

Yess!! I have some questions.

The only gripe I have is while I have my feet on the ground, some people believe what we are seeing is the Wii U maxing out. To the more tech savvy people, do you think there is more to squeeze out of the hardware? Throw out your estimations!! Please save your, but 3rd party will not care.

Also, X360 has 10MB of EDRAM, what do you think 32MB bring to the table in comparison? Does it work the same or having more means more complexity into the equation? New techniques must be used.

How does the Wii U EDRAM relate to the XB1 ESRAM, will they work similarly?
 
I agree with the earlier comment about this thread needing to be renamed. It should be changed "lets bash Nintendo and the Wii U felloowship" thread because that is apparently the only form of opinion that goes unmolested. If someone were to just muck up a PS3/PS4/Xboxone thead with nothing but console bashing, they'd be banned in a heart beat.

All those raging fits people have been throwing over the past 10-20 pages about derailing the thread with offtopic statements, but apparently its perfectly fine so long as the opinions doing it are those they are fond of. Half of this page is nothing but that, but no one says a word. This is a new height of hypocrisy. The GPU isn't even arbitrarily mentioned to where they could at least pretend to be on topic.

I would suggest someone just make a new topic and analyze Latte in there and leave them to have a bashing party in here, but I'm 100% certain that all of the same individuals would just move to that one and continue to derail that thread with offtopic Nintendo bashing complete with congratulatory fist bumping for anyone who joins in.

Can someone recommend a place where its possible to have an on topic technical discussion without the thread being dragged down by people taking repeat shots at business models, promoting their preferred hardware or endlessly hammering its producer with condescending labels with no intention of making any progressive contribution on the hardware being discussed?

I really hate all of this console war garbage.

Well, in the end, all that garbage was set in place to get to "lol WiiU is current gen", so you are right in the spot. Knowing that the WiiU IS Next-Gen is just corroding for some, specially when they felt entitled to the "value" of what they are paying.
 
The wii u, irrespective of its raw power, is an 8th generation console. Lets throw that hogwash away. When people say stuff like "last gen" either they are trolling or they are attempting to approximate its raw power. Which is, as IdeaMan once put it, "360+". Ie closer to previous gen systems raw power wise, with more modern architecture, more of everything and a design paradigm that is more like the other 8th gen systems. Just, obviously, with less of everything.
 
Krizzx - lherre has access to the dev kit. He has the documentation. What he says rings true. Why not, instead of a rant, ask him question you think he could answer without breaking the (very strict) Nintendo NDA?
Thing is Lherre can't/won't tell us anything more. Testing......

Lherre, how many shaders does Wii U have?
 
Yess!! I have some questions.

The only gripe I have is some people believe what we are seeing is the Wii U maxing out. To the more tech savvy people, do you think there is more to squeeze out of the hardware? Throw out your estimations!! Please save your, but 3rd party will not care.

As hard as I am on it, I don't think the Wii U is even remotely maxed out. PS3/360 ports won't be using the tessellator, for example, and even though it's not DX11 compliant it still exists in the hardware. Also, bear in mind how many years it took developers to fully exploit the PS3 SPUs and 360 eDRAM. Modern hardware demands a lot of R&D and people are still finding new ways of exploiting even those DX9 GPUs. People just need to keep their expectations in check.

To put it bluntly, I expect that there will be Wii U games 5 years from now that look better than some high-end PC games do today. But they will also run at a significantly lower resolution than those PC games.
 
At this point I have no clue what krixx is ranting about. Apparently being impressed with what they've achieved while still being realistic about the systems capabilities is console war stuff. But then again we've had people that firmly believe ray traced lighting is feasible, that a smattering of fixed function shaders is akin to an entirely new fixed function featureset. That the entire thing taken as a whole is more than marginally more powerful than consoles on shelves today, while using a fraction of the electricity on similar if not outright identical die sizes.

It's not enough to look at it and go "Very impressive efficiency gains there Nintendo and AMD." It must have a hidden reserve. We... just... can't... see... where...

I literally punched myself during that escapade.
 
Well, in the end, all that garbage was set in place to get to "lol WiiU is current gen", so you are right in the spot. Knowing that the WiiU IS Next-Gen is just corroding for some, specially when they felt entitled to the "value" of what they are paying.

If only the person who called it a "a current gen machine refined or the perfect current gen machine." was a developer with a dev kit or documentation.
Oh wait, it was? Well, I'll be damned.
 
If only the person who called it a "a current gen machine refined or the perfect current gen machine." was a developer with a dev kit or documentation.
Oh wait, it was? Well, I'll be damned.

It's 'current gen' in terms of performance because everything else blows it out of the water in that regard. It's not 'current gen' in terms of hardware features.

Resolution is the most demanding aspect of GPU performance. 1080p demands 2.25x the performance of 720p and 3.4x the performance of 600p. Xbox One games that run at 1080p should therefore theoretically be possible at 600p on Wii U on those occasions that CPU and memory size differentials aren't an issue and tessellation is kept in check. Just contrast these Crysis 3 comparison shots and tell me you see a huge difference between 'very high' and 'low'. There's little visible difference but I could easily picture Wii U playing it on 'low' at 720p with Xbox One getting 'high' at 1080p and PS4 getting 'very high' at 1080p.

It's relative and game based is what I'm saying. The performance differential between 'low' and 'very high' in that game is absolutely immense.
 
It's 'current gen' in terms of performance because everything else blows it out of the water in that regard. It's not 'current gen' in terms of hardware features.

Resolution is the most demanding aspect of GPU performance. 1080p demands 2.25x the performance of 720p and 3.4x the performance of 600p. Xbox One games that run at 1080p should therefore theoretically be possible at 600p on Wii U on those occasions that CPU and memory size differentials aren't an issue and tessellation is kept in check. Just contrast these Crysis 3 comparison shots and tell me you see a huge difference between 'very high' and 'low'. There's little visible difference but I could easily picture Wii U playing it on 'low' at 720p with Xbox One getting 'high' at 1080p and PS4 getting 'very high' at 1080p.

It's relative and game based is what I'm saying. The performance differential between 'low' and 'very high' in that game is absolutely immense.

its a shame the wii u version of crisis 3 never got finished and released, that would have been very interesting to see
 
As hard as I am on it, I don't think the Wii U is even remotely maxed out. PS3/360 ports won't be using the tessellator, for example, and even though it's not DX11 compliant it still exists in the hardware. Also, bear in mind how many years it took developers to fully exploit the PS3 SPUs and 360 eDRAM. Modern hardware demands a lot of R&D and people are still finding new ways of exploiting even those DX9 GPUs. People just need to keep their expectations in check.

To put it bluntly, I expect that there will be Wii U games 5 years from now that look better than some high-end PC games do today. But they will also run at a significantly lower resolution than those PC games.

Depend if someone put in the R&D over the next 5 years to get the max out of the system. Like you have seen with PS360.
 
It's 'current gen' in terms of performance because everything else blows it out of the water in that regard. It's not 'current gen' in terms of hardware features.

What I said doesn't go against your post. And I agree with you, lherre and pretty much everyone else, who have some common sense.
It was a sarcastic post aimed at people resorting to the old victim mentality.
 
Woah... chill dude. I was just asking a question. I'm not one of the ones who pretend to just be curious then attack you for his own presumptions of what you said afterward.

I've been following the Wii U info since launch and I don't remember seeing anything about it being DX10.1 post launch.(I still want to see those places were it was mentioned as that would be helpful).

Your hypothesis are noted.

I seem to be getting two different stories here. That lack of DX11 limits the console on one front, and whether is DX9/10/11 they can all do the same thing so it doesn't matter on another.

This doesn't add up to me given all of the other data I've seen. On one hand, you have comments like the one by the KH3 dev who said the Wii U wasn't getting it due to it not having DX11 capability. Then you have all of these devs, and the project cars log that list the Wii U as receiving nothing but DX11 implementations of features(and the PS4 and Wii U receiving the same one in the last instance but not he 360/PS3) version meaning that it isn't just a downscaled DX11 feature.

Taking all of this in account, none of it is connecting. There is so much contradiction from the devs down. I want to know what is fact, what is fiction, and what is most probable.


We have info that Latte is a custom chip with its own custom API(called GX2). It would make no sense for them to just rebrand a standard API like DX10 or DX10.1. for it to be custom made their must be customizations to it.

Given all of this info, I believe that Latte does contain DX11(not simply DX10 or 10.1) implementations of at least some features in its API, or their OpenGL equivalent.

I am not nor have I ever claimed that it is a fully DX11 compliant GPU as that is next to impossible(same with DX10 for that matter) do to it being a Microsft ATI. Customization seems to be the key that everyone is factoring out when discussing Latte's capabilities. From BG's and Fourth's analysis, Latte seems to have components on it that range from the HD4000 to HD6000 series. I'm not stating anything is certain, but I'm not writing off any possibilities either other than Latte having Direct X explicitly of course.
??? That post was perfectly calm. I'll go over it once again, since you seem to have some misunderstandings about the topic. Obviously the Wii-U would not use DirectX for its graphics API. However, this does not mean that the GPU being DX compliant is "impossible". For example, the GPUs in the XB1 and PS4 are almost identical, and in turn are pretty much identical to existing AMD GCN parts. All three would use a different API, but each could conceivably support the others' APIs.

As for the 10.1/11 thing, it seems you misunderstood. I wasn't saying that people were claiming that Latte was fully DX11 compliant. However, even if it was a bog-standard R700 chip, in a closed box environment it pretty much would be. The difference between the two isn't something where you could say "oh it only has SM4.1 support, turn off the DX11 lighting." Two of the biggest changes going from 10.1 to 11 were support for adaptive tessellation and compute shaders. We know that the Wii-U supports both of these and we also know AMD's DX10.1 chips supported them, even if they weren't exposed by the API. Statements such as "I believe that Latte does contain DX11(not simply DX10 or 10.1) implementations" are meaningless because, once again, a chip that only supported DX10.1 on PC could run a heap of DX11 code if the API allowed for it,

As an addendum, I googled "project cars directx10" and this is what came up:
 
The wii u, irrespective of its raw power, is an 8th generation console. Lets throw that hogwash away. When people say stuff like "last gen" either they are trolling or they are attempting to approximate its raw power. Which is, as IdeaMan once put it, "360+". Ie closer to previous gen systems raw power wise, with more modern architecture, more of everything and a design paradigm that is more like the other 8th gen systems. Just, obviously, with less of everything.

When people say Wii U is last gen or 7th generation they are just talking about performance. It gives a person a good idea about what games should look like on the platform. Its like when people call Wii just a gamecube, despite it not really being a gamecube.
 
Depend if someone put in the R&D over the next 5 years to get the max out of the system. Like you have seen with PS360.

True, but some of the other systems' R&D will still translate over to Wii U. It's an ongoing process and not limited to any GPU architecture or DirectX generation.

The difference between Shader Model 4 and 5 lies mostly in instruction length (I believe) but that doesn't mean that Wii U will be limited by shader instruction length. Having the capability to process very long shaders doesn't mean that it's always practical from a performance perspective. It's only a specification and not a performance guarantee.
 
The wii u, irrespective of its raw power, is an 8th generation console. Lets throw that hogwash away. When people say stuff like "last gen" either they are trolling or they are attempting to approximate its raw power. Which is, as IdeaMan once put it, "360+". Ie closer to previous gen systems raw power wise, with more modern architecture, more of everything and a design paradigm that is more like the other 8th gen systems. Just, obviously, with less of everything.
That was always a dumb argument. It launched in 2012. A year before One and PS4 and six to seven years after PS3/360. It was always an 8th generation platform. It not being a powerhouse in comparison to them was always a dumb argument because it was intended to combat them for marketshare.

Just as Wii had (and for many a year destroyed) against PS3/360.

But that's a well fought battle.

Calling it last gen is generally used as a knock,,, but almost always those offended would be by any term used to describe it's capability. Weak, not a powerhouse, last gen. For some reason it's taken just as a pejorative. When in essence it appears to be as large a jump over 360 as Wii was over GCN. Meager at best. Only this time having efficiency gains and modern architecture to help it along. And some key Nintendo idiosyncrasies that both help and hinder depending on the dev pipeline. I'm impressed with what they've done under case constraints, cooling limitations, all the while on the same die size and using little in comparison wattage.

To only look for magic in the machine you lose sight of the true beauty in the design.
 
As hard as I am on it, I don't think the Wii U is even remotely maxed out. PS3/360 ports won't be using the tessellator, for example, and even though it's not DX11 compliant it still exists in the hardware. Also, bear in mind how many years it took developers to fully exploit the PS3 SPUs and 360 eDRAM. Modern hardware demands a lot of R&D and people are still finding new ways of exploiting even those DX9 GPUs. People just need to keep their expectations in check.

To put it bluntly, I expect that there will be Wii U games 5 years from now that look better than some high-end PC games do today. But they will also run at a significantly lower resolution than those PC games.

See my issue isn't resolution, but I think its rendering capabilities exceeds current gen by at least 2 generations, 3 being max. What worries me is devs looking at the eDRAM as just a framebuffer pool.

I actually think Nintendo went all out on customizations. Like increasing shader clocks, and SRAM blocks, maybe going dual graphics engine, with eDRAM speed being XXXGB/s.
 
person with the "Accurate" tag.
Does he work on a studio with a Wii U developer kit? Did he participate on a Wii U developed game? Actual questions. It would be very informative to at least speculate about the tools of the Wii U. What version of developers kits they are using now? Is Nintendo update the documentation with their own versions of shader language? it would give us a perception about how nintendo works with other studios on the development of the tools.
 
Does he work on a studio with a Wii U developer kit? Did he participate on a Wii U developed game? Actual questions. It would be very informative to at least speculate about the tools of the Wii U. What version of developers kits they are using now? Is Nintendo update the documentation with their own versions of shader language? it would give us a perception about how nintendo works with other studios on the development of the tools.

You won't get answers to those questions. Just trust the tag. He didn't earn that by accident.
 
The only questions I would pose to developers are ones that won't break NDAs. Even if he isn't currently working on the Wii u (and lets face it, being a third party dev he isn't lol) he is still bound by that NDA. Tread carefully for his Job's sake.
 
You won't get answers to those questions. Just trust the tag. He didn't earn that by accident.

Sorry I do not trust tags. I find it difficult on internet forums to trust someone without credential. A tag on GAF is not proof that anyone knows about the Wii U. Also I find contradictory the fact that he says that the Wii U is current gen+ system and the NDA do not apply to that.

Also post like this do fill me up with confidence.

http://www.neogaf.com/forum/showpost.php?p=57246112&postcount=537

A little more information about the documentation could change that though...
 
Sorry I do not trust tags. I find it difficult on internet forums to trust someone without credential. A tag on GAF is not proof that anyone knows about the Wii U. Also I find contradictory the fact that he says that the Wii U is current gen+ system and the NDA do not apply to that.

Also post like this do fill me up with confidence.

http://www.neogaf.com/forum/showpost...&postcount=537

A little more information about the documentation could change that though...

Tags are given by the mods/admins, who in cases like this verify if people are who they claim to be.
 
Yess!! I have some questions.

The only gripe I have is while I have my feet on the ground, some people believe what we are seeing is the Wii U maxing out.
That's completely absurd. That applies to any system btw.

That doesn't mean it is going to be running Avatar in real time next E3, or keep up with PS4/XBone for that matter, but there will be improvements.
 
Tags are given by the mods/admins, who in cases like this verify if people are who they claim to be.

I do not question if he is a developer or works for the industry, I am curious if he is ever developed for the Wii U or has any information about the documentation. Because he made a post about Marcan's findings that "blazed" the internet with misinformation about the CPU of the machine.

Clearly if you have documentation and a dev kit you already know not to quote guys like Marcan for the architecture of the Wii U.

Shin 'En clearly they are bind by NDA but that never stopped them to talk about what are they doing with their new engine and what the Wii U supports on shader language.

Shin'en Multimedia ‏@ShinenGames 12 Αυγ

@MaxWill37699872 We've gone deferred+HDR. Very simple and fast on WiiU because all renderTargets fit in EDRAM.
 
Sorry I do not trust tags. I find it difficult on internet forums to trust someone without credential. A tag on GAF is not proof that anyone knows about the Wii U. Also I find contradictory the fact that he says that the Wii U is current gen+ system and the NDA do not apply to that.

Also post like this do fill me up with confidence.

http://www.neogaf.com/forum/showpost.php?p=57246112&postcount=537

A little more information about the documentation could change that though...
You dont know much about GAF.... tag is all the proof you need on gaf.

The guy he quote is a source of a ton of info on the wiiu.

The guy owns you nothing, who are you?
 
I do not question if he is a developer or works for the industry, I am curious if he is ever developed for the Wii U or has any information about the documentation. Because he made a post about Marcan's findings that "blazed" the internet with misinformation about the CPU of the machine.

Clearly if you have documentation and a dev kit you already know not to quote guys like Marcan for the architecture of the Wii U.

Shin 'En clearly they are bind by NDA but that never stopped them to talk about what are they doing with their new engine and what the Wii U supports on shader language.

I'm not sure I understand. Why do you think marcan's findings are "misinformation"? Have you considered that someone who hacks and tests the hardware may actually have more information on it than what is provided to the devs? Or that marcan's findings tally with what lherre already knows and that posting it is a way of disseminating the information without saying anything directly?
 
Status
Not open for further replies.
Top Bottom