WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
The Wii U has some PC-like dx10/dx11-like systems like geometry shaders, however, when we started working on the port, we already had our game released on the X360, and going from a console release to Wii U is easier since the consoles have very similar performance characteristics.

I think that some are misreading this quote as meaning something that it doesn't. DX10 parts (like R700) already support many DX11 API features just like they do many OpenGL 4.x features.

The geometry shader came in the DX10 specification, for example, and tessellation in DX11. The R700 still has a tessellator but not a DX11 compliant one. It just doesn't matter to the Wii U because it doesn't run DirectX.
 
Kindly read my post again. I was strictly talking about theoreticals, and the read/write thing halving the bandwidth myth. I acknowledged that the actual bandwidth figures are lower.

And I was saying that the read write halving the bandwidth isn't a myth. It was an actual later discovery that contradicted a myth.
 
And I was saying that the read write halving the bandwidth isn't a myth. It was an actual later discovery that contradicted a myth.
Look, this has been pretty well documented by microsoft. The read/write thing is only valid for the bus connecting the CPU and the northbridge, not the one connecting the main RAM to the northbridge/GPU.
 
Is this a discovery made by you?

No?

Look, this has been pretty well documented by microsoft. The read/write thing is only valid for the bus connecting the CPU and the northbridge, not the one connecting the main RAM to the northbridge/GPU.

I'm not disagreeing with that.

I was just pointing out that you can't just write off a bottleneck like it doesn't matter the way his response seemed to.

Then there are the other bottlenecks and discrepancies on top of that. Like the RAM/CPU resources in use by the sound and OS. The Wii U has a DSP for sound and an arm processor for security, though I'm sure if it handles the OS as well. Its has unrestricted memory access for the most part.

Though, I recall there was this one restriction listed in the vgleaks thing about the Wii U memory.
 
Okay...?

So... where do these numbers come from? For a guy so surefire to blast people for not posting sources, you do very little in terms of sourcing information yourself.

Kind of like how I had to search pages (in a thread that's now past 200 pages) to find a changelog that was posted several weeks back and I faliled to find it because you wouldn't tell me where it was.

EDIT: I don't see this on Google either.
 
Okay...?

So... where do these numbers come from? For a guy so surefire to blast people for not posting sources, you do very little in terms of sourcing information yourself.

Kind of like how I had to search pages (in a thread that's now past 200 pages) to find a changelog that was posted several weeks back and I faliled to find it because you wouldn't tell me where it was.

So surefire to blast people for not posting source? Now this is where we hit one of these failure to communicate points.

I don't "blast" anyone for not posting sources. Simply asking, where did you see this, is not "BLASTING!!!!!!!", and rarely do I ask for a source. If I do ask for one, its because someone has claimed something completely unfounded like its fact with no sufficient details provided. Exaggerating my statements and begging what I've said when what I've already explained it "on the same page" is annoying.

I do not respond to people who misquote me and accuse me of doing things I have not, so I am cutting communication here.
 
Plus side, this argument that the WiiU does or does not have DX11 features can be put to rest.
Minus side, sounds like they are approaching the WiiU as they would approach developing for the X360- meaning, not looking at the eDRAM as the main ram.
But then again, maybe for this particular game its not necessary. Which makes
porting so easy.

Which is disappointing, because it keep what features they use to a minimum.
 
So surefire to blast people for not posting source? Now this is where we hit one of these failure to communicate points.

I don't "blast" anyone for not posting sources. Simply asking, where did you see this, is not "BLASTING!!!!!!!", and rarely do I ask for a source. If I do ask for one, its because someone has claimed something completely unfounded like its fact with no sufficient details provided. Exaggerating my statements and begging what I've said when what I've already explained it "on the same page" is annoying.

I do not respond to people who misquote me and accuse me of doing things I have not, so I am cutting communication here.
The irony. You can't even do the same. What you've claimed is unfounded. You've shown time and time again to not know some very basics on technology. Others, including myself have lived up to their mistakes, but so far, you seem to be the one with the "perfect" track record, as you always tend to spin yourself into a position that paints you as someone that's "misunderstood."

Give up the source, otherwise stop making that claim.
 
The irony. You can't even do the same. What you've claimed is unfounded. You've shown time and time again to not know some very basics on technology. Others, including myself have lived up to their mistakes, but so far, you seem to be the one with the "perfect" track record, as you always tend to spin yourself into a position that paints you as someone that's "misunderstood."

Give up the source, otherwise stop making that claim.

You're talking to the guy who was proven wrong about the 1080p argument and couldn't even just type up a post saying "sorry, I was wrong", he just twisted his way out of it.
 
You're talking to the guy who was proven wrong about the 1080p argument and couldn't even just type up a post saying "sorry, I was wrong", he just twisted his way out of it.

Trust me, I know. Even in the Bayo 2 thread that EatChildren made, he said those bullshots were direct feed screengrabs, lmao. Even StevieP told him they were bullshit and 720p when he posted them in this thread, and he said "how do you know?"

http://www.neogaf.com/forum/showpost.php?p=81055337&postcount=9413
 
Interview from NintendoLife: Black Forest Games on Bringing Giana Sisters: Twisted Dreams to Wii U



[/B]


http://www.nintendolife.com/news/20...ringing_giana_sisters_twisted_dreams_to_wii_u

Excellent. We now have another source citing the Wii U as having DX11 features. This makes 4 sources.
It also reassures the ease at which things can be ported to the Wii U by better devs. I haven't heard this since pre launch. Indies are the future.







As for the the two above me. Stop derailing the thread with offotpic personal attacks. Being in consensus about your augmented version of reality does not make the things you make up true. I'm not taking the bate. Go spit your social issues with me out somewhere else. Its ridiculous the way so many people come into this thread just to antagonize me.
 
As for the the two above me. Stop derailing the thread with offotpic personal attacks. Being in consensus about your augmented version of reality does not make the things you make up true. I'm not taking the bate. Go spit your social issues with me out somewhere else. Its ridiculous the way so many people come into this thread just to antagonize me.

You never lived up to ANY of your mistakes. And you still can't even post a source for your claims.
 
Excellent. We now have another source citing the Wii U as having DX11 features. This makes 4 sources.
Already addressed. All DX10 GPUs support some DX11 features because DX11 is a superset of DX10. That doesn't mean that it isn't R700 based because it almost certainly is.

The sheer fact that they cite a geometry shader (a DX10 feature that was considered 'new' in GPUs 6 years ago) and not SM5.0 or a DX11 compliant tessellator speaks volumes. That quote spells out quite plainly that it's decidedly not a DX11 compliant GPU.
 
Already addressed. All DX10 GPUs support some DX11 features because DX11 is a superset of DX10. That doesn't mean that it isn't R700 based because it almost certainly is.

I cant be certainly based on anything because its a custom GPU designed around eDRAM. What GPUs does AMD have with eDRAM built in? It might look similar
on paper to some AMD GPU, but its performance will be different because of the
eDRAM and the components designed/modified to take advantage of it.
 
I cant be certainly based on anything because its a custom GPU designed around eDRAM. What GPUs does AMD have with eDRAM built in? It might look similar
on paper to some AMD GPU, but its performance will be different because of the
eDRAM and the components designed/modified to take advantage of it.

Well, it's obviously based on something. GPUs are ridiculously complex chips and there's no chance that AMD designed something from the ground up just for Nintendo. Modern computer chips are the result of decades of work.

There's definitely some custom logic in there as well (eDRAM, for example) but the rest of the chip is all based on previous technology. The chip wouldn't have any DX11 feature capability if it didn't because it would then be entirely incompatible.

Edit:

And if it's not based on R700-series then it would be based on R800-series which has full hardware support (and not partial) for the DX11 feature set.
 
lets try this again:
krizzx do you have the link for where the split read/write 360 memory thing you stated (the one that 'bust the myth') originates from?
 
I just wanted to add Wii U's first GPU (that we know) was R700. Doesn't sound like they were ever interested in any technology past that as the last specs still say DX10.1.

And I don't believe Nintendo wanted to fool/trick Sony/MS by switching out at the last second or that they "really cared about third party games". Nintendo ever since the Wii does not play like that.. Heck, that "lateral use of withered technology" could be direct proof/rationale between why the Wii U would be that way. The gamepad came first, the graphics (DX11) was an absolute afterthought.

But why does it matter? Dx11 or no DX11, Wii U needs to prove it can even do anything meaningful with it. All the games so far are still built closer to last gen spec implying all benefits so far are likely marginally better to superficial.
 
That quote spells out quite plainly that it's decidedly not a DX11 compliant GPU.

Come on. We rightfully get on krizzx's case when he reads into quotes things that aren't there. It's only fair you are called out for doing the same. That comment in no way proves anything definitive about the GPU as you're claiming.

Negative spin is just as garbage as positive spin. It's all spin.
 
But why does it matter? Dx11 or no DX11, Wii U needs to prove it can even do anything meaningful with it. All the games so far are still built closer to last gen spec implying all benefits so far are likely marginally better to superficial.


Nintendo has its own API, developers have to show what they can do with it.

GX shares some similarities with OpenGL, and differs greatly in many ways as well. OpenGL, by design, masks a lot of the nitty-gritty hardware specifics, leaving implementation of it to hardware vendors, whereas the GX API is very close to the metal and many functions have little, if any, processing performed by the CPU. What this means is that, if you write smart code and know how the hardware works under the hood, you can bring out the best performance of the machine, but you'll also be working with an API that is altogether more complex than writing in a higher-level API such as OpenGL.

Since it has been updated for the WiiU, I doubt it was an afterthought for Nintendo.
 
Nintendo has its own API, developers have to show what they can do with it.



Since it has been updated for the WiiU, I doubt it was an afterthought for Nintendo.
I don't think that changes what I said (i.e, never said they couldn't care about their own libraries). Nintendo putting more emphasis on their gamepad thingy over worrying about the latest and greatest graphics citing the Wii and Lateral Thinking of Withered Technology as a source.
 
Already addressed. All DX10 GPUs support some DX11 features because DX11 is a superset of DX10. That doesn't mean that it isn't R700 based because it almost certainly is.

The sheer fact that they cite a geometry shader (a DX10 feature that was considered 'new' in GPUs 6 years ago) and not SM5.0 or a DX11 compliant tessellator speaks volumes. That quote spells out quite plainly that it's decidedly not a DX11 compliant GPU.

Already adressed? Someone has tried to make that exact claim to dismiss the DX11 functionality every single time it has come up in this thread, but it has no weight. They never substantiate it. They just state it arbitrarily likes this simple information existence automatically brings about some form of contradiction. The developer says DX11 specifically, so I will go with what the developer says. They know better than me or you what features they are using.

Its like with the project C.A.R.S. logs that were posted. It specifically listed the usage of DX11 features not 10 and in the last log before the poster said he wasn't going to post anymore do to the rules about it, it listed the PS4 and Wii U as both receiving the exact same DX11 implementation of a feature. Is the PS4 now only DX10 equivalents as well?

We know what the API Nintendo uses is. Its proprietary API called GX. It was leaked that the Wii U uses GX2 and the bases for it is OpenGL.
 
Come on. We rightfully get on krizzx's case when he reads into quotes things that aren't there. It's only fair you are called out for doing the same. That comment in no way proves anything definitive about the GPU as you're claiming.

Negative spin is just as garbage as positive spin. It's all spin.

Eh? The quote explicitly states 'some DX10 parts and some DX11 parts'. The R700-series does exactly that. Just like the G80-series from nVidia and the same goes for the OpenGL 4.x extensions that I mentioned. That's what GPU technology is and is what this thread is about.

Acting like I'm on some crusade against the Wii U couldn't be farther from the truth because I'm a big Nintendo fan. But this is a GPU technology thread and wild claims have no part in it.
 
Look the information isn't new. Ideaman and a few others confirmed last year that Wii Us GPU is a DX 10.1+ equivalent component. The + in the fact it has some bells and whistles that are in fact beyond 10.1. It is not a full DX 11 equivalent chip but it has elements.
 
Look the information isn't new. Ideaman and a few others confirmed last year that Wii Us GPU is a DX 10.1+ equivalent component. The + in the fact it has some bells and whistles that are in fact beyond 10.1. It is not a full DX 11 equivalent chip but it has elements.

"Confirmed"? How was anything confirmed, especially that? The Wii U doesn't use any DX feature set specifically. Its a microsoft API

When you are talking about the Wii U, you are talking about "equivenlent" feature sets. The info released from the Wii U leak last year said its API is GX2.


The only time it would have been using DX10.1 is in the early dev kits where it was still using a stock HD 4850.
 
"Confirmed"? How was anything confirmed, especially that? The Wii U doesn't use any DX feature set specifically. Its a microsoft API

When you are talking about the Wii U, you are talking about "equivenlent" feature sets. The info released from the Wii U leak last year said its API is GX2.


The only time it would have been using DX10.1 is in the early dev kits where it was still using a stock HD 4850.

You keep ignoring that StevieP has seen documentation.

And It's a SM4.x part according to the documentation

I'm not sure what happened to his tag, but he's always had ties to the inside.
 
One thing I think needs to be pointed out, is that devs porting over they're engines from a DirectX based engine. How often do devs reference OpenGL when commenting on graphics features, even though the PS4 is in no way connected to DirectX API, remarks about feature set is DirectX 11.1 or .2.

So GX2 being OpenGL based or similar, would mean features are DX10.1/DX11 equivalents. So this leads me to ask, is there any features that OpenGL possess that DirectX doesn't posses?
 
One thing I think needs to be pointed out, is that devs porting over they're engines from a DirectX based engine. How often do devs reference OpenGL when commenting on graphics features, even though the PS4 is in no way connected to DirectX API, remarks about feature set is DirectX 11.1 or .2.

So GX2 being OpenGL based or similar, would mean features are DX10.1/DX11 equivalents. So this leads me to ask, is there any features that OpenGL possess that DirectX doesn't posses?

Considering that graphical API's are merely a way of exposing the underlying hardware in a uniform manner, you should be able to do everything in one that you can in the other assuming that they expose the same hardware in the same way.
 
One thing I think needs to be pointed out, is that devs porting over they're engines from a DirectX based engine. How often do devs reference OpenGL when commenting on graphics features, even though the PS4 is in no way connected to DirectX API, remarks about feature set is DirectX 11.1 or .2.

So GX2 being OpenGL based or similar, would mean features are DX10.1/DX11 equivalents. So this leads me to ask, is there any features that OpenGL possess that DirectX doesn't posses?

I honestly never see people talking about OpenGL unless it's some nitty gritty tech stuff from Sony. PSGL is the API they use. Custom made, but it's just a more advanced feature set than OpenGL. Using Direct3D descriptors easily gives one an idea of what type of feature set and Shader Model version they are using.

For example. Saying something is DX10 is Shader Model 4, DX10.1 is a SM4.1 and SM5 is DX11.
 
... anyway, this thread has been very silly for a while. Unless techy folk like Blu, Thraktor etc. have anything of relevance to the thread's title to post, I'm not sure it's worth posting anything at all.

All I've read for he last several days is mostly a childish to-and-fro 'debate' based on fabricated lists/screenshots/youtube videos/wild assertions... i.e. bloody nonsense.

I didn't chuck $50 into this project to read juvenilia.
Agree 100%.

You know what's wrong with this thread? There's only so much one can say about a subject and most of us do it and get done with it until there's something meaningful to discuss again.

It's not so much as "nothing else can be said" as it is... well, this:



Seriously. This thread should be named "krizzx against logic and the world" or something, dudes that have something meaningful to say have nothing on him (and no, this is not an attack).

Some of us have access to documentation, are developers, engineers or programmers, some are knowledgeable in various degrees or forms (no point in self-defining; really), some like to read technical jargon or simply want to have a good conversation (and I understand you're trying to be in this last criteria; but it's not working out, because...). This thread is always going nowhere when people keep insisting on the very same points; that includes people that, pitch topics whose silver lining is: "help me prove Wii U is really something else" which are honestly as annoying as "it's not even better than X360"; and that's where my interest in this thread is dwindling; I'm very interested in reading knowledge posts, stuff I didn't know or even different non-previously-stated-opinions and don't take me wrong but the krizzx circle is pissing me off grandly for a while now; because we keep having silly arguments all over.

Anyway my point is; dude, take a chill pill and stop getting your head under the guillotine for the Wii U graphical prowess; bumping a thread to put things on the table like 1080p or something is not something you're gonna get out unscathed not only because there aren't many Wii U games like that but because everyone doubts PS4 or XBone will be able to comply to it either (and for good reason); suggesting the Wii U has hidden muscle to completely cloak that and pull 1080p standard for loads of games, at 60 frames per second no less... is silly. And I understand there's an ego associated and some people won't let you save face hence you feel cornered all the time and go damage control all over (and you're not a uneducated fellow, so you really try to patch things and don't hold us in bad light, as I don't towards you); but learn to save face before you're under everyone's scrutiny; it's that situation when you took it as a routine for a while and somebody says "hey that dude is annoying" and in the next moment it clicked for everyone; you've pushed it past that point, till the point of behaving like a caricature or yourself. I could pitch something to take the piss and know you'd cling to it providing it catered to your wishes, that's the position you put yourself into.

If anything, the fact this has gone for so long, and you totally dominated in post count a thread meant for discussing Wii U technicalities without having much to say/add has certainly driven away lots of people that perhaps had something to say (I'd like to have seen more posts from Thraktor, Fourth Storm, wsippel, StevieP, bgassassin... I'm forgetting people for sure) instead of 655 posts by you (again, no offense, but you post too much for the reality of this thread), or perhaps they tried and kept seeing the thread revolving around discussing cyclic misinformation; stuff like the recent "MORE PERFORMANCE WILL BE UNLOCKED" on the next update, or speculation about "incomplete gpu firmware" and just gave up, you keep hoping for hugely unlikely best case scenarios and, while you won't agree, you keep refusing to see the big picture; and get the queue. My point: it's not just karma the reason people fall over you, learn to stay back a little more, and it'll go away, and I want to read knowledgeable points, not 1080p silliness.

I wish Nintendo would release specs just for that, not because it's gonna change my outlook of whatever (what the console is, is pretty obvious at this point) but because of things like this.

Sorry if I'm being too hard; it's not you, it's the situation. Carry on.
 
I doubt GX has that much to do with GX2; seeing the GPU completely changed and this time, it's a more standardized implementation. Stuff like:

Lighting in GX is different from OpenGL. While OpenGL has diffuse, ambient and specular colors, as well as a global ambient color, GX only has diffuse light colors. Additionally, OpenGL has diffuse, ambient, specular and emission material colors, whereas GX has diffuse and ambient.

Are certainly not true now otherwise we'd be hearing complains. That reverse engineering effort is interesting, yes, but not really relevant to Wii U.

GX2 is most likely a more modern OpenGL offshoot; and it probably retains GX legacy support just so the development environment is familiar enough, but not that many specific quirks to it (like those described there).

Still, I wouldn't mind to see that developed further, this is just my opinion.
So GX2 being OpenGL based or similar, would mean features are DX10.1/DX11 equivalents. So this leads me to ask, is there any features that OpenGL possess that DirectX doesn't posses?
Not really no.

DirectX releases simply chase whatever new OpenGL implemented; not the other way around, so OpenGL usually has a few months of exclusivity of whatever they implemented; for instance the upcoming DirectX 11.2 is only meant to close the gap between DirectX 11 and the last OpenGL released, that being OpenGL 4.4, out now.

Then you have performance considerations, performance varies between doing things on OpenGL or DirectX; sometimes it even varies from the DirectX implementation to the other. That's one of the reasons why John Carmack always stuck to OpenGL, if DirectX was faster he wouldn't (not the only one, as he didn't want to be tied to a proprietary implementation, etc), of course that meant that Nvidia cards whose drivers had better OpenGL performance had the advantage. But that's really a can of worms.
 
I doubt GX has that much to do with GX2; seeing the GPU completely changed and this time, it's a more standardized implementation. Stuff like:



Are certainly not true now otherwise we'd be hearing complains. That reverse engineering effort is interesting, yes, but not really relevant to Wii U.

GX2 is most likely a more modern OpenGL offshoot; and it probably retains GX legacy support just so the development environment is familiar enough, but not that many specific quirks to it (like those described there).

Still, I wouldn't mind to see that developed further, this is just my opinion.

I wonder if its up to devs whether they develop using GX2, and if it shares similarities like giving access to code to the metal.
 
I wonder if its up to devs whether they develop using GX2, and if it shares similarities like giving access to code to the metal.

It depends on wether or not nintendo provides another API, which I doubt, most console API's give access to low level features wether or not you consider this coding to the metal is up to you.
 
It depends on wether or not nintendo provides another API, which I doubt, most console API's give access to low level features wether or not you consider this coding to the metal is up to you.

The reason why I bring up "code to the metal" is because its mentioned as a feature of GX or part of the API. In the link I provided.

I'm no programmer so I can't consider any levels as coding to the metal. I just remember reading that devs would like access to code to the metal.
 
And Chess too! don't forget about the 1080p/60fps Chess game...

... anyway, this thread has been very silly for a while. Unless techy folk like Blu, Thraktor etc. have anything of relevance to the thread's title to post, I'm not sure it's worth posting anything at all.

All I've read for he last several days is mostly a childish to-and-fro 'debate' based on fabricated lists/screenshots/youtube videos/wild assertions... i.e. bloody nonsense.

I didn't chuck $50 into this project to read juvenilia.

JB

You put 50? Damn. I think I put like... 15 or something lol.

EDIT: Also, great post lostinblue, and I'm sorry myself. That's too many posts by me that have been fueled by rage.
 
And much thicker APIs all around.
Yep!

That's not to say that a focus on a singular platform won't bring out its best even when talking large multiplatform developers. It's part of the reason that being dev focus is such a sought after position for different manufacturers. It means using your hardware to best represent the title.

You still do get a boon for being the focus. The devs "coding to the metal" but this era is not the same as those before it. When you had to translate from Japanese exactly how mip-mapping was feasible. And then hijack the PS1 innards for audio processing VU0 running your Assembly written motion blur effect and VU1 is meanwhile crunching all of your triangle and texture stuff sending it to GS which only use is rendering.

This is a dramatization for effect. We live in a much more consolidated console gaming market. All three pulling from not only the same hardware manufacturers but to a degree the same toolsets. The way of developing a lot of these games will be shockingly similar. Just with varying degrees of capability in use. And differing stylistic choices.

WiiU in any sense is not a powerhouse. You can look at power usage readings, GPU clock rate, likely design that inspired that GPU, and come to the conclusion that it is an interesting piece of kit. Pushing a shockingly impressive array of visuals on such little power. I mean it's potentially capable of greater than 360 or PS3 visuals while using 1/3rd the electricity of even die shrunk redesigned models.

So far from a powerhouse but really impressive in its own right.
 
With these PC like consoles coming up, I wish they would force devs to code to the API to guarantee BC in succeeding machines. Yeah...pipe dream.
Devs DO use the API's. They don't want to do low level coding. Going forward. Assuming MS and Sony stick with X86 and AMD, there should be very little preventing them from doing BC.
 
Devs DO use the API's. They don't want to do low level coding. Going forward. Assuming MS and Sony stick with X86 and AMD, there should be very little preventing them from doing BC.

Whatever it takes so I can run Xbone and PS4 games even betterer with like higher FPS and resolution on their successors *smokes pipe*
 
Agree 100%.

You know what's wrong with this thread? There's only so much one can say about a subject and most of us do it and get done with it until there's something meaningful to discuss again.

It's not so much as "nothing else can be said" as it is... well, this:



Seriously. This thread should be named "krizzx against logic and the world" or something, dudes that have something meaningful to say have nothing on him (and no, this is not an attack).

Some of us have access to documentation, are developers, engineers or programmers, some are knowledgeable in various degrees or forms (no point in self-defining; really), some like to read technical jargon or simply want to have a good conversation (and I understand you're trying to be in this last criteria; but it's not working out, because...). This thread is always going nowhere when people keep insisting on the very same points; that includes people that, pitch topics whose silver lining is: "help me prove Wii U is really something else" which are honestly as annoying as "it's not even better than X360"; and that's where my interest in this thread is dwindling; I'm very interested in reading knowledge posts, stuff I didn't know or even different non-previously-stated-opinions and don't take me wrong but the krizzx circle is pissing me off grandly for a while now; because we keep having silly arguments all over.

Anyway my point is; dude, take a chill pill and stop getting your head under the guillotine for the Wii U graphical prowess; bumping a thread to put things on the table like 1080p or something is not something you're gonna get out unscathed not only because there aren't many Wii U games like that but because everyone doubts PS4 or XBone will be able to comply to it either (and for good reason); suggesting the Wii U has hidden muscle to completely cloak that and pull 1080p standard for loads of games, at 60 frames per second no less... is silly. And I understand there's an ego associated and some people won't let you save face hence you feel cornered all the time and go damage control all over (and you're not a uneducated fellow, so you really try to patch things and don't hold us in bad light, as I don't towards you); but learn to save face before you're under everyone's scrutiny; it's that situation when you took it as a routine for a while and somebody says "hey that dude is annoying" and in the next moment it clicked for everyone; you've pushed it past that point, till the point of behaving like a caricature or yourself. I could pitch something to take the piss and know you'd cling to it providing it catered to your wishes, that's the position you put yourself into.

If anything, the fact this has gone for so long, and you totally dominated in post count a thread meant for discussing Wii U technicalities without having much to say/add has certainly driven away lots of people that perhaps had something to say (I'd like to have seen more posts from Thraktor, Fourth Storm, wsippel, StevieP, bgassassin... I'm forgetting people for sure) instead of 655 posts by you (again, no offense, but you post too much for the reality of this thread), or perhaps they tried and kept seeing the thread revolving around discussing cyclic misinformation; stuff like the recent "MORE PERFORMANCE WILL BE UNLOCKED" on the next update, or speculation about "incomplete gpu firmware" and just gave up, you keep hoping for hugely unlikely best case scenarios and, while you won't agree, you keep refusing to see the big picture; and get the queue. My point: it's not just karma the reason people fall over you, learn to stay back a little more, and it'll go away, and I want to read knowledgeable points, not 1080p silliness.

I wish Nintendo would release specs just for that, not because it's gonna change my outlook of whatever (what the console is, is pretty obvious at this point) but because of things like this.

Sorry if I'm being too hard; it's not you, it's the situation. Carry on.
x1000

Half of all his post of gaf are in this one thread. Just crazy... like most I keep seeing this thread bump and click to see if there is new info but its just nonsense everyday....
 
Agree 100%.

You know what's wrong with this thread? There's only so much one can say about a subject and most of us do it and get done with it until there's something meaningful to discuss again.

If you don't feel this thread has any value then ignore it. Just stop until it interests you again.

Seriously. This thread should be named "krizzx against logic and the world" or something, dudes that have something meaningful to say have nothing on him (and no, this is not an attack).

Saying that this post isn't an attack doesn't make it so. This thread has nothing to do with personal traits of the members participating in the discussion.

And I understand there's an ego associated and some people won't let you save face hence you feel cornered all the time and go damage control all over (and you're not a uneducated fellow, so you really try to patch things and don't hold us in bad light, as I don't towards you); but learn to save face before you're under everyone's scrutiny; it's that situation when you took it as a routine for a while and somebody says "hey that dude is annoying" and in the next moment it clicked for everyone; you've pushed it past that point, till the point of behaving like a caricature or yourself. I could pitch something to take the piss and know you'd cling to it providing it catered to your wishes, that's the position you put yourself into.
n.

This is a textbook example of an ad-hominem attack and possibly a bit of projection. You know that you sound like this right?
tumblr_mme54mwfdq1r8rauqo1_500.png


Why care so much that he cares? If he's a fanatic then fuck him, doesn't make him right. I am by NO means experienced with hardware analysis or any of the esoteric details this thread is full of. But I enjoy reading the discussion betwixt those that do. I know I'm not the only one that still keeps up with this thread and continues to be intrigued by the small flow of details as they materialize. Let's get back on topic and forget about who posts how much, as it really isn't relevant to anything. Here, I'll start:

One thing I have been wondering about is that if the wireless radios Wii U uses can handle 5 player multi (Pad + 4 Wiimotes), and if Nintendo is correct when they said it can handle a total of two pads simultaneously, would it be possible to host 6 player multi? (2 pads, 4 Wiimotes)? Is that a bottleneck issue for the radios or the processing power of the system? Thanks, love this thread!
 
Guys look what I have found about the eDRAM architecture of the Wii U. I already saw the speech of Marc Cerny at gamelab, but I did not took notice the part that explained the 2 different next gen architectures the next gen consoles are using including the Wii U.

According to Mark Cerny the shared pool of GDDR5 in the Playstation 4 have a more direct approach to handle the next gen graphic engines that are being developed with 179Gb/s bandwidth. According to him is more than enough for developers to handle.

Now for the juicy part of his speech was that he choose this architecture because of simplicity BUT the advantage of eDRAM on the die would produce at least 1Tb/s of bandwidth IF the developers had the right amount of time and produce specific workarounds to TAP into that power the advantages would be enormous AND BETTER for next gen graphics.

http://www.youtube.com/watch?v=xHXrBnipHyA&feature=player_detailpage#t=2324

So Wii U(32mb) and Xbox One have this specific architecture that uses a HUGE amount of eDRAM ON die this bring new elements to the table that the programmers and specialist here on this thread I would like to analyse and inform us.

Plus I found out that Nintendo had a business relationship with S3 Graphics that developed the texture compression technology on the GameCube and the 3DS in 2010.

http://www.gamasutra.com/view/news/28244/Nintendo_Extends_S3_Texture_Compression_License.php

So if I remember correctly the GameCube had Texture Cache Approx. 1MB Sustainable Latency : 6.2ns (1T-SRAM)

Which used according to S3 texture compression 1mb for 6mb of texture data.Which is stated on the article on GamaSutra and I quote the saying of Genyo Takeda.

S3 says its texture compression tools allow for up to six-fold compression without affecting visual fidelity.

So imagine the possibilities if nintendo has found a compression method that uses 6 times LESS memory for HIGH resolution textures without the huge amount of ram the other consoles are using. We are talking about for some amazing efficiency there.

So here some food for though about that secret sauce of the different architecture of all next gen systems.

So the quotes of Criterion about the Wii U punch above its high it is starting to make some sense.

Also if anyone has information about a good quality video capture card that I can use for 1080p 60fps screen and video caption I would be grateful. I need to post some very interesting finding for all of you to see.
 
Status
Not open for further replies.
Top Bottom