WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
Guys look what I have found about the eDRAM architecture of the Wii U. I already saw the speech of Marc Cerny at gamelab, but I did not took notice the part that explained the 2 different next gen architectures the next gen consoles are using including the Wii U.

According to Mark Cerny the shared pool of GDDR5 in the Playstation 4 have a more direct approach to handle the next gen graphic engines that are being developed with 179Gb/s bandwidth. According to him is more than enough for developers to handle.

Now for the juicy part of his speech was that he choose this architecture because of simplicity BUT the advantage of eDRAM on the die would produce at least 1Tb/s of bandwidth IF the developers had the right amount of time and produce specific workarounds to TAP into that power the advantages would be enormous AND BETTER for next gen graphics.

http://www.youtube.com/watch?v=xHXrBnipHyA&feature=player_detailpage#t=2324

So Wii U(32mb) and Xbox One have this specific architecture that uses a HUGE amount of eDRAM ON die this bring new elements to the table that the programmers and specialist here on this thread I would like to analyse and inform us.

Plus I found out that Nintendo had a business relationship with S3 Graphics that developed the texture compression technology on the GameCube and the 3DS in 2010.

http://www.gamasutra.com/view/news/28244/Nintendo_Extends_S3_Texture_Compression_License.php

So if I remember correctly the GameCube had Texture Cache Approx. 1MB Sustainable Latency : 6.2ns (1T-SRAM)

Which used according to S3 texture compression 1mb for 6mb of texture data.Which is stated on the article on GamaSutra and I quote the saying of Genyo Takeda.



So imagine the possibilities if nintendo has found a compression method that uses 6 times LESS memory for HIGH resolution textures without the huge amount of ram the other consoles are using. We are talking about for some amazing efficiency there.

So here some food for though about that secret sauce of the different architecture of all next gen systems.

So the quotes of Criterion about the Wii U punch above its high it is starting to make some sense.

Also if anyone has information about a good quality video capture card that I can use for 1080p 60fps screen and video caption I would be grateful. I need to post some very interesting finding for all of you to see.

Everyone uses texture compression. Everyone.
 
So imagine the possibilities if nintendo has found a compression method that uses 6 times LESS memory for HIGH resolution textures without the huge amount of ram the other consoles are using. We are talking about for some amazing efficiency there.

6x compared to what? uncompressed?
 
6x compared to what? uncompressed?

According to GameCube the texture compression was 1 to 6 compare to the Playstation 2 which did not have any.

Whereas PlayStation 2's CPU and two Vector Units split up the tasks of various graphic procedures, like transformation and lighting, for example, all of this is handled singularly by Gamecube's Flipper chip, which also decompresses textures at a 6:1 ratio. PS2 has no hardware texture compression and seeing as how it only features 4MBs of Embedded DRAM on its graphic synthesizer, developers would need to compress textures in software, which in turn means a significant hit on the Emotion Engine CPU.

http://uk.ign.com/articles/2000/11/04/gamecube-versus-playstation-2

As for xbox is not clear.

There is also speculation that the GC cache can hold compresed textures and the Xbox cache cannot, if so then that can make a huge difference in the comparison as with a 6:1 compression ratio, the cache can hold 6 times more data! 6 MB of data for the GC compared to 128 KB or 256 KB for the Xbox is a huge difference.

http://segatech.com/technical/consolecompare2/

The point is that nintendo have a close partnership with S3 from 1990 and they are working on GPGPU compression technologies. I do not know what they did on the GPGPU latte tools but I can only speculate according to their official site explanation.

http://www.s3graphics.com/en/technologies/tec_dateil.aspx?supportId=34
 
I don't think that changes what I said (i.e, never said they couldn't care about their own libraries). Nintendo putting more emphasis on their gamepad thingy over worrying about the latest and greatest graphics citing the Wii and Lateral Thinking of Withered Technology as a source.

What latest and greatest graphics are we talking about?
What special effects can the WiiU not do this generation?
 
Yeah, i thought so too. Google did lead me to this page:

If you clicked on my sources links you will see that the comparison DC vs GC vs Xbox is older than the new one that I already posted from the same site.

The original rendering comparison between the Xbox and Gamecube garnered some criticism that the comparison was not accurate enough, as the comparison was based on maximum quoted specs and not on effective bandwidth. We will see, as this comparison will then be based on effective bandwidth. In other words these are the results that can be achieved in a game situation.

http://segatech.com/technical/consolecompare2/

Xbox never confirmed it had S3TC.
 
What latest and greatest graphics are we talking about?
What special effects can the WiiU not do this generation?
In a closed box environment any bog standard DX10 compliant part could conceivably run almost every effect we're likely to see this gen. What some people in this thread don't seem to grasp is that that doesn't mean those effects are feasible for use in-game. This isn't fixed function hardware; if the grunt isn't there some effects simply won't be worth using, because they'll either demand too much memory/flops full stop or look like crap when "optimised" to the appropriate level. This is why latching onto any use of the term "DirectX11" in relation to the system as if it tells anything more than we already knew is pretty silly.
 
The real question is what architecture is more future proof for this gen. Clearly the Xbox One and Ps4 went for more direct approach and easy optimisation from PC to console built like PC architecture with the xbox one including eDRAM on die rather than direct pool of memory like Cerny said in his gamelab speech.

The Wii U architecture on the other hand is totally customised with tricks and stuff that are not known until some games come out that can give us some more information about the shader language nintendo is using for the machine. Plus S3 Graphics have very interesting stuff on their website that are including in their GPGPU page.

S3 Graphics GPGPU technology helps accelerate parallel-based, computation-intensive applications for the engineering, scientific, medical, and consumer markets.

The complete re-design of the Chrome general purpose programmable shader architecture supports an SIMD instruction set to efficiently calculate computations on parallel data workloads and thousands of concurrent threads. The result is Gigaflops of high-throughput calculations to create a high performance computing (HPC) environment.

Internal research and development along with industry leading ISV (Independent Software Vendor) and academia support help build the S3 Graphics GPGPU ecosystem to bring HPC to the masses. Applications that took days to complete can now churn out answers right at your fingertip in the comfort of your home, office, or research laboratory. Putting multiple advanced graphics processors in a system or across multiple platforms will give true HPC flexibility at fractions of the cost of today's HPC platforms utilizing non-GPU accelerated solutions.

Benefits of GPGPU Computing on S3 Graphics Chrome Products:

Increase calculation throughput and density
Map application/algorithm parallelism to GPU parallelism
Instruction level parallelism
Data level parallelism
Task parallelism
Stream processors can be programmed to match application

Applications Enhanced with S3 Graphics Chrome GPGPU

A variety of computation-intensive markets and applications require the high-throughput parallel processing power of the Chrome Series GPGPU stream processor cores, including:

Image Processing using S3FotoPro™
Video Color Correction
Video Encoding / Transcoding
Scientific Research
Game Physics
Engineering Analysis
Financial Analysis
Signal Processing

S3FotoPro™ Application

S3FotoPro™ is a GPGPU application that uses a proprietary and complex smart-image algorithm to analyze and automatically adjust macro and micro details within a picture to enhance the picture quality.

S3FotoPro calls on the power of the S3 Graphics GPGPU to perform the transformations and adjustments needed to beautify the image. Available picture enhancements include: Color clarity and correction, de-fogging, skin smoothing, gradient blending, saturation and tonal balance adjustments and optimizations, and many more improvements.

Download S3FotoPro here: http://www.s3graphics.com/en/drivers/software.aspx

Plus the main complain about the Wii U from early developed games was the bandwidth from the main pool of RAM was slow, some sensational articles saying that the true bandwidth of the console is the shared pool of memory of 1GB of RAM that is available of 12Gb/s. Without even say anything about the huge amount of eDRAM that is present on the die.

Now according to Cerny the eDRAM architecture have more benefits in the long run when developers get the hang of it so Wii U is yet used it's full potential with the exception of the new consoles that are coming have problems with their own exclusives games running at a stable frame and resolution.
 
The Wii U architecture on the other hand is totally customised with tricks and stuff that are not known until some games come out that can give us some more information about the shader language nintendo is using for the machine. Plus S3 Graphics have very interesting stuff on their website that are including in their GPGPU page.

I think this is taking it a bit far, yeah the Wii U is using eDRAM to augment the graphics bandwidth but to say 'its totally customised' compared to the XB1/PS4 is a bit funny when other then the aforementioned eDRAM and Wii/GC B/C hardware have we heard about anything they have changed?.
 
i dont think its taking it too far. i mean we know Nintendo isnt oneto boast about what they have in their console from the performance side. I do think it should be noted that Iwata went mentioned specifically in the Wii U direct on the console that it had a GPGPU.... thats kinda unlike Nintendo. i wouldnt be surprised if Nintendo is still optimizing the console GPU/CPU that will take place in future Firmware update. so like he said their are things we wont know or find out until we see them in a game or a developers sneaks out some info that can get by and not get them in trouble with the NDA they signed.

It obviously has GPGPU its a unified shader card that from a recentish new stock, this is a major departure from there previous generation so its worth noting, also, Its a bit hard to update the hardware when its already been fabricated.
 
Yeah when compared to 8 year old tech its not that bad.
That's true, and sounds even less bad when you know that it has bigger caches than "current" tech PS4 and Xbox One.

MrPresident said:
The quote explicitly states 'some DX10 parts and some DX11 parts'. The R700-series does exactly that.
The R-700 was de base from which this GPU was developed and customized during at least 3-4 years (silicon wasn't finished until early 2012 if I don't recall it bad).
To assume that this is a vanilla R-700 when we already know that the GPU has at least 32MB of eDram on the die with all the implications this may have on the chip design is pure nonsense.
Look, WE ALREADY KNOW THAT THIS GPU HAS BEEN CUSTOMIZED, why are you insisting on it being nothing more than an R-700 when, again, there wasn't a single R-700 with eDram on it?

antonz said:
Look the information isn't new. Ideaman and a few others confirmed last year that Wii Us GPU is a DX 10.1+ equivalent component. The + in the fact it has some bells and whistles that are in fact beyond 10.1. It is not a full DX 11 equivalent chip but it has elements.
Problem is that DX11 is in fact also a DX 10.1+. We know that this GPU is custom, but we don't know in which way was customized. Is it DX 10.1+ with the '+' being some functions that maybe will be included on DX 12?
Is it a plain DX 10.1 with the eDram integrated on it? We can't possibly know that with the information we have.

What we do know for sure is that the pure GPU die is at 100 mm^2 more or less (excluding the eDram), and that on the R700 line we had the HD4670 that was a 320:32:8 (SPU:TMU:ROP) part that at 55nm it had a die of 146 mm^2 reaching speeds of 750 Mhz (released in 2008).

The GPU silicon was ended in January 2012, so lets assume that if this was a PC GPU it would have been released in March-April of that same year, at 40nm. We know that the GPU is clocked at 550 Mhz, which is a whole 200 Mhz slower than that of the HD4670, and the few things that we guessed from the pics is that it was hand-customized to the point where some identified parts had different shapes in order to squeeze every single mm^2 of the area of the die.

There are also tons of things that customizations allows. like to ditch some functions that the company doesn't want on the design that could also cut off some of those 146 mm^2.

If this is a 160:16:8 part like most of the people believe it is, then it's absolutely obvious that this has to have some extra functions or that those 160 unified shaders have some functionalities on them that exceed the ones seen on the vanilla R700.

The fact that both the Wii and the GC had a GPU with 3 rasterizers (they were really good with the polygon output in comparison to its overall performance) shows us that Nintendo likes customize their GPUs a lot. That by its own its not a proof of anything of course, but it helps us to put things on perspective.

What I'm aiming at with this post is to discard the worst possible scenario that most people seems to defend as the most plausible one or at least make it less popular, and that is, a 550 Mhz vanilla R-700 GPU with eDram and a core configuration of 160:16:8 (SPU,TMU,ROP) or even a 160:8:8 configuration.
Even if it was a "vanilla" 320:32:16 part it would still be too big at 550 Mhz 40nm and 100 mm^2 of area, so it's pretty plausible to me that there are more functionalities that weren't there on the vanilla R-700.

phosphor112 said:
I'm not sure what happened to his tag, but he's always had ties to the inside.
StevieP refers to the leak that appeared last year which had a product description of 5 lines for the whole system, stating that the GPU was a "“GPU7” AMD Radeon™-based High Definition GPU. Unique API = GX2, which supports Shader Model 4.0 (DirectX 10.1 and OpenGL 3.3 equivalent functionality)".
 
Exactly what I said, you're not going to be able to do stuff on it that requires massively more power, nor are you going to be able to do anything on it that requires features that it does not have.


Sorry, but you didn't say anything before, and you haven't said anything now.
You want us to use a competitors API as a standard for Nintendo's own, but
you haven't been able to inform us how they are different, and why one is
better than the other. For all we know Nintendo's API is better than Microsoft's.
 
Sorry, but you didn't say anything before, and you haven't said anything now.
You want us to use a competitors API as a standard for Nintendo's own, but
you haven't been able to inform us how they are different, and why one is
better than the other. For all we know Nintendo's API is better than Microsoft's.

I never mentioned a API, I am talking about hardware features and we know that the Wii U GPU is at least a generation older then the ones in the XB1/PS4.
 
Saying that this post isn't an attack doesn't make it so. This thread has nothing to do with personal traits of the members participating in the discussion.
It's not an attack because I don't mean ill; I'm just sick of the situation, and I don't think he's so much of a victim as a person that puts himself in that position time and time again.

If I behaved like him I'd be a victim too.

And that's why I took the time to put it down on words. Because like I said, it bothers me. And if anything I figured it's like a bandaid, being direct in regards to where the problem is might feel like an attack, but I really don't think it is.
Why care so much that he cares? If he's a fanatic then fuck him, doesn't make him right. I am by NO means experienced with hardware analysis or any of the esoteric details this thread is full of.
I don't care that he cares. I care that I care, because reading people saying this thread has become obsolete is something that I pin on him, because as is, it really is.

I'd prefer it to flow slower but with information being poured when adequate, like the gianna sister comments yesterday or anything else meaningful, no shame in trial and error too; but not the "agenda pitching" and jumping to conclusions.

And I really don't enjoy beating around the bush. So if I did the villain and reflected what most have been thinking for a while so be it.
One thing I have been wondering about is that if the wireless radios Wii U uses can handle 5 player multi (Pad + 4 Wiimotes), and if Nintendo is correct when they said it can handle a total of two pads simultaneously, would it be possible to host 6 player multi? (2 pads, 4 Wiimotes)? Is that a bottleneck issue for the radios or the processing power of the system? Thanks, love this thread!
Wii U controllers use Wireless 802.11 and Wii controllers use bluetooth, so they're using separate channels; so yeah, you could pull that, providing the system can keep with them.

There's a cost for doing 2 Wii U gamepads though; and that is the fact that the bandwidth will keep being the same as for one controller; read, if you were doing 60 frames per second the transmission will be halved to 30.

Depending on what you're doing it could clearly turn into an issue for the processing power of the system though (for instance, 6 local players Mario Kart)
 
I never mentioned a API, I am talking about hardware features and we know that the Wii U GPU is at least a generation older then the ones in the XB1/PS4.

You talked about DX11.
Microsoft DirectX is a collection of application programming interfaces (APIs) for handling tasks related to multimedia, especially game programming and video,

And now you are telling us that the WiiU GPU is a generation behind without any basis.
The WiiU GPU has eDRAM in it, so which of AMD's GPU's, last generation, had this? So we can finally identify Latte.
 
You talked about DX11.


And now you are telling us that the WiiU GPU is a generation behind without any basis.
The WiiU GPU has eDRAM in it, so which of AMD's GPU's, last generation, had this? So we can finally identify Latte.

I'm sorry but you seriously think that the Wii U GPU that is based on AMD's older VLIW4/5 architecture is on the same level as the XB1/PS4 GPU which is based on AMD's newer SIMD GCN arch?.

Are you really trying to argue that? because its plainly obvious how wrong you are, and anyway. Also isn't it obvious that anything the GPU doesn't support cannot run on it? say anything that would only run on AMD's DX11 class GPUs (GCN) for example?. Also it should be pointed out that DX11 can easily be used to refer to a baseline of hardware standards that a device has to meet as well as a API, i was talking about the former not the latter.

If you want to go the 'eDRAM makes it special and therefore it cannot be based of this route' then go for it, but know that every GPU we are talking about has had modifications which mean that they will never appear in a standard desktop line, so it can be equally applied to all.
 
According to GameCube the texture compression was 1 to 6 compare to the Playstation 2 which did not have any.

As for xbox is not clear.
You can take as a fact that Xbox had S3TC.

S3TC was implemented as a compliance requirement for DirectX 6:

Its subsequent inclusion in Microsoft's DirectX 6.0 and OpenGL 1.3 (via the GL_EXT_texture_compression_s3tc extension) led to widespread adoption of the technology among hardware and software makers.
Source: http://en.wikipedia.org/wiki/S3_Texture_Compression

So of course a DirectX 8 part is fully compatible with it.
 
And that's why I took the time to put it down on words. Because like I said, it bothers me. And if anything I figured it's like a bandaid, being direct in regards to where the problem is might feel like an attack, but I really don't think it is.I don't care that he cares. I care that I care, because reading people saying this thread has become obsolete is something that I pin on him, because as is, it really is.

In my opinion there are a lot of folks involved in the downfall of this thread and it's not on the head of any one person. Too much bickering back and forth from folks on both sides of the discussion. Healthy debate is one thing, but there seems to be a lot of people taking sides and it's hard to follow the discussion when there's so much noise with no one willing to budge from their stance. It's like reading a religious debate where no side has any hope of converting the other. I feel like any real discussion left long time ago.

This is just my point of view as someone look in from the outside.
 
In my opinion there are a lot of folks involved in the downfall of this thread and it's not on the head of any one person. Too much bickering back and forth from folks on both sides of the discussion. Healthy debate is one thing, but there seems to be a lot of people taking sides and it's hard to follow the discussion when there's so much noise with no one willing to budge from their stance. It's like reading a religious debate where no side has any hope of converting the other. I feel like any real discussion left long time ago.

This is just my point of view as someone look in from the outside.
I agree, but I'm not scapegoating him or meaning to; I simply took the time to write my thoughts about this whole groundhog day situation. It's getting really old, and I imagine really old to him as well.

Sometimes people don't see the way or the way out unless you point it out to them.

Like I said, I know he does have an ego (everybody does), and some people when they're in discussions they really don't let go at the notion of him knowing he's wrong. Krizzx can actually make it worse by being stubborn, like with the 1080p fiasco argument and subsequent twisting.

Granted, every single one of us can probably be like that every once in a while and not being burned to a crisp by others, he also had that free pass a while ago but the thing is he created the pattern so that even if people could let it slide they know past misgivings and statements so the easy shortcut way out might be throwing it all at him. It's not fair, and it creates the cycle (certainly not his fault alone) but the fact that that pattern exists and it's so obvious is something he inflicted on himself. the rest is human nature on both sides playing out; had I not thought things through I could have done that.

And I can certainly get people getting frustrated with the fact he doesn't backtrack all the way by the 1000th time. There would be no issue if he was an unknown to us all, but we've been dealing with him for months; it's a case of too much exposure.

So, at the risk of this being taken for an attack it really isn't as I'm not baring the chance for him to save face or bashing him or his character, in fact the fact that he didn't post in a while might be a reflection of him thinking his participation through, I don't know. But I know I really don't mean ill towards the dude.

It's certainly not just him, but I think he's the biggest problem we have now; I'm sorry for putting it like that, but I also think that a little bluntness can sometimes do some good.
 
I agree, but I'm not scapegoating him or meaning to; I simply took the time to write my thoughts about this whole groundhog day situation. It's getting really old, and I imagine really old to him as well.

Sometimes people don't see the way or the way out unless you point it out to them.

Like I said, I know he does have an ego (everybody does), and some people when they're in discussions they really don't let go at the notion of him knowing he's wrong. Krizzx can actually make it worse by being stubborn, like with the 1080p fiasco argument and subsequent twisting.

Granted, every single one of us can probably be like that every once in a while and not being burned to a crisp by others, he also had that free pass a while ago but the thing is he created the pattern so that even if people could let it slide they know past misgivings and statements so the easy shortcut way out might be throwing it all at him. It's not fair, and it creates the cycle (certainly not his fault alone) but the fact that that pattern exists and it's so obvious is something he inflicted on himself. the rest is human nature on both sides playing out; had I not thought things through I could have done that.

And I can certainly get people getting frustrated with the fact he doesn't backtrack all the way by the 1000th time. There would be no issue if he was an unknown to us all, but we've been dealing with him for months; it's a case of too much exposure.

So, at the risk of this being taken for an attack it really isn't as I'm not baring the chance for him to save face or bashing him or his character, in fact the fact that he didn't post in a while might be a reflection of him thinking his participation through, I don't know. But I know I really don't mean ill towards the dude.

It's certainly not just him, but I think he's the biggest problem we have now; I'm sorry for putting it like that, but I also think that a little bluntness can sometimes do some good.

Okay, I'm tired of this 1080p "fiasco" garbage. That was no lie and that list was no lie nor was it gotten from Youtube videos, I did no make that list up, 90% I got from article confirmations, and the rest I got from viewing media. I twisted and spinned nothing, my argument was always the same from day one , that "most" Wii U games, most Wii U games that did fall under the category of being a port or had most of their development done in the launch time(because Nintendo and devs attested to poor develop environment) were 1080p. The only spinning was done on the part of people who reworded my argument to a more sensational claim about all Wii U exclusives games being 1080p 60fps which was never my argument. The only reason I even posted that in the first place was because of all of the people who made the claim that the Wii U is not more capable of 1080p than the other last gen consoles contrary to the statement by Shin'en that it was, and completely stomp at the possibility of any major game being 1080p like it is 100% fact that the Wii U is incapable of it.

I stand behind my research and I have seen no one "prove" it wrong by any stretch of the word, All I have seen is posters simply outright dismiss it and tell me I'm wrong followed by loads of insults and belittling(which is proof now?) when I do not accept their word for it as unquestionable, 100%, absolute fact and disregard all of what I have researched with my own eyes in its entirety for their own unproven contrary statements.

I will stand behind what I say, because I do not make up information. If I am wrong about something, I have no problem admitting it but I was not in this instance.

The rest of the problem come from similar twisting on same posters part. If I say something "maybe" or "to my knowledge" or "from what I heard" or "its possible", those statements have the "potentially but not sure" erased and get rethrown at me under the pretext that I made an absolute 100% claim like it was fact when I only made a suggestion. I don't like having my words twisted in that manner and people who twist my words will more than likely get a hostile response equivalent to theirs.


Now for the last time, I'm not the topic of this thread. If you don't like what I have to say then their is an ignore button. There are tons of people who come in here who actually do make outrageous claims and all of the crap I get accused up, I don't derail the thread to attack them when they do it, because I don't really care if they have a negative opinion. Heck over half the opinions in this thread are negative. You can say whatever you want about the Wii U or hardware. I do not care, but don't twist my words and accuse me of doing things I haven't. The reason is clear is clear to me. They have emotional investment in good things about the hardware being false, and therefore cannot just disagree and move along.

I make no "claim" that I cannot back up.
 
You talked about DX11.
And now you are telling us that the WiiU GPU is a generation behind without any basis.
The WiiU GPU has eDRAM in it, so which of AMD's GPU's, last generation, had this? So we can finally identify Latte.
You don't seem to understand that even though it's a custom chip the underlying tech of things like the ALUs is going to based on an existing product. If the R700s in the dev kits and constant mentions of DX10.1 in documentation aren't going to convince you that it's pre-GCN then you better have a good reason for why you think otherwise.
 
You don't seem to understand that even though it's a custom chip the underlying tech of things like the ALUs is going to based on an existing product. If the R700s in the dev kits and constant mentions of DX10.1 in documentation aren't going to convince you that it's pre-GCN then you better have a good reason for why you think otherwise.

Where are these documents that constantly mention DX10.1 one? I only ever saw one back in the day and that was prior to launch.

All statements and documents "I have seen" to date(post launch) say or make mention of DX11 starting with the comments by Unity.
 
Where are these documents that constantly mention DX10.1 one? I only ever saw one back in the day and that was prior to launch.

All statements and documents "I have seen" to date(post launch) say or make mention of DX11 starting with the comments by Unity.

And pray tell what AMD GPU is pre GCN and DX11.
 
I'm sorry but you seriously think that the Wii U GPU that is based on AMD's older VLIW4/5 architecture is on the same level as the XB1/PS4 GPU which is based on AMD's newer SIMD GCN arch?.

*snip*

Are you ever going to answer questions?
 
I reminisce about the good old days. You guys remember those days? When Fourth Storm, Blu and those other dudes thought of this great idea. An idea that portrays the 'goog Gaf'. An idea that though may have disappointed some had a definite conclusion been reached, was a brilliant and progressive idea. Myself like countless others who know little to nothing about this stuff was mesmerised and drawn in to the discussion and discoveries. It was brilliant. Now I am truly disheartened and very much disappointed in what once was truly informative becoming what is essentially a quarrel. This is 'bad Gaf' at it's worst. Locking the thread would be a disservice to all the above mentioned individuals who had a vision, one that is being eroded by nihilistic elements. To stop the spread of 'bad Gaf' though, it may have to be done.
 
Its not a GCN card is what it has to do with it, therefore its not a DX11 feature-set card.



I've answered all the questions that have been put forth.

Even if it was GCN, it wouldn't be a "dx11 feature-set card" since that's an MS API. People are referring to an equivalent feature set, which is entirely possible (although there's not that much evidence behind it)
 
Even if it was GCN, it wouldn't be a "dx11 feature-set card" since that's an MS API. People are referring to an equivalent feature set, which is entirely possible (although there's not that much evidence behind it)

I was wrong in the first place. [5xxx is DX11]. But its not the point when you talk about a dx11 feature set in regards to hardware it doesn't really matter if it supports the API or not it only matters if it has the specific features required to support the API. Of which the r7xx based card in the Wii U does not have, it may have things that go beyond DX11, and it may have things that never end up in DX12/13/999999 but we know for a fact that it doesn't support the required hardware features to get to the level of DX11.
 
I was wrong in the first place. [5xxx is DX11]. But its not the point when you talk about a dx11 feature set in regards to hardware it doesn't really matter if it supports the API or not it only matters if it has the specific features required to support the API. Of which the r7xx based card in the Wii U does not have, it may have things that go beyond DX11, and it may have things that never end up in DX12/13/999999 but we know for a fact that it doesn't support the required hardware features to get to the level of DX11.


Like what?


Edit: That wasn't antagonistic, I genuinely want to know what they are.
 
Like what?


Edit: That wasn't antagonistic, I genuinely want to know what they are.

DX11 compared to DX10 brings a bunch of new features that would require new hardware you can read about them all here.

http://msdn.microsoft.com/en-us/lib...90(v=vs.85).aspx#new_features_for_direct3d_11

  1. HLSL Shader Model 5.0
  2. Dynamic Shader Linkage
  3. Tessellation through Hull and Domain shaders
  4. New block compression formats: BC6H for HDR images, BC7 for higher-fidelity standard images
 
DX11 compared to DX10 brings a bunch of new features that would require new hardware you can read about them all here.

http://msdn.microsoft.com/en-us/lib...90(v=vs.85).aspx#new_features_for_direct3d_11


So which of those are you a) 100% certain Latte doesn't have and b) 100% sure GX2 doesn't have equivalent features to replicate? I'm not saying you're wrong by any means, you just seem definitive in your argument considering we know so few facts at this stage. I've not heard any definitive proof before that GX2 is not DX11 equivalent (although I personally don't think it is, or that it matters)
 
I reminisce about the good old days. You guys remember those days? When Fourth Storm, Blu and those other dudes thought of this great idea. An idea that portrays the 'goog Gaf'. An idea that though may have disappointed some had a definite conclusion been reached, was a brilliant and progressive idea. Myself like countless others who know little to nothing about this stuff was mesmerised and drawn in to the discussion and discoveries. It was brilliant. Now I am truly disheartened and very much disappointed in what once was truly informative becoming what is essentially a quarrel. This is 'bad Gaf' at it's worst. Locking the thread would be a disservice to all the above mentioned individuals who had a vision, one that is being eroded by nihilistic elements. To stop the spread of 'bad Gaf' though, it may have to be done.

Most of them left because people started attacking and misquoting them for posting unfavorable opinions. I know it happened to BG and Zomie quite a few times. It seems that if people don't like your opinion, they attack "you" in this thread. This has driven off a lot of the progressive posters.

DX11 compared to DX10 brings a bunch of new features that would require new hardware you can read about them all here.

http://msdn.microsoft.com/en-us/lib...90(v=vs.85).aspx#new_features_for_direct3d_11

There was a recent link that had Latte binned as Shader Model 5.0, though no one looked into it.

As for the rest, there is no way to verify one way or the other at the moment. I suppose one may get a response from Shin'en.
 
So which of those are you a) 100% certain Latte doesn't have and b) 100% sure GX2 doesn't have equivalent features to replicate? I'm not saying you're wrong by any means, you just seem definitive in your argument considering we know so few facts at this stage. I've not heard any definitive proof before that GX2 is not DX11 equivalent (although I personally don't think it is, or that it matters)

Well im not 100% sure that the PS4 doesn't contain a sun exploding missile but I can pretty sure. We know from previous leaks that latte is based off rx7xx. And wether or not GX2 supports them is a moot point considering that they are hardware features and GX2 is a software API, the point being that based off the information we have we can be pretty sure that it doesn't support said new features.

Wether or not you wish to replicate them in software at the cost of not only speed, but time, money and headaches is up to the developer but if we are then quantifying that as 'support' then you have opened a whole another kettle of fish because you should be able to reasonably do most things in software, just at the cost of the 4 things I mentioned previously.

So it should be under any sensible definition then atleast, that if it is baed of rx7xx then it does not support a number of features that DX11 does, because if it supported all of them or even most then AMD probably would have tooted there horn and marketed there cards as DX11 after the update/announcement of it but they didn't. They stuck to DX10 this should give some hint at least to what hardware support level exists in the desktop card.
 
Where are these documents that constantly mention DX10.1 one? I only ever saw one back in the day and that was prior to launch.

All statements and documents "I have seen" to date(post launch) say or make mention of DX11 starting with the comments by Unity.
IIRC the Unity slide at GDC said 10.1. There was a dev comment only in the last page or so of this thread saying it was a mix of the two. The point here isn't about that though. Do you understand that with only a few changes to things such as the tessellator, the R700/HD4XXXX series would be fully DX11 compliant? So when you have a chip like that in dev-kits, and mention of DX10.1 and SM4.1 equivalent functionality, subsequent talk of DX11 effects being used in a closed box environment don't contradict that. This isn't about talking the system down or saying "lol it's shit it's not even dx11", because in a console the difference between 10.1 and 11 would basically be nil (apart from maybe the tessellator, which could easily be custom, retrofitted etc...). It's about recognising where the chip is derived from, and unless you think every mention of R700 or DX10.1 was a red-herring, that's where the evidence points to.

Even if it was GCN, it wouldn't be a "dx11 feature-set card" since that's an MS API. People are referring to an equivalent feature set, which is entirely possible (although there's not that much evidence behind it)
Yes it would be, if you put it in a PC and had appropriate drivers. What's the difference between a "dx11 feature-set card" and a card with an equivalent feature set? The GPUs in the XB1 and PS4 are almost identical, but both will use different APIs.
 
Well im not 100% sure that the PS4 doesn't contain a sun exploding missile but I can pretty sure. We know from previous leaks that latte is based off rx7xx. And wether or not GX2 supports them is a moot point considering that they are hardware features and GX2 is a software API, the point being that based off the information we have we can be pretty sure that it doesn't support said new features.

Wether or not you wish to replicate them in software at the cost of not only speed, but time, money and headaches is up to the developer but if we are then quantifying that as 'support' then you have opened a whole another kettle of fish because you should be able to reasonably do most things in software, just at the cost of the 4 things I mentioned previously.

So it should be under any sensible definition then atleast, that if it is baed of rx7xx then it does not support a number of features that DX11 does, because if it supported all of them or even most then AMD probably would have tooted there horn and marketed there cards as DX11 after the update/announcement of it but they didn't. They stuck to DX10 this should give some hint at least to what hardware support level exists in the desktop card.

The key phrase there is "based on". It's not simply an rv7xx part, that much is certain. We don't know what Nintendo added/removed. That same leaked documentation also said "Shader Model 4.0" didn't it? So that would already put it beyond the rv700 it's based on.(edit, no it wouldn't, I'm an idiot)

There may well even be hardware in there (fixed function perhaps) which achieves the same results. Maybe it's more likeley there isn't, but we don't know which is why I'm questioning your apparent conviction. I probably agree with you, but the possibility of Latte having the required hardware to enable GX2's feature set to replicate DX11, is not as outlandish as you were implying.
 
Sorry if this is old (it surely is), but what i'm reading doesn't really say that it's dx11...

nintendo_gdc-3.jpg
http://www.polygon.com/2013/8/20/4641786/unity-for-wii-u-opens-up-gamepad-hardware-and-more-to-developers said:
A representative for Unity ran through how the development tools had been optimized for the Wii U version of the SDK. Most of them were fairly esoteric: The Wii U version of Unity supports DX10 level graphics, deferred rendering, GFX output support on the Wii U GamePad (using its forward-facing camera) and a few other specification-heavy tweaks.

http://www.cinemablend.com/games/Wii-U-GPGPU-Squashes-Xbox-360-PS3-Capable-DirectX-11-Equivalent-Graphics-47126.html said:
In a pre-briefing interview with Helgason before the press announcement went live, Gaming Blend had the opportunity to ask a few questions about the jump from mobile, PC and current-gen consoles to the first next-gen console, and whether developers would be able to make use of all of Unity's latest high-end technology on Nintendo's newest console, including the ability to make use of Unity 4's DirectX 11 equivalent features and shaders. Helgason replied with the following...
Yeah. We'll do a -- we'll make it potentially possible to do.

What's interesting is that our philosophy is always this: We have a match work flow and I'm sure we can make a decent game and prototype, and they're fun. And then we have a shared system that basically allows you to access the full capabilities of the hardware you run. That's going to be good whether you're running [software] on an iPhone, the Wii U, a gaming PC or whatever.
 
Well im not 100% sure that the PS4 doesn't contain a sun exploding missile but I can pretty sure. We know from previous leaks that latte is based off rx7xx. And wether or not GX2 supports them is a moot point considering that they are hardware features and GX2 is a software API, the point being that based off the information we have we can be pretty sure that it doesn't support said new features.

Wether or not you wish to replicate them in software at the cost of not only speed, but time, money and headaches is up to the developer but if we are then quantifying that as 'support' then you have opened a whole another kettle of fish because you should be able to reasonably do most things in software, just at the cost of the 4 things I mentioned previously.

So it should be under any sensible definition then atleast, that if it is baed of rx7xx then it does not support a number of features that DX11 does, because if it supported all of them or even most then AMD probably would have tooted there horn and marketed there cards as DX11 after the update/announcement of it but they didn't. They stuck to DX10 this should give some hint at least to what hardware support level exists in the desktop card.

Pretty sure they have never stated this, it wouldn't make sense to do what you are saying either since it isn't going to run DirectX at all... DX11 is very old by now (came out 4 years ago) I think we can put to bed that it is some sort of magic bullet point that makes Wii U next gen or not. There is no such metric really and wouldn't change the graphics output from the box anyways.

Honestly even R800 is R700 based, it doesn't mean much unless we have a list of features Wii U can't do right in front of us, and considering DX9 can approximate everything DX11 does, while DX10.1 having a much easier time of doing this, I doubt there is a point at all in whether or not Latte has DX11 hardware in it. Shader Model 4+ has really not seen any changes afaik, at least not from a feature set site: http://en.wikipedia.org/wiki/Shader_Model_4.0
 
The key phrase there is "based on". It's not simply an rv7xx part, that much is certain. We don't know what Nintendo added/removed. That same leaked documentation also said "Shader Model 4.0" didn't it? So that would already put it beyond the rv700 it's based on.

There may well even be hardware in there (fixed function perhaps) which achieves the same results. Maybe it's more likeley there isn't, but we don't know which is why I'm questioning your apparent conviction. I probably agree with you, but the possibility of Latte having the required hardware to enable GX2's feature set to replicate DX11, is not as outlandish as you were implying.

SM4.0 is the SM for DX10 so SM4.0 would make sense for a DX10 part. Of course its not completely standard but to imply that they suddenly went and decided to make it completely DX11 compliant seems a bit out there when we have no indication that they did anything there.

I never implied that software emulation for specific bits of hardware is outlandish, just that its all around not good solution for a large number of things.
 
Status
Not open for further replies.
Top Bottom