WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
Its not a GCN card is what it has to do with it, therefore its not a DX11 feature-set card.

So only GCN is DX11 compatible?
I still dont see what that has to do what DX11, a competitors API, have to do with Nintendo's API


I've answered all the questions that have been put forth.

Not by me, unless you think asking a question is the same as answering it.
 
IIRC the Unity slide at GDC said 10.1. There was a dev comment only in the last page or so of this thread saying it was a mix of the two. The point here isn't about that though. Do you understand that with only a few changes to things such as the tessellator, the R700/HD4XXXX series would be fully DX11 compliant? So when you have a chip like that in dev-kits, and mention of DX10.1 and SM4.1 equivalent functionality, subsequent talk of DX11 effects being used in a closed box environment don't contradict that. This isn't about talking the system down or saying "lol it's shit it's not even dx11", because in a console the difference between 10.1 and 11 would basically be nil (apart from maybe the tessellator, which could easily be custom, retrofitted etc...). It's about recognising where the chip is derived from, and unless you think every mention of R700 or DX10.1 was a red-herring, that's where the evidence points to.

Woah... chill dude. I was just asking a question. I'm not one of the ones who pretend to just be curious then attack you for his own presumptions of what you said afterward.

I've been following the Wii U info since launch and I don't remember seeing anything about it being DX10.1 post launch.(I still want to see those places were it was mentioned as that would be helpful).

Your hypothesis are noted.

I seem to be getting two different stories here. That lack of DX11 limits the console on one front, and whether is DX9/10/11 they can all do the same thing so it doesn't matter on another.

This doesn't add up to me given all of the other data I've seen. On one hand, you have comments like the one by the KH3 dev who said the Wii U wasn't getting it due to it not having DX11 capability. Then you have all of these devs, and the project cars log that list the Wii U as receiving nothing but DX11 implementations of features(and the PS4 and Wii U receiving the same one in the last instance but not he 360/PS3) version meaning that it isn't just a downscaled DX11 feature.

Taking all of this in account, none of it is connecting. There is so much contradiction from the devs down. I want to know what is fact, what is fiction, and what is most probable.


We have info that Latte is a custom chip with its own custom API(called GX2). It would make no sense for them to just rebrand a standard API like DX10 or DX10.1. for it to be custom made their must be customizations to it.

Given all of this info, I believe that Latte does contain DX11(not simply DX10 or 10.1) implementations of at least some features in its API, or their OpenGL equivalent.

I am not nor have I ever claimed that it is a fully DX11 compliant GPU as that is next to impossible(same with DX10 for that matter) do to it being a Microsft ATI. Customization seems to be the key that everyone is factoring out when discussing Latte's capabilities. From BG's and Fourth's analysis, Latte seems to have components on it that range from the HD4000 to HD6000 series. I'm not stating anything is certain, but I'm not writing off any possibilities either other than Latte having Direct X explicitly of course.
 
SM4.0 is the SM for DX10 so SM4.0 would make sense for a DX10 part. Of course its not completely standard but to imply that they suddenly went and decided to make it completely DX11 compliant seems a bit out there when we have no indication that they did anything there.

I never implied that software emulation for specific bits of hardware is outlandish, just that its all around not good solution for a large number of things.


My point about SM4.0 was that no rv700 supported it.*** So the assumption that 'because Latte is based on an rv700 it can't have equivalent dx11 features' doesn't hold water - as that same base GPU doesn't support SM4.0 either.....while Latte supposedly does. So Nintendo have evidently changed some things. Again, I'm not arguing your conclusion at all, just your certainty of it. Let's leave some doors a-jar :)

And I was referring to your comparison of Latte having those hardware capabilities (or equivalent) to PS4 having "sun destroying missiles". That would be slightly more outlandish ;)


***Edit. I'm reading the rv700 docs wrong, my bad. Ignore the first paragraph.
 
I seem to be getting two different stories here. That lack of DX11 limits the console on one front, and whether is DX9/10/11 they can all do the same thing so it doesn't matter on another.

Being DX10.1 doesn't really limit it much. DX11 is extremely similar to DX10 and 10.1. So, 10.1 can still pull off 11 features, just not as well. Biggest example, probably is tessellation.
 
So only GCN is DX11 compatible?
I still dont see what that has to do what DX11, a competitors API, have to do with Nintendo's API




Not by me, unless you think asking a question is the same as answering it.

I actually already corrected myself in regards to your first half of your first question, and I have already attempted to explain the difference between a software API And the hardware features that said software API is trying to abstract away as well, but I will try again either way, the reason it has to do with DX11 is because it has to do with what hardware requirements DX11/OGL4.4/Whatever has, this has been the argument in the thread for a while has it not?


We have info that Latte is a custom chip with its own custom API(called GX2). It would make no sense for them to just rebrand a standard API like DX10 or DX10.1. for it to be custom made their must be customizations to it.

Given all of this info, I believe that Latte does contain DX11(not simply DX10 or 10.1) implementations of at least some features in its API, or their OpenGL equivalent.

I am not nor have I ever claimed that it is a fully DX11 compliant GPU as that is next to impossible(same with DX10 for that matter) do to it being a Microsft ATI. Customization seems to be the key that everyone is factoring out when discussing Latte's capabilities. From BG's and Fourth's analysis, Latte seems to have components on it that range from the HD4000 to HD6000 series. I'm not stating anything is certain, but I'm not writing off any possibilities either other than Latte having Direct X explicitly of course.

You have some things the wrong way around DX abstracts and exposes the hardware to the programmer there would be nothing DX specific in the hardware that you couldn't expose with another API, DX levels are just a nice way of trying to workout what hardware featureset a card has.
 
Really doesn't matrer what feature set the hardware is...it lack the horse power to use the more advance dx11 feature set anyway.

Btw its dx10 feature set. We have this confirm like last year. It about only thing we know is correct.
 
Guys look what I have found about the eDRAM architecture of the Wii U. I already saw the speech of Marc Cerny at gamelab, but I did not took notice the part that explained the 2 different next gen architectures the next gen consoles are using including the Wii U.

According to Mark Cerny the shared pool of GDDR5 in the Playstation 4 have a more direct approach to handle the next gen graphic engines that are being developed with 179Gb/s bandwidth. According to him is more than enough for developers to handle.

Now for the juicy part of his speech was that he choose this architecture because of simplicity BUT the advantage of eDRAM on the die would produce at least 1Tb/s of bandwidth IF the developers had the right amount of time and produce specific workarounds to TAP into that power the advantages would be enormous AND BETTER for next gen graphics.

http://www.youtube.com/watch?v=xHXrBnipHyA&feature=player_detailpage#t=2324

So Wii U(32mb) and Xbox One have this specific architecture that uses a HUGE amount of eDRAM ON die this bring new elements to the table that the programmers and specialist here on this thread I would like to analyse and inform us.

Why is it so hard to comprehend that Edram running at 1TB/sec isn't nearly the same as the Edram on either the Xbox One or the Wii U? We're talking about a 1Tb/sec vs 200 ~Gb/sec at most.
 
Well im not 100% sure that the PS4 doesn't contain a sun exploding missile but I can pretty sure.

This is way off topic so I apologize in advance but your word choice here for the joke is really interesting...maybe even intentional? If you recall back when PS2 was releasing Sony seeded a rumor that the PS2 might not launch in the US because it was so powerful you could launch missiles from it. The ploy worked and over night every kid and teenager wanted the incredibly powerful, international-ban-worthy PS2

COMPERE: From a war of the past to the war of the future. Could Sony's video games machine, Playstation 2, be used to launch an actual nuclear war? Apparently Japan's Ministry of Trade seriously thinks it could. The Ministry's slapped export controls on what's already the world's most sought after games machine saying it contains parts which could be used in real life missile guidance systems.

http://www.abc.net.au/pm/stories/s119754.htm

I wouldn't rule out the sun exploding missile!

Also, freezamite, your posts lately in this thread have been really good and really interesting. I'm learning a lot here and so can't debate on a technical level. But wanted to say thanks for adding your ideas to the discussion etc. Quoting a snippet so you'll see my compliment
The R-700 was the base from which this GPU was developed and customized during at least 3-4 years (silicon wasn't finished until early 2012 if I don't recall it bad)...

Going back to lurking mode
 
On one hand, you have comments like the one by the KH3 dev who said the Wii U wasn't getting it due to it not having DX11 capability.

Did they really say that? I was under the impression that Nomura stated that the target development was on DX11, and that any platform that could represent their vision of the game would be considered. The media outlets interpreted that as it not coming to the Wii U.

I think this is the specific response: http://www.youtube.com/watch?feature=player_embedded&v=IpzjiMPhNBU#t=3135
 
Really doesn't matrer what feature set the hardware is...it lack the horse power to use the more advance dx11 feature set anyway.

Btw its dx10 feature set. We have this confirm like last year. It about only thing we know is correct.

Now this is the type of unfounded comment that makes problems.

Please explain how the Wii U lacks the horsepower to use the more advanced dx11 features, and where was it using the DX10 features set confirmed? If this was confirmed by any means, then there should be no difficulty in showing where it was done.
 
Being DX10.1 doesn't really limit it much. DX11 is extremely similar to DX10 and 10.1. So, 10.1 can still pull off 11 features, just not as well. Biggest example, probably is tessellation.
And brings us back around to "It might have a comparable featureset to PS4/One but nowhere near the brute capability to put it to good use." I mean if it has tessellation units comparable to PS4/One it likely doesn't have the poly crunching capability to put it to good use.

On a lighter note I was adopted by a wild kitten yesterday. It climbed up through my basement and started eating at Northstars food dish. Doesn't want me to touch him unless protected by Northstar but will rub against my leg if I hide my hands. Once he sees my hands though that kitten runs.

I've been calling him little one for two months now. Laying food outside for him. But he followed Northstar through the basement and has been inside pretty consistently. Decided I'd name him little Juan.
 
So here some food for though about that secret sauce of the different architecture of all next gen systems.
Oh god, who started this secret sauce nonsense? It wasn't right the first time where the Xbox One was suppose to have 2x GPU's, it was never right again. Seriously, can we drop it. The idea there's actually "secret" power in a console sounds so silly.

MDX said:
What latest and greatest graphics are we talking about?
What special effects can the WiiU not do this generation?
It's the best visuals that can be achieved now and into the foreseeable future. So games like Crysis 3, Ryse, Killzone SF and other games down the line that are clearly above everyone else in power.

I don't really care about the special effects. It's always been possible to approximate effects on lesser hardware. However, that doesn't make the likes of 3DS the "greatest" in 2013 because there are other games on more platforms way more technical than it despite it supporting normal maps for example.
 
And brings us back around to "It might have a comparable featureset to PS4/One but nowhere near the brute capability to put it to good use." I mean if it has tessellation units comparable to PS4/One it likely doesn't have the poly crunching capability to put it to good use.

On a lighter note I was adopted by a wild kitten yesterday. It climbed up through my basement and started eating at Northstars food dish. Doesn't want me to touch him unless protected by Northstar but will rub against my leg if I hide my hands. Once he sees my hands though that kitten runs.

I've been calling him little one for two months now. Laying food outside for him. But he followed Northstar through the basement and has been inside pretty consistently. Decided I'd name him little Juan.
Cute. If you have another cat, make sure that kitten doesn't have feline AIDS.

It can transmit between cats easily.
 
Cute. If you have another cat, make sure that kitten doesn't have feline AIDS.

It can transmit between cats easily.
It'd be way too late for that. I can't keep my cats inside this house because of just how old and decrepit this house is. Barely has a foundation and has many open spots between inside and out. I'll do my due diligence when the kitten lets me pick him up. Which should be any time in the next few weeks, but little Juan has been hanging around Northstar for almost two months now.

I gave up on keeping the cats in. Block up holes in the foundation, block up the basement, they just find another way out. I'd have to have a good chunk of this shithole remade before it'd be possible to keep my cats inside. If little Juan has FIV then Northstar would too at this point, in which case they'll die. Tragic, very sad, but to a degree unavoidable.

I have no doubt if someone actually inspected this house it'd be condemned just based on its foundation. Not even thinking about the roof. Best I can do is get them their shots and hope for the best.
 
Something else that has me concerned that Wii U wasn't intended to be DX11 is the lack of developer commitment.

Wii U was first unveiled in 2011. Literally 2 years ahead of the PS4/XBO. You're telling me NO ONE in the industry would have heard about this or attempted to try something on it until now?

Nintendo's communications or outreach would have to be so bad or, the tech in it wasn't much different to the PS3/360 to pull off anything good. Or what I've been trying to say from the last page.

Nintendo is following a philosophy similar to the Wii and lateral thinking which was to use extremely outdated hardware (in this case, the R700) and focus more resources on selling the gamepad.
 
And brings us back around to "It might have a comparable featureset to PS4/One but nowhere near the brute capability to put it to good use." I mean if it has tessellation units comparable to PS4/One it likely doesn't have the poly crunching capability to put it to good use.

Agree the whole discussion is moot at this point. Its has been beaten to death for teh past year.

Love for some new info to come out but looks unlikely. Crazy we know more about ps4/xbone than wiiu.

Something else that has me concerned that Wii U wasn't intended to be DX11 is the lack of developer commitment.

Wii U was first unveiled in 2011. Literally 2 years ahead of the PS4/XBO. You're telling me NO ONE in the industry would have heard about this or attempted to try something on it until now?

Nintendo's communications or outreach would have to be so bad or, the tech in it wasn't much different to the PS3/360 to pull off anything good. Or what I've been trying to say from the last page.

Nintendo is following a philosophy similar to the Wii and lateral thinking which was to use extremely outdated hardware (in this case, the R700) and focus more resources on selling the gamepad.
Its seem the main performance goal of the wiiu is to be xbox 360+ without breaking BC. That is what they did.
 
Agree the whole discussion is moot at this point. Its has been beaten to death for teh past year.

Love for some new info to come out but looks unlikely. Crazy we know more about ps4/xbone than wiiu.

Its seem the main performance goal of the wiiu is to be xbox 360+ without breaking BC. That is what they did.

While using a fraction of the wattage.

It really is impressive in its own right. Just not the kind of impressive to turn heads.
 
Just done the part of Rayman Legends where the creatures swarm and it does indeed look like they've nerfed the amount of enemies to suit the weaker hardware of the PS3 and 360...unless there's a similar level on the final stage with more enemies in it..?

Haven't had a chance to check out the light sources either (cos I'm too busy running for my life lol) but I don't think each individual creature acts as a light source like they were in that screenshot released last year showing well over a hundred of them before they decided to make it multiplatform. :o(
 
Something else that has me concerned that Wii U wasn't intended to be DX11 is the lack of developer commitment.

Wii U was first unveiled in 2011. Literally 2 years ahead of the PS4/XBO. You're telling me NO ONE in the industry would have heard about this or attempted to try something on it until now?

Nintendo's communications or outreach would have to be so bad or, the tech in it wasn't much different to the PS3/360 to pull off anything good. Or what I've been trying to say from the last page.

Nintendo is following a philosophy similar to the Wii and lateral thinking which was to use extremely outdated hardware (in this case, the R700) and focus more resources on selling the gamepad.

why is it so damn expensive to manufacture then ?
 
You can take as a fact that Xbox had S3TC.

S3TC was implemented as a compliance requirement for DirectX 6:

Source: http://en.wikipedia.org/wiki/S3_Texture_Compression

So of course a DirectX 8 part is fully compatible with it.

It is not the same as the S3 Graphics compression the GameCube used, the texture compression that nintendo invented with S3 is exclusive to Nintendo hardware. The S3TC that was implemented on the DirectX6 was for Windows not the original Xbox which was using a custom firmware and API created in collaboration with NVidia and Microsoft.

http://en.wikipedia.org/wiki/Xbox_(console)

The Xbox runs a custom operating system which was once believed to be a modified version of the Windows 2000 kernel. It exposes APIs similar to APIs found in Microsoft Windows, such as DirectX 8.1. The system software may have been based on the Windows NT architecture that powered Windows 2000; it is not a modified version of either.

DirectX 8.0a 4.08.00.0400 (RC14) Last supported version for Windows 95 February 5, 2001
DirectX 8.1 4.08.01.0810 Windows XP, Windows XP SP1, Windows Server 2003 and Xbox exclusive October 25, 2001
4.08.01.0881 (RC7) This version is for the down level operating systems
(Windows 98, Windows Me and Windows 2000)

Which is not clear what API where available on the original Xbox because it was custom made. If S3TC that GameCube used was available in DirectX6 in windows so it would be available on Dreamcast, which is not, as well and I am very certain that the Dreamcast was using a custom Windows firmware CE2000.

DirectX 6.0 4.06.00.0318 (RC3) Windows CE as implemented on Dreamcast August 7, 1998

maybe the original Xbox was using some kind of S3TC but nowhere as efficiently as the GameCube because Xbox GPU had other priorities. Plus the memory of the GameCube was more efficient producing double the amount of fillrate data than the Xbox.

http://en.wikipedia.org/wiki/DirectX

There is no question that GC's main memory of 1T-SRAM is much more efficient than the Xbox's DDR SDRAM, as the latency of GC's 1T-SRAM is 10 ns, and the average latency of 200 MHz DDR SDRAM is estimated to be around 30 ns.

Memory efficiency is largely driven by data streaming. What that means is that developers can do optimizations to their data accesses so that they are more linear and thus suffer from less latency. Latency is highest on the first page fetch, and less on subsequent linear accesses. It's random accesses that drives memory efficiency down, as more latency is introduced in all the new page fetches.

It has been brought up that DDR SDRAM is only 65 percent effective, and it is only 65 percent effective when comparing a SDRAM based GeForce2 graphics card with a DDR based GeForce2 graphics card. The Xbox's main memory efficiency should be around 75 percent effective if one considers that the Geforce3 has a much better memory controller than what is on the Geforce2 chipsets. You can see that incredible efficiency of the Geforce3 memory controller versus the Geforce2 at AnandTech's Geforce3 review, where fill-rate is compared, and that is a good measure of memory effectiveness. The comparison at AnandTech's does not just highlight the effectiveness of the GeForce3's Lightspeed Memory Architecture (memory controller), but also highlights the effectiveness of the texture cache, and the visibility subsystem.

The GC's 1T-SRAM main memory is speculated to be 90 percent effective. A significant difference between the two memories!

So GameCube was built to take advantage of it's powers.

Frame Buffer and Z-Buffer Accesses
The GC has a 2 MB on-chip frame (draw) buffer and z-buffer, so reads and writes to that on-chip memory buffer does not effect the main memory bandwidth. The GC still has to send the frame buffer to memory for display each frame.

The Xbox stores it's frame buffer and z-buffer in main memory, and it supports z-buffer compression at a 4:1 ratio, so a 32-bit z-buffer value is only 8-bits in size when compressed. The decompression and compression of z-buffer data, to and from memory, is handled automatically by the Xbox GPU.

Xbox: 640 x 480 (resolution) x 5 (frame buffer write (24-bits) + z-buffer read (1 byte) + z-buffer write (1 byte)) x 3 (overdraw) x 60 FPS = ~277 MB/sec or 0.277 GB/sec. So 4.05 GB/sec - 0.277 GB/sec = 3.77 GB/sec
GC: Only has to write out frame buffer each frame and at 60 FPS is roughly 55 MB/sec or 0.055 GB/sec. So 1.44 GB/sec - 0.055 GB/sec = 1.39 GB/sec.

So if the original Xbox had S3TC why the GPU had to do extra compression and decompression therefore hitting the performance of the framebuffer?

What is known:
GC cache is either 8 times to 4 times larger than the Xbox's (128 KB or 256 KB).
Xbox can feed it's cache with 3 times greater data per second than the GC.

There is also speculation that the GC cache can hold compresed textures and the Xbox cache cannot, if so then that can make a huge difference in the comparison as with a 6:1 compression ratio, the cache can hold 6 times more data! 6 MB of data for the GC compared to 128 KB or 256 KB for the Xbox is a huge difference.

Since there is so much speculation on the two different caches for each GPU, and there is no clear calculation for an accurate comparison, the cache will not be included in our result below.

That's why it is not clear if the Xbox had the same S3TC technology as the GameCube but a different approach on the matter.

http://segatech.com/technical/consolecompare2/

Anyway that is not the point. The point is that latte is not yet put to the test by take advantage of the GPGPU features, the differences in power would be visible on multiplatform titles that would hit the 3 consoles(ps4,Xbox one, WiiU) at the same time.

Do you have those screenshots yet?

I am trying to find a good VCC that supports 1080p resolution at 60fps. I was looking for the digital foundry equipment AND THE FUNNY part is that THEY DO NOT write WHAT equipment they are using so at least buy the same stuff to make the same comparison. All the cards that I find here in my country is at 720p in 60fps and 1080i 30fps. So if I am going to make an investment for my personal use I am going to buy the best value for my money. My friend's capture card is at 720p max so if I post something like that it would be stupid and immoral from my part so be a little more patient, I have not forgotten you.

Why is it so hard to comprehend that Edram running at 1TB/sec isn't nearly the same as the Edram on either the Xbox One or the Wii U? We're talking about a 1Tb/sec vs 200 ~Gb/sec at most.

That is exactly my point WE DO NOT KNOW the bandwidth the eDRAM the Wii U is producing. Maybe it 100 or 150 or 200 or 1000Gb/s for all I know that's why we need to see MORE games not judging early ports for god sake. Some of the posters here do not want to find the reality about the latte rather than bash Wii U as a shitty, low tech console. Myself wants only to see what the machine is capable of, not make console wars or graphics contests so that I would vindicated on my purchase for strongest console.
 
Xbox never confirmed it had S3TC.
Every NV starting from NV20 up had full S3TC lineup support (http://ixbtlabs.com/articles/nv20/). Also, PS2's GS had support for CLUT4 & 8 texture compression (4/8bit texel indexing into a lookup table), which is a form of texture compression, albeit rudimentary.

You can take as a fact that Xbox had S3TC.

S3TC was implemented as a compliance requirement for DirectX 6:

Source: http://en.wikipedia.org/wiki/S3_Texture_Compression

So of course a DirectX 8 part is fully compatible with it.
IIRC, DXTC1 through 5 were not mandatory in DX6, but they were (the entire set or thereabout) in DX8.

Its not a GCN card is what it has to do with it, therefore its not a DX11 feature-set card.
Every AMD design since R800 has been DX11-compliant.
 
Something else that has me concerned that Wii U wasn't intended to be DX11 is the lack of developer commitment.

Wii U was first unveiled in 2011. Literally 2 years ahead of the PS4/XBO. You're telling me NO ONE in the industry would have heard about this or attempted to try something on it until now?

Nintendo's communications or outreach would have to be so bad or, the tech in it wasn't much different to the PS3/360 to pull off anything good. Or what I've been trying to say from the last page.

Nintendo is following a philosophy similar to the Wii and lateral thinking which was to use extremely outdated hardware (in this case, the R700) and focus more resources on selling the gamepad.

Actually, Nintendo started customizing its GPU with the latest available tech at the time when they started getting serious into building the thing.

Well I'm not an expert but I do know it's built on the 40nm process so I wouldn't equate it with being new.
Edit: As for the actual costs, I have no idea.

As far as I'm aware, 40nm is still cheaper than 28nm. There are lots of reasons why the Wii U is expensive to produce. That the PS360 are still over $200 in over 7 years of being on the market should tell you something other than "they want to make xtreme profits!!11!". That it's small and produces what will end up being slightly better than last gen output should be a big clue. So should the more expensive controller.
 
Actually, Nintendo started customizing its GPU with the latest available tech at the time.



As far as I'm aware, 40nm is still cheaper than 28nm.

Latest available with a power threshold they wanted of course. Still amazes me what they've achieved using so little electricity, and a mature die size. Capability in excess of 360/PS3 while using a fraction of the power and a very simple cooling system.

It might not impress the hardcore power gamer but I find it a beautiful little design.
 
Latest available with a power threshold they wanted of course. Still amazes me what they've achieved using so little electricity, and a mature die size. Capability in excess of 360/PS3 while using a fraction of the power and a very simple cooling system.

It might not impress the hardcore power gamer but I find it a beautiful little design.

It would be impressive if they were selling it for 200$.
 
There is so much offtopic Nintendo bashing and congratulatory chest bombing on this page that it's just pitiful.

That is exactly my point WE DO NOT KNOW the bandwidth the eDRAM the Wii U is producing. Maybe it 100 or 150 or 200 or 1000Gb/s for all I know that's why we need to see MORE games not judging early ports for god sake. Some of the posters here do not want to find the reality about the latte rather than bash Wii U as a shitty, low tech console. Myself wants only to see what the machine is capable of, not make console wars or graphics contests so that I would vindicated on my purchase for strongest console.

Finally, someone else notices it besides me. That is why this topic can't progress right. The moment we start to uncover any features or make headway on the capabilities, it gets gangblocked by people who act as if the possibility that others might start thinking the Wii U is more capable than its launch ports demonstrated causes them physical injury.

I've seen at least 3 purely antagonistic claims made with certainty about the hardware on the last 2 page that are completely unfounded. No one called them out for it at all, but I make the slightest positive suggestion as a "maybe" and I get gang bashed.

The only fact we know about the GPU is that its custom built and we know the eDram size, yet here have people talking about unfounded confounded confirmations. Its make no sense that people who dislike this hardware so much are so omnipresent in topics about it.


As for the compression that the GC used. Wasn't it G3 compression?
 
In a closed box environment any bog standard DX10 compliant part could conceivably run almost every effect we're likely to see this gen. What some people in this thread don't seem to grasp is that that doesn't mean those effects are feasible for use in-game. This isn't fixed function hardware; if the grunt isn't there some effects simply won't be worth using, because they'll either demand too much memory/flops full stop or look like crap when "optimised" to the appropriate level. This is why latching onto any use of the term "DirectX11" in relation to the system as if it tells anything more than we already knew is pretty silly.

We don't know that for a fact. There's still over a third of the silicone that we don't have a Scooby Doo about. We know that Nintendo were working closely with the likes of Crytek and other engine creators...who knows..? It's quite possible that Nintendo have evolved the TEV Unit so that Latte can have the best of both worlds that Flipper and Hollywood didn't have - a fixed function system that gives the Wii U a standard rendering pipeline, making porting between the PS3/360/Wii U and PS4/One/Wii U possible.

Nintendo know that the nonstandard rendering pipeline of the Wii was more of a problem for developers and publishers than the Wii's lack of power. Treyarch bringing Modern Warfare to the Wii - despite Infinity Ward saying it was impossible - is proof that the Wii was capable of running your average PS3/360 game without much of a problem.

I've always wondered what would have happened last gen if Nintendo had ditched backwards compatibility last gen and chosen a GPU with traditional programmable shaders and kept other components as they were - we may have seen scaled down ports of all major third party titles including the likes of Dead Space, Bioshock, Fallout 3 etc etc. You'd have jaggies up the wazoo, much poorer textures, much longer loading times, pop-in etc but we may have seen it happening.
 
Yes, the EDRAM bandwidth is a mystery. Also if going by some dev comments on the XB1 (will try to look for it I read it here on GAF), it seems devs don't like the ways around using the 32mb of ESRAM (XB1) properly. In this particular mention they were complimenting the PS4 direct access/high bandwidth.

So given the above and given the WiiU has a similar setup with 32MB of EDRAM, plus low port budgets, small teams, new tools, IMO it is almost a given the EDRAM was not used properly on the ports. Add to this the CPU troubles, I find amazing how the ports even ran decently.

I also find interesting what is going on on the other games for the new consoles. Another downgrade was informed, Ryse is 900p native and not 1080p. I find this interesting to put the latest Wii U games performance in context of their rivals, as a lot of 720p60 games on this low powered console seem even more amazing to me. The Wii U is capable of handling some games at 1080p, but going by Shinen, it is not worth it, as going 720p they can add more effects. Maybe with more optimizations this could change, but for now I still believe this is an excellent 720p machine.
 
I reminisce about the good old days. You guys remember those days? When Fourth Storm, Blu and those other dudes thought of this great idea. An idea that portrays the 'goog Gaf'. An idea that though may have disappointed some had a definite conclusion been reached, was a brilliant and progressive idea. Myself like countless others who know little to nothing about this stuff was mesmerised and drawn in to the discussion and discoveries. It was brilliant. Now I am truly disheartened and very much disappointed in what once was truly informative becoming what is essentially a quarrel. This is 'bad Gaf' at it's worst. Locking the thread would be a disservice to all the above mentioned individuals who had a vision, one that is being eroded by nihilistic elements. To stop the spread of 'bad Gaf' though, it may have to be done.

I don't know why but for some reason I just read that word as 'squirrel' lmfao
 
I don't remember the exact words, but I think the better description for wii u was already made in this thread ... a current gen machine refined or the perfect current gen machine. Not more not less (speaking about performance). Obviosly with almost "current" tech. Basically what stevieP said.

Of course i think someone won't agree ...
 
I don't remember the exact words, but I think the better description for wii u was already made in this thread ... a current gen machine refined or the perfect current gen machine. Not more not less (speaking about performance). Obviosly with almost "current" tech. Basically what stevieP said.
Coincidentally, Wii U's memory is the same as the Xbox 360 dev kit (1GB).
Edit: My mistake. I thought all the new ones did.
 
I don't remember the exact words, but I think the better description for wii u was already made in this thread ... a current gen machine refined or the perfect current gen machine. Not more not less (speaking about performance). Obviosly with almost "current" tech. Basically what stevieP said.

Of course i think someone won't agree ...

Unfortunately that definition would limit the Wii U to a DirectX 9 like feature set which does the console great injustice.

Coincidentally, Wii U's memory is the same as the Xbox 360 dev kit (1GB).
Edit: My mistake. I thought all the new ones did.

Also I'm pretty certain the Wii U has 2GB of Memory, in the same vein that that the PS4/Xbox One both have 8GB of Memory
 
Unfortunately that definition would limit the Wii U to a DirectX 9 like feature set which does the console great injustice.

He said current tech. It has an improved feature set, in other words, over 7th gen. He was speaking of power (as in, raw power) when talking about a "perfected" 7th gen console.
 
He said current tech. It has an improved feature set, in other words, over 7th gen. He was speaking of power (as in, raw power) when talking about a "perfected" 7th gen console.

And what is GAF using as a power baseline nowadays?

Gamecubes?

I've seen Floating Points Operations, Memory Bandwidth, Power Supply TDP, and all sorts of discrete measurements and estimates tossed around in this thread for quite some time. I've been a Software Engineer (Business software not entertainment software) for roughly 7 years, and back in the day I took all sorts of Computer Organization courses and still putz around on my FPGAs (in Verilog) when I have the urge.

Unfortunately all of those metrics really tell us very little about what kind of Games the GPU will be able to spit out. We don't have a good example of what 50, 250, 500, 1000 or a 1500 GFlop game looks like or what a a 33, 48, or 100W TDP would enable.

Where is is thread heading?
What will actually help us get there?
 
I don't remember the exact words, but I think the better description for wii u was already made in this thread ... a current gen machine refined or the perfect current gen machine. Not more not less (speaking about performance). Obviosly with almost "current" tech. Basically what stevieP said.

Of course i think someone won't agree ...
I agree! Good to see you posting again.
 
I don't remember the exact words, but I think the better description for wii u was already made in this thread ... a current gen machine refined or the perfect current gen machine. Not more not less (speaking about performance). Obviosly with almost "current" tech. Basically what stevieP said.

Of course i think someone won't agree ...

I also agree somewhat. I think the 32MB of EDRAM has not been fully put to good use.

The RAM is 2x PS360, but the EDRAM is 3 times that of X360 (same amount of ESRAM as XB1) plus the focus on so much caches.
 
I am trying to find a good VCC that supports 1080p resolution at 60fps. I was looking for the digital foundry equipment AND THE FUNNY part is that THEY DO NOT write WHAT equipment they are using so at least buy the same stuff to make the same comparison. All the cards that I find here in my country is at 720p in 60fps and 1080i 30fps. So if I am going to make an investment for my personal use I am going to buy the best value for my money. My friend's capture card is at 720p max so if I post something like that it would be stupid and immoral from my part so be a little more patient, I have not forgotten you.
www.amazon.com/AVERMEDIA-Broadcaster-capture-1080p60-Component/dp/B006T8QCYA/

Why not just trust the images from Eurogamer?

Here... look.

awyIXBz.png


rSKjisR.png
Look. It's the same. You take the resolution of the screengrab. Find an aliased line. Count the "steps" and see how many pixels that steps equals.

As you can see there are 12 steps and within those 12 steps, it moves vertically 12 pixels.

1080 / 12 steps. Then multiply that by how many pixels..

1080 / 12 x 12 = 1080.

360 is native 1080 just as is the Wii U version.

http://www.eurogamer.net/articles/digitalfoundry-rayman-legends-face-off

Link for the source images.
 
Status
Not open for further replies.
Top Bottom