Rumor: Wii U final specs

Till the very end, aren´t we?

You seem truly frightened about a successful Wii U, in my eyes...

The fact that the gap is far smaller than Wii -> xbox 360 somehow has people scared...

Yup, not just here either. I've even got people on another forum I frequent denying that the U has the most impressive launch/launch window lineup seen in donkeys years. The mind boggles.

Methinks there are going to be plenty of disappointed gamers next gen when the PS4 and 720 launch/launch window titles aren't going to be too far different in terms of eye candy compared to U titles at that time. The majority of development will be done on underpowered/unfinished dev kits and developers will be more experienced with U development. The PS4 and 720 are going to benefit from developers working with the U having similar architecture though which should do them a few favours.

As for the GPU I'm still convinced we're looking at a 28nm or 32nm process, we're still looking at around 600-800GFLOPS and that Nintendo are still including some sort of fixed functions for lighting, shadows and depth of field. Although I still haven't a clue how that fixed functionality is going to integrate with the likes of the Unreal, Cry, MT Framework etc engines.
 
I'm not going to say anything but some people also said that about the GPGPU and 2GB.

And were right on both accounts. 2GB was not for games. I even said they likely have flash 512 gb storage for the OS. Plus we have the 1.5GB main ram rumors but it turn out worse when we found out it had 2GB. Now instead of 1.5GB for games we have 1GB and 1GB for OS. This is worse then what I even said with the flash.

Now about the gpgpu. The whole point was not if the WIIU gpu could do gpgpu, it just terrible at running this code. It based on the r700 and for them to come out and say we design this to make up for the poor CPU performance. It just a bad design. You take a chip that terrible at running GPGPU code and then pair it with a weak cpu.

Funny thing I call this stuff long ago whenever I saw the size of the console. Back then people were saying the LOW end was 600 glfop and high was crazy at over tflop. Its going to be around 350-450 glfop at most.

http://67.227.255.239/forum/showpost.php?p=42128123&postcount=2747

Yup, not just here either. I've even got people on another forum I frequent denying that the U has the most impressive launch/launch window lineup seen in donkeys years. The mind boggles.

Methinks there are going to be plenty of disappointed gamers next gen when the PS4 and 720 launch/launch window titles aren't going to be too far different in terms of eye candy compared to U titles at that time. The majority of development will be done on underpowered/unfinished dev kits and developers will be more experienced with U development. The PS4 and 720 are going to benefit from developers working with the U having similar architecture though which should do them a few favours.

As for the GPU I'm still convinced we're looking at a 28nm or 32nm process, we're still looking at around 600-800GFLOPS and that Nintendo are still including some sort of fixed functions for lighting, shadows and depth of field. Although I still haven't a clue how that fixed functionality is going to integrate with the likes of the Unreal, Cry, MT Framework etc engines.
You are sooo wrong it not even funny. WE already have coputer games out now and seen demo of next gen games. They look better... We are not even talking true next gen game just current gen games running on a $200 gpu. The console will blow these cards out of the water.

Even at 28nm the number you state are impossible.
 
As for the GPU I'm still convinced we're looking at a 28nm or 32nm process, we're still looking at around 600-800GFLOPS and that Nintendo are still including some sort of fixed functions for lighting, shadows and depth of field. Although I still haven't a clue how that fixed functionality is going to integrate with the likes of the Unreal, Cry, MT Framework etc engines.

Any clues of all this?
 
Now about the gpgpu. The whole point was not if the WIIU gpu could do gpgpu, it just terrible at running this code. It based on the r700 and for them to come out and say we design this to make up for the poor CPU performance. It just a bad design. You take a chip that terrible at running GPGPU code and then pair it with a weak cpu.
Not again... How many times must you be told? It's not a god damned r700. It being based on that doesn't mean much considering the amount of customization it's had.

So is the little chicken game & Puddle the only 1080p Wii U games so far?

Scribblenauts and MHU too, as far as I know
And Trine 2 as well, I think.
 
And were right on both accounts. 2GB was not for games. I even said they likely have flash 512 gb storage for the OS. Plus we have the 1.5GB main ram rumors but it turn out worse when we found out it had 2GB. Now instead of 1.5GB for games we have 1GB and 1GB for OS. This is worse then what I even said with the flash.

You do know Nintendo can put the extra ram for OS and move it to the games right? They did it for the 3DS and they will do it for the Wii U.
 
Yup, not just here either. I've even got people on another forum I frequent denying that the U has the most impressive launch/launch window lineup seen in donkeys years. The mind boggles.

Methinks there are going to be plenty of disappointed gamers next gen when the PS4 and 720 launch/launch window titles aren't going to be too far different in terms of eye candy compared to U titles at that time. The majority of development will be done on underpowered/unfinished dev kits and developers will be more experienced with U development. The PS4 and 720 are going to benefit from developers working with the U having similar architecture though which should do them a few favours.

As for the GPU I'm still convinced we're looking at a 28nm or 32nm process, we're still looking at around 600-800GFLOPS and that Nintendo are still including some sort of fixed functions for lighting, shadows and depth of field. Although I still haven't a clue how that fixed functionality is going to integrate with the likes of the Unreal, Cry, MT Framework etc engines.

I have a bridge to sell you.
 
So is the little chicken game & Puddle the only 1080p Wii U games so far?
Here.

Not again... How many times must you be told? It's not a god damned r700. It being based on that doesn't mean much considering the amount of customization it's had.
You just wasted a minute of your life, but I assume you know that.
 
Not again... How many times must you be told? It's not a god damned r700. It being based on that doesn't mean much considering the amount of customization it's had.

Means a whole lot. This is still the core design of the system. Console customization does not rebuild the design of the core or why would you buy the design in the first place?
 
So is the little chicken game & Puddle the only 1080p Wii U games so far?

FWIW, the screenshots in that Puddle article were rendered at 720p - the game does use a post-process AA though. The shots might as well have been from the PS360 versions (if they weren't).
 
FWIW, the screenshots in that Puddle article were rendered at 720p - the game does use a post-process AA though. The shots might as well have been from the PS360 versions (if they weren't).

The developers said it runs in 1080p though, so I would call PS360 version
 
Now about the gpgpu. The whole point was not if the WIIU gpu could do gpgpu, it just terrible at running this code. It based on the r700 and for them to come out and say we design this to make up for the poor CPU performance. It just a bad design. You take a chip that terrible at running GPGPU code and then pair it with a weak cpu.

This folks is how you end up on someone's ignore list. This road has been traveled so much, there are ruts filled with misconceptions and willful dismissal of logic. The fact that you still refuse to understand the concept of modifying a GPU base boggles my mind.
 
Yup, not just here either. I've even got people on another forum I frequent denying that the U has the most impressive launch/launch window lineup seen in donkeys years. The mind boggles.

Methinks there are going to be plenty of disappointed gamers next gen when the PS4 and 720 launch/launch window titles aren't going to be too far different in terms of eye candy compared to U titles at that time. The majority of development will be done on underpowered/unfinished dev kits and developers will be more experienced with U development. The PS4 and 720 are going to benefit from developers working with the U having similar architecture though which should do them a few favours.

As for the GPU I'm still convinced we're looking at a 28nm or 32nm process, we're still looking at around 600-800GFLOPS and that Nintendo are still including some sort of fixed functions for lighting, shadows and depth of field. Although I still haven't a clue how that fixed functionality is going to integrate with the likes of the Unreal, Cry, MT Framework etc engines.

That doesn't really make any sense.

Ligthing is one thing, you could technically have a seperate pixed function lighting pass similiarly to the GC or 3DS but it would be pretty messy to combine it with pixelshaders. But there's no way they can have fixed function system for shadows and DoF. I mean you can do those things fine with lots of fixed function systems, but they're not specific GPU functions. That's procedures you have to do in several steps, and none of the steps involved are really that complex that it would need any special systems.
 
Trine 2 also running at 720p? Damn, if any game could benefit from the upped resolution it's Trine 2.

So much detail in those environments.

Thinking Blops 2 is 1080p at this point is just stupid.
 
So as someone who hasn't really been here for a while, what do we know now?

Is the console at least a little meaty?

I'd be perfectly fine if it were in-between the current gen and next.
 
Means a whole lot. This is still the core design of the system. Console customization does not rebuild the design of the core or why would you buy the design in the first place?

It doesn't mean a whole lot. Every Radeon HD GPU released since 2008 is r700-based, the GPUs in the PS4 and 720 will also be r700-based.
 
In this thread: a bunch of folks trying desperately to pretend they have a MS in computer engineering and an extensive background developing mass market electronic devices to justify whatever agenda/confirmation bias they're pushing.
 
R700 is DX10, Evergreen is DX11.
I think he's saying the architecture in Evergreen retains some R700 roots which is true. The stream processors did not evolve.

"The general principle of the computing section has not changed much in the RV870. It is still based on shader processors with superscalar design, each processor incorporating five ALUs four of which are general-purpose ALUs and the fifth is a special-purpose ALU capable of executing complex instructions like SIN, COS, LOG, EXP, etc."
http://www.xbitlabs.com/articles/video/print/radeon-hd5870.html

Edit: The comparison works for those GPU's. I'm not sure about the rest.
 
The game wouldn't gain anything going from 720p60fps to 1080p60fps unless you own a massive TV of which a very small % of the market does. Even then, its arguable that its worth the effort.

I have my PC hooked up to my tv (which isn't massive) and the difference between 720p and 1080p is night and day. It's not just jaggies, there is a ton of texture detail simply being lost running at 720p.
 
Yeah it doesn't take a very large tv for 1080p to be very noticable. If you have a 26 incher though and are sitting like 10 feet away it won't matter for you.
 
I think he's saying the architecture in Evergreen retains some R700 roots which is true. The stream processors did not evolve.

"The general principle of the computing section has not changed much in the RV870. It is still based on shader processors with superscalar design, each processor incorporating five ALUs four of which are general-purpose ALUs and the fifth is a special-purpose ALU capable of executing complex instructions like SIN, COS, LOG, EXP, etc."
http://www.xbitlabs.com/articles/video/print/radeon-hd5870.html
Ein ¿?. Souther island's GCN shader architecture has almost nothing to do with R700 VLIW-5 shader architecture, above all in terms of flexibility and programability.
 
I think he's saying the architecture in Evergreen has some R700 roots which is true. The stream processors did not evolve.
Some roots?

Evergreen Family Instruction Set Architecture said:
Differences Between the R700-Family and Evergreen-Family of Devices
The following bullets provide a brief overview of the more important differences
between the R700-family and the Evergreen-family of GPUs.
• Pixel parameter interpolation is now performed in kernel code, rather than in
fixed-function hardware. Parameter data is preloaded into the local data
share (LDS) before a pixel wavefront launch, and the kernel uses new
INTERP_* instructions to evaluate a primitive’s vertex attribute value of each
pixel.
• Texture and vertex fetch clauses are now defined by which cache (TC or VC)
services the clause, rather than by fetch-type.
• Local data share (LDS) is now accessed through ALU instructions, rather
than fetch instructions.
• Added support for jump-tables.
• Added the ability to write from a maximum of four streams to a maximum of
four stream-out buffers.
• Added support for flexible DX11 tesselation using hull shaders (HS) and
domain shaders (DS).
• Added work-group synchronization in hardware for compute shaders (CS).
• Added support to dynamically index texture resources, texture samplers, and
ALU constant buffers.
• Removed the Fbuffer and the Reduction buffer.
• Added support for floating point rounding and denormal modes.
• ALU clauses can now use up to four constant buffers using the
ALU_EXTENDED opcode.
• Added support for exception flag collection.
• Added single-step control of the control flow, allowing instructions to be
issued through register writes to the SQ, as well as arbitrary instructions to
be inserted in the execution path.
 
I don't think so, jim.

NI and R700 are incomparable. PS4 and 720 are likely SI based.
Architecture-wise, Cypress and Wekiva are quite comparable, and so is Cayman and Cypress. Everything until Tahiti (GCN) is comparable, as it's the same evolution line.
 
Technically speaking, couldn't the Wii U's MCM be considered a SoC? It's got a CPU, GPU, a bit of memory and possibly an I/O chip.

SOC is everything on one chip.

MCM is just like the Wii U. CPU/GPU and edram on one piece of silicon.

f8_01.jpg
 
SOC is everything on one chip.

MCM is just like the Wii U. CPU/GPU and edram on one piece of silicon.
MCM has nothing to do with what parts are on the same silicon. It's about multiple dies on the same package substrate.
 
R700 is DX10, Evergreen is DX11.

Evergreen is basically a R700 evolved to support a list of features defined by the DX11 API. GPU7 is a R700 evolved to support a list of features defined by Nintendo for its custom WiiU API. In both cases the end result isn't a R700, despite both GPUs being based on that chip.

Nintendo/AMD started development for GPU7 on the newest GPU available when WiiU development started (which happened to be a DX10.1 part) and customised it to support the features they want most. There would be no reason for Nintendo to develop it around a Microsoft API like AMD did with Evergreen. Which is why the whole "it's not DX11" thing is so silly.

BTW I'm confused why this WiiU teardown has spawned some negative talk about the GPU. The die looks quite big to me. Even if you remove a significant portion for the eDRAM (by the way it's a big positive that it's on chip which quite a few people doubted but I never did) it's still quite large.
 
Evergreen is basically a R700 evolved to support a list of features defined by the DX11 API. GPU7 is a R700 evolved to support a list of features defined by Nintendo for its custom WiiU API.

They started with the newest GPU available when WiiU development started (which happened to be a DX10.1 GPU) and customised it to support the features they want most.

There would be no reason for Nintendo to develop its GPU around a Microsoft API. Which is why the whole "it's not DX11" thing is so silly.

BTW I'm confused why this WiiU teardown has spawned some negative talk about the GPU. The die looks quite big to me. Even if you remove a significant portion for the eDRAM (by the way it's a big positive that it's on chip which quite a few people doubted but I never did) it's still quite large.

Well, what worries me is not the CPU inside the MCM, it's the GPU.

Here's a picture of the Core i5 with integrated Graphics. The smaller chip is the CPU and the larger one the GPU, just like the WiiU

big_k42f-motherboard.jpg




And here's a picture of the R700 die on a dual setup:

IMG_0500.jpg




and again a picture of the WiiU die:

slide004.jpg


**I know that the images aren't on the same scale, but you can guesstimate the actual size

Notice the following:
-the size of the WiiU CPU compared against the first picture, the mobile core i5. It is considerably smaller.
-The size of the WiiU GPU is considerably smaller than the R700 GPU unit in the second picture.
-Summing up the first two observations, both units are at least half the size of the core i% CPU chip and R700 GPU chip. So there is no way it will be more than 2-3 times the performance of this current gen.
 
Well, what worries me is not the CPU inside the MCM, it's the GPU.

Here's a picture of the Core i5 with integrated Graphics. The smaller chip is the CPU and the larger one the GPU, just like the WiiU

big_k42f-motherboard.jpg




And here's a picture of the R700 die on a dual setup:

IMG_0500.jpg




and again a picture of the WiiU die:

slide004.jpg


**I know that the images aren't on the same scale, but you can guesstimate the actual size

Notice the following:
-the size of the WiiU CPU compared against the first picture, the mobile core i5. It is considerably smaller.
-The size of the WiiU GPU is considerably smaller than the R700 GPU unit in the second picture.
-Summing up the first two observations, both units are at least half the size of the core i% CPU chip and R700 GPU chip. So there is no way it will be more than 2-3 times the performance of this current gen.

GameCube had a very strange design too it, much smaller than the PS2, more powerful and 100$ cheaper, I wouldn't look at size for Nintendo consoles to see how powerful a console is.
 
Trine 2 also running at 720p? Damn, if any game could benefit from the upped resolution it's Trine 2.

So much detail in those environments.

Thinking Blops 2 is 1080p at this point is just stupid.
pretty much. although i still we think we should believe Rayman is 1080p until specifically told otherwise as there's footage of the game running at 1080p.

http://wiiunews.at/wp-content/gallery/rayman_legends_1080/rayman-legends-wallpaper-1.jpg <- screen

and

http://gamersyde.com/download_rayman_legends_trailer_fr_-28017_en.html <- video
 
I wouldn't count on BLOPS 2 being 1080p either. The statement has yet to be clarified by them. And even though Full HD usually means 1080p I dont actually believe they have out right said "1080p". They may just be using the full HD term loosely as a buzz word . And I think Activision has been misleading on native resolution statements in the past.

Especially with all the confusion and speculation, and contention the over the actual resolution, I would think Activision would have cleared this up by now if it truly was 1080p. But at the same time they have not chosen to correct journalists from big sites that have reported it as 1080p

I've never bought a COD game and maybe I'm totally wrong about it but I remember reading about COD games being Sub-HD... like 650p or something crazy like that. Maybe they finally managed to get it running at 720p and they're calling it Full-HD lol
anyway, I couldn't care less about this game
 
I've never bought a COD game and maybe I'm totally wrong about it but I remember reading about COD games being Sub-HD... like 650p or something crazy like that. Maybe they finally managed to get it running at 720p and they're calling it Full-HD lol
anyway, I couldn't care less about this game

It runs 540p on the PS3 (or something close to that).
 
It runs 540p on the PS3 (or something close to that).

MW3 ran at 1024 x 600 on both PS3 and 360. i'd expect BLOPS 2 to run at this res on both of those consoles too. apparently they run at that resolution on 360 to avoid tiling (ie, to fit inside the 10MB eDRAM) and on the PS3 because the GPU is weak. neither of these things should be a problem for the Wii U.

i think it's safe to presume 720p native, and that the rep meant 'not sub HD' when he said 'full HD'. i won't be surprised if it's 600p though, but my gut feeling says 720p native based on the reasons digital foundry stated for it being 600p on the other consoles... *and 1080p native for Rayman*. remember, Activision regularly say COD is 60 fps, when it doesnt run at anything approaching an average of 60 fps on PS3, and regularly fails to hold 60 on 360.
 
It runs 540p on the PS3 (or something close to that).

I knew it
my resolution was off tho :D

MW3 ran at 1024 x 600 on both PS3 and 360. i'd expect BLOPS 2 to run at this res on both of those consoles too. apparently they run at that resolution on 360 to avoid tiling (ie, to fit inside the 10MB eDRAM) and on the PS3 because the GPU is weak. neither of these things should be a problem for the Wii U.

i think it's safe to presume 720p native, and that the rep meant 'not sub HD' when he said 'full HD'. i won't be surprised if it's 600p though, but my gut feeling says 720p native based on the reasons digital foundry stated for it being 600p on the other consoles... *and 1080p native for Rayman*. remember, Activision regularly say COD is 60 fps, when it doesnt run at anything approaching an average of 60 fps on PS3, and regularly fails to hold 60 on 360.

Then I guess I was right
it will be 720p on Wii U meaning native HD but they just fucked it up saying it was full HD aka 1080p
 
well we don't know for sure yet, but i think it's the sensible thing to presume.
I think it's a fair asumption on your part. What could be interesting is if reaching the 720P resolution would make such a significant difference. I could tell you that if it wan't for the pixel counters, not many people would know that the COD games were running in sub HD resolutions.
 
I think it's a fair asumption on your part. What could be interesting is if reaching the 720P resolution would make such a significant difference. I could tell you that if it wan't for the pixel counters, not many people would know that the COD games were running in sub HD resolutions.

I agree with this.
 
I think it's a fair asumption on your part. What could be interesting is if reaching the 720P resolution would make such a significant difference. I could tell you that if it wan't for the pixel counters, not many people would know that the COD games were running in sub HD resolutions.

I don't think it'd be an astonishing difference, but it'd definitely be noticeably better.
 
Top Bottom