Rumor: Wii U final specs

I was talking in terms of performance, not overall sales potential or 3rd party support.



True. I guess it depends on what aspect of the systems we're looking at. The gap in memory may be comparable, if not larger, to the DC -> xbox, but the processing performance gap may be smaller for example.
Oh... I think that's where the confusion came in. "PS2 of next gen" tends to refer to the system as the base for 3rd party development (then having games ported up to more capable console).

In terms of spec, I'd say DC to GC is probably pretty realistic in general. DC to Xbox might be a bit too far, I'm not expecting 8GB in Durango/PS4.
 
When I refer to DirectX 11 like effects, I'm referring to the ease of efficiency that comes with taking advantage of the new pipelines (hull and domain shader, compute shader). So while there should be some crossover between PS3/360 and the new DirectX 11, there's no way they can render as many complex lights, dynamic shadows, particles, subdivide polygons on the fly etc with the same efficiency as DirectX 11 can.

A greater DX version (hardware), not always mean better graphics. My PC GPU is DX10 GPU (better than DX9), and have 1GB of vram, and it can't run the games at the same level than PS360. We need to know more specs than DX version supported.
 
A greater DX version (hardware), not always mean better graphics. My PC GPU is DX10 GPU (better than DX9), and have 1GB of vram, and it can't run the games at the same level than PS360. We need to know more specs than DX version supported.
Well, we know it's based on a R770. How's that compare to the R520 that Xenos was based off?
 
Well, we know it's based on a R770. How's that compare to the R520 that Xenos was based off?
I believe the R700 series is two generations ahead on what Xenos was based on, so it should be a bit more efficient in addition to whatever difference in raw power and modifications the Wii U's GPU had.
 
Oh... I think that's where the confusion came in. "PS2 of next gen" tends to refer to the system as the base for 3rd party development (then having games ported up to more capable console).

In terms of spec, I'd say DC to GC is probably pretty realistic in general. DC to Xbox might be a bit too far, I'm not expecting 8GB in Durango/PS4.

Yeah, that's definitely where the confusion is coming from on my end lol. I always thought people were referring to the performance of the system relative to the competition.

Even if the other systems have 4GB of memory, that's still a bigger gap than what we saw from the DC -> GC. Though it's still too early to even start comparing specs. =p

Well, we know it's based on a R770. How's that compare to the R520 that Xenos was based off?

Xenos was based on the unreleased R400, not the R520. =)
 
Xenos was based on the unreleased R400, not the R520. =)

It is a mix of R520 and R600 (Wikipedia):

The Xenos is a custom graphics processing unit (GPU) designed by AMD (former ATI), used in the Xbox 360 video game console. Developed under the codename "C1,"[1] it is in many ways related to the R520 architecture and therefore very similar to an ATI Radeon X1800 series of PC graphics cards as far as features and performance are concerned. However, the Xenos introduced new design ideas that were later adopted in the R600 series, such as the unified shader architecture. The package contains two separate silicon dies, the GPU and an eDRAM, featuring a total of 337 million transistors.
 
Well, we know it's based on a R770. How's that compare to the R520 that Xenos was based off?

R770 rapes Xenos. This is coming from someone who had a 4850 in the past. But don't expect it to be the same GPU in the console. May be based off of it, but it'll be a custom GPU.
 
Yeah, that's definitely where the confusion is coming from on my end lol. I always thought people were referring to the performance of the system relative to the competition.

Even if the other systems have 4GB of memory, that's still a bigger gap than what we saw from the DC -> GC. Though it's still too early to even start comparing specs. =p
Well, GC had two mem pools (24MB & 16MB) while DC just had the 16MB main RAM. I did forget DC's 8MB VRAM though. In that case I'd say it's probably too little yeah, I'm expecting 4-6GB depending on target pricepoint.


Xenos was based on the unreleased R400, not the R520. =)
You're right, I had it backwards. R520 was more "based on" Xenos, not the reverse, though really they were developed in tandem. :/
 
So much trolling and people talking about fairytale specs of PS4 / 720 in this thread...

The ONLY person that should be listened to in this thread is Ideaman because at the end of the day he is the only person who is in contact with developers who are working on the system.

If he says they are happy with the combination of it's CPU, GPU and Ram then that is good enough for me.

Roll on Nintendo games made on 2011 hardware, in HD, with a decent online marketplace, with a decent online gaming service and all the innovation that will come with the new controller.

Haters gonna hate.
 
Ok, some things from R520, some R600 (and custom things).
R520 is the closest GPU equivalent to Xenos afaik, but unlike the R770/GPU7, it was developed in parallel with Xenos. Xenos took nothing from R600, those came later.

It's like saying Wii U took some things from R770 and some things from R800 (like compute shaders). Which is actually true, but still isn't a 1:1 comparison with Xenos and R520/R600.

Basically my comparison was wrong in the first place. It's really more like R400:Xenos to R700:GPU7.
 
From a Beyond3D article:

A name that has long since been mentioned in relation to the graphics behind Xenon (the development name for XBOX 360) is R500. Although this name has appeared from various sources, the actual development name ATI uses for Xenon's graphics is "C1", whilst the more "PR friendly" codename that has surfaced is "Xenos". ATI are probably fairly keen not to use the R500 name as this draws parallels with their upcoming series of PC graphics processors starting with R520, however R520 and Xenos are very distinct parts. R520's aim is obviously designed to meet the needs of the PC space and have Shader Model 3.0 capabilities as this is currently the highest DirectX API specification available on the PC, and as such these new parts still have their lineage derived from the R300 core, with discrete Vertex and Pixel Shaders; Xenos, on the other hand, is a custom design specifically built to address the needs and unique characteristics of the game console. ATI had a clean slate with which to design on and no specified API to target. These factors have led to the Unified Shader design, something which ATI have prototyped and tested prior to its eventual implementation ( with the rumoured R400 development ? ) , with capabilities that don't fall within any corresponding API specification. Whilst ostensibly Xenos has been hailed as a Shader Model 3.0 part, its capabilities don't fall directly inline with it and exceed it in some areas giving this more than a whiff of WGF2.0 (Windows Graphics Foundation 2.0 - the new name for DirectX Next / DirectX 10) about it.

R400 but with some R500 features.

R520 is the closest GPU equivalent to Xenos afaik, but unlike the R770/GPU7, it was developed in parallel with Xenos. Xenos took nothing from R600, those came later.

It's like saying Wii U took some things from R770 and some things from R800 (like compute shaders). Which is actually true, but still isn't a 1:1 comparison with Xenos and R520/R600.

Basically my comparison was wrong in the first place. It's really more like R400:Xenos to R700:GPU7.

You are right ;)
 
It is a mix of R520 and R600 (Wikipedia):

There's a reason why we weren't allowed to use Wiki as a reference in college. ;p

It's a common mistake and I'm not surprised it's on wiki. The R520 used discrete pixel and vertex shaders, the R400 was ATI's first attempt at a unified shader architecture.

From B3D:

ATI are probably fairly keen not to use the R500 name as this draws parallels with their upcoming series of PC graphics processors starting with R520, however R520 and Xenos are very distinct parts. R520's aim is obviously designed to meet the needs of the PC space and have Shader Model 3.0 capabilities as this is currently the highest DirectX API specification available on the PC, and as such these new parts still have their lineage derived from the R300 core, with discrete Vertex and Pixel Shaders; Xenos, on the other hand, is a custom design specifically built to address the needs and unique characteristics of the game console. ATI had a clean slate with which to design on and no specified API to target. These factors have led to the Unified Shader design, something which ATI have prototyped and tested prior to its eventual implementation ( with the rumoured R400 development ? ) , with capabilities that don't fall within any corresponding API specification.

http://www.beyond3d.com/content/articles/4/2

New ATI graphics high end core succeeding the R300/R350 (Radeon 9700/9800). The R400 has been delayed until 2004 due to: 3GIO, DDR-3 and DX9 pixel shader 3.0. It is renamed to R500, while the less ambitious R420 takes over in the short term.

http://endian.net/details.aspx?tag=atir400

High end ATI graphics chip for XBox2 built in a 90 nm process. Part of a new generation, comparable to R600 for the PC.
Early on reported to "have 10 times higher geometry and 4 times higher pixel performance compared to the RADEON X800 XT" (believe it if you want to).
May support WGF and Pixel and Vertex shaders 3.0 with 128 bit precision. Also known as Xenos.

http://endian.net/details.aspx?tag=atir500

Edit: beaten
 
I was talking in terms of performance, not overall sales potential or 3rd party support.
Which does make sense considering that this is the spec thread :/ my mistake.

I do believe that in terms of specs it will fall in between the PS2/DC as they compared to the GC/XBOX. The big difference is that while the PS2 was technically weaker than its competitors all three consoles that generation had a strength that the others couldn't quite match but I doubt that the Wii U will be stronger in any one area than the PS4 or next Xbox.
 
no, not new to r800. its in r700.

Where has it been confirmed dx11? lol
Sorry, I meant Iwata confirmed it to be a GPGPU. Though yes, that's not really R800 specific.

I didn't say confirmed DX11, it won't run any version of DirectX. Unity Technologies confirmed it was capable of DX11 equivalent features.
 
Sorry, I meant Iwata confirmed it to be a GPGPU. Though yes, that's not really R800 specific.

I didn't say confirmed DX11, it won't run any version of DirectX. Unity Technologies confirmed it was capable of DX11 equivalent features.

Of course is gpgpu because it has compute shader. We have known for a long time its a r700 core.

That statement means nothing.
 
Don't get so defensive, I was just seeing what your $10m figure was based on. And now that I know it's NOTHING, feel free to get defensive. :P

I wasn't getting defensive, at least that was not my intent despite how it may have come off.. Message boards are tricky that way.

And for the record it isn't based on nothing. Just do a google search on game development budgets, and you are likely to find articles that talk about single platform development averaging around 10 million. for instance add to that the fact that this isn't some second rate developer on the project either. It all adds up to AAA game to me. Whether it receives a AAA score however is another story.

Are they recent articles? Nope... but still im not basing my opinion off modern day as I haven't looked into it recently... :P

Oh yeah.. I worked in the game industry too... that helps
 
I didn't say confirmed DX11, it won't run any version of DirectX. Unity Technologies confirmed it was capable of DX11 equivalent features.

That means nothing, what Crytek said about Crysis 3 on PS360:

"It is very, very difficult, but it is possible. It just requires a lot of effort. Some of the stuff these guys are making work on consoles now is absolutely amazing. It's render features that shouldn't theoretically work on consoles, but they've managed to construct code that can emulate a similar thing from a… hack and slash sounds wrong, but they don't have the same streamlined pipeline you would have with a DX11 structure, but they can get to a similar result just by experimenting and using tips and tricks."

What Helgason said about Unity on Wii U:

Gaming Blend: While the Wii U doesn't specifically use DirectX functionality, will the Unity Engine for the Wii U allow for DirectX 11 equivalent functionality in regards to shaders, soft and self shadowing as well as potential scalability for shader 5.0 (or higher)?

Helgason: Yeah. We'll do a – we'll make it potentially possible to do.

The full article: http://www.cinemablend.com/games/Interview-Why-Unity-Engine-Good-Fit-Wii-U-47173.html
 
But then, their issues can derive from 10 different origins. They may not put in good use all the co-processors such as the DSP to alleviate some processing burden, versions of middleware not fully optimized for the system, the adaptation of their engine to the Wii U specifics isn't good enough, etc. This could be a problem witnessed at a certain time, their complains were relevant a semester ago but not now, they took more time than other studios to learn how to exploit the hardware with the trio cpu/memory/gpu, etc. There are like tons of variable. But a sure thing is that 3 studios have developed ambitious titles on the platform without ever criticizing the CPU.

Most studios aren't going to publicly criticize a platform they're working on and is under nda...

But there's was also the kotaku article calling it a performance orphan.
 
Most studios aren't going to publicly criticize a platform they're working on and is under nda...

But there's was also the kotaku article calling it a performance orphan.

But they haven't publicly talked at all; they have spoken with Ideaman secretly. There's no reason for them, in their anonymity, to talk up the system if they don't think it's up to scratch.
 
Most studios aren't going to publicly criticize a platform they're working on and is under nda...

But there's was also the kotaku article calling it a performance orphan.
IdeaMan isn't the public, and by this point prior to the Wii release the bitching from devs about its shortcomings were louder than the crowd at a football game so so much for NDAs.

But Kotaku's a trusted news source, right?
 
IdeaMan isn't the public, and by this point prior to the Wii release the bitching from devs about its shortcomings were louder than the crowd at a football game so so much for NDAs.

But Kotaku's a trusted news source, right?

Wasn't just Wii, last generation showed me devs are babies and like the gaming public be it consumers or manufacturers to pick up a tab most really don't want to pay for. The bitching about ram on the hd twins showed they are divas when it comes to this and will gladly play hardware games for something they knock the pc realm for being all about at times.
 
I don't understand what certain regular posters get out of this constant hate campaign against the Wii U. It's just worryingly weird now... like dressing up in your mother's clothes weird.
 
Most studios aren't going to publicly criticize a platform they're working on and is under nda...

But there's was also the kotaku article calling it a performance orphan.

Well, i was obviously talking of information given informally to me. I don't know why they would hide problems originating from the CPU while they criticized other parts of the Wii U since one year (some SDK revisions, etc.). No really, the CPU is a perfect fit for the rest. It doesn't mean it's a beast, but it's at the very least balanced with the memory and GPU.
 
Well, i was obviously talking of information given informally to me. I don't know why they would hide problems originating from the CPU while they criticized other parts of the Wii U since one year (some SDK revisions, etc.). No really, the CPU is a perfect fit for the rest. It doesn't mean it's a beast, but it's at the very least balanced with the memory and GPU.

Could it be possible that their games just aren't CPU intensive? Every game will have different performance requirements and sometimes they are CPU bound, GPU bound, both (if the CPU is a bottleneck, that could effect what's fed to the GPU), etc.

What concerns me most about the CPU is the lack of info. We know there are three cores, built up from the broadway architecture, it contains some eDRAM, and that's about it. Not that I expect you to answer these questions, but I'd love to know how many threads the chip supports, clock speed, amount of eDRAM, etc.
 
I don't understand what certain regular posters get out of this constant hate campaign against the Wii U. It's just worryingly weird now... like dressing up in your mother's clothes weird.
I'm starting to wonder if there is some fear going around that if Nintendo does manage to find the sweet spot that they missed with the Wii (not too weak, not prohibitively powerful) that the Wii U will stymie development for the other consoles as the 3DS had done in some respect to the Vita.

I think the thought of more Bayo2 like announcements is enough to cause concern amongst those that don't want to see the market take its direction from Nintendo and leads to the constant negativity and naysaying that we see from the same consistent group of posters in Wii U related threads.
I trust Totillo more than I trust Ideaman.

I'm not even sure what a "balanced" CPU is.
I don't know who IdeaMan is in his professional life so I'm not putting my trust in either but what I do trust is my ears. As much as I didn't want to believe it back in 2006 pretty much any developer that went on record told us that the Wii wasn't the console that most were expecting it to be. Some said it in a polite way and more than a few said it in a not so polite way but the message was clear.

Now I'm not sure why devs would suddenly start holding their tongues now but the general message we've been getting from interviews so far seem to equate to "nothing to write home about but we can work with it".

That's actually better than what I was expecting a year ago.
 
IdeaMan isn't the public, and by this point prior to the Wii release the bitching from devs about its shortcomings were louder than the crowd at a football game so so much for NDAs.

But Kotaku's a trusted news source, right?

I trust Totillo more than I trust Ideaman.

I'm not even sure what a "balanced" CPU is.
 
I don't understand what certain regular posters get out of this constant hate campaign against the Wii U. It's just worryingly weird now... like dressing up in your mother's clothes weird.

The same reason people defend Nintendo tooth and nail. Or the same reason certain posters constantly shit on Sony. Or the same reason people defend Sony tooth and nail. Or the same reason certain posters shit on MS. Or the same reason people defend MS tooth and nail. You are on the internet. Somebody is going to hate everything. There's no deeper meaning. Nothing about being scared that Nintendo will stymie developmental growth on other platforms, or that Nintendo will win and change the industry. Just plain old internet fanboysim. That's it. If it seems like a Nintendo only problem it's because that's all you're reading about. Go peep out a vita thread or the Xbox dashboard update thread. A damn dashboard update thread, shitted up real nice. It just is what it is.
 
I'm starting to wonder if there is some fear going around that if Nintendo does manage to find the sweet spot that they missed with the Wii (not too weak, not prohibitively powerful) that the Wii U will stymie development for the other consoles as the 3DS had done in some respect to the Vita.

I think the thought of more Bayo2 like announcements is enough to cause concern amongst those that don't want to see the market take its direction from Nintendo and leads to the constant negativity and naysaying that we see from the same consistent group of posters in Wii U related threads.

As irrational and illogical as that reasoning may be, I think it's still giving the trolls too much credit lol.

I think it could also just be general hatred for things that doesn't immediately appeal to them, which is equally stupid. It's scary how many times I've basically seen an opinion boil down to "it's not for me, so it must suck". =/

I trust Totillo more than I trust Ideaman.

I'm not even sure what a "balanced" CPU is.

My guess is by saying the system is balanced, the system should be able to hit good utilization with fewer bottlenecks. So with the CPU, it should be able to feed the GPU efficiently while still handling other processing tasks (along with whatever other chips they have to assist the CPU).

Out of curiosity, what did Totillo say again?
 
Its the same song and dance every few weeks and people fall for it every time. Its the same people doing the same song and dance every few weeks to
 
Its the same song and dance every few weeks and people fall for it every time. Its the same people doing the same song and dance every few weeks to

Launch isn't until the 18th, what do you want me to do in the meantime? Don't act like this waltz isn't passing the time!
 
Its the same song and dance every few weeks and people fall for it every time. Its the same people doing the same song and dance every few weeks to

And yet you keep coming back, thinking you're going to see something different. We're not so different, you and I.
 
Launch isn't until the 18th, what do you want me to do in the meantime? Don't act like this waltz isn't passing the time!

You could buy Art Academy like me and find out how bad you suck at painting compared to the person who did the tutorial on the tulip.

And yet you keep coming back, thinking you're going to see something different. We're not so different, you and I.

It feeds my anger so that when the day comes and I murder 30 people I can blame gaf
 
You could buy Art Academy like me and find out how bad you suck at painting compared to the person who did the tutorial on the tulip.

No thanks! I bought a wacom pen thinking I would get into art. Turns out I suck! I had kind of forgotten about it. Thanks :(

So is this thread bumped for a reason today or is it just the usual back and forth?

We're talking about being horrible artists, what do you think?
 
Top Bottom