Wii U clock speeds are found by marcan

People are more willing to take any piece of negative news about Wii-U than anything positive... yep sounds like GAF

To be fair, there has been a lot more news that could be framed as negative than there has been news that could be framed as positive. This hasn't been some perfect launch and everyone is just a big meanie.
 
I don't understand the purpose of posts like these. Mind you, I get that click speed doesn't tell a complete picture, but it's not like it's a piece of meaningless jargon either. It's not as if this is the first we are hearing about the CPU being a potential bottleneck.

Yep, personally I think it's quite an unreasonable stretch to assume that Wii U's CPU architecture is so much better that each CPU cycle on average computes 3x that of Xenon. That's needed to break even with a 7 year old console.
 
Yep, personally I think it's quite an unreasonable stretch to assume that Wii U's CPU architecture is so much better that each CPU cycle on average computes 3x that of Xenon. That's needed to break even with a 7 year old console.

If it's a much shorter pipeline then highly branching routines could very well run 3x faster.
 
Unless the Wii U cost 100 dollars to make there is no way that controller is 45% of the cost. It is low res, captive screen, a radio receiver and a few sensors. That does not cost a ton of money to make. Cheap tablets with a full SOC and ram can be bought for under 100 dollars. So those proably cost 50 dollars to make. If those can be made for 50 the Wii U tablet can be made for close to that.

Yeah I think people are pretty crazy with their overestimation of the cost of the tablet.
 
To be fair, there has been a lot more news that could be framed as negative than there has been news that could be framed as positive. This hasn't been some perfect launch and everyone is just a big meanie.

Meanie? I thought everybody was being fair and objective...
 
The main issue with this line of reasoning is that the Wii U originally appeared to be an olive branch to their core gamer demographic. If it can't land ports of 720/PS4 games, then their success on that front depends heavily on third parties making solid exclusives - basically identical to the Wii's situation. I don't think the Wii U will have nearly the casual appeal that the Wii did, even if that market isn't tapped out by now. So the question is, what is it that they'll do well?
I think I'll just post what I said earlier today and add a bit to it.
While you're not wrong Nintendo specifically stated that their goal for this console ws to win back the crowd that they lost with the Wii.

Now while it was silly to ever believe that this was true, especially after the name became official, it still would have served Nintendo better if they had just said that they were repeating their strategy with the Wii.

I'm not mad at all, I've had my WiiU for about a week and I'm really happy with it. But as you can see from this thread the choices that Nintendo made with this console has some generally level head posters, including one of the founding fathers of the board, pretty irate. This isn't the best way to start off your supposed campaign to win back this market. I'm not saying that Nintendo can't turn this around but they're off to a horrible start.
I'm not sure what happened in between the time Nintendo made those statements about their initial strategy for the WiiU and now. Could the 3DS launch debacle have spooked them? Did they see a trend happening in the market that made them change their minds? Whatever it was I knew the WiiU console couldn't be the console to win back the gamers Nintendo spurned with the Wii when they named the system WiiU. It would be like dating your wife's best friend before you met her, saying that she was just a fling and then naming your first daughter after the friend.

It just wasn't the right move for Nintendo to make if they seriously wanted to start new inroads with the hardcore crowd, there's probably not another brand as tainted with that set than the Wii.

Edit: BTW, I'm almost afraid to ask but how did this

quest
Banned from OT


happen?
 
It just wasn't the right move for Nintendo to make if they seriously wanted to start new inroads with the hardcore crowd, there's probably not another brand as tainted with that set than the Wii.

I thought the gestures of a "hardcore" turn were frankly quite misguided from the outset. That would be terrible for Nintendo, it's not their strength to compete on those terms.

I partly believe that we put too much faith in E3. Iwata knows that's a show watched by traditional gaming enthusiasts, so he may take up those talking points. But it feels increasingly like one little niche of PR for them, and not all the audience to whom they truly want to address themselves. At least I hope not, because that audience feels increasingly irrelevant to me, no offense to those types on this site. :p
 
In this case, in all honesty, they actually do mean something. The clockspeed difference is something like 250%. The Espresso would have to literally be 250% more efficient at everything else to be equivalent to the Xenon. It isn't, but it is equivalent enough where it matters.

Spot on.

The Wii U's CPU doesn't need to be as powerful or diverse in feature set as the Xbox 360's and PS3's.

SIMD processing will be handled primarily by the GPU
Audio processing handled by the DSP
I/O processing handled by the I/O controller
I believe it also has a dedicated memory controller(not sure if true?)
Wifi, bluetooth, and data streaming also have their own dedicated processors
The opperating system apparently runs of its own dedicated ARM processor (not sure if true?)
Then there's probably some more i've missed

So compared to the Xenon and Cell processor, the Wii U's CPU wont be tasked with processing audio, I/O, running the background OS, SIMD, etc. So the Wii U's cpu should have a reduced work load vs Xenon and Cell.

Then there's the advtanges of the IBM 750 architecture the CPU is based on. Vs Xenon and Cell the 750 architecture provides better performance per clock, is out of order, has significantly more cache, and with the MCM manufacturing process it should be able to communicate at low latencies with the GPU. There may and likely is other performance advantages as the CPU would no doubt be customized and tweaked with specific instruction sets and features optimised for 3D games.

So all in all, the Wii U CPU doesn't need to be that good. From memory the Cell processor reserves and entire core for running the PS3's opperating system. And the Xbox 360 has known to have an entire thread maxed out with audio processing, and certainly I/O + audio can easily take up and entire thread.
 
Yep, personally I think it's quite an unreasonable stretch to assume that Wii U's CPU architecture is so much better that each CPU cycle on average computes 3x that of Xenon. That's needed to break even with a 7 year old console.
With out-of-order execution, I don't see why the espresso couldn't exceed the Xenon.
 
It's also worth pointing out that Marcan, the guy who reveled these clock speeds, has stated the Wii U's CPU cannot be compared to the Xenon and Cell Processors. He's also said the Wii U's CPU should be able to process more instructions per cycle then them both. Reading his tweets the negative he pointed out about the Wii U's CPU was its inferior SMID capabilities vs Xenon and Cell, something we all knew.

No, he didn't say that. He said that at the same clock speeds it should outperform them, but at the currect clock speeds he didn't know. He said it be much better at IPC and inferior SMID capabilities.

Basically I took it as the processor may be slightly inferior or equal to the processor's in the PS3/360. Yet there 7 years old. No one expected it to be close to equal.
 
Yeah i miss read his comments on that. Clock for clock, more effiicent. At 1.2ghz, he's not sure specifically how it'd rate.
 
Yea that's not true. Was more or less same CPU whole time. Only real CPU change was the clock speed was raised 25% between I believe Cat-Dev V3 and V4.

Heh, Nintendo seems to be a bit reluctant to raise the CPU clockspeed. Thanks for the info, Arkam.

Though that raises the question on why it was getting reported that the Wii U CPU was "like Xenon, but clocked faster."
 
It seems a big question this situation is raising, relates to Nintendo's avowed desire to serve the "hardcore" crowd better than the did with the Wii.

However, it seems that serving the hardcore crowd is being interpreted, in the context of Wii U hardware discussion, in a very narrow sense: that the only way to make the "hardcore" happy is to present them with clear technological superiority as they currently interpret it, going into the 8th generation.

At this point, I have a suspicion that - rightly or wrongly - Nintendo's perspective was less about absolute technological superiority or parity with another game platform. And more about: online services, control interface, and what kinds of games the console was suitable to host.

In that regard, Wii U generally succeeds (with room for improvement in several areas) at being far more "hardcore" than the Wii: it's capable of online gaming and online games that have come to be associated with core and hardcore gaming, like Black Ops II. It has that hardcore controller available. It's HD. Its online system is still incomplete, but a major step in the right direction.

In short, the package is a genuine attempt to meet the so-called hardcore halfway - but by nature, the enthusiast gamer may still look at halfway as "shouldn't have even bothered trying".

Which unfortunately comes back to the position Nintendo finds themselves in, pulled in two directions. They don't wish to compete in the bloody arena of deficit hardware to win over the hardcore with sheer specs and seem to have a ceiling on what the want to spend. Within their comfort zone they might've sacrificed any features that didn't serve raw power. It seems that wasn't acceptable either, and they felt they needed a unique concept, not just power. That concept (the U pad) is actually very well executed with excellent technology behind it, but it's not free.

At most, I think one might argue that if Nintendo had abandoned their desire for keeping heat, size, and power down (which may be related not just to the Japanese market, but reliability and lifespan) they could have pushed a little more raw power. But how much without getting rid of stuff like the gamepad, I dunno. They might have judged that in the long run, it wouldn't be enough to make a huge difference in comparisons to the other companies' next consoles.
 
Spot on.

The Wii U's CPU doesn't need to be as powerful or diverse in feature set as the Xbox 360's and PS3's.

SIMD processing will be handled primarily by the GPU
Audio processing handled by the DSP
I/O processing handled by the I/O controller
I believe it also has a dedicated memory controller(not sure if true?)
Wifi, bluetooth, and data streaming also have their own dedicated processors
The opperating system apparently runs of its own dedicated ARM processor (not sure if true?)
Then there's probably some more i've missed

So compared to the Xenon and Cell processor, the Wii U's CPU wont be tasked with processing audio, I/O, running the background OS, SIMD, etc. So the Wii U's cpu should have a reduced work load vs Xenon and Cell.

Then there's the advtanges of the IBM 750 architecture the CPU is based on. Vs Xenon and Cell the 750 architecture provides better performance per clock, is out of order, has significantly more cache, and with the MCM manufacturing process it should be able to communicate at low latencies with the GPU. There may and likely is other performance advantages as the CPU would no doubt be customized and tweaked with specific instruction sets and features optimised for 3D games.

So all in all, the Wii U CPU doesn't need to be that good. From memory the Cell processor reserves and entire core for running the PS3's opperating system. And the Xbox 360 has known to have an entire thread maxed out with audio processing, and certainly I/O + audio can easily take up and entire thread.
You're maybe describing what would be about 10-15% maximum total, minus SIMD, of CPU time total. Thats not big of a difference. And where would the ARM be, since the one located within the GPU is used for Security measures? Was there another found on board?

You can go down that list as many times as you want, but noone here should agree that it makes up for the sheer difference in clockspeed at most angles.
 
You're maybe describing what would be about 10-15% maximum total, minus SIMD, of CPU time total. Thats not big of a difference. And where would the ARM be, since the one located within the GPU is used for Security measures? Was there another found on board?

You can go down that list as many times as you want, but noone here should agree that it makes up for the sheer difference in clockspeed at most angles.

On the 360, Audio can take up to 2 full threads (1 complete CPU core)
 
In this case, in all honesty, they actually do mean something. The clockspeed difference is something like 250%. The Espresso would have to literally be 250% more efficient at everything else to be equivalent to the Xenon. It isn't, but it is equivalent enough where it matters.

Nitpick:

The difference would be ~157% (3.2 - 1.243)/1.243 =~ 157%

If the Wii U and 360 CPU:s were of equal performance, it'd have to be 157% more efficient. 100% more (which means double) efficient would put it at ~2.5 Ghz 360 CPU, 250% ~4.4 GHz etc.

I have no idea how these CPU:s compare to each other, I just wanted to point out a common mistake when talking about percentages.
 
In this case, in all honesty, they actually do mean something. The clockspeed difference is something like 250%. The Espresso would have to literally be 250% more efficient at everything else to be equivalent to the Xenon. It isn't, but it is equivalent enough where it matters.

It can be more than 250% efficient at some things, and less at others. We really don't have enough knowledge to make any proper judgements in this case. Clock speed really doesn't mean much.
 
You're maybe describing what would be about 10-15% maximum total, minus SIMD, of CPU time total. Thats not big of a difference. And where would the ARM be, since the one located within the GPU is used for Security measures? Was there another found on board?

You can go down that list as many times as you want, but noone here should agree that it makes up for the sheer difference in clockspeed at most angles.
I found a good link that addresses that...

http://www.notenoughshaders.com/2012/11/03/shinen-mega-interview-harnessing-the-wii-u-power/
More specifically, we’ve heard rumors about the CPU, that it’s supposedly the weakest link of the system. Word has spread that it’s some sort of Broadway (Wii CPU) but in a three-core configuration and improved. Others have argued that based on its reduced size seen in recent pictures and the overall low consumption of the unit, it is not very powerful. Have you encountered any problems during your development because of this component or is it efficient enough?

We didn’t have such problems. The CPU and GPU are a good match. As said before, today’s hardware has bottlenecks with memory throughput when you don’t care about your coding style and data layout. This is true for any hardware and can’t be only cured by throwing more megahertz and cores on it. Fortunately Nintendo made very wise choices for cache layout, ram latency and ram size to work against these pitfalls. Also Nintendo took care that other components like the Wii U GamePad screen streaming, or the built-in camera don’t put a burden on the CPU or GPU.

Concerning the graphical aspect, are the features supported satisfactory for you? Is the system allowing a good amount of effects not present or not really used on current gen consoles, such as tessellation? Iwata has promoted the GPGPU side of the chip, have you taken advantage of this?

For Nano Assault Neo we already used a few tricks that are not possible on the current console cycle.

Due to the modern GPU architecture you have plenty of effects you can use to make Wii U games look better than anything you have seen on consoles before.
Yep, it's going to boil down to the power of the GPU.
 
So how likely is it we'll see a potential hack and subsequent homebrew? I can only imagine how amazing a system like Wii U could be with all kinds of homebrew apps!

Anyway, specs are nothing surprising and pretty much what I expected. Still think it was a horrible idea to gimp the CPU in favor of a smaller/cooler box...
 
Because they only have experience with Nintendo consoles and no experience with AAA game development and cutting edge graphic engines and their requirements. That's why.
 
Here is an unbiased view on the Wii U CPU news:

http://hothardware.com/News/Hackers-Discover-Wii-Us-Clock-Speed-Processor-Design/

"What all this means is that writing good games for the Wii U may well require developers to adopt new practices. It's absolutely fair to compare current Wii U games against the peak of the Xbox 360 / PS3 versions, but we suspect the platform has untapped potential at this early stage -- just as every other console always has."
 
Here is an unbiased view on the Wii U CPU news:

http://hothardware.com/News/Hackers-Discover-Wii-Us-Clock-Speed-Processor-Design/

"What all this means is that writing good games for the Wii U may well require developers to adopt new practices. It's absolutely fair to compare current Wii U games against the peak of the Xbox 360 / PS3 versions, but we suspect the platform has untapped potential at this early stage -- just as every other console always has."

Good article, but I think most of the sane people don't actually believe the Wii U is weaker than the PS3/360 and won't eventually be able to have games that match/exceed their best. The issue has always been comparisons to when the other two come out and ports.
 
Gemüsepizza;44910057 said:
Because they only have experience with Nintendo consoles and no experience with AAA game development and cutting edge graphic engines and their requirements. That's why.

Randy Pitchford. Thats the man you´re looking for then.
 
Basically.

I'm disappointed in myself that I didnt come up with that example earlier. Its a real killer in getting that point through peoples heads.

That's because there isn't really a "point" here.

As you should probably already know by now, the reason the PS3 has a hard time with ports of any kind is its extremely unconventional architecture, even moreso than the PS2's, albeit in a different way. So you have two bizarre architectures and you want to port from one to the other.

There is no such evidence that the Wii U's architecture is nearly as bizarre as the PS3's. You're presenting a pretty false equivalence here.
 
There is no such evidence that the Wii U's architecture is nearly as bizarre as the PS3's. You're presenting a pretty false equivalence here.

It doesn't have to be bizarre, only has to be substantially different from what is currently being made.

That said, the simple fact that we have such a (relatively) strong GPU counter a lower clocked CPU all with super low latency... I'd say that's pretty safe to say as something very different from what the HD twins are, even if the CPU or GPU tech by itself aren't "difficult" to program for.

You don't program in a bubble.
 
So how likely is it we'll see a potential hack and subsequent homebrew? I can only imagine how amazing a system like Wii U could be with all kinds of homebrew apps!

Anyway, specs are nothing surprising and pretty much what I expected. Still think it was a horrible idea to gimp the CPU in favor of a smaller/cooler box...

They could get away with a slow (efficient, I guess) CPU because they have specialized processors for a lot of functions. So the CPU doesn't have to do as much as the 360's does. Should make evening more stable and reliable, beyond keeping the machine cooler.

It doesn't have to be bizarre, only has to be substantially different from what is currently being made.

That said, the simple fact that we have such a (relatively) strong GPU counter a lower clocked CPU all with super low latency... I'd say that's pretty safe to say as something very different from what the HD twins are, even if the CPU or GPU tech by itself aren't "difficult" to program for.

You don't program in a bubble.

I know nothing about software development, but that kind of thing would depend more on Nintendo's dev tools, right? Nobody's coding with the assembly language.
 
That's because there isn't really a "point" here.

As you should probably already know by now, the reason the PS3 has a hard time with ports of any kind is its extremely unconventional architecture, even moreso than the PS2's, albeit in a different way. So you have two bizarre architectures and you want to port from one to the other.

There is no such evidence that the Wii U's architecture is nearly as bizarre as the PS3's. You're presenting a pretty false equivalence here.

Well, the difference between comparing ports from 360-to-PS3 vs. the Wii U is that, if we're talking specifically engine and physics-wise, the others do not have to account for a vastly lower clockspeed. And that's across the board from general code and all the rest of it. Games didn't need to be written that much differently when the engine could rely on 3Ghz clocks (PS3 Cell quirks aside), PS3's main core being roughly equivalent to 360's individual tri. So it'll be a total reconsideration of how to code a engine and optimize out the edges on Wii U, and granted pure clocks wouldn't solve everything but it makes a big difference in how ports are handled or considered.

If the Wii U even gets ports they'll be poor or baseline, from what I can see there's too much money and energy involved in doing a great port to justify the return, imo.
 
Choosing between Borderlands 2 and Aliens: Colonial Marines was probably something Gearbox was looking at more from a game design perspective in terms of which IP would be more ideal for Wii U's GamePad. Regarding Borderlands 2, publisher 2K Games is being picky in determing if its big IPs are a good fit on Wii U. The publisher is taking a wait and see approach first. 2K Games is going to observe how well Sega's Aliens: Colonial Marines does on Wii U. By the way, the Wii U version of Aliens is being outsourced to developer Demiurge Studios. It will be handling most of the grunt work, but Gearbox is partially involved. It's likely the same way Vigil Games was working with Darksiders 2 Wii U before sending it over to THQ Montreal to do most of the work.

July 2012 interview with Gearbox co-founder and Chief Creative Officer Brian Martel:

So we're really excited to bring Aliens: Colonial Marines to it [Wii U]. And we're going to do a wait and see approach, our publisher 2K is going to see whether they're going to go with that [bring Borderlands 2 to Wii U].

http://www.dealspwn.com/gearbox-cco-borderlands-2-inventory-artwork-fantastic-wii-107801
 
Define that 'weak as fuck' if you can.

Normally people start up the hyperbole once the more powerful hardware is in hand. But now they're starting before we even have a Durango or PS4. Which means the 360 and PS3 that everyone thinks are powerful are now weak as fuck. Major cognitive dissonance going on.

Well...

When people are talking about the cross gen games coming out next year I don't think they're referring to Wii U ports.

Basically that. You think UE4 and Agnis Philosophy were just made for decoration? 1313 and Watch_Dogs are going to be PC only? There will be a clear graphical jump between 360/PS3 and their successors. To believe otherwise is crazy.

Wii U will be in the same position when it comes to power that the Wii was come PS4/720 launch. Such a shame.

It's still a next gen console, though. Just a weak as fuck next gen console. It's weaker than 360 in some ways for goodness sake. That's just unacceptable in 2012.

You need to stay the fuck out of any technical discussion then because that statement just makes you look like an ignoramus.

N64 had 3X the clockspeed of Playstation yet I never remember people calling the playstation last gen upon the arrival of the n64.

Comparing spec numbers between different architectures is completely pointless.

N64 shit on the PS1.

But PS1 at least shit on SNES and Genesis.

Can't say the same regarding Wii U vs. PS3/360.
 
N64 shit on the PS1.

But PS1 at least shit on SNES and Genesis.

Can't say the same regarding Wii U vs. PS3/360.

When the hardware is used well in certain ways (certain engine and art demands) by Retro, the EAD's, Monolith and Platinum....I have no doubt Wii U will shit on 360/PS3. It'll just be a light drizzle instead of a full blown diarrhea surge! :P
 
Slow OS is purely a software issue, CPU/RAM doesn't help, but it's 99% software at fault. All Nintendo would need to do is get some competent OS programmers and patch that shit, it would run smooth as butter.
 
Good article, but I think most of the sane people don't actually believe the Wii U is weaker than the PS3/360 and won't eventually be able to have games that match/exceed their best. The issue has always been comparisons to when the other two come out and ports.

I have my doubts on this point actually.

It's been talked about before but Nintendo really was put into a bad position with the Wii coming into this gen. They currently have no one, or at least very few people, familiar with modern graphics techniques working for the company which will lead to a massive learning curve for them which will take years to overcome. Add to that how expensive it is to make a game that looks like Gears or Uncharted, so you really expect Nintendo to put that sort of investment into a single project?
 
I have my doubts on this point actually.

It's been talked about before but Nintendo really was put into a bad position with the Wii coming into this gen. They currently have no one, or at least very few people, familiar with modern graphics techniques working for the company which will lead to a massive learning curve for them which will take years to overcome. Add to that how expensive it is to make a game that looks like Gears or Uncharted, so you really expect Nintendo to put that sort of investment into a single project?

I would think Retro has some of that expertise, but overall I agree with you. Aside from possibly Retro, couple EAD teams (mostly Tokyo) and Monolith they don't have a lot of stock when it comes to highly adept graphical/technical teams. And again, as you say it's also a money on the table issue, in my mind that is the biggest hurdle they'll cross on this subject.

Aside from a couple third parties willing to invest and develop exclusively.
 
I know nothing about software development, but that kind of thing would depend more on Nintendo's dev tools, right? Nobody's coding with the assembly language.

Again, you can't program in a bubble. Even if you program in a high level language you still have to understand what a system IS and ISN'T good at. Dev tools are helpful in that they can automate many processes and choose the "correct" means but they aren't sentient and they can't design a game for the developers. They still have to know the hardware well enough to know what works and what doesn't.

A good example of this was made early about computer AI, specifically pathfinding AI. Let's say a developer wants to create a giant crowd of NPCs. The developer wants the crowd to move around and look lively. There are several ways to do this. You can set a predetermined "path" for each AI to follow. This wouldn't be very processor intensive, but it would be man hour intensive as you'd have to design a path for every unit, and it makes it so the player can't interact with the crowd without breaking it. Not a good choice.

The second would be to create a pathfinding routine that can be applied to everyone in the crowd. This is very CPU intensive (every member of the crowd plotting a path, each path overlaying, each member having to be able to navigate every other AI doing the same thing, etc).

The third would be a simple "walk around aimlessly and bounce off your neighbors". This would work nearly as well as the second option and would be much easier to off set onto the relatively powerful GPU, freeing the CPU for other things.

These are DESIGN decisions. These are things that the dev tools can't decide for you, but can have a HUGE impact on performance. If you know your system has a really great CPU, the second option may be better simply to leave the GPU free for rendering other things (and in some instances it'll look better to boot), in others, offloading as much to the GPU as possible would be the better option.

High level or low level software design, you have to have at least some understanding of the hardware and how to make design decisions to get the best out of it. If you're multi-platting a game, you may not have a real choice. You may be forced simply due to time constraints (or to keep a product "similar" across platforms) to go with a method that's not best for a particular hardware... Nature of the business.
 
When the hardware is used well in certain ways (certain engine and art demands) by Retro, the EAD's, Monolith and Platinum....I have no doubt Wii U will shit on 360/PS3. It'll just be a light drizzle instead of a full blown diarrhea surge! :P

I certainly hope so!
 
Top Bottom