Wii U clock speeds are found by marcan

We may really have to appreciate third party devs' hard work for preparing ports, instead of finger pointing them and calling them lazy :|
 
It's not likely given Anand's numbers, but it's possible the CPU runs at a lower clock speed while not playing games and marcan measured under such conditions.
 
My MacBook Air uses a 1.4GHz dual core CPU and the CPU is usually not the bottleneck when it comes to games. Clock frequencies are quite meaningless.
 
So disregarding clock speed for a second, how do these PPC cores relate to those in the Xbox 360 CPU? Without knowing that, these clock speeds don't tell us much no?
 
ITT: people that don't understand clock speed isn't as an important factor in modern cpu/gpu architecture

"Architecture" isn't magic either. The die area, indicative of execution resources, is minuscule. We know now it runs very slowly. The efficiencies people claim are somehow going to save the console's CPU performance are simply fantasy, as is the claim the EDRAM is some technical panacea.
 
As of Clocks speed many people are right that clock speeds are vague as comparison but let's be real here no technology right now exist that will give you good CPU at 1,25 GHz add to that restriction in sense of gamepad screen rendering pipeline and it is worse than X360 CPU let alone Cell.
 
So disregarding clock speed for a second, how do these PPC cores relate to those in the Xbox 360 CPU? Without knowing that, these clock speeds don't tell us much no?
Our tech lords mentioned that Hz for Hz, the Broadway is way starker than Xenon. But as much as three times?
In 2005, we exactly expected that kind of CPU in the Wii.



edit: wsippel
Very heavily modified. 750CL is inherently unsuited for SMP.
So, not exactly vanilla Broadways if he is to be believed.
 
GPU speed is about what I expected but the CPU is quit a bit slower.

ITT: people that don't understand clock speed isn't as an important factor in modern cpu/gpu architecture

Problem is neither chip is "modern" by today's standards.

And... iirc, isn't WiiU development GPU centric?

Can you explain what you mean by this? Are you asking is devs can use the GPU to make up for the weak GPU?

GAF experts LOL

Gotta love them.

So what's your take on this news?
 
If you want to go beyond clockspeed...then, how well did Broadway compare to a Xenon core?

it's really hard to say. it always seemed like the Wii's problems were graphics related rather than raw number crunching. I do remember that Dead Rising: Chop Till You Drop saw significantly reduced zombie counts, which obviously could in part be down to the CPU, but I was under the impression that the COD downports had pretty comparable campaigns, just with inferior detail.
 
Seems like this console was made for japan market in mind.
Games playable without TV and japanese people mostly don't care about graphics.
 
So disregarding clock speed for a second, how do these PPC cores relate to those in the Xbox 360 CPU?

I'm not quite sure but they should share the same set of intrinsics (e.g. AltiVec for SIMD).
If this is the case, a port from 360-->WiiU would be easy but if the alleged clock speed is real, than they most likely have to rewrite a lot of memory management stuff and optimize their gameloops to get a decent framerate.
 
Has there been a single positive hardware leak so far? Honest question, so far everything seems to disappoint even modest expectations.

I still don't understand why the Wii U is being sold at a loss. Everything sounds like it's dirt cheap to produce. Even the GamePad is hardly a marvelous piece of high end hardware...

nintendo sucks at getting good manufacturing deals?

something similar was going on wih the 3DS since it doesn't add up how they barely get to profit numbers at the current price.
 
My MacBook Air uses a 1.4GHz dual core CPU and the CPU is usually not the bottleneck when it comes to games. Clock frequencies are quite meaningless.

I dont think so.

My old cpu was a e6420@2,1GHZ.

I had problems with a high number of games; but obviously it deppends of the game.

Big ass open world games, games with a lot of physics, npcs and that kind of stuff are very cpu demanding.

Try to play Arma2, or GTAIV or RedfactionGuerrilla. That Mac might explode.
 
So disregarding clock speed for a second, how do these PPC cores relate to those in the Xbox 360 CPU? Without knowing that, these clock speeds don't tell us much no?
IPC should be a lot better. Xenon IPC is supposedly really terrible. 2 on paper, 0.5 or something on average in real world situations.
 
Seems like this console was made for japan market in mind.
Games playable without TV and japanese people mostly don't care about graphics.

That's exactly what Wii U sounds like. Sounds like the system wasn't designed with western audiences and western developers in mind at all.
 
I feel like 1.2ghz has come up a lot in the speculation for the Wii U CPU clockspeed over the last months so I'm hardly surprised.

Really Nintendo has indicated several times the goal of Wii U hardware is to bring them up alongside the current competition (ie. tick that box) and whilst launch games have (sometimes major) problems, it seems like a stretch to think that they wont be able to squeeze enough performance out of the CPU to bring ports up to par with the PS3/360 sooner than later. And really thats all thats been targeted.

And if you thought Wii U could compete with the next offerings of Sony/MS, you were probably not thinking realistically. Its never really been in the offing as much as Nintendo fans have crossed their fingers wishing otherwise.
 
Wait, the Wii U's been hacked already? Hoho, oh boy. I find that a lot more interesting than some (disappointingly low) number. I wonder how though. Via Wii mode? A security hole relating to the HDD?

Judging by the reports on the OS's functionality, it's clear it didn't spend sufficient time in testing and QA. This is why you don't release incomplete software Nintendo, it's bound to have security holes.
 
I´m more and more confused about what this console is suposed to be.

A HD Nintendo Console?

On a side note, if this hack was made through the Wii Mode of the system, could it be "false" information as the hardware is running in the wii mode so teh numbers are not accurate?
 
Can you explain what you mean by this? Are you asking is devs can use the GPU to make up for the weak GPU?

Given I'm a layperson when it comes to technology who mostly understands macro concepts on the surface, I thought what you paraphrased, yes. I read on this forum something about development on the platform utilizes GPGPU (which I assume would point to my previous query about compensation).
 
The thing is, the Wii has been absolutely fucking blown open by hackers. Every single part of it in terms of firmware, security and hardware has been mapped out, and it all started from knowledge of the Gamecube.

So naturally, if the Wii U is based on the Wii in hardware/software (which it is), then the hackers have a great starting point.

I give it a couple of months.
 
for the hack, it happened before that some teams "worked on it" before launch, after having managed to put a hand on a dev kit (there are various means). I don't know if it's the case here though.
 
No. Those are pretty much the only positives I can think of for the Wii U HW :|

I wasn't intending that...
We knew since the beginning that the console had eDRAM on it and still we don't know how much there is and its characteristics, the same with the gpu we know since the beginning that used a more modern architecture than ps360 now the only thing that we know more than before is its clock...
these are not absolute positive things it can go bad even with these
 
I feel like 1.2ghz has come up a lot in the speculation for the Wii U CPU clockspeed over the last months so I'm hardly surprised.

Really Nintendo has indicated several times the goal of Wii U hardware is to bring them up alongside the current competition (ie. tick that box) and whilst launch games have (sometimes major) problems, it seems like a stretch to think that they wont be able to squeeze enough performance out of the CPU to bring ports up to par with the PS3/360 sooner than later. And really thats all thats been targeted.

And if you thought Wii U could compete with the next offerings of Sony/MS, you were probably not thinking realistically. Its never really been in the offing as much as Nintendo fans have crossed their fingers wishing otherwise.

It's okay if Wii U doesn't compete with 720/PS4.

But to create a system where developers have to put in a ton of effort just to make 360/PS3 games look just as good on Wii U is a big mistake.
 
So disregarding clock speed for a second, how do these PPC cores relate to those in the Xbox 360 CPU? Without knowing that, these clock speeds don't tell us much no?

The cores in the 360 are slightly modified versions of the PPE in the PS3. The Wii U cores are slightly modified versions of the Wii's Broadway CPU. I'm gonna go ahead and assume the PPE is more powerful than Broadway. According to Marcan the Cpu in the Wii U is from 1997-based technology(PowerPC 750).
 
Are you fucking kidding me ? 1,25 GHz ?

I change my statement.

Devs were not lazy with ports on WiiU they were fucking wizards to achieve that on WiiU.

This. I'm sorry but that's disgraceful, this has killed any interest I've had in the WiiU now to be honest.
 
Explains why they're taking a loss on each system, though.

CPUs with these kinds of clock speeds must be super rare to find in 2012.
 
Googling; put up a comparison. Let me know if I got the information wrong.

zKhhB.png
 
damn poor Bayonetta.. it ran like shit on PS3 which cpu has a clockspeed of 3.2Ghz
now they want her to run on a CPU with 1,25Ghz.. poor girl.
 
Top Bottom