IceDoesntHelp
Banned
Are we getting anywhere with any of this?
Are we getting anywhere with any of this?
We already solved it within 5 minutes of the post. It's bullshit.Are we getting anywhere with any of this?
CPU: IBM PowerPC 7xx-based tri-core processor "Espresso" reportedly clocked at 1.24 GHznote however, it has been rumored that the CPU has been over clocked to 3.24 GHz after the 3.0.0 update.
If only it was only bullshit.We already solved it within 5 minutes of the post. It's bullshit.
I think the fans now rev'n higher are intended for improved cooling = improved stability, that's all. Wii U might have had the OS footprint reduced so more RAM = more performance, don't believe the 2 GHz overclock at all.
There's an Arthur Gies joke in there somewhere.Why is that a requirement for understanding how a CPU architecture works?
8gigs of DDR5 was not just some pipe dream.
It was always feasible even during my cries of "2 gigs in the PS4 most likely." It was just expensive to do so, and could have ended up with an overly complicated board. This meant much less than I would have thought. Neither price nor the potentiality of complexity stopped Sony from going the extra mile.
The fact that in Wii mode it just downclocks one of the cores to the Wii's clock speed, runs the code natively and disables the other cores does mean that it is the same as broadway.you made the wii u cpu? you know what it was designed for? it obviously was designed for at least 1.24ghz, it isn't just 3 broadways duct tapped together
for your viewing pleasure, some "comments" from HamSandMan77, the user who wrote this in the first place:
http://tvtropes.org/pmwiki/article_history.php?article=Main.WiiU
"Seeing that people are having mixed views on said clock speeds (even for me when I first seen said clock speeds), I'm placing them as rumors. "
"Homedrewers have said that that is what the CPU & GPU's clock crystal is clocked at and that with said update it up clocked to that spead, also, the Wii U uses a Power6 base chip & the updated speads are placed back into rumor mode. "
GDDR5
The fact that in Wii mode it just downclocks one of the cores to the Wii's clock speed, runs the code natively and disables the other cores does mean that it is the same as broadway.
The higher clock just comes from the newer process node.
for your viewing pleasure, some "comments" from HamSandMan77, the user who wrote this in the first place:
http://tvtropes.org/pmwiki/article_history.php?article=Main.WiiU
"Seeing that people are having mixed views on said clock speeds (even for me when I first seen said clock speeds), I'm placing them as rumors. "
"Homedrewers have said that that is what the CPU & GPU's clock crystal is clocked at and that with said update it up clocked to that spead, also, the Wii U uses a Power6 base chip & the updated speads are placed back into rumor mode. "
If he really "seen said clock speeds", how come he doesn't just put up proof?
Wait wait wait wait, clock crystal? Is that a thing? Like, in a watch?"Homedrewers have said that that is what the CPU & GPU's clock crystal is clocked at and that with said update it up clocked to that spead, also, the Wii U uses a Power6 base chip & the updated speads are placed back into rumor mode. "
Indeedexactly, who are these homedrewers he talks about
If we're going to start basing generations on power, then yes my newest gaming PC is a generation ahead of the upcoming consoles. It's not my criteria, but many that want to deride Nintendo's weak hardware decisions and relegate them to a previous generation just because their products are released with hardware that don't greatly eclipse Microsoft and Sony's hardware.
That said, the sarcasm in this thread is needless. Even if in some hypothetical update Nintendo frees up more memory or does a minor firmware-based overclock to any of the parts (again, not unheard of - Sony and Nintendo have done this before) that doesn't preclude the fact that the Wii U is orders of magnitude weaker than the upcoming Microsoft and Sony consoles. Nothing can or will change that, because physics never - ever- lie.
The needless "circle jerk" or whatever GAF calls it nowadays on either side of the spectrum isn't necessary.
The fact that the GPU in this console is a marked improvement over the previous generation's GPUs isn't a stretch, and that the CPU isn't being overclocked to 3+ ghz is extremely, incredibly obvious to anyone who knows the type of CPU that's in the console. The end.
We already solved it within 5 minutes of the post. It's bullshit.
So the CPU is a 1.24GHz tri-core?
Isn't that even weaker than the 360's three cores?
How's the GPU?
CPU is based on PowerPC 7xx which is an old design (same as Gamecube/Wii). Iwata admits the console can't run everything perfectly and some developers complained about it. One developer has said they're ok with it.So the CPU is a 1.24GHz tri-core?
Isn't that even weaker than the 360's three cores?
How's the GPU?
CPU is based on PowerPC 7xx which is an old design (same as Gamecube/Wii). Iwata admits the console can't run everything perfectly and some developers complained about it. One developer has said they're ok with it.
GPU is better than PS3/360 but probably not by much. There's still some secrecy to the chip but don't expect dramatic improvements over what the Xbox 360 can do (compared to PS4/720).
Their loss. If interpretation is such a problem they can give a disclaimer about how each specs are suppose to be read.This is the reason why companies don't like to share specs of custom silicon.
Compared to the PS3/360? Sure, but that's not saying much (2012 vs 2005 tech).Once people figure out the architecture and the GPGPU I feel like it will hold its own just fine.
This is the reason why companies don't like to share specs of custom silicon. Look at the Apple A5 (even better A5X) chip and then compare it to the Tegra 3. The Tegra 3 has 3 or so more times "cores" and faster clockspeeds all around. On paper the Tegra 3 is a beast. But then the "weaker" custom Apple chips beat it out on most categories. I feel that the Wii U can have the same potential being an MCM die and all. Once people figure out the architecture and the GPGPU I feel like it will hold its own just fine.
broadway downclocked itself to gamecube clock speed but it did have some other improvements over gecko
No, the A5 is duel core, Tegra 3 is quad core.
They are the same cores! Just that Tegra 3 has 2x the number and higher clock speed.
Tegra 3 is faster CPU wise no matter what.
Tegra 3: 4-core CPU 1.6 GHz, 12-Core GPU = 16 total cores
A5: Dual core CPU 1 GHz, Dual core GPU= 4 total cores
FOUR TIMES as many cores.
I'll give you credit that the CPU is a BIT faster, not by much at all (according to anandtech) and the A5 DESTROYS the Tegra with 6 times the GPU cores. The A5 chip is INCREDIBLY efficient with better designed silicon with "less-powerful" specs. @JordanN
THIS is why specs don't matter, real-world performance is really what matters.
That your WiiU will overheat and turn off after about 20 seconds.
So what's stopping a Wii U dev from actually giving some info on the GPU/CPU's capabilities?
Now I finally got to play NBA 2k13 (Demo). I don't know how it performed before the update, but it's smooth as hell. No frame rate dips whatsoever. Seems like Nintendo really improved the performance. Might be software related though.
So your saying that the console form factor is designed around a 550Mhz GPU? This would suggest that the chip is at limit, core is at its safe spot.
Actually I looking for performance benefits.
You can not just shove CPU cores and GPUs in the same boat!
The CPU cores in Tegra 3 are the same cores from ARM! It just has more and at a higher clock speed!
The GPU specs on the other hand, A5 are better than the Tegra3 and the "core"s are not comparable, one has Unified Shader Pipelines, the other does not!
Ok... not sure how that makes sense, because even if we are just talking about CPU cores the Tegra is still a "newer" quad-core at 1.6 GHz and the A5 a dual core at 1 GHz. The Tegra is Barely better at most benchmarks. The point I'm trying to prove is that you don't have to look at the core specs to "determine" real world performance. I'm just giving an example to show that maybe, MAYBE the Wii U is not as bad off as we think and that it really might be really efficient at what it does because of its custom silicon. Yeah devs will have to figure out how to be better with the chipset but I just don't think people should rule it out as being so "weak" yet.
Ok... not sure how that makes sense, because even if we are just talking about CPU cores the Tegra is still a "newer" quad-core at 1.6 GHz and the A5 a dual core at 1 GHz. The Tegra is Barely better at most benchmarks. The point I'm trying to prove is that you don't have to look at the core specs to "determine" real world performance. I'm just giving an example to show that maybe, MAYBE the Wii U is not as bad off as we think and that it really might be really efficient at what it does because of its custom silicon. Yeah devs will have to figure out how to be better with the chipset but I just don't think people should rule it out as being so "weak" yet.
Don't know why he's debating you so adamantly: I'm pretty sure what you just said here has been consensus for, like, 3000 posts.
So... How long before this is on "news" sites?
So the CPU is a 1.24GHz tri-core?
Isn't that even weaker than the 360's three cores?
How's the GPU?
Your computer is 10x the PS4? 20TFLOPS?
Impressive bruh.
1.843TFLOPs x 10 = 18.43TFLOPs
4 HD7970s clocked at 1.2GHz = 19.661 TFLOPs so it isn't as impossible as you are making it out to be. But the reality is that CF doesn't hold 100% efficiency however, it is possible to be a generation ahead with a PC set up, in a similar manner to PS4 being a generation ahead of Wii U, which is his point.
Oh my god, I almost posted a comment saying there's no fucking way his PC was 10x current gen unless he had like 4x 7970s.
Didn't know how close I was o.o
Where did Iwata say that and why have some developers(the ones who actually made good games for it) say the exact opposite?CPU is based on PowerPC 7xx which is an old design (same as Gamecube/Wii). Iwata admits the console can't run everything perfectly and some developers complained about it.
GPU is better than PS3/360 but probably not by much.
He obviously knows something we don't.Where did Iwata say that and why have some developers(the ones who actually made good games for it) say the exact opposite?
How did you conclude this?
What coprocesssors? Audo DSP is official, but what else?Honestly, anyone trying to say it's outright weaker than the PS3/360 isn't worth listening to. It would not be running ports at anything close to parity if a single part of the system was that huge of an issue. CPU that has limited in comparison SIMD functionality, but with co-processors to offload specific tasks to freeing a modicum of system resources. This is a change, well overall the entire system is. Literally taking a 4 stage pipeline CPU from a decade ago... no three of them! Slapping them together with refinements, and using the smaller die size to gain speed.