If such overclock happened, as conservative as Nintendo is, it won't be over 10%.
So, did the overclocking turn out to be true, false, don't know?
Only a 1000%? Has anyone hooked the console up to a true/false meter yet? Perhaps it's really 1100% false.1000% false.
Really it sad to see this ruined a good thread.
Actually, we really don't know. It just seems likely that it's false. Particularly due to the extremity of the clock speed bump.
An overclock to the CPU of that degree is practically impossible unless the CPU was actually underclocked before. If it were underclocked one of the Wii U hackers would have noticed a long time ago.I think there more chance of some ram being freed up.
but like the other guy said a clock increase of more then 10% is not possible without risking compilations.
E3 will confirm the visual capabilities of the WiiU.
1000% false.
Really it sad to see this ruined a good thread.
The most entertaining thing that can happen is that this turns out to be true.It's just a bit of fun. Hell if we knew all the facts then this thread wouldn't have been half as entertaining.
1000% false.
Really it sad to see this ruined a good thread.
I know nothing about tech or power draw, but you guys are saying Wii U is like 30 in idle and only a few watts more when playing a game like NFS? So if there was a clock jump how much more power would it draw, if playing a game only draws a few watts more?
.
The Wii U is rated at 75 watts of electrical consumption.
Please understand that this electrical consumption rating is measured at the maximum utilization of all functionality, not just of the Wii U console itself, but also the power provided to accessories connected via USB ports.
However, during normal gameplay, that electrical consumption rating won't be reached.
Depending on the game being played and the accessories connected, roughly 40 watts of electrical consumption could be considered realistic.
I say measure the wattage when pikmin 3 comes out for example since that is a nintendo exclusive, there might be a difference, there might not be. But as of now previous users have mesured the wattage (wii u menu, playing zombi u) and it is (more or less) the same, remember there is another update coming soon.
Who knows how much farther Pikmin 3 has come from being a Wii port though. 3D Mario will likely be the earliest testament to the extent of the Wii U's power.
Who knows how much farther Pikmin 3 has come from being a Wii port though. 3D Mario will likely be the earliest testament to the extent of the Wii U's power.
Nooo!!
No one took this as true, but it started the discussion if possible. The CPU at 3Ghz was never accepted here.
I dont know what people here are saying, I only what Iwata has said
Iwata:
When is the WiiU being pushed to 40 watts?
Because it sounds like most games, if not all currently, are running around the 30's.
What if they gave themselves room to still allow games to reach in the 40's?
I wish people who took such a stance against that horrible digital foundry write up when it dropped.
I'm guess that was accepted here by a lot of people despite the small contradicting confirmations.
That or Zelda. I do not believe this rumor, but Marcan hacked the wiiu when it first came out. Could he check again post update just to shut this rumor down? Also, and just a question for my own curiosity, is it not plausible to raise the clock speed at all? Meaning if they actually wanted to bump clock speed they could?
Funny thing they overshot on the specs in that article....
considering to this day no one actually knows the specs
considering to this day no one actually knows the specs
This phenomenon is not exclusive to Nintendo hardware. This rumor is only being ridiculed because it is not particularly realistic. We also saw the same thing happen with 8GB GDDR5 RAM. Even though it turned out to be true, many people, Sony fans included, believed that it was an impossible wish.This is funny though. I never see people who are so against something being viewed positively even if its false. When people misinterpreted the Cell as being an 8 core processor, you didn't see people jumping up this hard to denounce it like this. You didn't see this when people were blowing the GDDR5 thing out of proportion.
The way people come out to deny good rumors about this hardware is surreal. You would think that people believing this console more capable than the lowest end hypothesis is causing them pain with some of these response.
This phenomenon is not exclusive to Nintendo hardware. This rumor is only being ridiculed because it is not particularly realistic. We also saw the same thing happen with 8GB GDDR5 RAM. Even though it turned out to be true, many people, Sony fans included, believed that it was an impossible wish.
I know what a load of nonsense this over-clocking thing is.
But there were some great improvements with a recent 3DS update that saw better/more consistent frame-rates in games. Perhaps new graphics drivers or improved low level code. Just saying.
I'd say the only real hope for performance updates is in the area of drivers - because apparently all system software for Wii U was grossly incomplete when it launched.
I mean, even background downloads caused games to suffer framerate problems - though I actually don't know if that's been fixed by the new firmware.
That it is being ridiculed is not what I'm referring to. Its how.
I don't remember there being to many rumors about the PS4's RAM before Sony's announcement, but, that Sony "had" 8GBDDR5 is not what I was referring to. It is the godly level that the RAM was elevated to when Sony announced it.
I've always wondered if it were possible to upclock the system though. It has to have some form of voltage stepping for it to be able to under clock in Wii mode.
Out of curiosity, what's the WiiU's power consumption in Wii mode? Would be interesting me thinxfail0verflow has a native mode hack, the clock speeds marcan made public are Wii U mode clocks. In Wii mode, the CPU is running at 729MHz (243 * 3).
Holy fuck, what's with the insane hyperboles up in here? Yes, the 3.2GHz rumor is clearly false. However, that doesn't mean that this thread disintegrated into nothingness as a result. Goddamn, almost no one here even takes it seriously.
However, it did help spur some discussion, and even informed some people about the possibilities of overclocking hardware, and what the repercussions are . Seriously, these "this thread has gone to shit" comments are useless as fuck.
Consider the source.
Does the Wii u GPU even have a 'driver' I know the 3ds does but was under the impression the Wii u didn't
I don't know, but based on my my personal experience since the update, a few things I've noticed that stood out to me. The animated WiiU icon that appears every time you launch a game no longer has the hiccup it had before. A message has appeared when starting up BO2, saying "your software is ready". I played BO2 yesterday, ground war domination, and it was smmoth as butter. I also tried to get a hiccup out of NFS:MW by spinning the camera non stop, but nothing.
The USB ports, he even says it in the quote. 2.5 watts per port x 4 ports = 10 watts.
One thing that did stand out from our Wii U power consumption testing - the uniformity of the results. No matter which retail games we tried, we still saw the same 32w result and only some occasional jumps higher to 33w
Unlike the 3DS update where pretty much everyone said they were noticing performance improvements prior to any speculation about an upgrade I haven't heard any talk of Wii U game improvements prior to this new overclocking rumor.So, can we take this as first hand a confirmation of improved game performance without a game update?
So, can we take this as first hand a confirmation of improved game performance without a game update?
Why consume foreign product when your national product is just as good if not better:That troll thread on GameFAQs is pure comedy gold.
Isn't 3.24GHz really close to the PS4/720 CPU's? Or am I thinking about something else? And the GPU being at 800MHz is a lot better than what people thought before the Wii U released.
Is it feasible that they did it to keep heat down but later deemed it safe enough for extended use within the thermal ceiling, assuming the power supply could support it.
someone measure the power draw, if it is still ~33watts there was no clock speed change.
I'm obviously going to wait until we get more info, and someone tests the power draw and stuff like that, but Nintendo made the OS have 1GB when we know later Nintendo will unlock half of that for the games later on, so it's possible Nintendo under clocked the Wii U for launch I think, we'll have to see of course.
Nintendo may have underclocked because their tools weren't ready or their OS wasn't ready, which clearly it wasn't and still isn't.
Not taking this seriously yet.
The 3.24 could mean the total amount, so the CPU was lowered 1.08 each (stability issues????).
The GPU at 800Mhz I think that is possible (GPGPU???).
I could see this being an e3 announcement, but I honestly doubt it's true.
Still the speculation is exciting.
Point I'de like to make
This isn't unprecedented, Nintendo intentionally built the Nintendo 64 to be updated at a later time, doubling the ram through expansion slot.
They also built the SNES to allow for the use of the SuperFX chips.
I'll admit something, it's probably just my imagination, but ever since the update my WiiU's fan is louder than my PC fan. Probably my imagination, or my PC's fans have gotten quitter.
Yeah that's true, has anybody tried measuring nintendoland? I have the game but not the tools to measure. When i play nintendoland the fan seems louder than what it once was weeks ago, could be me though.
The plan is going smoothly my friend. FB3 deemed to sh!t for this kind of super power, now will get Frost Bites in our asses with a 4 attached.So, given the current enthusiasm of the WiiU crowd, means that if the speculations are right, WiiU could actually compete with PS4/XB3 (which according to PC GAF are already outdated to the point of a WHOLE generation)?
I mean, one of the fantasy sellers here has already convinced me that WiiU will be better at lighting and DoF than next gen last gen outdated hardware and so this talk of GPGPU isn't surprising.
Looks like Nintendo may not have to worry about "down" ports after all if the GPU can upclocked.
Hoping eurogamer does a comparison, we want to get this over with.
I would rather it be anyone other than Eurogamer. They should be ashamed of the bias, bashing, fabrication and click bait writing in that article they made.