I'm guessing nobody saw my previous post?.
http://www.neogaf.com/forum/showpost.php?p=39730241&postcount=7416
http://www.neogaf.com/forum/showpost.php?p=39730241&postcount=7416
Some games need more power. For those experiences there's hardware out for that. And I'll buy them.
But hardware cannot make a game great. It can make a great game better, but the core game has to be sound. Otherwise polishing a turd.
I'm guessing nobody saw my previous post?.
http://www.neogaf.com/forum/showpost.php?p=39730241&postcount=7416
Looking back Nintendo launched GC in Sept. and announced they changed the clocks in May. So if we're looking at a Nov. or so release, now is the time they would likely do it.
So all this OoOE, efficiency, modern talk makes no difference? Jesus, how low have they clocked it exactly if a PS360 fighting game is struggling? There's no way it's going to make it through a generation if that's the case.
But then again, my thought is how can that be the case when games like ME3, Arkam City, ACIII etc, must all need much more processing power than Tekken? Those devs must be having some real headaches if that is the case.
There's just so much that doesn't add up... so much doesn't add up... ummm, shit, I jst realised this is Nintendo we're talking about !
It makes difference, of course, the code could be unoptimized to take advantage of this CPU and that hurting it's efficiency. But it certainly seems that it's clocked under 3.2 GHz and/or engineers are saying that if it was clocked a tad higher it could match X360 performance whilst running that code (we don't know exactly under which conditions though).So all this OoOE, efficiency, modern talk makes no difference? Jesus, how low have they clocked it exactly if a PS360 fighting game is struggling? There's no way it's going to make it through a generation if that's the case.
Might depend on how engine's are written/optimized.But then again, my thought is how can that be the case when games like ME3, Arkam City, ACIII etc, must all need much more processing power than Tekken? Those devs must be having some real headaches if so.
No matter how you put it doesn't seem to be far off, though.Might depend on how engine's are written/optimized.
Aye, I've suggested miss-translation/out of context a few times. Let's hope someone can tweet him for a comment on CPU remarks.
Nintendo being Nintendo. It's always for better or worse and in this case it's pointing to the latter.
It's not surprising since we've had indications the CPU was the problem since Arkam's posts back in Jan. or Feb. pointed in that direction. But doing it for no other (possible) reasons than keeping power consumption as low as possible and to have the console quiet as possible is asinine. I'm not saying they should push it to the point where it can cause a high failure rate, but Nintendo has a bad habit of overdoing it with stuff like this and it's ugly head is rearing again. Hopefully they will back off of that some before launch since clocks can still be changed at this point at least.
If it makes Nintendo blink and up the clock I support it. The CPU as a problem has been pointed to for too long and like I mentioned this is likely Nintendo unnecessarily doing it for some perceived need.
I dunno.Yeah, but again, if it was that close surely they could easily find that little bit extra they need from looking into offloading to DSP, I/O GPU etc, unless they have already done that and there's still problems (surely not.) OR, is there any circumstance where CPU clock speed will only do and no amount of DSP/IO etc make no difference. What's with all the creative solutions? I think it's a bad translation.
I wouldn't necessarily say its "Nintendo being Nintendo" (stingy with clocks) if they've decided to go with an architecture like the 47x series. I don't think there are any historical examples of those bad boys going higher than 2ghz (or even 1.6 really) and running straight code from Xenon isn't going to produce any ground-breaking results. Of note: in the 60 or so watts the box is going to be pushing, most of it is going to be the gpu. Most of ibm's embedded solutions like the aforementioned 47x's eat up <2w per core.
I wonder if it's possible to "unlock" more speed as the PSP did well into its life. Perhaps Nintendo kept the CPU clocks low as they attempted to "balance" the sysem, as they did with Wii and GCN, where the CPU clock = 3x GPU clock. But if the CPU is holding devs back, it may not be as balanced in practice as it perhaps is theoretically. Still, how high could they really clock it? A modified 400 series core is currently the best running theory and those top out at 2 Ghz.
There's no way to spin this, it's terrible. There's also no more "why are we trusting anonymous devs?!?" excuse, the Tekken guy just outright publicly said it. Lower clocked than 2005 hardware (not "5-6" years old, but 7). Saying it might be more efficient doesn't make sense based on what he said, that they have to use creative solutions just to get it up to par. If it was more efficient and had some custom magic in it he wouldn't have said that. This is the Wii all over again.
Is it just a matter of cost, not wanting to charge over $300 and still making a profit that is the CPU issue or is it more to do with the fact they love tiny consoles which is obviously a massive problem regarding heat from both the CPU and GPU ?.
Looking back Nintendo launched GC in Sept. and announced they changed the clocks in May. So if we're looking at a Nov. or so release, now is the time they would likely do it.
Probably slightly more the latter than the former, but both are factors. Nintendo would simply rather produce a smaller, quieter, more reliable console than a behemoth beast that red rings every two years.
I disagree.I wonder if it's possible to "unlock" more speed as the PSP did well into its life. Perhaps Nintendo kept the CPU clocks low as they attempted to "balance" the sysem, as they did with Wii and GCN, where the CPU clock = 3x GPU clock. But if the CPU is holding devs back, it may not be as balanced in practice as it perhaps is theoretically. Still, how high could they really clock it? A modified 400 series core is currently the best running theory and those top out at 2 Ghz.
Mostly GPU.What was the cause of the RROD, the CPU or GPU overheating ?.
Less than 6 months is very tight for components like CPU and GPU. But this isn't manufacturing; I bet they could change clocks on a console with a firmware/software upgrade providing they decide on stone for something else before launch.OK. For some reason I was under the impression that there was normally a 6 month lead time on manufacturing.
It's not out there. He got the percentage wrong.
There are some games on the 360 where audio processing does take up an entire core.
I dunno.
I don't even think they're trying to be negative (or positive), they were trying to be insightful on how it's going for them; their experience doesn't need to be reflected everywhere but it's still their experience, and a little higher clocked solution seems to have been the sweet spot for them.
Now, of course that has to be taken as a grain of salt, their tech seems very PS3/X360 centered (and it's good tech at that, meaning they took years to get it that efficient), they can't change that by clicking fingers. Wii U is clearly a different beast.
What was the cause of the RROD, the CPU or GPU overheating ?.
Gotta start reading the news you all are talking about before entering this thread. Because the Tekken thing sounds neither as roundly positive nor negative as it's being interpretted.
(personally, I once again take comfort in the fact that at least we're jumping to HD with a good DD and online solution)
Is it just a matter of cost, not wanting to charge over $300 and still making a profit that is the CPU issue or is it more to do with the fact they love tiny consoles which is obviously a massive problem regarding heat from both the CPU and GPU ?.
Gimped CPU comes as no surprise. Falls in line with what I've been saying about he systems overall capabilities. Slightly more powerful than current gen.
I'd personally drop "neither" and say it's both. Positive in that Wii U can be even more capable than it currently is and negative because at least so far Nintendo is choosing not to allow it to be more capable.
But they can unlock it via firmware update like it happened with the 3DS right?
But they can unlock it via firmware update like it happened with the 3DS right?
Impossible to theorize I'm afraid. Not enough data on their tech innards or Wii U CPU.Does anyone know or can theorise on that? Tekken engine: PS360 centric; how much work might be involved basic optimising for WiiU architecture in say, man hours or what-have-you?
I forget Neogaf replies come really quick so I tend to edit everything if I want to add something.EDIT: Stealth edit by lostinblue. :|
It's not some sort of spite thing. It's just that on current generation consoles, the most sizable base of active digital consumers has been on the X360 and PS3. Bear in mind, this game was green lit more than a year ago when I did the deal with Marvel.
Don't worry though, on some future unannounced Capcom projects, Nintendo platforms will get digital love from us.
I was having problems in conveying this.We're talking about two very different CPU's that are probably in a similar ball park performance-wise. In that situation you can easily come across problems where optimisations to the code for the older CPU are actually hindering performance on the newer CPU. If you don't have a significant performance advantage to negate that then you need to put in some work to change the code to get that same code working well (the creative solutions being referred to). Of course this isn't an issue if you can write the code to work well with the new CPU from the beginning.
By digital love he means virtual-service releases?Mock if old.Don't worry though, on some future unannounced Capcom projects, Nintendo platforms will get digital love from us.
I hear you.Got to say though, I greatly respect Nintendo for their build quality, I'd hate it if they dropped down to Sony, or God forbid MS standards. The only N console failed on me was a DS Lite after 4 years of use, and they repaired that for free for me because it was a known manufacturing fault - micro cracks on the mainboard. Love to see Sony or MS help me out with a 4 year old failed console.
Remember that comment yesterday from Katsuhiro Harada stating "it would be distracting that the Gamepad plays a big role with fighting games" well apparently he never said that, he said "difficult for fighting games" he's mad at Gamespot for changing his words, makes you wonder if the CPU comment is wrong too.
Looking at the small screen [Wii U GamePad] and the big [TV] screen at the same time is pretty difficult for a fighting game. So were thinking of making it useful as a way of having shortcuts.
Or, by making progressing through the game more convenient. Or by playing alone on the GamePad screen.
https://twitter.com/Harada_TEKKEN/status/222703388025032704
Also he said
https://twitter.com/Harada_TEKKEN/status/222711445886996480
"WiiU gets 'trolled' too much as it is. I like Namco on Nintendo platforms"
Got to say though, I greatly respect Nintendo for their build quality, I'd hate it if they dropped down to Sony, or God forbid MS standards. The only N console failed on me was a DS Lite after 4 years of use, and they repaired that for free for me because it was a known manufacturing fault - micro cracks on the mainboard. Love to see Sony or MS help me out with a 4 year old failed console.
We also have a possible indication the CPU can be clocked up to 4Ghz.
Gimped CPU comes as no surprise. Falls in line with what I've been saying about he systems overall capabilities. Slightly more powerful than current gen.
That makes me a little frustrated as well; what might be the reason they can't wield the SDK to easily off-load their CPU tasks. Unabashed nativity talking here, but I would have thought Nintendo make it easy for devs to understand what needs to be done and how to do it, more so for them if they'll be working on a major Nintendo IP soon.
I suppose if what you're saying is they would just like a little more clock speed so they can piss out a minimum effort port because optimising with DSP etc in mind is too much messing, then that's that.
Does anyone know or can theorise on that? Tekken engine: PS360 centric; how much work might be involved basic optimising for WiiU architecture in say, man hours or what-have-you?
Games that cost US$100 million to make on the PlayStation 4 and Xbox 720 will also cost US$100 million to make on the Wii U. The industry problem of over expensive game budgets and a shitty market will be problematic for everyone and the Wii U will solve nothing in this field. As I said earlier, this is a 'now' problem on 'current' hardware, and it will be a similar problem for similar games on the Wii U.
Where publishers and developers will find success with smaller budget titles will depend on hardware market penetration, sustainable software sales, and distribution models that benefit developers/publishers working on a small budget. If Nintendo is unable to provide these they won't be helping anybody.
Tekken Producer said the Wii U CPU is slightly slower than the PS3/360 CPU and a lot of people are taking it that the Wii U CPU sucks even though that's not what it means.
From what I remember POWER7 is only made with eight
cores. The 4 and 6-core versions are just the 8-core with disabled cores.
A2 is in-order.
So now we have an anonymous dev and a public dev both talking about the CPU having a "low" clock.
There's no way to spin this, it's terrible. There's also no more "why are we trusting anonymous devs?!?" excuse, the Tekken guy just outright publicly said it. Lower clocked than 2005 hardware (not "5-6" years old, but 7). Saying it might be more efficient doesn't make sense based on what he said, that they have to use creative solutions just to get it up to par. If it was more efficient and had some custom magic in it he wouldn't have said that. This is the Wii all over again.
Does Kojima and Crystal Dynamics speak for Nintendo?
I dont see the connection you are trying to make here.
Just because some developers do no want to bring their games to the console, doesn't mean Nintendo is not aggressively pursuing 3rd parties.
And they have already shown through their hardware design that they are interested in third parties, by announcing their pro controller, for those third parties not interested in working out the gamepad or wiimote.
Nintendo has worked on their online structure, allowing developers to basically self publish their own titles.
They are most likely offering a console with modern feature sets, yet priced, as close as possible, to mainstream budgets.
Nintendo is also publishing several games for third parties. Project P 100, NinjaGaiden3
Going after third parties is not just about money-hatting, it can also be about offering an ecosystem that they can thrive in.
What Nintendo's mission is is to put as many consoles into the hands of users using their key franchises. That means Mario, which is a title bought by both core and casual players. Its up to third parties to take advantage of it. Those that dont, will most likely become dinosaurs.
Blame Miyamoto's fear of moms.I am not sure why are they trying to achieve quiet operation, it's pointless, that damn Wii disc drive was heard through half of my house, freaking earthquake (I have high senses for hearing silent sounds and vibration, must be a side effect of my long-distance seeing clear ability), it wasn't the cooling at all when I finally checked it out, I thought it was, what will make the WiiU disc drive less noisy, intertial dampners for crying out loud ?
I know I said he is not a technical head so he's not being overly technical in his explanation/perhaps he couldn't elaborate on it so it should be taken as a grain of salt.Are you out of your mind. Just keep reading what we're saying, the tekken guy is a jerk, he's not an engineer, he probably speaks of silly things like clock speed. Developers talk shit they want, they need to fill in their communciation for the marketing, they just tell you the gut feeling when they have no idea, his team has, but he's just playing with customers.
How on Earth can they keep even an on par Xenon type CPU, 2GB of Ram and a 500 FLOP GPU cool in that thin case ? !!! .
My Gaming PC's case is the size of 6 or 7 Wii U's pilled ontop of one another and the CPU and GPU fans still go like crazy !.
The GPU will most likely be closer to 600 GFLOPS (not that it means much).
The CPU thing goes back to the way the Wii U handles graphics.....it seems to be all about the GPU.
The CPU doesn't have to worry about the sound department since the system has a dedicated DSP.
Games being ported to the Wii U from Xbox 360/PS3 will use code made from the ground up for those older systems with older technology. The Wii U will be able to handle those ports just fine with some tweaking to how the CPU/GPU work together, but developers are not going to completely re-write code for the Wii U from scratch, otherwise it would be similar to trying to port to the og Wii (ok maybe not that hard) and developers would steer clear of that noise. All Third Parties have/want to do is make the game run on Wii U just good enough to make the game look the same/maybe a tad better than the HD twins, other than that, I don't think they are going to care to put more work into "figuring out" how to get better use of the CPU.
Right now developers aren't doing that for the most part and are still in Xbox 360 mode.....which is totally understandable
Games built for next-gen (PS4/Xbox 3) will be able to port to the Wii U due yet again to the GPU being modern with compute shaders and all the DirectX 11 jazz.
Games built from the ground up and made for Wii U specifically will probably be able to match closely the games on PS4/Xbox 3 in terms of graphics if development uses some of the unique "Custom Nintendo Features" that are unknown at this time i.e. just like the Gamecube did with Resident Evil 4, Star Fox Adventures & Rogue Squadron II/III despite the Xbox 1 being about twice as powerful in technical terms.
Huh ?...
I was asking someone in the know how such a thin case would be able to keep the Wii U from over heating.
Maybe you quoted the wrong person .
Mostly on paper only, as PS2 and Xbox inflated and gave theorectical performance as a measure while Nintendo went with realworld scenarios (that could be surpassed). It still was supposed to be a little more powerful going by specs alone though, seeing the more expensive (and familiar) parts, compliant feature set, nvidia feature set, hard drive and the effort to trump any feature the other platforms had by the widest possible margin, but it's not as linear as everyone makes it out to be (not twice by the longest shot); specially if xbox bottlenecks play a part.i.e. just like the Gamecube did with Resident Evil 4, Star Fox Adventures & Rogue Squadron II/III despite the Xbox 1 being about twice as powerful in technical terms.
Are you out of your mind. Just keep reading what we're saying, the tekken guy is a jerk, he's not an engineer, he probably speaks of silly things like clock speed. Developers talk shit they want, they need to fill in their communciation for the marketing, they just tell you the gut feeling when they have no idea, his team has, but he's just playing with customers.
Most of the negative rumors were anonymous.
I'm not sure wsippel ever followed up to that (if he would pop in and correct me I'd be happy) but there exists no Broadway-based (which this CPU isn't) or 47x based design that goes past 1ghz or 2ghz respectively. I don't think Waternoose ever got that high either - anyone ever tried overclocking their 360 or PS3? lol
Wii U offers the same large level 3 cache found in the POWER7 line and processing technology as Watson (another POWER7).
Anyways, this is getting wack. IBM called it. You can disagree on its definition (i.e opinion) but facts are fact.
As far as specs go Gamecube CPU speed was 486 Mhz vs 733 Mhz of the original Xbox and 299 Mhz of the PS2 latest models. That is roughly 60% faster in clock speed for Xbox against GC and something along 140% for PS2.
This clock difference while substantial didn't make for a general situation where ports where impossible or games in general lagging one generation behind for the slower consoles. What we are talking now is about a difference of a "little bit" so it obviously couldn't make much difference.
Now for the argument of "in the past generation what made the difference was not Mhz, but architecture". That is my point. Wii U while a "little" bit slower than 360 is likely going to be better on CPU general terms and comparable on functions with their counterparts. The difference on Mhz against current and next gen, could be pretty much neglible
As far as we do know, the Wii U has 3MB of L2 cache.
Just to point out, the POWER7 uses up to 32MB of L3 and 2MB of L2.
fixed
The level 3 is implemented in edram, which is what the Wii U has.It doesn't work like that exactly (Watson isn't simply "another POWER7", it was some crazy number of POWER7's put together. It was a supercomputer on a whole other level)
Also, you have to be careful about how much you read into IBMs statements. Fourth Storm is right in that regard; from what we do know (I don't know how far back you've been following) but the Wii U CPU is not a POWER7 and there are no facts (as you are claiming) that support that. There will be similarities to an extent, but we don't know how much it actually has in common. For one thing, there is no (as far as we know and have heard) L3 cache at all in the Wii U CPU and it certainly isn't anywhere near 32MB. It apparently has 32MB of eDRAM so you might be getting that mixed up.
As far as we do know, the Wii U has 3MB of L2 cache.
Just to point out, the POWER7 uses L3, not L2.
Again, everything is still more on the speculative side of things until we get a Wii U and rip it open and see whats inside but people who have seen spec sheets have confirmed that.
Even in IBM's release they did not outright say the Wii U CPU is POWER7 (which we did think it was for a while)