• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Community Thread

Status
Not open for further replies.

Van Owen

Banned
I'm sure Harada got some flack from Nintendo for the comments. I don't see how saying the clockspeed is low can be taken out of context. He didn't mention anything about a bad translation...

It's humouros that stuff that IdeaMan or other people here say will be taken as fact, but a credible website reporting from devs that would like to remain anonymous is bullshit lol.
 
I'm sure Harada got some flack from Nintendo for the comments. I don't see how saying the clockspeed is low can be taken out of context. He didn't mention anything about a bad translation...

It's humouros that stuff that IdeaMan or other people here say will be taken as fact, but a credible website reporting from devs that would like to remain anonymous is bullshit lol.

Clockspeed low dosent mean cpu slow...

Thats what was taken out of context. Whats the hell is wrong with you?

You seem to have some problems processing that fact...

If there would be an anonymous dev saying: "720 is a weaksauce" it would be dismissed as fake! But hey, this is against Wii U so anonymous dev -> GOSPEL!!!
 

JAYinHD

Member
I'm sure Harada got some flack from Nintendo for the comments. I don't see how saying the clockspeed is low can be taken out of context. He didn't mention anything about a bad translation...

It's humouros that stuff that IdeaMan or other people here say will be taken as fact, but a credible website reporting from devs that would like to remain anonymous is bullshit lol.

The problem is for every anonymous developer that says something negative about it you have known developers that has something positive to say.

I think some folks frustration may come from people believing the so called anonymous developers over actual developers.
 
The problem is for every anonymous developer that says something negative about it you have known developers that has something positive to say.

I think some folks frustration may come from people believing the so called anonymous developers over actual developers.

There are developers on record saying the Wii U has, I think they said "a really great CPU"

Van Owen. Discus.

Things like the Wii U are becoming very sexy with what you can do with the controller, especially with what you're able to do with the motion tracker or whatever the sub gameplay you'd get to see on there. I think the machine itself will have one of the best looking versions of the game because they've got more RAM [and] they're late in the cycle so they've got this really great processor.
~ Martell is chief creative officer at Gearbox and director.
 

Van Owen

Banned
Clockspeed low dosent mean cpu slow...

Thats what was taken out of context. Whats the hell is wrong with you?

You seem to have some problems processing that fact...

If there would be an anonymous dev saying: "720 is a weaksauce" it would be dismissed as fake! But hey, this is against Wii U so anonymous dev -> GOSPEL!!!


He said they had to come up with "creative ways to get around it". That doesn't scream "high performance at a low clock rate" to me.

If a reputable site had anonymous devs saying that about Durango, I would believe there was some truth to it, yes.
 
As far as I know, only Pitchford has said that. I'm not sure what he's comparing it to, and I haven't seen his game running on Wii U.

You would never believe an anonymous dev saying that.

How do you explain that AC3, wich is far more demanding on the CPU, (Tons of NPC onscreen at once) runs on Wii U aswell as Tekken Tag Turnament 2?
 
GHZ =/= performance dude

Seriously...An low clocked i5/i7 runs circles around a way higher clocked AMD FX

And Athlon 64 ran circles around Intels Netburst architecture (P4) while being clocked lower.

You really don´t miss ANY chance to just TROLL AWAY about Wii U stuff you don´t know, don´t you?

Just for what it's worth, there's a limit to how much clock frequency doesn't affect performance. A 3.2GHz Pentium 4 was most certainly not lower performing than a 2.0GHz Athlon XP or Athlon 64. At worst, it was slightly faster overall. And the P4 was known to be an architecture with particularly miserable IPC, having been architected with a ridiculous number of pipeline stages to eke out the fastest possible clock. The Xenon doesn't have that particular reputation.

2.0GHz sounds very low to me. Unless the GPU is insanely fast. Which we know probably isn't the case.*





That's only nine bucks per game! :D



* What we've heard is that it's probably comfortably faster than what's in existing consoles, but certainly not to a staggering degree.
 

Van Owen

Banned
You would never believe an anonymous dev saying that.

How do you explain that AC3, wich is far more demanding on the CPU, (Tons of NPC onscreen at once) runs on Wii U aswell as Tekken Tag Turnament 2?


Rendering multiple NPCs is more GPU intensive, I would think. It's not like they all have complex AI routines. AC3 was running at sub 30fps at e3 though.

So can someone explain the general consensus on the CPU to me? Because like a day ago everyone said the systems was probably designed so that the GPU would take off part of the load from the CPU. But now people are arguing that the CPU is really good?
 
idk about the Gamepad, but I'm sure the CCPro will be around the same price as the Dual Shock/ Xbox Wireless Controller

For reasons of clarity, I would like to point out that the "Classic Controller Pro" is a controller for the Wii, not the Wii U, which connects to a Wii Remote. The Wii U Pro Controller is what you're thinking about.



The Wii U will be a solid 3-4x more powerful than 360/PS3. It's modest but not as huge of an amount like the rumored specs of the Xbox 3/PS4.

A solid 3-4x as powerful as 360/PS3 would be on the very high end of what has been discussed on this forum, perhaps slightly above that range in fact.


here we go with nonsense multipliers again.

my foot is 7.334 times more powerful than my blanket.

I've always suspected that your blanket was last gen. Too colourful. Mine is brown.
 
2Ghz CPU and computer shader = enough to create a pixel-perfect shadow for you to argue with.

This ^^

Lol.

Gaf is a really weird forum imo. You can get banned for port begging but major trollage is fine.

Van Owen, if you're so dead set against the U and are convinced that it's going to be severely underpowered (despite leaks from certain good natured and thoughtful forum members saying the complete opposite) then why on earth do you keep posting here in this and any other U-related thread..? You seem to be obsessed and, seriously, that can't be healthy. :eek:/

You don't like the Wii U. We get it. You're not going to get one. We get it. So do us all a favour and please give it a rest, I normally have a great deal of patience with trolls but for some reason you're really getting under my skin lololol.
 
Rendering multiple NPCs is more GPU intensive, I would think. It's not like they all have complex AI routines. AC3 was running at sub 30fps at e3 though.

So can someone explain the general consensus on the CPU to me? Because like a day ago everyone said the systems was probably designed so that the GPU would take off part of the load from the CPU. But now people are arguing that the CPU is really good?

PS360 version didn´t run at 30 fps aswell

Collision detection for all NPCs for example...

I once stated that as a possibility for Wii U and now that amd jaguars are rumored to be in durango/orbis other people think those consoles will have a beefy gpu to make up for the fact that jaguars are low end apus and really nothing to orgasm over.
 
Rendering multiple NPCs is more GPU intensive, I would think. It's not like they all have complex AI routines. AC3 was running at sub 30fps at e3 though.

So can someone explain the general consensus on the CPU to me? Because like a day ago everyone said the systems was probably designed so that the GPU would take off part of the load from the CPU. But now people are arguing that the CPU is really good?

But none of that matters for the purpose of this discussion. Gearbox went on record and said it has a really great processor. No one has gone on record and said it's got a really shit processor. End of.

Edit; people aren't arguing that the CPU is really good, we are arguing that someone when on record to say it is really good, and no one has gone on record to say it is really bad.
 
Rendering multiple NPCs is more GPU intensive, I would think. It's not like they all have complex AI routines. AC3 was running at sub 30fps at e3 though.

So can someone explain the general consensus on the CPU to me? Because like a day ago everyone said the systems was probably designed so that the GPU would take off part of the load from the CPU. But now people are arguing that the CPU is really good?

I think some may argue the CPU is more modern, and may perform more efficiently in some areas. But as I understand it, the general gist is that the GPU would indeed take some of the load off the CPU.
 

Van Owen

Banned
You don't like the Wii U. We get it. You're not going to get one. We get it. So do us all a favour and please give it a rest, I normally have a great deal of patience with trolls but for some reason you're really getting under my skin lololol.

I don't dislike Wii U. Do I think it's super powerful? No. I'm trolling for discussing hardware in the hardware community thread? Ok...


PS360 version didn´t run at 30 fps aswell

Um, the demos I saw had it running at 30 or very close to it. Much smoother than Wii U, either way.


But none of that matters for the purpose of this discussion. Gearbox went on record and said it has a really great processor. No one has gone on record and said it's got a really shit processor. End of.

Randy said that, yes. Multiple others had said differently. I never called it "shit". Discuss?
 

AzaK

Member
SMT in the CPU became debatable apparently starting with the 2nd dev kit. Clock still not known.

A modified R700 that will have similarities to the E6760.

2Ghz CPU and computer shader = enough to create a pixel-perfect shadow for you to argue with.
3GHz, with SMT +compute would be better. From the rumours we're hearing about Wii U CPU it definitely sounds gimped.. Or should I say less than expected and possible. Will be interesting to see what it is and I hope Nintendo hasn't taken their heat/power usage to too much of an extreme.

You really don´t get it. 2GHZ Wii U Cpu could easily smoke Xbox 360 cpu even if that one is clocked 3.2 ghz because the architcture is way different and more modern.

GHZ DO NOT MEAN PERFORMANCE

But 2GHz with no SMT (if true) feels like it might be pushing it.
 

DrWong

Member
As said in the previous page I can see Nintendo Land as a retail/eshop game priced low - around 19.90/29.90 - with only a few levels per mini games and a strong DLC based model to expand the experience.
 
A solid 3-4x as powerful as 360/PS3 would be on the very high end of what has been discussed on this forum, perhaps slightly above that range in fact.

3-4x is pretty much what the gang here has speculated/confirmed with bgassassin, Ideaman & others.

The leaked dev kit specs from last year are at least on par with that as well.

The GPU & RAM of the Wii U seem to be the strong suits thus far for the system, with the CPU being out of order and not able to be used to the full in 360/PS3 ports.
 
I'm pretty certain that NintendoLand will be packed in, as will a remote plus and nunchuck.

And when Wii Sports U comes along they'd better have golf and bowling with online play!
 
3GHz, with SMT +compute would be better. From the rumours we're hearing about Wii U CPU it definitely sounds gimped.. Or should I say less than expected and possible. Will be interesting to see what it is and I hope Nintendo hasn't taken their heat/power usage to too much of an extreme.



But 2GHz with no SMT (if true) feels like it might be pushing it.

Katsuhiro Harada said that the clock speed is only "a little bit" slower than the Xbox 360.

Devs were probably expecing a 3.5GHz CPU but instead got something clocked at 2.8 to 3 GHz.

http://www.digitalspy.com/gaming/ne...compared-to-360-ps3-says-tekken-producer.html
 
I don't wish to take sides here, as it is really none of my business, but I think some of you are being a tad too harsh on Van. His comments ARE a little abrasive at times, but at least he sticks around for discussion afterwards. I'd much rather deal with someone like him than your average, drive-by, ignorant troll.

Anyways, I agree with Mihael and hope that the next thread deals more with the software aspect. By then we should hopefully have more info on the games themselves. Don't get me wrong, the hardware talk is fascinating, but some people tend to lose sight of what's really important.
 
Sorry. I don't think it's anything other than a very marginal leap over current gen.




So now every anonymous source in every facet of reporting is bullshit?

Can´t name another source?

Thats pretty weak you know?

As far as i know there was ONE anonymous source and only one. Not multiple...
 

Van Owen

Banned
Can´t name another source?

Thats pretty weak you know?

As far as i know there was ONE anonymous source and only one. Not multiple...

It's not like Harada really disputed the other story. He just said it was blown out of proportion.

There have definitely been multiple sources saying the overall system specs are very close to current gen.
 
3-4x is pretty much what the gang here has speculated/confirmed with bgassassin, Ideaman & others.

The leaked dev kit specs from last year are at least on par with that as well.

The GPU & RAM of the Wii U seem to be the strong suits thus far for the system, with the CPU being out of order and not able to be used to the full in 360/PS3 ports.

Yup. On paper we're looking at 3 times more powerful than the 360 going by what's been leaked so far - 3MB eDRAM for the CPU against 1MB, 32MB eDRAM on the GPU against 10MB, 2GB of RAM (with 512MB reserved for the OS) against 512MB.

Mix in a DSP and OoOE CPU and we should be looking at around 4 times more powerful in terms of real world performance by my reckoning.

What's most important is Nintendo's (surprising!) forward thinking in using a GPGPU and what appears to be a similar architecture to the PS4 and 720 meaning that the U should be able to provide up-ports and receive down-ports to and from the PS4 and 720, albeit with eye candy less impressive.

You've also got to factor in that, first party titles aside, the majority of PS4 and 720 titles during the first year of their lives aren't going to outshine U titles by that much due to the majority of development being done on underpowered/underclocked/unfinished dev kits. The U should have 2 years worth of eye candy being superior compared to competing consoles, and once that third year arrives it should be around halfway through its lifespan and developers, being more experienced in working with the hardware, should be able to squeeze more out of the box.

Nintendo have a huge advantage in being the first out of the gate next gen, something that did Microsoft a great deal of good this gen. They and Sony are fools to give them a head start, generational fatigue started setting in a year ago despite the adoption of motion controls breathing new life into this gen for a minority of people.
 
Just for what it's worth, there's a limit to how much clock frequency doesn't affect performance. A 3.2GHz Pentium 4 was most certainly not lower performing than a 2.0GHz Athlon XP or Athlon 64. At worst, it was slightly faster overall. And the P4 was known to be an architecture with particularly miserable IPC, having been architected with a ridiculous number of pipeline stages to eke out the fastest possible clock. The Xenon doesn't have that particular reputation.

Xenon has in-order execution while P4 has OoOE. From what I've seen, a Xenon core/thread is not better than a P4 when it comes to IPC, despite having the shorter pipeline.
Intel's Ivy Bridge puts out about 3 times as many IPC as a P4.
 
It's not like Harada really disputed the other story. He just said it was blown out of proportion.

We have a studio on record saying it's really great, not pretty good, not alright, not arse... Match that on record quote with something that says the opposite.

There have definitely been multiple sources saying the overall system specs are very close to current gen.

How interested are you in this argument? List them.
 
It's not like Harada really disputed the other story. He just said it was blown out of proportion.

There have definitely been multiple sources saying the overall system specs are very close to current gen.

"Very close" is very relative. 3-4x PS360 and "very close" may mean the same thing to some.

Have you guys considered that you may be arguing for the same thing from different perspectives?
 

Van Owen

Banned
We have a studio on record saying it's really great, not pretty good, not alright, not arse... Match that on record quote with something that says the opposite.

Harada saying they have to work around the CPU doesn't count?


How interested are you in this argument? List them.

http://kotaku.com/5920931/the-wii-us-power-problem
http://www.computerandvideogames.co...-as-good-as-ps3-but-its-still-not-as-capable/
http://www.gamesindustry.biz/articl...ess-powerful-than-ps3-xbox-360-developers-say
 

Van Owen

Banned
"Very close" is very relative. 3-4x PS360 and "very close" may mean the same thing to some.

Have you guys considered that you may be arguing for the same thing from different perspectives?

I can believe that that Wii U has 4x the RAM of 360. I don't believe what Wii U produces on screen will be 4x better looking than what 360 can produce.
 
Harada saying they have to work around the CPU doesn't count?

We've already addressed that for you mate.


Thanks.


I can believe that that Wii U has 4x the RAM of 360. I don't believe what Wii U produces on screen will be 4x better looking than what 360 can produce.

You are actually arguing with yourself mate!!! It's mad.
 
3-4x is pretty much what the gang here has speculated/confirmed with bgassassin, Ideaman & others.

The leaked dev kit specs from last year are at least on par with that as well.

The GPU & RAM of the Wii U seem to be the strong suits thus far for the system, with the CPU being out of order and not able to be used to the full in 360/PS3 ports.

IdeaMan said the visuals would be best described as "360++" simultaneously on the television set and GamePad, and he repeatedly tried to get people away from saying things like "Wii U is [#]X faster than PS360".

3-4x the performance would be like having a 1TFLOPS GPU if you're talking raw numbers or a much faster GPU if you're talking nebulous "apparent performance".

The Wii U GPU, as most people have been placing it, is more like 2-2.5x the performance of the Xenos according to the ranges that people like bgassassin has given.



Nobody thinks it's super powerful.

His point is that arguing that the performance will be along the low end of the argument continuum isn't trolling.



Xenon has in-order execution while P4 has OoOE. From what I've seen, a Xenon core/thread is not better than a P4 when it comes to IPC, despite having the shorter pipeline.
Intel's Ivy Bridge puts out about 3 times as many IPC as a P4.

I like you. I just wanted to say that this rebuke makes you one of the better posters here in my eyes.

I keep forgetting that it's in-order. It's interesting because OoO and in-order processors don't have peak performance numbers that differ, but the average performance of an OoO processor tends to be closer to the peak unless the compiler for the in-order processor is very good.

My followup question would be: How well tuned was the SDK and compiler for the in-order microarchitecture of the Xenos?



Lol, were're arguing Van Owen is only interested in trolling a negative view of the Wii U.

This comment seems closer to trolling than anything Van Owen has been posting today. Trolling isn't defined as arguing against the majority opinion.
 

AzaK

Member
I can believe that that Wii U has 4x the RAM of 360. I don't believe what Wii U produces on screen will be 4x better looking than what 360 can produce.

Thing is, whether you're talking about power or perceived results, multipliers mean almost nothing.

Is 4x a combo of sharper textures, better shadows, tesselation, good AA and nice Global Illumination or is it 1080p 60fps with twice as many objects on screen?

I can see Wii U doing the former, but not the latter. You could probably argue both of those are 4x in their own way.
 

Sid

Member
We've already addressed that for you mate.



Thanks.
Though one thing you have to keep in mind is that not a single developer having a wiiu devkit would publicly badmouth the wiiu and provide it negative press,even harada had to backtrack a bit....i'm not saying anything either way
 
Lol, were're arguing Van Owen is only interested in trolling a negative view of the Wii U.

I dunno, man. What I'm seeing here is the same old and tired WiiU hardware debate. And I don't see a whole lot of trolling, tbh. Just differences in opinion, and an unwillingness for either side to come to compromise which, I believe, is causing a whole lot of unnecessary grief.
 

Van Owen

Banned
Though one thing you have to keep in mind is that not a single developer having a wiiu devkit would publicly badmouth the wiiu and provide it negative press,even harada had to backtrack a bit....i'm not saying anything either way

This too. Likely everyone is still under NDA. Nobody on record is talking specifics in either direction, just generalities.
 
I can believe that that Wii U has 4x the RAM of 360. I don't believe what Wii U produces on screen will be 4x better looking than what 360 can produce.

Christ on a bike, you're priceless lol. 4 times more powerful does not mean games will be 4 times better looking. Who on earth gave you that impression..? lmfao. Anyone who even entertains that idea is more than a few sandwiches short of a picnic lololol.
 

Shokio

Neo Member
This is what the internet has come to? We're using anonymous unnamed sources to combat the on-record testimony of named developers? Oh LAWD what a world we live in.
 
Though one thing you have to keep in mind is that not a single developer having a wiiu devkit would publicly badmouth the wiiu and provide it negative press,even harada had to backtrack a bit....i'm not saying anything either way

This too. Likely everyone is still under NDA. Nobody on record is talking specifics in either direction, just generalities.

Yeah no. Some of them are quoted as "Tech insiders" and they still remain anonymous. Others say their games won't be on Wii U because of CPU, yet you have to prove to Nintendo you will develop games in order to qualify for a dev kit.

At the end of the day, they are anonymous. Anonymous is anonymous is anonymous.

This is what the internet has come to? We're using anonymous unnamed sources to combat the on-record testimony of named developers? Oh LAWD what a world we live in.

Shokio, get your bloody arse in here and lay it down lad ;-)
 
Status
Not open for further replies.
Top Bottom