Rumor: Wii U final specs

Can't we infer more from the performance of the games that are being initially ported to/developed for the system than whether it is an overclocked Wii CPU or a Power PC?

Launch titles where the developers have no middleware, little to no documentation and don't even know the final specs until halfway through development? Of course not.
 
Some clarifications to stop the circles, from someone who "knows nothing" (me)

1) It's not a 600gf GPU. It's less. But not current gen less.
2) The GPU7 architecture is firmly based on VLIW R700, but the final result is not a stock R700 as has been trumpeted over and over.
3) The power supply has been confirmed by eyewitnesses to have a 75w max output, and as all power supplies in the console space will not be running at a constant 100%.

Bonus note: "enhanced broadway" also doesn't mean "let's take our previous CPU, duct tape a few of them together, and call it a day" either. The CPUs are customized creatures as well, so "enhanced broadway" is about as accurate as "R700".
 
Well, according to Ideaman his sources have their port running on Wii u with AA, extra graphical bells and whistles, and double the framerate.

He won't say what game though...
 
Some clarifications to stop the circles, from someone who "knows nothing" (me)

1) It's not a 600gf GPU. It's less. But not current gen less.
2) The GPU7 architecture is firmly based on VLIW R700, but the final result is not a stock R700 as has been trumpeted over and over.
3) The power supply has been confirmed by eyewitnesses to have a 75w max output, and as all power supplies in the console space will not be running at a constant 100%.

Bonus note: "enhanced broadway" also doesn't mean "let's take our previous CPU, duct tape a few of them together, and call it a day" either. The CPUs are customized creatures as well, so "enhanced broadway" is about as accurate as "R700".

I assume by less you mean considerably and not say 25 less.
:(
 
3) The power supply has been confirmed by eyewitnesses to have a 75w max output, and as all power supplies in the console space will not be running at a constant 100%.
We've all seen the pictures already.

Tell your eyewitnesses to hook up a watt-meter.
 
Obviously we're talking custom silicon, but ballpark wise we're talking 4650-4670 range? TDP would seem to line up with that on a R700-style chip.
 
Well, according to Ideaman his sources have their port running on Wii u with AA, extra graphical bells and whistles, and double the framerate.

He won't say what game though...

well it does have 4X the memory & a GPGPU with maybe 2X the GFLOPS + 32MB of eDRAM so I'm sure it will out class the PS3 & Xbox 360 in a lot of games.
 
Do people really think MS and Sony are going to blow 160W of their power budget on the GPU?

didnt they do that with PS3 and xbox360? Plus they'll probably downclock it a little for that very reason. But we could easily get a 2.5tflop GPU in PS4/720 if they use a downclocked SI 8000 GPU!
 
didnt they do that with PS3 and xbox360?
Nope. Around half that. The rest was for the CPUs.
JohnnySasaki said:
Plus they'll probably downclock it a little for that very reason. But we could easily get a 2.5tflop GPU in PS4/720 if they use a downclocked SI 8000 GPU!
Smaller chips at high clocks tend to be cheaper than huge chips at slow clocks, even if aggregate performance comes out the same. Lots of tradeoffs to make there.

I don't think we'll see a full version of any high-end PC GPU in any console ever again. The PC market has become too crazy to still make easy transitions like that.
 
I should start ending all of my posts with four exclamation points!!!!

Seriously though, once the texture is in the ram, couldn't it be used on both? I don't see why the system would have to load duplicates of the same thing... running the engine twice I understand for diffferent angles/scenes, but textures?

Maybe I'm just not understanding how it works, but would you have to?
 
I should start ending all of my posts with four exclamation points!!!!

Seriously though, once the texture is in the ram, couldn't it be used on both? I don't see why the system would have to load duplicates of the same thing... running the engine twice I understand for diffferent angles/scenes, but textures?

Maybe I'm just not understanding how it works, but would you have to?
If the two screens are showing different views of the same scene then textures would be shared. If it's entirely different scenes then it could be entirely different textures. All of the games we've seen material from so far are of the first type.

ed: Wait, P100 has this in-doors mini-game on the pad while the main screen is showing the standard view. That could potentially use fairly different assets for the two views.
 
I don't know about what they'd say specifically but I'd say that whatever comes the closest to matching Sony's original spec (AMD HD7790 18CU~1152SPU) is the tippy top end for next gen.
Why?

As we know with the wiiU and the 360 specs often change this far out.

And even remember a thread a few weeks back that was stating that specs aren't final yet. Of course it was rumor but so is everything these days.
 
Why?

As we know with the wiiU and the 360 specs often change this far out.

And even remember a thread a few weeks back that was stating that specs aren't final yet. Of course it was rumor but so is everything these days.

Can I see a link to that rumor? :O.
 
Can I see a link to that rumor? :O.
I honestly only remember it in the back of my mind from browsing a little while back.

I literally read it and was like, oh that's interesting, wonder if that's true, and then moved on.

I very well could be going cuckoo from college, work and lack of sleep though and misread the thread.
 
If the two screens are showing different views of the same scene then textures would be shared. If it's entirely different scenes then it could be entirely different textures. All of the games we've seen material from so far are of the first type.

ed: Wait, P100 has this in-doors mini-game on the pad while the main screen is showing the standard view. That could potentially use fairly different assets for the two views.

Ok, gotcha, so Black Ops II local multiplayer would have dual texture loads?
 
Ok, gotcha, so Black Ops II local multiplayer would have dual texture loads?
No, BLOPS2 is a prime example where the assets will be shared between the two views as the entire map's assets are all pre-loaded.

ed: I see what may have misled you. A scene is a term which is more akin to 'a level' than to 'a camera view'. You can be sitting in a scene, looking all around and it's the same set of assets in memory, just some of them get clipped by the visibility checks.
 
Why?

As we know with the wiiU and the 360 specs often change this far out.

And even remember a thread a few weeks back that was stating that specs aren't final yet. Of course it was rumor but so is everything these days.

Dev kit specs change. Often. Don't look at dev kits for an indicator of what's going to be inside the console.

Target specs changing? No... Not so often. Not late in the game, anyway. WYSIWYG is their general rule of thumb.
 
No, BLOPS2 is a prime example where the assets will be shared between the two views as the entire map's assets are all pre-loaded.

Right, now I'm trying to think of a game that would use loads of textures on both screens... or better yet, why a game would. You can't really focus on two unrelated dynamic environments at the same time, unless you're playing a local co-op game in which both players have the full single player experience simultaneously, one on the gamepad, the other on the TV. That idea is probably better suited for a different thread though.

Edit:
ed: I see what may have misled you. A scene is a term which is more akin to 'a level' than to 'a camera view'. You can be sitting in a scene, looking all around and it's the same set of assets in memory, just some of them get clipped by the visibility checks.

Gotcha, for some reason I thought the textures dropped from the RAM during visibility checks. I am clearly not a dev, lol
 
What do you mean?

I didn't mean it as a hard limit so much as "that's a pretty damn awesome baseline to work with". It's like how when the Wii U speculation began, I had most of my eggs in the "performance similar to a RV740 with Nintendo-isms" basket which even now seems to hold pretty well.
 
Top Bottom