Wii U Speculation Thread The Third: Casting Dreams in The Castle of Miyamoto

Well, then you're still talking about different scenes aren't you? We could argue semantics, but this is not the different perspective game that IdeaMan was talking about. Whatever the case, that really doesn't yet represent twice the CPU workload either. The CPU does a lot more than preparing the graphics scene.

I'd applaud games that would do what you say, but I really think that most games will but some inventory/map/puzzle/HUD/whatever on the screen. Having two different 3D perspective will tax the hardware so I hope Nintendo overcompensates on fillrate in the Wii U GPU so that it can do it better. Saying it's a nose dive is exaggerating it though, it's not as if Nintendo doesn't know what they're doing. Also, we just established that a different viewport in the same scene does not tax the CPU at all.
 
Well, then you're still talking about different scenes aren't you? We could argue semantics, but this is not the different perspective game that IdeaMan was talking about. Whatever the case, that really doesn't yet represent twice the CPU workload either. The CPU does a lot more than preparing the graphics scene.

I'd applaud games that would do what you say, but I really think that most games will but some inventory/map/puzzle/HUD/whatever on the screen. Having two different 3D perspective will tax the hardware, so I hope Nintendo overcompensates on fillrate in the Wii U GPU so that it can do it better.

So do I.
 
Its an interesting step but it suggests to me that things arent as cozy in support land. Microsoft and Sony arent paying for all the dev tools for developers on their platforms

I disagree. We know things are cosy in support land due to the titles that have been announced so far from big publishers and developers. Imo, Nintendo are making efforts to encourage smaller, independent developers to not only develop for the U but also to give them the tools needed to produce software of a higher quality.

This is, again imo, an effort to raise the quality of U titles for sale on the eShop.
 
I don't think that will be the case at all. What you're talking about is pretty much like having split screen coop, except the other screen is on the controller.

If you have a separate high quality screen (padlet) then the hardware should be powerful enough to allow developers to use the extra screen to complete on their vision.

Here are some examples that I'd like to see:-

Rear view cam - racing/driving games.
Spy cams - first person shooters.
A periscope in a submarine game
A virtual x-ray machine, move it around to view thru walls.


etc...
 
I disagree. We know things are cosy in support land due to the titles that have been announced so far from big publishers and developers. Imo, Nintendo are making efforts to encourage smaller, independent developers to not only develop for the U but also to give them the tools needed to produce software of a higher quality.

This is, again imo, an effort to raise the quality of U titles for sale on the eShop.

Well said. Now if only Nintendo can get a deal with EPIC or CRYTEK to license their engines for 3rd party use in return for maybe $1 from each title developed with said engine.
 
Well, then you're still talking about different scenes aren't you? We could argue semantics, but this is not the different perspective game that IdeaMan was talking about. Whatever the case, that really doesn't yet represent twice the CPU workload either. The CPU does a lot more than preparing the graphics scene.

I'd applaud games that would do what you say, but I really think that most games will but some inventory/map/puzzle/HUD/whatever on the screen. Having two different 3D perspective will tax the hardware so I hope Nintendo overcompensates on fillrate in the Wii U GPU so that it can do it better. Saying it's a nose dive is exaggerating it though, it's not as if Nintendo doesn't know what they're doing. Also, we just established that a different viewport in the same scene does not tax the CPU at all.

I feel the same way. PS360 each have 4000 Mpixels/sec / 4 Gpixels/sec so I hope Nintendo goes for at least 8000 ~ 10,000 Mpixels/sec. It's very important that Wii U GPU has at least 16 ROPs.

I do realize fillrate is also dependent on clockspeed, but starting out with only 8 ROPs would mean Nintendo and AMD would have to clock the Wii U GPU at 1 GHz to double Xenos and RSX fillrate, which I don't see happening.
 
I disagree. We know things are cosy in support land due to the titles that have been announced so far from big publishers and developers. Imo, Nintendo are making efforts to encourage smaller, independent developers to not only develop for the U but also to give them the tools needed to produce software of a higher quality.

This is, again imo, an effort to raise the quality of U titles for sale on the eShop.
That seems to be the point. Activision or EA probably already have volume licenses for all the tools, and if they don't, compared to the overall budget of current projects, the license fees wouldn't really matter either way.
 
Really. If it's that simple. Where does the GPU get all the information from to render the scene. It has to be generated by the game engine which uses CPU.

I don't think the CPU handles the giving of items in memory to the GPU. I do believe (and perhaps some one with a bit more knowledge can correct me) that, stuff like that would be controlled by the Memory Controller. If I understand what you're trying to say and what people are arguing about. Stuff like the polygon models, the texture info and that kind of stuff would be stored in ram, and then sent to the GPU by the memory controller, not to the CPU and then to the GPU.

There are some things attributed to the graphics process the CPU would cover, like the polygon mesh deformations of characters as they move, but that's different. I don't think the process is for the GPU to go I can haz texture maps? and then the CPU to fetch them, and say ok here you go and send them to the GPU. I would imagine it's to the memory controller.

Hey I could be wrong though, like I said maybe Blu, or Brain Stew could chim in on it. I don't think the CPU is doing as much as you think it is.
 
Sadly that just won't happen. Those licenses cost far too much.

Not really. If lets say EPIC gets a $1 for every copy of a game that uses their engine it can more than cover the cost of licencing. Lets say game 'A' uses UE3 and ends up selling a million copies that right there is $1,000,000 for EPIC. The sheer volume of games that use the engine will offset the high licencing cost.
 
PAUL GALE! You promised info on a Wii U/3DS title...whatever happened to it!?
BLARGAHRGAHGRHRGH
iGwaF.jpg
 
Sega is restructuring and there will a "reduction of number of titles"

Here's the announcement in english

2. Descriptions of measures

(1) streamline organizations
We will streamline organization in the U.S. and Europe home video game software. This will
create a smaller company positioned for sustained profitability.

(2) Reduction of number of titles
We conducted detailed reviews of earnings projections for titles targeted toward the U.S. and
European markets and decided to narrow down sales titles from the following period and after to
strong IPs, such as “Sonic the Hedgehog,”, “Football Manager”, “Total War” and “Aliens” which
are expected to continue posting solid earnings. In accordance with this, we are canceling the
development of some game software titles.

So Aliens Colonial Marines for Wii U shouldn't be concerned. I hope that there wasn't a small/niche/exclusive and unheard of title for the Big N system planned that got canceled in the process.
 
Not really. If lets say EPIC gets a $1 for every copy of a game that uses their engine it can more than cover the cost of licencing. Lets say game 'A' uses UE3 and ends up selling a million copies that right there is $1,000,000 for EPIC. The sheer volume of games that use the engine will offset the high licencing cost.

Isn't UE3 free? I thought it was, with EPIC just billing for support?
 
Isn't UE3 free? I thought it was, with EPIC just billing for support?

No there are royalties from the revenue of the game developed that use Unreal Development Kit, if they exceed a small amount (every "ambitious" titles, supported by a rather big studio, with at least anecdotal sales, must pay royalties, so i guess it's better for them to buy directly the license rather than using the free version).

http://www.udk.com/licensing.html
 
I don't think the CPU handles the giving of items in memory to the GPU. I do believe (and perhaps some one with a bit more knowledge can correct me) that, stuff like that would be controlled by the Memory Controller. If I understand what you're trying to say and what people are arguing about. Stuff like the polygon models, the texture info and that kind of stuff would be stored in ram, and then sent to the GPU by the memory controller, not to the CPU and then to the GPU.

There are some things attributed to the graphics process the CPU would cover, like the polygon mesh deformations of characters as they move, but that's different. I don't think the process is for the GPU to go I can haz texture maps? and then the CPU to fetch them, and say ok here you go and send them to the GPU. I would imagine it's to the memory controller.

Hey I could be wrong though, like I said maybe Blu, or Brain Stew could chim in on it. I don't think the CPU is doing as much as you think it is.
Sending the same scene to another viewport can double the number of drawcalls (i.e. a single draw command and its corresponding state, as issued by the CPU). Depending on how many drawcalls a single instance of the scene constitutes and the specifics of a drawcall as per the GPU API/HAL, that might become an issue if an excessive numbers of drawcalls was reached. Normally, though, the drawcall count is one of the things developers keep a close tab on, but moreover, console API drawcalls tend to be many times more efficient than desktop API drawcalls - a repeat draw of the same scene on a console can be as cheap as instructing the GPU to re-execute these command buffers here, while pointing the GPU to a new set of uniforms (i.e. shader constant parameters). Anyway, I would be surprised if the garden demo resulted in a taxing number of drawcalls, particularly to a degree where the CPU/GPU interactions or the CPU alone would become stressed. But that's just my guess.
 
Wii U will be Nintendo's last console.

Feeding the troll...

Yeah, not likely. They are way too stable to make this their last home console, plus their handheld market will continue for years to come.

Hardware is where Nintendo are at yo.

Sony on the other hand...

Microsoft will never go under due to sheer investment.
 
Sound processing is often underestimated. We're talking realtime 3D surround mixing with up to several dozen channels, dozens or even hundreds of compressed samples, not to mention sample rate conversions and effects. The audio thread is typically the highest priority thread as well and needs privileged memory access, as stutter or other audio artifacts are usually far more noticable and distracting than framerate drops. Our ears are more unforgiving than our eyes.

It's actually crazy that some people underestimate the impact of a dedicated DSP. From beyond3D:

bkillian said:
On the XBox 360, audio mixing for a normal game can use as much as two full hardware threads, 1/3 of the CPU dedicated to audio. That's not even considering complex games like car racers, where each vehicle can have dozens of voices and complex filters. That's hugely wasteful in terms of cost. A general purpose CPU is just not optimised for audio processing.

The difference is that a CPU core costs you dollars, and a DSP core costs you pennies. That's why almost all mobile architectures have dedicated audio silicon. By far the most popular request my team gets from game devs is "Make audio cheaper!". Just running a single good reverb will completely blow the L1 and L2 cache, and require ridiculous amounts of memory bandwidth. We support 320 simultaneous voices on the 360 (that's how many the XMA hardware will decode at a time). AAA games use all of them. Even plants vs zombies uses over 100 and most of a core, _just for audio_. If we were to have the choice of adding a full core to the 360 or a DSP that can handle the same load. We'd probably choose the DSP, since it would be vastly cheaper in BOM and essentially give devs an extra 50% CPU for game logic.

In other words, I'm not sure how some developers could possibly be getting *less* out of the CPU, bar no optimization at all. It has triple the cache, is OoO, and at the very least has helper hardware to offload things like audio. Even at a lower clock, said CPU would produce better results.

Wii U will be Nintendo's last console.

If this is Nintendo's last console, it's everyone's last console. In the traditional sense of the word, at least.
 
Wii U will be Nintendo's last console.

In which case, I exit the world of console gaming and continue to work on my Nintendo collection. They're approaching 30 years of output in the home console space, and I'll be more than happy to live off of that gigantic library.

I won't lie.. to see the gaming industry taken-over by non-gaming-centric companies that are in it for ulterior motives? Companies that would leave gaming in a heartbeat if it served their long-term goals? It puts me off a bit.
 
I understand your viewpoint but if the visuals on the main screen are only 10% better then Xbox360 then as far as I'm concerned, it's on par. The fact that it's also rendering to the padlet doesn't matter.

One thing does concern. If the CPU has the same performance as the 360 (based on rumour), and it's working on scenes for both screen and padlet, I can see that in some instances, games will perform less well then on PS360.

The CPU won't have the same performance as the 360. Just because they both have 3 cores, 2 SMTs per core and perhaps even a similar clockspeed doesn't mean they'll perform equally. The U's CPU has several major advantages:

1) It has 3 times the amount of eDRAM

2) Multithreading in CPUs has progressed during the last 7 years

3) The U's CPU has OoOE rather than IoE, meaning it's a great deal more efficient



With regards to the GPU I can't see them releasing anything other than a 640 SPU chip clocked at less than 600Hz with a decent tessellation unit. Who knows, if they can manage to get the GPU on a 28nm process (which isn't beyond the realms of possibility) we could see them clocking it as high as 800Hz.


Memory I'm thinking they currently have 1.5GB of DDR3 with the eDRAM on the CPU and GPU taking care of any bottlenecking issues, and it wouldn't surprise me if they bung in an extra 512Mb at the last minute to make it 2GB...the 360 and 3DS both had last minute increases in RAM so I've got a feeling in my bones that we'll end up with 2GB when it's released.
 
The patent. If you go back and check it out, the I/O controller at least in the patent plays a rather significant role in Wii U. It also talked about the controller and console having hardware codecs. At least based on that, Nintendo seemed to do a lot keep the CPU and GPU free to do what it's primarily supposed to do.

A very "Nintendo-esque" hardware design trait
 
Top Bottom