Wii U Speculation Thread 2: Can't take anymore of this!!!

Status
Not open for further replies.
To say it clearer, in my opinion at GDC there will not be any WiiU stands at all.

They just need to build up the hype with any possible means, and in order to do so they won't lift the lid on it until they decide it's the right time and they can focus the widest public attention on it - and most probably it's going to be at E3 - not any sooner.
Nah, there'll probably be dozens of Wii U demo units at GDC, and not only at Nintendo's booth. But even though I'm probably beginning to sound like a broken record: GDC isn't a trade show. It is no event to build up hype for fans and show off your latest games.

There was very little press coverage in recent years, and there'll be even less now that there are no keynotes. As far as I understand, press isn't even allowed on the show floor this year unless they get invited by one of the exhibitors, and considering the focus, that probably mostly covers publications like Develop or Gamasutra, not the enthusiast press.
 
I don't know the exact type of memory. But some other infos could hint about it (like super fast cutting edge ? normal ? not as quick/efficient that it could/should be considering the setup ? (read: additional screen to handle, etc).

But it's in the specific context that i've talked about (third party, v4 dev kit, etc.). In Nintendo first-party studios, with a more optimized development framework (latest hardware and software dev kits), with engines tailored specifically for the Wii U, what i would have heard could have been different.

All in all, the hardware will be balanced and enough for what the Wii U will be on the technical side : not a powerhouse between 5x and 10x the xbox360 in paper, capable of running 2014 complex 3D engines at 1080p, 60fps (or even 30), max AA, etc, a generational leap as important as some older ones. But it will not be a xbox360 in disguise, a Nintendo make-up applied on current gen hd tech. And of course, the main interest is really the padlet and all the wonderful ideas of gameplay that can come with it.

And it's a matter of perception also, people who sticked with Nintendo consoles will really feel the difference if the few experiences on their friends PS3 were not enough to accustom their eyes to HD graphics. Hell, even in the case of "PC + Nintendo gamers (who don't use Dolphin)", Big N visual style is so distinct and not common on computer that even the power-users who upgrade their GPU every year will feel the leap on first-party titles (granted Nintendo will not release a modern war fps). They are far more used to see what western developers make of Epic/Crytek/XX engines, which are relatively related in terms of aesthetic results once on screen (and from that come all the complains that 80% of games look the same occidental/industrial/cold color/technical with sharp edges/plastic render/etc.), so Nintendo HD games will refresh their view, it will be a very noticeable step for them.

Cheers for the detailed reply.

I think the biggest problem has been deciphering which version of devkit a particular developer has.

eg: The latest V5 devkit could have GDDR5 where as the previous V4 devkit had GDDR3. (not saying this is the case!)

As exciting has it is to speculate about specs/power, all I really care about is what I see on my screen come E3. I'm really not worried at all.
 
Cheers for the detailed reply.

I think the biggest problem has been deciphering which version of devkit a particular developer has.

eg: The latest V5 devkit could have GDDR5 where as the previous V4 devkit had GDDR3. (not saying this is the case!)

As exciting has it is to speculate about specs/power, all I really care about is what I see on my screen come E3. I'm really not worried at all.

I've not said that, even if it's not impossible that in future dev kit, there will be a change in the memory department, i don't know.

When i say:

I don't know the exact type of memory. But some other infos could hint about it (like super fast cutting edge ? normal ? not as quick/efficient that it could/should be considering the setup ? (read: additional screen to handle, etc).

But it's in the specific context that i've talked about (third party, v4 dev kit, etc.). In Nintendo first-party studios, with a more optimized development framework (latest hardware and software dev kits), with engines tailored specifically for the Wii U, what i would have heard could have been different.

You have to read: "ok, from what i know, the 2x thing experienced by my sources could hint about the memory performances, with a bit of stretched thinking and speculation of course. But it's relevant to the v4 dev kit/SDK 2, third party, what they are doing on Wii U, what engine they use, etc." and "if i've heard my infos from sources working for Nintendo with a different context, it could lead me to think of a better state on the case of the Wii U memory".

Take that with big caution, because the improvements (not huge according to lherre) with the latest dev kit compare to the v4 ones may come from a specific hardware upgrade, a general small hardware upgrade, an improvement of the SDK, even an optimization of the engine tested, but i bet there is at least a small hardware upgrade.

To sum up: i don't know the type of memory used :p

Oh, and for "a nintendo centric website" guys, really, come to the board and read again what we have said, your articles have some inaccurate informations that your readers comment on. I don't want to meet a fellow gamer if i come to the USA who will throw tomatoes at me if i say i am IdeaMan on Gaf because he believed the Wii U had 7GB of ram.


On second thought, he can throw chocolates :D
 
I've not said that, even if it's not impossible that in future dev kit, there will be a change in the memory department, i don't know.

When i say:



You have to read: "ok, from what i know, the 2x thing experienced by my sources could hint about the memory performances, with a bit of stretched thinking and speculation of course. But it's relevant to the v4 dev kit/SDK 2, third party, what they are doing on Wii U, what engine they use, etc." and "if i've heard my infos from sources working for Nintendo with a different context, it could lead me to think of a better state on the case of the Wii U memory".

Take that with big caution, because the improvements (not huge according to lherre) with the latest dev kit compare to the v4 ones may come from a specific hardware upgrade, a general small hardware upgrade, an improvement of the SDK, etc.

To sum up: i don't know the type of memory used :p

Oh, and for "a nintendo centric website" guys (1 and 2), really, come to the board and read again what we have said, your articles have some inaccurate informations that your readers comment on. I don't want to meet a fellow gamer if i come to the USA who will throw tomatoes at me if i say i am IdeaMan on Gaf because he believed the Wii U had 7GB of ram.


On second thought, he can throw chocolates :D
Just an FIY, that particular site is banned here.
And yeah, the misconstrue what we say here a lot.
D:

Oh you!

Though thinking about it, Crytek were probably referencing the Valve PC-Console when talking about 8GB RAM.



Well, that really isn't fair at all, seeing as it's just a PC and not a console in the slightest.
 
I don't know the exact type of memory. But some other infos could hint about it (like super fast cutting edge ? normal ? not as quick/efficient that it could/should be considering the setup ? (read: additional screen to handle, etc).

But it's in the specific context that i've talked about (third party, v4 dev kit, etc.). In Nintendo first-party studios, with a more optimized development framework (latest hardware and software dev kits), with engines tailored specifically for the Wii U, what i would have heard could have been different.

All in all, the hardware will be balanced and enough for what the Wii U will be on the technical side : not a powerhouse between 5x and 10x the xbox360 in paper, capable of running 2014 complex 3D engines at 1080p, 60fps (or even 30), max AA, etc, a generational leap as important as some older ones. But it will not be a xbox360 in disguise, a Nintendo make-up applied on current gen hd tech. And of course, the main interest is really the padlet and all the wonderful ideas of gameplay that can come with it.

And it's a matter of perception also, people who sticked with Nintendo consoles will really feel the difference if the few experiences on their friends PS3 were not enough to accustom their eyes to HD graphics. Hell, even in the case of "PC + Nintendo gamers (who don't use Dolphin)", Big N visual style is so distinct and not common on computer that even the power-users who upgrade their GPU every year will feel the leap on first-party titles (granted Nintendo will not release a modern war fps). They are far more used to see what western developers make of Epic/Crytek/XX engines, which are relatively related in terms of aesthetic results once on screen (and from that come all the complains that 80% of games look the same occidental/industrial/cold color/technical with sharp edges/plastic render/etc.), so Nintendo HD games will refresh their view, it will be a very noticeable step for them.

So:
- It might have fast memory
- It might have ok memory
- It might have slow memory
- It's "balanced and enough for the Wii U"
- Nintendo has a better understanding about the arcitecture
- It's a matter of perception
- It's a noticable step up from Wii

In other words, you made a huge post about absolutely nothing. Why do people do this?
 
Just an FIY, that particular site is banned here.
And yeah, the misconstrue what we say here a lot.
D:





Well, that really isn't fair at all, seeing as it's just a PC and not a console in the slightest.


Really, i start to fear what some sites will say based on the few info i gave/will give after falling by chance on a "Wii U rumor" thread on a board i follow sometimes. I don't have the time to correct everyone so i'll stay on NeoGaf but phew, it's intimidating.

It's because of you little light-sabered pest ! It was you who said 8GB ! You owe me a Ninja Turtles pizza with pineapple slices ! ><
 
So:
- It might have fast memory
- It might have ok memory
- It might have slow memory
- It's "balanced and enough for the Wii U"
- Nintendo has a better understanding about the arcitecture
- It's a matter of perception
- It's a noticable step up from Wii

In other words, you made a huge post about absolutely nothing. Why do people do this?

I tried to write a more elaborate answer to Milkman than "i don't know the memory type" :D The context could be interesting also.
Oh and the "balanced and enough" part is always important. And the perception one is not very often seen here :p
 
Really, i start to fear what some sites will say based on the few info i gave/will give after falling by chance on a "Wii U rumor" thread on a board i follow sometimes. I don't have the time to correct everyone so i'll stay on NeoGaf but phew, it's intimidating.

It's because of you little light-sabered pest ! It was you who said 8GB ! You owe me a Ninja Turtles pizza with pineapple slices ! ><

Deal, next time your lost in the corn fields of Illinois, I'll show up with said pizza.
 
IdeaMan, I was more referencing the various contradictory articles on gaming websites.

I was guessing the reason for the differences in how much power it has. dev 1 saying "a bit more than 360 power", dev 2 "2 times 360 power" and dev 3 "4/5 times 360 power"

My simple logic put's this down to each dev having a different version of devkit.

dev 1= early devkit
dev 2= intermediate devkit
dev 3= latest devkit

I'm probably completely wrong on this but that is how my simple mind works...!

I really wasn't questioning the thing's you said. Sorry for any confusion:)
 
My simple logic put's this down to each dev having a different version of devkit.

The simple answer is trying to use a single number multiplication for the strength of hardware leads to extremely vague, unspecific results that don't accurately portray the real world performance of said hardware, especially when the people saying this things have widely varied sources, technical knowledge and information available.

The unfortunate thing about hardware is the only way to get accurate, concrete information on performance is to know the specifics of the components.
 
IdeaMan, I was more referencing the various contradictory articles on gaming websites.

I was guessing the reason for the differences in how much power it has. dev 1 saying "a bit more than 360 power", dev 2 "2 times 360 power" and dev 3 "4/5 times 360 power"

My simple logic put's this down to each dev having a different version of devkit.

dev 1= early devkit
dev 2= intermediate devkit
dev 3= latest devkit

I'm probably completely wrong on this but that is how my simple mind works...!

I really wasn't questioning the thing's you said. Sorry for any confusion:)

Oh no problem at all, it wasn't an interjection concerning you, i'm becoming paranoid now with what i said about other websites that are distorting what is discussed here, so i try to be very precise :) So no, i don't know the type of memory used and never said it was DDR3 or GDDR5 :p

For the dev kit, i expect third parties to have the V4 one, which have already benefited of a noticeable boost in power, and Nintendo and first-parties to work on the "V5" (it has a different code name apparently). I doubt that there are studios in the world who will be stuck with v3 ones. Nintendo surely have a specific way to manage the distribution of development kits, i don't see them considering studio X to be negligible and confine them to the use of a revision 1, if they are on board, they must have at least v4 dev kits by now.

If people thought about that because they hope that arkam (and maybe some other developers) statements that lead them to think the Wii U will be underpowered is a consequence of a v1 dev kit being used by its studio, then: Maybe its team had v3 dev kit at the time, and they have met less difficulty with the v4 one since then. But, what he said rely MUCH more on other parameters: what its studio was trying to do/achieve with the dev kit, what kind of use they have of the padlet, what is their approach of the graphical/engine part of their project on Wii U: if it's a port of an existing engine, at what time his impressions were made, when the engine was not optimized at all or not, etc, etc.
 
Like what?

Like how to create compelling software without dumbing down an experience (See: Dead Space Extraction and Resident Evil: UC).
Or how use art and atmosphere to overcome technical limitations.
Or how a stable game is preferable than a few shinys that people won't see.
 
Basically, each DRAM chip has a certain interface width. That can be either 16-bit or 32-bit, but nominally it's 32-bit. These chips are hooked up directly to the memory I/O controller on the GPU or CPU (console situation).

The 16-bit mode just allows one to use double the memory chips in the event that you want more memory, but the DRAM density isn't high enough. It was not uncommon for GPUs to have 16x 1Gbit GDDR5 in order to offer 2GB, but that's before the 2Gbit chips started showing up. Even then, 2Gbit is only starting to appear en masse. Anyways, that's just an example. Same thing happened for earlier generations.

So with that in mind, you have possibilities for memory configuration. For a console chip, the company is going to be mindful of the die size as well as a roadmap for future die reductions. The memory controller I/O width on the GPU or CPU is going to place a minimum die size restriction because that's one of those things that just doesn't shrink with process node.

You might think you can get away with a migration to a smaller bus whilst using higher speed memory in the future, but that's just added risk and QA for developers. Far simpler to keep everything as identical as possible when doing future redesigns.

Anyways, so you have a fixed memory bus width taking up a huge chunk of the chip perimeter, and also keep in mind they are likely going to have another huge chunk for the eDRAM interface.

Clearly, you don't want to have a huge perimeter necessitating a large chip throughout the lifetime of the console. That has a number of implications for cost/yield/power consumption (because you'd not be able to use a newer process node).

--------------

Where the WiiU is concerned, 128-bit is a good starting point, but that means either 4 or 8 DRAM chips. Space is clearly very limited on the motherboard.

People hoping for GDDR5 will be limited to using 2Gbit density since nothing higher is on the roadmap.

DDR3 is far cheaper and much lower power consuming FWIW, plus there are 4Gbit densities available at modest clocks. Four chips would be very good for conserving motherboard space since you can put two on each side.

Electrical signalling is a bit of a concern for higher clocks when you're packing the wires very tightly, so... that's just another design concern.

Anyways, to achieve 1.5GB of RAM you have two choices, the first of which is the most obvious: Add two more 2Gbit DRAMs to the above for 6 chips total. The 16/32-bit modes is where the 96-bit/192-bit theories come from. If you use 4Gbit DDR3, then you only need 3 chips, so that's... 96-bit.

The second choice isn't so obvious because you almost never hear about such a configuration, which is to use a mix of DRAM densities. Consider that for 1.5GB of RAM, you really need 12Gbit worth of DRAMs. This can be done with 4 chips (128-bit) - 2x4Gbit + 2x 2Gbit. To my knowledge only one PC SKU has ever done such a mix of chips (GTX 550, 192-bit, 1GB SKU).


--------------

PC memory modules are a different situation with memory addressing, allowing you to hook up to 16 DRAMs on a DIMM to a 64-bit channel to the CPU. I digress.

Thanks a bunch for explaining this better than I could.

Right now, I'm sticking by my 2 GB DDR3 prediction on a 64-bit bus. Yes, that would look bad on paper. Since that DDR3 would likely be clocked from 600-650 Mhz (also about where I would place GPU speed), that would match the rumor of "some things being worse than Xbox 360." But the bandwidth would be near 2x regardless, so performance would be better over all...in theory. :D
 
Like how to create compelling software without dumbing down an experience (See: Dead Space Extraction and Resident Evil: UC).
Or how use art and atmosphere to overcome technical limitations.
Or how a stable game is preferable than a few shinys that people won't see.

You should become a developer.
 
Thanks a bunch for explaining this better than I could.

Right now, I'm sticking by my 2 GB DDR3 prediction on a 64-bit bus. Yes, that would look bad on paper.
Since that DDR3 would likely be clocked from 600-650 Mhz (also about where I would place GPU speed), that would match the rumor of "some things being worse than Xbox 360." But the bandwidth would be near 2x regardless, so performance would be better over all...in theory. :D

No that's just bad period. You sure you meant to type 64-bit?
 
The simple answer is trying to use a single number multiplication for the strength of hardware leads to extremely vague, unspecific results that don't accurately portray the real world performance of said hardware, especially when the people saying this things have widely varied sources, technical knowledge and information available.

The unfortunate thing about hardware is the only way to get accurate, concrete information on performance is to know the specifics of the components.
And their interconnection, and their peak/average/minimal rate of communication, and, and, and.

See, the one issue I had with Arkam's 'slightly less than 360' (r) (tm) statement is that for a technically literate person (read a coder) to say that, the kit would have to be a verbatim copy of the 360, with just some parameter(s) slightly toned down. Of course later on Arkam himself said he was not technical staff, ergo we should not hold him accountable for such definitive statements, at which stage Arkam became yet another 'second-hand info' source. Which is perfectly fine, but sort of worn out by now.
 
And then we were toast.

*puts on glasses*

Warm buttery toast.

Cd2wk.jpg

You should become a developer.

Nah, but I could be a consultant.
:P
It just amazes me that such creative companies can make such uncreative decisions.
 
You need to aim high. So how would you change the Zelda franchise?

Honestly, it doesn't need a lot of changes.
People are like "OMG! COMPLETE OVERHAUL!"
But why? Then it's not Zelda anymore.
There's only two things it needs.

1. Less handholding. Take it back to Zelda 1 and a bit like LTTP. Just set you out on an adventure without a giant set up. Have you learn by doing.

2. More involved and intricate world. It's not worth saving if you don't care about it. Make me care about the world I'm saving. Games like Okami did this beautifully through memorable locations and characters.

Everything else from the dungeons and the items are pretty much fine. No need to change staples that are never used in other games, especially when they work.
 
It just amazes me that such creative companies can make such uncreative decisions.

Well put.
But I think this happens when you don't have the guts to follow your own vision of the game and go for some kind of stupid audience instead. In most studios, it seems like "creativity" is held hostage.
 
Status
Not open for further replies.
Top Bottom