• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Community Thread

Status
Not open for further replies.

Pineconn

Member
I have a feeling I'll never make a thread on NeoGAF, hah. I'd be far too terrified to do it.

Ah well, good thing I'm still just a junior! :D
 

Drago

Member
Alright, how's this for the OP?

Shin'en Multimedia talks Wii U (thread title)

http://www.superphillipcentral.com/2...d-linzner.html

SP: Do you currently have a Wii U devkit? If so, has the system exceeded your expectations?

ML: We have a few Wii U devkits since quite some time. We even almost finished our first Wii U game. The kits exceeded our expectations in every way and we still learn how to get the best performance out of it. A good thing about Wii U is that it's very easy to develop for. You don't have so many limitations like on the other current gen platforms. For us the Wii U is the perfect platform with enough horsepower for many years to come. I think we currently only tapped 20% of its potential and our first game already looks and plays brilliant.
SP: What kind of potential for new ideas, games, and developers/publishers do you see for Nintendo's digital platforms?

ML: Digital is of course the future. We love that Nintendo doesn't really put restrictions on the developers what they want to do in the eShop. For us as a small developer it's like a dream come true to be able to design whatever we want and to bring it quickly to the players.
More at the link. All of this sounds very good, if you ask me...
 

Pittree

Member
Hearing praise on the graphics side from Shin'en for the Wii U is always a good thing. I was hyped since the other interview from some weeks ago. (Here it is a quote for those who missed it.)

"For instance, we have a very action-heavy game with literally thousands of animated objects, but had no problems rendering the complete gameworld, twice, for the Wii U controller display in two-player mode."

By the way does anybody have an idea of what kind of game they are working? That "literally thousands of animated objects" sounds impressive coming from them.
 

Regginald

Banned
Shin'en puts effort into their Nintendo games though, you can't expect others to do the same.

Seriously I laughed when Super Meat Boy supposedly couldn't fit on WiiWare but Jett Rocket and FAST could.
 

Drago

Member
Shin'en puts effort into their Nintendo games though, you can't expect others to do the same.

Seriously I laughed when Super Meat Boy supposedly couldn't fit on WiiWare but Jett Rocket and FAST could.

Super Meat Boy was a very big game with large MP3 tracks. They didn't want to remove content/compress too many things so they just skipped it altogether.

Shin'en built their games around 40mb, Team Meat did not.
 

Drago

Member
Super Meat Boy was originally a WiiWare exclusive?

I believe it started as WiiWare with the intention of bringing it to other platforms.

Microsoft said yes to a release, Sony said no, and it wasn't possible on Wii. PC has very few restrictions like those, so it released there as well.

I don't know this for sure, though
 

japtor

Member
Let's see it. Uncompressed video, with uncompressed audio, running at full 1080p.
You vastly underestimate the size of uncompressed video:
http://en.wikipedia.org/wiki/Uncompressed_video#1080i_and_1080p_HDTV_uncompressed

1080i and 1080p HDTV uncompressed
8 bit @ 1920 x 1080 @ 24fps = 95 MB per/sec, or 334 GB per/hr.
10 bit @ 1920 x 1080 @ 24fps = 127 MB per/sec, or 445 GB per/hr.
8 bit @ 1920 x 1080 @ 25fps = 99 MB per/sec, or 348 GB per/hr.
10 bit @ 1920 x 1080 @ 25fps = 132 MB per/sec, or 463 GB per/hr.
8 bit @ 1920 x 1080 @ 29.97fps = 119 MB per/sec, or 417 GB per/hr.
10 bit @ 1920 x 1080 @ 29.97fps = 158 MB per/sec, or 556 GB per/hr.

1080i and 1080p HDTV RGB (4:4:4) uncompressed
10 bit @ 1280 x 720p @ 60fps = 211 MB per/sec, or 742 GB per/hr.
10 bit @ 1920 x 1080 @ 24fps = 190 MB per/sec, or 667 GB per/hr.
10 bit @ 1920 x 1080 @ 50i = 198 MB per/sec, or 695 GB per/hr.
10 bit @ 1920 x 1080 @ 60i = 237 MB per/sec, or 834 GB per/hr.
 
You know it'll happen, lol. Some dev will refuse to compress their 20 hours of cutscenes in any way at all. I wonder if Square Enix is up to the task. :p
Kojima somehow seems to fit the bill better.

FFXIII and FFXIII-2 aren't all that well encoded, even on PS3:



Look at the reds, certainly not H264 or VC-1 and not VBR.
 

AzaK

Member
To the bold yes when it comes to using compute abilities of a GPU. It's not a simple matter of porting a PS360 game over and then making what was designed to work on the CPU work on the GPU. I posted a little while ago this interview with Tim Sweeney from a few years ago.

http://www.tomshardware.com/news/Sweeney-Epic-GPU-GPGPU,8461.html

I never got around to reading that article and eventually forgot about it. But that's a very good indication about Wii U's emphasis on GPU computing.

Crazy though. Why not just give a good CPU along with it to give more options and ease porting. Especially when 720/PS4 will have both. It can't have really saved them much to underclock right?
 

Ryoku

Member
You vastly underestimate the size of uncompressed video:
http://en.wikipedia.org/wiki/Uncompressed_video#1080i_and_1080p_HDTV_uncompressed

1080i and 1080p HDTV uncompressed
8 bit @ 1920 x 1080 @ 24fps = 95 MB per/sec, or 334 GB per/hr.
10 bit @ 1920 x 1080 @ 24fps = 127 MB per/sec, or 445 GB per/hr.
8 bit @ 1920 x 1080 @ 25fps = 99 MB per/sec, or 348 GB per/hr.
10 bit @ 1920 x 1080 @ 25fps = 132 MB per/sec, or 463 GB per/hr.
8 bit @ 1920 x 1080 @ 29.97fps = 119 MB per/sec, or 417 GB per/hr.
10 bit @ 1920 x 1080 @ 29.97fps = 158 MB per/sec, or 556 GB per/hr.

1080i and 1080p HDTV RGB (4:4:4) uncompressed
10 bit @ 1280 x 720p @ 60fps = 211 MB per/sec, or 742 GB per/hr.
10 bit @ 1920 x 1080 @ 24fps = 190 MB per/sec, or 667 GB per/hr.
10 bit @ 1920 x 1080 @ 50i = 198 MB per/sec, or 695 GB per/hr.
10 bit @ 1920 x 1080 @ 60i = 237 MB per/sec, or 834 GB per/hr.

Lol I was kidding :p
It was a hyperbole :>
 

alfolla

Neo Member
Remember that comment yesterday from Katsuhiro Harada stating "it would be distracting that the Gamepad plays a big role with fighting games" well apparently he never said that, he said "difficult for fighting games" he's mad at Gamespot for changing his words, makes you wonder if the CPU comment is wrong too.

“Looking at the small screen [Wii U GamePad] and the big [TV] screen at the same time is pretty difficult for a fighting game. So we’re thinking of making it useful as a way of having shortcuts.”

“Or, by making progressing through the game more convenient. Or by playing alone on the GamePad screen.”

https://twitter.com/Harada_TEKKEN/status/222703388025032704

Also he said

https://twitter.com/Harada_TEKKEN/status/222711445886996480

"WiiU gets 'trolled' too much as it is. I like Namco on Nintendo platforms"
Last one was a re-tweet by Mr. Harada.
 

TheD

The Detective
Kojima somehow seems to fit the bill better.

FFXIII and FFXIII-2 aren't all that well encoded, even on PS3:



Look at the reds, certainly not H264 or VC-1 and not VBR.


The reason that the reds (and blues and greens) look low res is due to chroma subsampling, something that both VC-1 and H.264 support.
 

Donnie

Member
there is no gpgpu in the wiiu and that is why you don't see people talking about. The gpu in the wiiu can run gpgpu code but its terrible at running this code.

You might want to add here that you actually have no idea what features the GPU is capable of, just in case anybody makes the mistake if thinking you know what you're talking about.
 
holy fuck that's final fantasy xiii?!
looks more like killzone.
i actually thought it was killzone
The drone design is really similar, yeah.
The reason that the reds (and blues and greens) look low res is due to chroma subsampling, something that both VC-1 and H.264 support.
I know; but VC-1 and H.264 reduce it with variable block-size motion compensation, vbr and other features. (also 4:4:4 support in H.264 case)

FFXIII on PS3 seems like MPEG2 at really high bitrates, it's still a victim of the encoding technology (the X360 version on the other hand being a wreck due to the high bitrate not being there). Anyway all this because we were saying square-enix isn't willing to compress anything, they are. Kojima productions are the guys that legend says didn't use sound compression in MGS4; never dwelled too much into this though so I don't know for sure.


BTW, I saw some red subsampling artifacts on the Wii-U controller screen on the games I tested; definitely 4:2:0.
You might want to add here that you actually have no idea what features the GPU is capable of, just in case anybody makes the mistake if thinking you know what you're talking about.
Not to worry. Only Kotaku would use him as a source. :p
 
Which is worse waiting for the Wii U to release without touching it... OR playing the Wii U and knowing that you may not get to touch it again until you buy one.
 
Ahh, good spec and posts though the night, otherwise known as Heavy Repellent Lotion. Yesterday's Tekken statement is now just today's fish & chip paper. Get your WiiU code base built up and stop your moaning ;-)
 

10k

Banned
I just read the first page of the Miyamoto being compared to Steve Jobs thread ....FML

Facepalm.gif

Edit: I'm thinking of switching my avatar to my Mario assassin that Nintendo dispatches :). It's a quick gimp edit. What do you guys think?
 

Nibel

Member
- 21.11.12: Rise of the Guardians (EUR 54.98)
- 29.11.12: Ben10 Omniverse (EUR 54.98)
- 29.11.12: F1 Race Stars (EUR 54.98)
- 29.11.12: Ninja Gaiden 3: Razor's Edge (EUR 59.98)
- 29.11.12: Tekken Tag Tournament 2 (EUR 54.98)
- 30.11.12: Warriors Orochi 3 (EUR 59.98)
- 30.11.12: Sonic & All-Star Racing Transformed (EUR 59.98)
- 30.12.12: Sport Island (EUR 59.98)
- 15.02.13: PES 13 (EUR 59.98)

Source

Wii U to be released at November 21st?
 

wsippel

Banned
- 21.11.12: Rise of the Guardians (EUR 54.98)
- 29.11.12: Ben10 Omniverse (EUR 54.98)
- 29.11.12: F1 Race Stars (EUR 54.98)
- 29.11.12: Ninja Gaiden 3: Razor's Edge (EUR 59.98)
- 29.11.12: Tekken Tag Tournament 2 (EUR 54.98)
- 30.11.12: Warriors Orochi 3 (EUR 59.98)
- 30.11.12: Sonic & All-Star Racing Transformed (EUR 59.98)
- 30.12.12: Sport Island (EUR 59.98)
- 15.02.13: PES 13 (EUR 59.98)

Source

Wii U to be released at November 21st?

In europe!? That would be a christmas miracle.

I would have thought 12 december in europe and maybe 21 november in the US and Japan.
 

Stewox

Banned
You vastly underestimate the size of uncompressed video:
http://en.wikipedia.org/wiki/Uncompressed_video#1080i_and_1080p_HDTV_uncompressed

1080i and 1080p HDTV uncompressed
8 bit @ 1920 x 1080 @ 24fps = 95 MB per/sec, or 334 GB per/hr.
10 bit @ 1920 x 1080 @ 24fps = 127 MB per/sec, or 445 GB per/hr.
8 bit @ 1920 x 1080 @ 25fps = 99 MB per/sec, or 348 GB per/hr.
10 bit @ 1920 x 1080 @ 25fps = 132 MB per/sec, or 463 GB per/hr.
8 bit @ 1920 x 1080 @ 29.97fps = 119 MB per/sec, or 417 GB per/hr.
10 bit @ 1920 x 1080 @ 29.97fps = 158 MB per/sec, or 556 GB per/hr.

1080i and 1080p HDTV RGB (4:4:4) uncompressed
10 bit @ 1280 x 720p @ 60fps = 211 MB per/sec, or 742 GB per/hr.
10 bit @ 1920 x 1080 @ 24fps = 190 MB per/sec, or 667 GB per/hr.
10 bit @ 1920 x 1080 @ 50i = 198 MB per/sec, or 695 GB per/hr.
10 bit @ 1920 x 1080 @ 60i = 237 MB per/sec, or 834 GB per/hr.



These are based on industry standards, they do not represent anything but those. The 1080p is a HDTV standard, format and it's quality is what it matters, and it's nothing special, TV tech far behind PC monitors.

However they show uncompressed size so this may not apply directly. We do not exactly know which one of these resolution standards, but be sure that there's definitely such bitrate restriction to maintain propereties of the selected resolution, it's just used for LCD manufacturing as a business approach, manufacturing custom size LCDs is more expensive than just ordering what's already being pumped out. I believe the rumor was q-HD , some even go as far as 720p but ... the pixel density is large, it should look fine even when less than q-HD. However they would really need those extra pixels to accomodate the screen switch, it may not look great otherwise when a big screen is shrinked, that's when the technology comes in, It may just be true that the game software and GPU actually produce the separate output rendered in DRC-native resolution. Well ... I think they call it Eyefinity, scaling is history.

I just hope the WiiU DRC has adjustable resolution by developers, and some kind of standby mode when it's not outputting any heavy game visuals from console so extra power for the main screen, well when there's map and menus there should not be much going on.
 

Pineconn

Member
November 21 sounds about right. I wonder if they're obligated to avoid the release of Halo 4 and Black Ops II. I'd assume GameStop prefers to stagger their huge releases.

You vastly underestimate the size of uncompressed video:
http://en.wikipedia.org/wiki/Uncompressed_video#1080i_and_1080p_HDTV_uncompressed

1080i and 1080p HDTV uncompressed
8 bit @ 1920 x 1080 @ 24fps = 95 MB per/sec, or 334 GB per/hr.
10 bit @ 1920 x 1080 @ 24fps = 127 MB per/sec, or 445 GB per/hr.
8 bit @ 1920 x 1080 @ 25fps = 99 MB per/sec, or 348 GB per/hr.
10 bit @ 1920 x 1080 @ 25fps = 132 MB per/sec, or 463 GB per/hr.
8 bit @ 1920 x 1080 @ 29.97fps = 119 MB per/sec, or 417 GB per/hr.
10 bit @ 1920 x 1080 @ 29.97fps = 158 MB per/sec, or 556 GB per/hr.

1080i and 1080p HDTV RGB (4:4:4) uncompressed
10 bit @ 1280 x 720p @ 60fps = 211 MB per/sec, or 742 GB per/hr.
10 bit @ 1920 x 1080 @ 24fps = 190 MB per/sec, or 667 GB per/hr.
10 bit @ 1920 x 1080 @ 50i = 198 MB per/sec, or 695 GB per/hr.
10 bit @ 1920 x 1080 @ 60i = 237 MB per/sec, or 834 GB per/hr.

Holy crap, and that's without audio? .....Well, good thing for compression, haha. Dem codecs.
 

AzaK

Member
Even a completely bog standard R700 is by no means terrible at GPGPU. It's not well suited for everything, but perfectly fine for physics or pathfinding. Examples (all on R700): Havok Cloth OpenCL, AMD March of the Froblins demo (GPGPU pathfinding/ AI with ~3000 entities).

Related question. Any idea what order of magnitude performance increases we typically look at when moving code from CPU to GPGPU? Taking into account the length of a piece of string and all of course.
 

DrWong

Member
I think all these multiplier means nothing for Tomorrow Corporation, an indie studio which just announced Little Inferno for Wii U coming this winter.

From GI >
The next game from the makers of World of Goo is a game about children burning their toys in order to stay warm.

There is a happy song playing, and the children seem perfectly fine with tossing their favorite toys in the flames, but there seems to be a terrifying undertone to the whole teaser. We're still not sure exactly what the game is about, but it seems dark, despite the raging fire. The game will be available this winter on Wii U, PC, Mac, and Linux.
 

alfolla

Neo Member
To the bold yes when it comes to using compute abilities of a GPU. It's not a simple matter of porting a PS360 game over and then making what was designed to work on the CPU work on the GPU. I posted a little while ago this interview with Tim Sweeney from a few years ago.

http://www.tomshardware.com/news/Sweeney-Epic-GPU-GPGPU,8461.html



I never got around to reading that article and eventually forgot about it. But that's a very good indication about Wii U's emphasis on GPU computing.

I don't catch it.
Is it a GPU or CPU based demo?
 
Regarding game budgets: How common is the $100M game anyway? Games like Uncharted and Gears are made on budgets of ~$20M from what I recall...

Regarding the discussion about POWER7: I actually don't think IBM has ever explicitly stated anything about "POWER7" or used the term "POWER7", or even "Watson technology" through any official release.

That was something of an editorialisation by Engadget I'd say.

IBM's official press release regarding their CPU for Wii U only states the following:
IBM (NYSE: IBM) today announced that it will provide the microprocessors that will serve as the heart of the new Wii U™ system from Nintendo. Unveiled today at the E3 trade show, Nintendo plans for its new console to hit store shelves in 2012.

The all-new, Power-based microprocessor will pack some of IBM's most advanced technology into an energy-saving silicon package that will power Nintendo's brand new entertainment experience for consumers worldwide. IBM's unique embedded DRAM, for example, is capable of feeding the multi-core processor large chunks of data to make for a smooth entertainment experience.

IBM's state-of-the-art 300mm chip plant in East Fishkill, N.Y., will be the manufacturing facility for the new game chip the company is building for Nintendo's new game console due to hit store shelves in 2012.

IBM plans to produce millions of chips for Nintendo featuring IBM Silicon on Insulator (SOI) technology at 45 nanometers (45 billionths of a meter). The custom-designed chips will be made at IBM's state-of-the-art 300mm semiconductor development and manufacturing facility in East Fishkill, N.Y.

The relationship between IBM and Nintendo dates to May 1999, when IBM was selected to design and manufacture the central microprocessor for the Nintendo GameCube™ system. Since 2006, IBM has shipped more than 90 million chips for Nintendo Wii systems.

"IBM has been a terrific partner for many years. We truly value IBM's commitment to support Nintendo in delivering an entirely new kind of gaming and entertainment experience for consumers around the world," said Genyo Takeda, Senior Managing Director, Integrated Research and Development, at Nintendo Co., Ltd.

"We're very proud to have delivered to Nintendo consistent technology advancements for three generations of entertainment consoles," said Elmer Corbin, director, IBM's custom chip business. "Our relationship with Nintendo underscores our unique position in the industry -- how we work together with clients to help them leverage IBM technology, intellectual property and research to drive innovation into their own core products."

Built on the open, scalable Power Architecture base, IBM custom processors exploit the performance and power advantages of proven silicon-on-insulator (SOI) technology. The inherent advantages of the technology make it a superior choice for performance-driven applications that demand exceptional, power-efficient processing capability – from entertainment consoles to supercomputers.
 
R

Rösti

Unconfirmed Member
I got this reply from Nintendo of America today, regarding why there will be no Wii U demo kiosks at Comic-Con:

Hello,

Thank you for taking the time to write to us with your comments regarding Nintendo at Comic-Con. I can certainly understand why you would like to see the Wii U at our booth. However, it is important to note that the Wii U is still in development. Additionally, I do not have an answer as to why there will be no Wii U at Comic-Con.

Having said that, I want to assure you that your comments will be added to our records for Comic-Con and made available for other departments at the company to use as they see fit. As we get closer to the system's launch date, you can count on more information being revealed at our website (www.nintendo.com). Stay tuned!

Sincerely,

Curtis Neal
Nintendo of America Inc.
He didn't even mention the Wii U Experience event taking place at the Nintendo Gaming Lounge in the San Diego Ballroom at the Marriott Marquis & Marina, at the same dates as Comic-Con. Sure, that event is invitation only as far as I know, but why not mention it as a way to get more, not necessarily new, information about the console? The New York event provided some new media at least.
 
Regarding game budgets: How common is the $100M game anyway? Games like Uncharted and Gears are made on budgets of ~$20M from what I recall...
Uncharted? No wai, specially the first one when they had to include technology R&D. Gears of War 1 was said to be a 10 million game, but you have to take into account they're not including tech/engine R&D (or promotion costs) it's a poster child with special conditions.

Most games are costing more than 20 million, approaching 30 million when in multiplatform development.

As for 100 million, it's certainly possible to get there, specially when you had to develop your tech and assets from the ground and it's your first game using it/them. Top range dev costs last gen were 30 million, for such games, with the increase in development time, number of developers and complexity of the tech/work of course the end price can't stay in the same range.
Regarding the discussion about POWER7: I actually don't think IBM has ever explicitly stated anything about "POWER7" or used the term "POWER7", or even "Watson technology" through any official release.

That was something of an editorialisation by Engadget I'd say.
This, yeah.
 

HylianTom

Banned
I think all these multiplier means nothing for Tomorrow Corporation, an indie studio which just announced Little Inferno for Wii U coming this winter.

From GI >
I am positively giddy about this. Another WoG-esque game for the Wii? Bring it!
And as far as my soundtrack expectations go: my iPod is ready.

File this under: Pleasant Surprises

This is turning out to be a rather positive week for the U thus far. I like!

My money is on November 18th or maybe the 25th in the States. Nintendo loves Sundays.
I'm sticking to my expectation of the 18th. The 25th seems a bit late in the shopping season; Nintendo wouldn't want to miss out on Black Friday shopping volumes.

Still wishing for October.. such a perfect month.. football, pumpkins, tolerable weather..
 

ozfunghi

Member
"The Wii U, you know, it's a really great little system. Nintendo has packed in a great CPU, and we're doing some interesting things with the GPU. I could see some very interesting things being done on the Wii U hardware!"

Unless people are giving concrete numbers or clear, technical details, PR dribble should be taken as just that. It's all relative to how the devs see the hardware and their expectations, and also relative to how much they want to sell the game. You can put a positive, interesting spin on any hardware package, just as you can put a negative one.

Generally, unless I'm listening to a dev talk from GDC, reading design documents, or getting technical specifications, I don't listen to what devs have to say about hardware when on the topic of unreleased games. They might hint and tease and give a vague generalisation of what to expect, but they will almost always put a positive spin on whatever they have to say.

Not much point saying "Oh yeah, the Wii U CPU is kinda crappy but the game runs fine, so please buy Colonial Marines anyway!".


Because those are the only options? "Yeah it's a great CPU" or "the CPU sucks"? Why not just shut up about the CPU and focus on the GPU or the amount of RAM... features where the difference with current gen are actually worthwhile? The comparison with Wii doesn't stick because there was basically nothing positive to be said about the Wii hardware. So if you wanted to spin it, you would be lying either way. This is not the case with WiiU. If you want to spin it, just say the GPU has a lot of new features the current gen doesn't have and the amount of RAM allows for much greater texture detail etc. No need to start working yourself in trouble with shady remarks that could bite you in the ass regarding the CPU.
 
Status
Not open for further replies.
Top Bottom