bgassassin
Member
I'll make a thread, I guess... it'll be my first one so could some one help me along in making it?
Nope. You won't learn if you don't throw yourself in the fire.
I'll make a thread, I guess... it'll be my first one so could some one help me along in making it?
Shin'en Multimedia talks Wii U (thread title)
http://www.superphillipcentral.com/2...d-linzner.html
SP: Do you currently have a Wii U devkit? If so, has the system exceeded your expectations?
ML: We have a few Wii U devkits since quite some time. We even almost finished our first Wii U game. The kits exceeded our expectations in every way and we still learn how to get the best performance out of it. A good thing about Wii U is that it's very easy to develop for. You don't have so many limitations like on the other current gen platforms. For us the Wii U is the perfect platform with enough horsepower for many years to come. I think we currently only tapped 20% of its potential and our first game already looks and plays brilliant.More at the link. All of this sounds very good, if you ask me...SP: What kind of potential for new ideas, games, and developers/publishers do you see for Nintendo's digital platforms?
ML: Digital is of course the future. We love that Nintendo doesn't really put restrictions on the developers what they want to do in the eShop. For us as a small developer it's like a dream come true to be able to design whatever we want and to bring it quickly to the players.
Alright, how's this for the OP?
I don't know. Go find out.
"For instance, we have a very action-heavy game with literally thousands of animated objects, but had no problems rendering the complete gameworld, twice, for the Wii U controller display in two-player mode."
ughh ok posted.
i really hope it doesn't backfire :[
It will go better than my first thread that's for sure.
http://www.neogaf.com/forum/showthread.php?t=141155
I think that was my first one. That's as far back as it went.
Shin'en puts effort into their Nintendo games though, you can't expect others to do the same.
Seriously I laughed when Super Meat Boy supposedly couldn't fit on WiiWare but Jett Rocket and FAST could.
Well, it already has more replies
Super Meat Boy was originally a WiiWare exclusive?
You vastly underestimate the size of uncompressed video:Let's see it. Uncompressed video, with uncompressed audio, running at full 1080p.
Congrats on your first thread Drago.
Kojima somehow seems to fit the bill better.You know it'll happen, lol. Some dev will refuse to compress their 20 hours of cutscenes in any way at all. I wonder if Square Enix is up to the task.
Heh, thanks.
well I'm gonna go turn in for the night... hope the thread doesn't go awry while I'm gone.
The 20% comment was also interesting. I'd like to see where they're at so far based on that comment.
To the bold yes when it comes to using compute abilities of a GPU. It's not a simple matter of porting a PS360 game over and then making what was designed to work on the CPU work on the GPU. I posted a little while ago this interview with Tim Sweeney from a few years ago.
http://www.tomshardware.com/news/Sweeney-Epic-GPU-GPGPU,8461.html
I never got around to reading that article and eventually forgot about it. But that's a very good indication about Wii U's emphasis on GPU computing.
Kojima somehow seems to fit the bill better.
FFXIII and FFXIII-2 aren't all that well encoded, even on PS3:
[]http://images.eurogamer.net/articles//a/9/9/4/0/4/1/BadCG_001.jpg.jpg[/IMG]
Look at the reds, certainly not H264 or VC-1 and not VBR.
holy fuck that's final fantasy xiii?!
how the fuck could they follow up ffxii with that.
You vastly underestimate the size of uncompressed video:
http://en.wikipedia.org/wiki/Uncompressed_video#1080i_and_1080p_HDTV_uncompressed
1080i and 1080p HDTV uncompressed
8 bit @ 1920 x 1080 @ 24fps = 95 MB per/sec, or 334 GB per/hr.
10 bit @ 1920 x 1080 @ 24fps = 127 MB per/sec, or 445 GB per/hr.
8 bit @ 1920 x 1080 @ 25fps = 99 MB per/sec, or 348 GB per/hr.
10 bit @ 1920 x 1080 @ 25fps = 132 MB per/sec, or 463 GB per/hr.
8 bit @ 1920 x 1080 @ 29.97fps = 119 MB per/sec, or 417 GB per/hr.
10 bit @ 1920 x 1080 @ 29.97fps = 158 MB per/sec, or 556 GB per/hr.
1080i and 1080p HDTV RGB (4:4:4) uncompressed
10 bit @ 1280 x 720p @ 60fps = 211 MB per/sec, or 742 GB per/hr.
10 bit @ 1920 x 1080 @ 24fps = 190 MB per/sec, or 667 GB per/hr.
10 bit @ 1920 x 1080 @ 50i = 198 MB per/sec, or 695 GB per/hr.
10 bit @ 1920 x 1080 @ 60i = 237 MB per/sec, or 834 GB per/hr.
looks more like killzone.
i actually thought it was killzone
Last one was a re-tweet by Mr. Harada.Remember that comment yesterday from Katsuhiro Harada stating "it would be distracting that the Gamepad plays a big role with fighting games" well apparently he never said that, he said "difficult for fighting games" he's mad at Gamespot for changing his words, makes you wonder if the CPU comment is wrong too.
Looking at the small screen [Wii U GamePad] and the big [TV] screen at the same time is pretty difficult for a fighting game. So were thinking of making it useful as a way of having shortcuts.
Or, by making progressing through the game more convenient. Or by playing alone on the GamePad screen.
https://twitter.com/Harada_TEKKEN/status/222703388025032704
Also he said
https://twitter.com/Harada_TEKKEN/status/222711445886996480
"WiiU gets 'trolled' too much as it is. I like Namco on Nintendo platforms"
Kojima somehow seems to fit the bill better.
FFXIII and FFXIII-2 aren't all that well encoded, even on PS3:
Look at the reds, certainly not H264 or VC-1 and not VBR.
there is no gpgpu in the wiiu and that is why you don't see people talking about. The gpu in the wiiu can run gpgpu code but its terrible at running this code.
The drone design is really similar, yeah.i actually thought it was killzonelooks more like killzone.holy fuck that's final fantasy xiii?!
I know; but VC-1 and H.264 reduce it with variable block-size motion compensation, vbr and other features. (also 4:4:4 support in H.264 case)The reason that the reds (and blues and greens) look low res is due to chroma subsampling, something that both VC-1 and H.264 support.
Not to worry. Only Kotaku would use him as a source.You might want to add here that you actually have no idea what features the GPU is capable of, just in case anybody makes the mistake if thinking you know what you're talking about.
My birthday.Wii U to be released at November 21st?
Even a completely bog standard R700 is by no means terrible at GPGPU. It's not well suited for everything, but perfectly fine for physics or pathfinding. Examples (all on R700): Havok Cloth OpenCL, AMD March of the Froblins demo (GPGPU pathfinding/ AI with ~3000 entities).there is no gpgpu in the wiiu and that is why you don't see people talking about. The gpu in the wiiu can run gpgpu code but its terrible at running this code.
- 21.11.12: Rise of the Guardians (EUR 54.98)
- 29.11.12: Ben10 Omniverse (EUR 54.98)
- 29.11.12: F1 Race Stars (EUR 54.98)
- 29.11.12: Ninja Gaiden 3: Razor's Edge (EUR 59.98)
- 29.11.12: Tekken Tag Tournament 2 (EUR 54.98)
- 30.11.12: Warriors Orochi 3 (EUR 59.98)
- 30.11.12: Sonic & All-Star Racing Transformed (EUR 59.98)
- 30.12.12: Sport Island (EUR 59.98)
- 15.02.13: PES 13 (EUR 59.98)
Source
Wii U to be released at November 21st?
You vastly underestimate the size of uncompressed video:
http://en.wikipedia.org/wiki/Uncompressed_video#1080i_and_1080p_HDTV_uncompressed
1080i and 1080p HDTV uncompressed
8 bit @ 1920 x 1080 @ 24fps = 95 MB per/sec, or 334 GB per/hr.
10 bit @ 1920 x 1080 @ 24fps = 127 MB per/sec, or 445 GB per/hr.
8 bit @ 1920 x 1080 @ 25fps = 99 MB per/sec, or 348 GB per/hr.
10 bit @ 1920 x 1080 @ 25fps = 132 MB per/sec, or 463 GB per/hr.
8 bit @ 1920 x 1080 @ 29.97fps = 119 MB per/sec, or 417 GB per/hr.
10 bit @ 1920 x 1080 @ 29.97fps = 158 MB per/sec, or 556 GB per/hr.
1080i and 1080p HDTV RGB (4:4:4) uncompressed
10 bit @ 1280 x 720p @ 60fps = 211 MB per/sec, or 742 GB per/hr.
10 bit @ 1920 x 1080 @ 24fps = 190 MB per/sec, or 667 GB per/hr.
10 bit @ 1920 x 1080 @ 50i = 198 MB per/sec, or 695 GB per/hr.
10 bit @ 1920 x 1080 @ 60i = 237 MB per/sec, or 834 GB per/hr.
I'd say it's two days early: software/hardware releases are mostly fridays right?Wii U to be released at November 21st?
You vastly underestimate the size of uncompressed video:
http://en.wikipedia.org/wiki/Uncompressed_video#1080i_and_1080p_HDTV_uncompressed
1080i and 1080p HDTV uncompressed
8 bit @ 1920 x 1080 @ 24fps = 95 MB per/sec, or 334 GB per/hr.
10 bit @ 1920 x 1080 @ 24fps = 127 MB per/sec, or 445 GB per/hr.
8 bit @ 1920 x 1080 @ 25fps = 99 MB per/sec, or 348 GB per/hr.
10 bit @ 1920 x 1080 @ 25fps = 132 MB per/sec, or 463 GB per/hr.
8 bit @ 1920 x 1080 @ 29.97fps = 119 MB per/sec, or 417 GB per/hr.
10 bit @ 1920 x 1080 @ 29.97fps = 158 MB per/sec, or 556 GB per/hr.
1080i and 1080p HDTV RGB (4:4:4) uncompressed
10 bit @ 1280 x 720p @ 60fps = 211 MB per/sec, or 742 GB per/hr.
10 bit @ 1920 x 1080 @ 24fps = 190 MB per/sec, or 667 GB per/hr.
10 bit @ 1920 x 1080 @ 50i = 198 MB per/sec, or 695 GB per/hr.
10 bit @ 1920 x 1080 @ 60i = 237 MB per/sec, or 834 GB per/hr.
Even a completely bog standard R700 is by no means terrible at GPGPU. It's not well suited for everything, but perfectly fine for physics or pathfinding. Examples (all on R700): Havok Cloth OpenCL, AMD March of the Froblins demo (GPGPU pathfinding/ AI with ~3000 entities).
The next game from the makers of World of Goo is a game about children burning their toys in order to stay warm.
There is a happy song playing, and the children seem perfectly fine with tossing their favorite toys in the flames, but there seems to be a terrifying undertone to the whole teaser. We're still not sure exactly what the game is about, but it seems dark, despite the raging fire. The game will be available this winter on Wii U, PC, Mac, and Linux.
To the bold yes when it comes to using compute abilities of a GPU. It's not a simple matter of porting a PS360 game over and then making what was designed to work on the CPU work on the GPU. I posted a little while ago this interview with Tim Sweeney from a few years ago.
http://www.tomshardware.com/news/Sweeney-Epic-GPU-GPGPU,8461.html
I never got around to reading that article and eventually forgot about it. But that's a very good indication about Wii U's emphasis on GPU computing.
IBM (NYSE: IBM) today announced that it will provide the microprocessors that will serve as the heart of the new Wii U system from Nintendo. Unveiled today at the E3 trade show, Nintendo plans for its new console to hit store shelves in 2012.
The all-new, Power-based microprocessor will pack some of IBM's most advanced technology into an energy-saving silicon package that will power Nintendo's brand new entertainment experience for consumers worldwide. IBM's unique embedded DRAM, for example, is capable of feeding the multi-core processor large chunks of data to make for a smooth entertainment experience.
IBM's state-of-the-art 300mm chip plant in East Fishkill, N.Y., will be the manufacturing facility for the new game chip the company is building for Nintendo's new game console due to hit store shelves in 2012.
IBM plans to produce millions of chips for Nintendo featuring IBM Silicon on Insulator (SOI) technology at 45 nanometers (45 billionths of a meter). The custom-designed chips will be made at IBM's state-of-the-art 300mm semiconductor development and manufacturing facility in East Fishkill, N.Y.
The relationship between IBM and Nintendo dates to May 1999, when IBM was selected to design and manufacture the central microprocessor for the Nintendo GameCube system. Since 2006, IBM has shipped more than 90 million chips for Nintendo Wii systems.
"IBM has been a terrific partner for many years. We truly value IBM's commitment to support Nintendo in delivering an entirely new kind of gaming and entertainment experience for consumers around the world," said Genyo Takeda, Senior Managing Director, Integrated Research and Development, at Nintendo Co., Ltd.
"We're very proud to have delivered to Nintendo consistent technology advancements for three generations of entertainment consoles," said Elmer Corbin, director, IBM's custom chip business. "Our relationship with Nintendo underscores our unique position in the industry -- how we work together with clients to help them leverage IBM technology, intellectual property and research to drive innovation into their own core products."
Built on the open, scalable Power Architecture base, IBM custom processors exploit the performance and power advantages of proven silicon-on-insulator (SOI) technology. The inherent advantages of the technology make it a superior choice for performance-driven applications that demand exceptional, power-efficient processing capability from entertainment consoles to supercomputers.
He didn't even mention the Wii U Experience event taking place at the Nintendo Gaming Lounge in the San Diego Ballroom at the Marriott Marquis & Marina, at the same dates as Comic-Con. Sure, that event is invitation only as far as I know, but why not mention it as a way to get more, not necessarily new, information about the console? The New York event provided some new media at least.Hello,
Thank you for taking the time to write to us with your comments regarding Nintendo at Comic-Con. I can certainly understand why you would like to see the Wii U at our booth. However, it is important to note that the Wii U is still in development. Additionally, I do not have an answer as to why there will be no Wii U at Comic-Con.
Having said that, I want to assure you that your comments will be added to our records for Comic-Con and made available for other departments at the company to use as they see fit. As we get closer to the system's launch date, you can count on more information being revealed at our website (www.nintendo.com). Stay tuned!
Sincerely,
Curtis Neal
Nintendo of America Inc.
Uncharted? No wai, specially the first one when they had to include technology R&D. Gears of War 1 was said to be a 10 million game, but you have to take into account they're not including tech/engine R&D (or promotion costs) it's a poster child with special conditions.Regarding game budgets: How common is the $100M game anyway? Games like Uncharted and Gears are made on budgets of ~$20M from what I recall...
This, yeah.Regarding the discussion about POWER7: I actually don't think IBM has ever explicitly stated anything about "POWER7" or used the term "POWER7", or even "Watson technology" through any official release.
That was something of an editorialisation by Engadget I'd say.
Haha, you got the digits mixed up. I'm 24 years old as of last month welcome to the boards oh Wise Sage of Nintendo
I'm guessing he was talking about the meaning of life, not your age xD
Wii U to be released at November 21st?
I am positively giddy about this. Another WoG-esque game for the Wii? Bring it!I think all these multiplier means nothing for Tomorrow Corporation, an indie studio which just announced Little Inferno for Wii U coming this winter.
From GI >
I'm sticking to my expectation of the 18th. The 25th seems a bit late in the shopping season; Nintendo wouldn't want to miss out on Black Friday shopping volumes.My money is on November 18th or maybe the 25th in the States. Nintendo loves Sundays.
"The Wii U, you know, it's a really great little system. Nintendo has packed in a great CPU, and we're doing some interesting things with the GPU. I could see some very interesting things being done on the Wii U hardware!"
Unless people are giving concrete numbers or clear, technical details, PR dribble should be taken as just that. It's all relative to how the devs see the hardware and their expectations, and also relative to how much they want to sell the game. You can put a positive, interesting spin on any hardware package, just as you can put a negative one.
Generally, unless I'm listening to a dev talk from GDC, reading design documents, or getting technical specifications, I don't listen to what devs have to say about hardware when on the topic of unreleased games. They might hint and tease and give a vague generalisation of what to expect, but they will almost always put a positive spin on whatever they have to say.
Not much point saying "Oh yeah, the Wii U CPU is kinda crappy but the game runs fine, so please buy Colonial Marines anyway!".