bgassassin said:
... but wouldn't that be more work to do what sounds like an animation build from the ground up as opposed to just utilizing what you already have and tweaking it so to speak?
My ranting was particularly about how comments were made as opposed to arguing for one or the other speculation. For what it's worth I agree that it seems unlikely they built an animation playback component from scratch just for this demo. As I was mentioning earlier it seems quite possible they reused some of their existing animation and/or cut-scene scripting components from before. Depending on how the game engine was built, specifically if it is design with appropriate layers of hardware abstraction (e.g., OpenGL is such an abstraction layer), it is quite possible it could be quickly ported to the new hardware, especially if the same or similar abstractions are directly supported. This would make sense from the point of view of producing something quickly, that might not take advantage of new hardware features or be properly optimized for the new hardware architecture.
So I agree that they probably used their existing engine to produce this. But just because the demo used assets that are from (or maybe even only inspired by) TP doesn't mean anything about what engine it is running on. It certainly doesn't mean it's "_the_ TP engine" and such a thing might not even exist (as opposed to the EAD Group1 engine or the EAD engine).
bgassassin said:
Plus it was confirmed to be a tech demo so I'm not sure how being a non-engine based animation shows the power of the console.
It really depends on what you want to show. The Samaritan people refer to all the time most probably is just a canned animation running in real-time with kick-ass surface shaders, lots of geometry and what not. It's a graphical show case, highlighting what the GPU could possibly do. I doubt it is doing any game simulation (AI, input device processing... I would be surprised if it even did any data loading from storage as opposed to pre-loading everything onto the Video RAM). So is that demo representative of what real games will be able to do?
It would actually be difficult to fully characterize the "power" of the WiiU. Specs can tell you so much, but performance is dependent on many, many factors. E.g., if you access data in a nice pattern where you use neighboring data next, caches are going to do wonders for you. But if your data is spread all over the place, their usefulness could be limited.
bgassassin said:
Looking at the original versus what we saw at E3 makes it seem more plausible they used the engine as the foundation. Especially since Link is still right handed in the demo.
It's plausible that they used the same
assets as the foundation. Those are independent of the engine. For example you could use Unreal Engine and script the same animation using the actual TP assets and you'd conceivably get the same animation shown at E3. That doesn't make UE the TP engine.
bgassassin said:
Wow, I beat TP, but I totally forgot about this boss fight. I didn't remember it at all. It's as if I saw this for the first time. Too bad Skyward Sword is coming out soon or I'd go and replay TP right now. I wonder what else I don't remember about it. And before anyone says there's plenty of time before the release... I'd never make it in time or if I did I'd probably be a little burnt out from Zelda and it would diminish my enjoyment of SS (which imho is looking stellar!!)
Oh, right about the embedded DRAM question for HD rendering:
1080p = 1920x1080 = 2,073,600 pixels
2x1080p = 4,147,200 pixels
As far as I understood from the couple of MSAA articles that were first results on searches the color buffer is not multisampled, but the depth and stencil are. So for a minimum of just a plain color buffer (RGBA = 4B/pixel) with a depth buffer (4B/pixel):
1080p x 4B = 8,294,400B
2x1080p x 4B = 16,588,800B
thus a total of 24,883,200B so ~24MB. This would be a very "bare minimum" framebuffer.
I'd really like someone with a good idea of eDRAM to chime in with his take on how much eDRAM on something like an embedded system (WiiU) would be reasonable. Unless they resort to tiled-rendering (where chunks of the framebuffer are processed in sequence) and considering the amounts of memory being embedded in other chips, it seems folks should not get their hopes up for 1080p 2xMSAA.