Wii U Speculation Thread of Brains Beware: Wii U Re-Unveiling At E3 2012

Status
Not open for further replies.
Yes, but hopefully they go with an APU style chip like Xbox 360's Slim, that should allow it to hit even higher wattage, as you would have more room to cool the single chip, instead of having 2 heaters to cool. Which would also make sense with them not having parts cool enough to run yet, and having down clocked dev kits to hit the performance of what they can currently fit in a Wii U unit. (since they wouldn't be combo units yet)
 
Nightbringer said:
About the GPU I am starting to think that is possible that the Mobility Radeon HD 5650 is the the GPU, the reasons are:

*The 450Mhz has a performance of 360 GFLOPS, 50% more than the Xbox 360 and PS3.
*104mm^2 at 40nm, the size is around the same one of Flipper and Hollywood entire chips.
*15W of power consumption, ideal for the type of box that Nintendo is using for the console.

What do you think?

I don't know if they will use exactly this kind of card as a base, but I am sure as hell that it will have something between 320 and 480 shader units @500-600 MHz (so between 320 and 580 GFLOPS)

You also have to keep in mind that most of Flippers/Hollywoods die size was eaten by the 1T-SRAM IIRC.
 
z0m3le said:
Yes, but hopefully they go with an APU style chip like Xbox 360's Slim, that should allow it to hit even higher wattage, as you would have more room to cool the single chip, instead of having 2 heaters to cool. Which would also make sense with them not having parts cool enough to run yet, and having down clocked dev kits to hit the performance of what they can currently fit in a Wii U unit. (since they wouldn't be combo units yet)
Nintendo seems to go with a proven process and should therefore have great yields, so die level integration might be possible. That also has some nice additional benefits the XCPU can't even use, like much lower latencies and enormous bandwidths. And it decreases the overall transistor count. With good yields and due to less complex cooling, it should be cheaper overall, while also offering a better performance.
 
wsippel said:
Nintendo seems to go with a proven process and should therefore have great yields, so die level integration might be possible. That also has some nice additional benefits the XCPU can't even use, like much lower latencies and enormous bandwidths. And it decreases the overall transistor count. With good yields and due to less complex cooling, it should be cheaper overall, while also offering a better performance.

Really hoping for this, it is sort of getting my hopes up, but I've been a gamer for a long time, and can deal with a little Nintendo disappointment every 5 years or so. As much as I love my 360, I will be Wii U only next gen, unless third parties decide to ignore Nintendo's console again.

THIRD PARTIES: Gamers don't buy boxes, they buy games. If you make em, they will come. - I bought my Xbox for Halo2, all my friends bought their Xbox 360's for Gears of War, I bought it for Blue Dragon and Mass Effect.

disap.ed said:
I don't know if they will use exactly this kind of card as a base, but I am sure as hell that it will have something between 320 and 480 shader units @500-600 MHz (so between 320 and 580 GFLOPS)

You also have to keep in mind that most of Flippers/Hollywoods die size was eaten by the 1T-SRAM IIRC.

I don't understand how you can "know" this for a fact?

Most rumors point to R770, which I believe doesn't go below 640 shader units, the HD4850 for example has 800 shader units, hits 1000Gflops and is a chip @ 55nm. I think they are probably using a HIGHLY modified chip but it's performance target is probably between the HD4730 and HD4850.
 
z0m3le said:
I don't understand how you can "know" this for a fact?

Most rumors point to R770, which I believe doesn't go below 640 shader units, the HD4850 for example has 800 shader units, hits 1000Gflops and is a chip @ 55nm. I think they are probably using a HIGHLY modified chip but it's performance target is probably between the HD4730 and HD4850.

I know shit, it is my guess because of the power draw of these cards. No card rated lower than 40-50 Watt has more than 480 shader units (maybe there are mobile cards which I haven't checked), so this is my conclusion. No more, no less.
 
disap.ed said:
I know shit, it is my guess because of the power draw of these cards. No card rated lower than 40-50 Watt has more than 480 shader units (maybe there are mobile cards which I haven't checked), so this is my conclusion. No more, no less.

Nice I never made the comparison, but that info is also not going to lead you to a solid case, as GPUs are built for PC's that don't worry about power requirements very much, they don't care about power usage below 100 watts.

Mobile GPUs do in fact worry about power usage, and the HD4000m series has 800 shaders, though I don't know the TDP of them, they can pack far more shaders into a 50watt TDP, the HD6900m series for instance has a chip at 50w TDP with 960 shaders, and the 5800m series goes up to 800 shaders inside 50watts, some as low as 35 watts with 800 shaders can be found.

Wii U will use a Modified GPU, that will most likely cut some corners for power consumption. AMD did the same thing for the 360, and Nintendo has a group working with AMD to hit just where Nintendo is targeting their power to be.
 
So just had a look at some newer notebook chips and there is in fact one card that would be quite nice, the Mobility Radeon 6830M:

800 Shader units / 40 texture units / 16 ROPS @ 575MHz => 24W ... Wow!

http://www.notebookcheck.net/AMD-Radeon-HD-6830M.43739.0.html

The 6870M is only clocked higher (@675 MHz) and already uses 50W, so clock rate will very likely be under 600W (Maybe even lower than XB360s 500 MHz, Hollywood's clock x 2 would be 486MHz)

So I have to apologize for my former post, it seems it would be quite possible to have 640 - 800 shader units. But we still have to keep in mind that we are speaking about Nintendo, so cost is always a matter and also that they most probably will again use embedded 1T-SRAM (if they want to implement Wii's 27MB of 1T-SRAM this will also be around 200 million transistors I guess)
 
disap.ed said:
So I have to apologize for my former post, it seems it would be quite possible to have 640 - 800 shader units. But we still have to keep in mind that we are speaking about Nintendo, so cost is always a matter. And we have to keep embedded RAM in mind.

High power laptops sound worse than a launch 360 and cost like 3 Wii UUs with little profit margin. Very small fans in those cases
 
disap.ed said:
So just had a look at some newer notebook chips and there is in fact one card that would be quite nice, the Mobility Radeon 6830M:

800 Shader units / 40 texture units / 16 ROPS @ 575MHz => 24W ... Wow!

1 billion transistors these are @40nm.

The 6870M is only clocked higher (@675 MHz) and already uses 50W, so clock rate will very likely be under 600W.

So I have to apologize for my former post, it seems it would be quite possible to have 640 - 800 shader units. But we still have to keep in mind that we are speaking about Nintendo, so cost is always a matter. And we have to keep embedded RAM in mind.

No worries about apologizing, we are all getting a little nuts waiting for more info, that's why I have committed to only talking about Wii U in this one thread.

I'm thinking inside of an APU, they can make it a pretty big chip and cool it well, I wonder if you could share the embedded ram between the GPU and CPU in this case? is it possible? (I know the 360Slim doesn't do this, but that was more a compatibility issue right?) or would it be better to go with a llano type set up, where the CPU uses it's L2 and L3 cache, and the GPU uses it's own cache.

If they can share a large L3 cache say 32MB, then wouldn't that reduce size and TDP?
 
Luckyman said:
High power laptops sound worse than a launch 360 and cost like 3 Wii UUs with little profit margin. Very small fans in those cases

But we are talking about low power GPUs not high power laptops!
 
The wattage/temperature numbers for R770 cards don't matter since they'll be using a smaller process than 55nm for the GPU. They'll be able to cut those numbers down considerably, especially if they can do a 32nm process.
 
disap.ed said:
So just had a look at some newer notebook chips and there is in fact one card that would be quite nice, the Mobility Radeon 6830M:

800 Shader units / 40 texture units / 16 ROPS @ 575MHz => 24W ... Wow!

http://www.notebookcheck.net/AMD-Radeon-HD-6830M.43739.0.html

The 6870M is only clocked higher (@675 MHz) and already uses 50W, so clock rate will very likely be under 600W (Maybe even lower than XB360s 500 MHz, Hollywood's clock x 2 would be 486MHz)

So I have to apologize for my former post, it seems it would be quite possible to have 640 - 800 shader units. But we still have to keep in mind that we are speaking about Nintendo, so cost is always a matter and also that they most probably will again use embedded 1T-SRAM (if they want to implement Wii's 27MB of 1T-SRAM this will also be around 200 million transistors I guess)
You have nothing to apologize for, since there's a major flaw with considering notebook chips. The 6970M is basically an underclocked desktop 6850. The 6850 costs about $160. However, the 6970M costs over $400. A big part of this is the fact that mobile parts are usually much higher binned than desktop parts. Higher binning means lower yields and higher prices. So, there's pretty much zero chance that Nintendo will use a notebook chip.

That said, there's also a flaw in logic involving the desktop cards. That flaw, funny enough, is that they are cards. The TDP and measured power consumption applies no just to the GPU, but everything else on the card, most notably the memory. To make this even worse, TDP stands for "Thermal Design Power," so it's more related to heat than power consumption usually, but not all the time, and it's pretty much always inflated due to the fact that no two chips are ever exactly the same (which is of course why binning exists). In other words, this line of reasoning really doesn't get anywhere in the end.

Long story short: PC tech is fucking confusing.
 
BurntPork said:
You have nothing to apologize for, since there's a major flaw with considering notebook chips. The 6970M is basically an underclocked desktop 6850. The 6850 costs about $160. However, the 6970M costs over $400. A big part of this is the fact that mobile parts are usually much higher binned than desktop parts. Higher binning means lower yields and higher prices. So, there's pretty much zero chance that Nintendo will use a notebook chip.

That said, there's also a flaw in logic involving the desktop cards. That flaw, funny enough, is that they are cards. The TDP and measured power consumption applies no just to the GPU, but everything else on the card, most notably the memory. To make this even worse, TDP stands for "Thermal Design Power," so it's more related to heat than power consumption usually, but not all the time, and it's pretty much always inflated due to the fact that no two chips are ever exactly the same (which is of course why binning exists). In other words, this line of reasoning really doesn't get anywhere in the end.

Long story short: PC tech is fucking confusing.

I believe his point was that you couldn't fit a GPU with more than 480 shaders inside 50watt TDP, and has found out that AMD produces a GPU with 960 shaders @ 50w TDP.

Again, Wii U won't be using an off the shelf part, it will be a custom part, built around the R700 series GPUs, it is completely possible that they are working around getting the most power under a certain wattage with this card, and price of any GPU drops by up to half, when buying in large bulk... For reference the Wii's GPU cost about $40 a piece in bulk, Wii U's could in fact be just as pricey as a 6970m, though I doubt it would be based on a mobile part, but might borrow some power saving design choices.
 
z0m3le said:
http://wii.ign.com/articles/117/1178879p1.html

Interesting, Ninja Gaiden Wii U is 30% complete:

"'We're looking forward to a merging of Ninja Gaiden 3 gameplay and visuals with Dragon Sword touch commands,' Garza told GiantBomb, noting that because development was early, anything could change. Garza also noted that while Dragon Sword wasn't particularly violent on the DS, potential Wii U owners had nothing to worry about. Prepare for a blood bath."

I'm adding this to the OP. There seemed to be a lot of praise about that control method so my interested has piqued.

z0m3le said:
I don't understand how you can "know" this for a fact?

Most rumors point to R770, which I believe doesn't go below 640 shader units, the HD4850 for example has 800 shader units, hits 1000Gflops and is a chip @ 55nm. I think they are probably using a HIGHLY modified chip but it's performance target is probably between the HD4730 and HD4850.

According to the rumored specs given to wsippel, it sounds like the current dev kits have an underclocked 4830.
 
z0m3le said:
I believe his point was that you couldn't fit a GPU with more than 480 shaders inside 50watt TDP, and has found out that AMD produces a GPU with 960 shaders @ 50w TDP.

Again, Wii U won't be using an off the shelf part, it will be a custom part, built around the R700 series GPUs, it is completely possible that they are working around getting the most power under a certain wattage with this card, and price of any GPU drops by up to half, when buying in large bulk... For reference the Wii's GPU cost about $40 a piece in bulk, Wii U's could in fact be just as pricey as a 6970m, though I doubt it would be based on a mobile part, but might borrow some power saving design choices.
Price is only a small part of the issue. The real concern is yields. Yields are also why 28nm isn't even being brought up anymore, despite the fact that a 28nm 4870 would work out very well.
 
BurntPork said:
Price is only a small part of the issue. The real concern is yields. Yields are also why 28nm isn't even being brought up anymore, despite the fact that a 28nm 4870 would work out very well.

What does the 28nm process have to do with a 40nm 6970m chip?

Also just a quick note, e350 is a 32nm process, and Llano that releases any day now is a 28nm chip. I really guess I just don't understand what you are trying to say.
 
From EDGE magazine site:

EDGE: Did you make any assumptions about Nintendo’s new hardware before Wii U was announced? Did it tally?
Mark Rein: Oh, wow. A really sticky situation. We were in the enviable position of not having to make assumptions, let’s just say that. I’m pretty impressed with the Wii U. It looks like a great device and I think it’ll do really well.
 
Hi guys, I don't know if have been asked and/or answered yet but are there any news about the transfer of your wiiware/virtual console games to the wii U?
 
antonz said:
I doubt patching will be an issue that Nintendo actively limits. High Voltage software developed a whole patching system for the Wii and Nintendo doesnt try to put up roadblocks.

I can understand the frsutration of being told 4mb is what you can work with on a patch though
MS do NOT have that 4MB limit anymore. It's been gone for at least 8 months now. With that in mind, I wouldn't really believe anything Gustav Halling has to say.
 
z0m3le said:
What does the 28nm process have to do with a 40nm 6970m chip?

Also just a quick note, e350 is a 32nm process, and Llano that releases any day now is a 28nm chip. I really guess I just don't understand what you are trying to say.

All Llano chips are 32nm.
 
Maxrunner said:
From EDGE magazine site:

EDGE: Did you make any assumptions about Nintendo’s new hardware before Wii U was announced? Did it tally?
Mark Rein: Oh, wow. A really sticky situation. We were in the enviable position of not having to make assumptions, let’s just say that. I’m pretty impressed with the Wii U. It looks like a great device and I think it’ll do really well.
I still feel like I'm in bazaaro world.
 
Maxrunner said:
From EDGE magazine site:

EDGE: Did you make any assumptions about Nintendo’s new hardware before Wii U was announced? Did it tally?
Mark Rein: Oh, wow. A really sticky situation. We were in the enviable position of not having to make assumptions, let’s just say that. I’m pretty impressed with the Wii U. It looks like a great device and I think it’ll do really well.
Hey thanks a pretty encouraging quote from Mark Rein and it confirms that they had a dev kit early. Hopefully this means Epic is working on something for the wii u. Do you have a link for this quote though?
 
McHuj said:
All Llano chips are 32nm.

Thanks for the correction.


I love that Mark Rein is so into this new box, I think he realizes that PS360 will be around for a few more years, and figures that Wii U will be able to show off new features of his engine.
 
Bulzeeb said:
Hi guys, I don't know if have been asked and/or answered yet but are there any news about the transfer of your wiiware/virtual console games to the wii U?

As I understand it, the DRM for Wii is much more archaic than DSi, so maybe we won't get any transfer at all and just a voucher for new downloads on WiiU. (wouldn't surprise me if we did just get a voucher based on our clubnintendo history)
 
artwalknoon said:
Hey thanks a pretty encouraging quote from Mark Rein and it confirms that they had a dev kit early.

I doubt they had the input, but I would have loved for them to have had some input on how the Wii U hardware came about.

I think they were pretty instrumental in the Xbox360 success with their push of 512Mb vs 256 Mb of RAM.
 
McHuj said:
I doubt they had the input, but I would have loved for them to have had some input on how the Wii U hardware came about.

I think they were pretty instrumental in the Xbox360 success with their push of 512Mb vs 256 Mb of RAM.

Both the 3DS and the GC had 3rd party input on hardware...why would you doubt Epic, probably the largest middleware provider in the industry today, had some level of input into the development of Nintendo's new system?
 
McHuj said:
I doubt they had the input, but I would have loved for them to have had some input on how the Wii U hardware came about.

I think they were pretty instrumental in the Xbox360 success with their push of 512Mb vs 256 Mb of RAM.
Yeah, given their past relationship with Nintendo, I don't think they have much influence. Capcom on the other hand...

PS: MT Framework >>>> UE3.
 
z0m3le said:
What does the 28nm process have to do with a 40nm 6970m chip?

Also just a quick note, e350 is a 32nm process, and Llano that releases any day now is a 28nm chip. I really guess I just don't understand what you are trying to say.
It has nothing to do with that. I'm just saying that yields are the reason Nintendo isn't using 28nm for the GPU despite the fact that TSMC has already started producing them and it would be a HUGE benefit.

Maxrunner said:
From EDGE magazine site:

EDGE: Did you make any assumptions about Nintendo’s new hardware before Wii U was announced? Did it tally?
Mark Rein: Oh, wow. A really sticky situation. We were in the enviable position of not having to make assumptions, let’s just say that. I’m pretty impressed with the Wii U. It looks like a great device and I think it’ll do really well.
I don't get why Rein is so excited, while that guy from DICE seems to pretty much hate it. Perhaps it's simply that DICE wants to stay PC-only and push graphics to the ultimate limit, but EA won't let them?
 
BurntPork said:
It has nothing to do with that. I'm just saying that yields are the reason Nintendo isn't using 28nm for the GPU despite the fact that TSMC has already started producing them and it would be a HUGE benefit.

COULD, (could, mind you) be the reason hardware isn't ready yet, this console releases next year and 28nm is coming out with the HD7000 series this year, so it will be produced in mass quantities by the end of this year.

BurntPork said:
I don't get why Rein is so excited, while that guy from DICE seems to pretty much hate it. Perhaps it's simply that DICE wants to stay PC-only and push graphics to the ultimate limit, but EA won't let them?

The DICE guy just wants more ram, he is a PC dev, and BF3 was built from the ground up, for PC and is being ported to the consoles. He still says that Wii U will be the most powerful console BY FAR, his capitals not mine.

I think Wii U has over 1GB memory, but possibly less than 2GBs, if it were 2, they wouldn't have any need to worry about PS360s successors, but Nintendo likes to limit devs... I have my fingers crossed, but if we get 1.5gb I will be happy.
 
I think it might simply be because the Wii U falls drastically short of cutting edge PCs. That's usually not the case for new consoles outside of the original Wii, correct? That's likely what devs like DICE want in the next gen of consoles (though it's highly unlikely they'll get it)
 
z0m3le said:
COULD, (could, mind you) be the reason hardware isn't ready yet, this console releases next year and 28nm is coming out with the HD7000 series this year, so it will be produced in mass quantities by the end of this year.
I doubt it. Chances are that they'd still have a lot of trouble making enough, and it would end up being more expensive anyway.

z0m3le said:
The DICE guy just wants more ram, he is a PC dev, and BF3 was built from the ground up, for PC and is being ported to the consoles. He still says that Wii U will be the most powerful console BY FAR, his capitals not mine.

I think Wii U has over 1GB memory, but possibly less than 2GBs, if it were 2, they wouldn't have any need to worry about PS360s successors, but Nintendo likes to limit devs... I have my fingers crossed, but if we get 1.5gb I will be happy.
Possible. Actually, after that Crytek guy wanting 8GB, I think he may even be expecting 4. :p

guek said:
I think it might simply be because the Wii U falls drastically short of cutting edge PCs. That's usually not the case for new consoles outside of the original Wii, correct? That's likely what devs like DICE want in the next gen of consoles (though it's highly unlikely they'll get it)
Again, possible, but could he really have been hoping for a high-end card? I'm sure that devs have some common sense.

It makes even less sense considering that Epic said that they wouldn't support next-gen systems if they can't run Samaritan. I am so confused.
 
guek said:
I think it might simply be because the Wii U falls drastically short of cutting edge PCs. That's usually not the case for new consoles outside of the original Wii, correct? That's likely what devs like DICE want in the next gen of consoles (though it's highly unlikely they'll get it)

Yes, I agree with this post, most people expect it, but they don't realize the power draw of current top end GPUs, the fastest in the world right now is the 580GTX, but that is a 250Watt card, and even at 20nm (won't happen till late 2014 or 2015) You'll only cut that power down by 40%, even Nvidia said that something like Epic's newest showcase won't be possible on a single GPU for a few years.

Close platforms have their advantage, but that advantage quickly goes away as tech pushes forward. For instance, in 2007 GPU's could outperform consoles released in 2005 and 2006 on windows platforms.
 
Sorry I'm sometimes dense and try not too use too much induced information to parse posts. I read what is there and try to make minimal assumptions, because these can be false. As I said before I not necessarily arguing the points, but much more the presentation.

bgassassin said:
Well when you take it out of context you lose what I was saying. I explained it right after that.
Sorry I fail to see how the following sentences support the task being "hard". If you're saying you have to do something as opposed to nothing to get it working, then I wouldn't call that "hard". Also, particularly when it come to assets, I can say for a fact that that is a gigantuan amount of work that goes into building content pipelines. These depend on two things: 1. the producer and 2. the consumer. In our case 1. the producer is a various set of tools like Maya for models/animations, Audio composition software and what have you. These software suites have a much more long-lived development cycle and typically change incrementally. 2. The consumer is the game engine that needs to be able to understand the assets and process them properly. Although the refresh cycle of these engines might be more dramatic than the producer software, what assets are and how they are described usually doesn't. Also no matter what your rendering special effects are, a mesh is a mesh so to say, and it's description is fairly stable. 90% of a game (if not more) is assets and optimizing the asset management, making sure you get the most out of your artists is important. All of this is to support the following proposition: developers are likely to separate out how an asset is managed from other components of the engine, such as the rendering engine. And that is to support: they are likely to reuse, without loss of functionality or "power" their asset pipeline. So finally, I would rather say that the task of using "old assets" with a "new engine" is likely not "hard".[/QUOTE]


bgassassin said:
But that has nothing to do with the point I was making. Gears of War was made with UE3. Those two engines aren't capable of producing that level of visuals, therefore the visuals of GoW would be gimped by using the older engines. There's plenty of info out their to show differences in the engine versions. It comes off that you're intentionally "playing dumb" to prove your point.
And you are also missing my point (most likely because I don't express it properly enough). I'm not saying the capabilities of UE2 vs UE3 aren't different and that running a game on one vs the other is not going to "gimp" it. I was talking about how fundamentally different Engine6 is as opposed to Engine5 (using UE as an illustration, not that I'm going to spend half a day googling and contrasting their internals): say 90% of the Engine6 uses the same code as Engine5 will you call it "new"? The most obvious changes from a users point of view are the graphics, but that is only one component of the engine. The point is that you don't know what the capability of Nintendo's engine is, nor how much needs to change for it to completely exploit the capabilities of the new hardware.

bgassassin said:
Since the mistakes Nintendo made with the N64, they made a fundamental change in certain things they did. Making simpler games, making hardware that's easier to develop for, making hardware that's about efficiency and not all out power, and separate from that using the same engine for multiple titles. Then there was continually using friend codes despite their broad unpopularity. And these are just off the top of my head.

Those are just a few of the things that lead to those conclusions.
Thanks, that's all I wanted to see, is something to support claims. I wouldn't say the evidence is convincing to me, but that is a different story. At least with these I can put your other comments into proper context.

bgassassin said:
Yes I would agree with what you have said, but would you also agree that if the management system is unable to produce what the modified code wants then the management system would have to upgraded?
Yes.

bgassassin said:
There's enough information available because that type of reasoning is based on observation of facts and draws a conclusion that is not 100% fact. You're forgetting the conclusion part that identifies it as inductive. So yes there is enough available for the conclusions I draw. Speaking strictly from a TP perspective, the engine was at best created and at worst modified for the Gamecube hardware since that was originally what TP was going to be release on. Referring back to your example, Wii U should at worse use Shader Model 3.3. Gamecube used Shader Model 0.0 because as it's been said here before the GC's TEV had no programmable pixel and vertex shaders. That alone calls for a new build. There are a lot of limitations with that hardware and an engine optimized for that extremely limited hardware can only go so far when considering the huge hardware leap. I'm sure even you would agree with that. Continuing with a UE comparison, why use UE2 when UE3 is designed for more current hardware. I'm expecting Nintendo to do the same with theirs.
Shit, I think how anal I am stems from having to deal with too many academic papers that are trying to pass BS by you.

1. (underlined above by itself) Your conclusions have to be convincing though. That depends on the evidence that you supply and the knowledge of your audience. It's what would differentiate a weak induction from a strong one. IMO the evidence was not strong enough for the conclusion.

2. (bold above) A new build of what? The rendering engine? sure fine I'll give you that. But for what I consider a game engine that would make up maybe 20% of it. I wouldn't call 20% constructing something new. Now again, remember, I'm not arguing that they won't build a new engine. Something that has changed with the new hardware that is going to be much more fundamental is the level of parallelism exposed. Multiple CPU threads and lots of CPU cores (depending on how much these are exposed to general compute/stuff that isn't synthesizing an image). Effectively making use of parallelism is a much, much bigger beast than the evolution of shader model from 0 to 3. Why? 1. Because in the case of shaders they can still mostly fit into the concept of geometry transformation and fragment shading, which you have even with model 0. So you are not changing the fundamental operations and the pipelining concept your engine might be based on could still be perfectly relevant: just update it a bit to add binding of the programmable parts. 2. a paradigm shift from having mostly sequential execution to algorithms that most be conceived to expose more parallelism to be effective is currently a big issue with software development. It requires a different way of thinking about things, and producing such algorithms is much harder. Sorry I don't have time to find specific resources for that (talks at AMD's Fusion 11 conference might be good -- C++ AMP -- and you can always start with (wikipedia).
 
BurntPork said:
I don't get why Rein is so excited, while that guy from DICE seems to pretty much hate it. Perhaps it's simply that DICE wants to stay PC-only and push graphics to the ultimate limit, but EA won't let them?

Mark Rein is all PR and has an engine to sell. WII U runs that engine.
Mark Rein is also saying with the samataritan demo to Sony/MS to push hardware or Apple will kill you
 
Maxrunner said:
EDGE: Did you make any assumptions about Nintendo’s new hardware before Wii U was announced? Did it tally?
Mark Rein: Oh, wow. A really sticky situation. We were in the enviable position of not having to make assumptions, let’s just say that. I’m pretty impressed with the Wii U. It looks like a great device and I think it’ll do really well.

Even my extreme skepticism is not enough to keep this comment from affecting me positively ;)
 
BurntPork said:
It makes even less sense considering that Epic said that they wouldn't support next-gen systems if they can't run Samaritan. I am so confused.
I guess we won't see Epic on consoles until next-next gen then.

But seriously, they'll optimize and remove features to make it run on next gen consoles. Simple as that.
 
Luckyman said:
Mark Rein is all PR and has an engine to sell. WII U runs that engine.
Mark Rein is also saying with the samataritan demo to Sony/MS to push hardware or Apple will kill you
Hm. That's true. However, it also reveals a sad truth: PR guys are the only ones saying truly good things about it. Actual devs who are working with the hardware aren't enthusiastic about it as a development platform at all.
 
Luckyman said:
Mark Rein is all PR and has an engine to sell. WII U runs that engine.
Mark Rein is also saying with the samataritan demo to Sony/MS to push hardware or Apple will kill you


Why do people say this? What is Apple doing that makes people think this?
 
BurntPork said:
Hm. That's true. However, it also reveals a sad truth: PR guys are the only ones saying truly good things about it. Actual devs who are working with the hardware aren't enthusiastic about it as a development platform at all.
Wait, who? That DICE guy who doesn't have a dev kit? [I'm basing that off of his Twitter account]
 
guek said:
again, burntpork pulls a fabricated negative statement out of his ass
I'll admit that I may have missed a few things, but I haven't seen much come from devs, and real excitement has been scarce.
 
BurntPork said:
What makes you think he doesn't have a dev kit?

There's also Ken Levine's hesitation, and the BG&E2 guy.
What hesitation? He is finishing Bioschock Infinite and it'll see release well before the WiiU does. He isn't hesitating he just won't port the game over to the system.
 
BurntPork said:
Hm. That's true. However, it also reveals a sad truth: PR guys are the only ones saying truly good things about it. Actual devs who are working with the hardware aren't enthusiastic about it as a development platform at all.
Huh? Pretty sure no one has said that. Dev wise, it should be pretty similar to working on the 360. Does it hurt that Nintendo are putting out a non behind the curve piece of hardware?
 
NateDrake said:
What hesitation? He is finishing Bioschock Infinite and it'll see release well before the WiiU does. He isn't hesitating he just won't port the game over to the system.
I'd even take it a step further. He isn't planning on porting it over to the Wii U.
 
NateDrake said:
What hesitation? He is finishing Bioschock Infinite and it'll see release well before the WiiU does. He isn't hesitating he just won't port the game over to the system.
I read that he's taking a "wait and see" approach with it. He seems more excited about it as a gamer than than as a developer.

Also, I'm not saying that devs don't want to work on it; they just don't seem to be as excited about it as you'd expect them to be with a new console. It seems more like "Hey! Another console to port stuff to!" than "Finally! A powerful new console that takes away some of the annoying limits of the current-gen!"
 
BurntPork said:
What makes you think he doesn't have a dev kit?

There's also Ken Levine's hesitation, and the BG&E2 guy.

ugh

During E3 CVG sat down with Bioshock's creator Ken Levine and asked him what his opinion on the new console with and he responded with a rather direct answer; it sounds "pretty f***ing cool".

Gabe Newell said:
Wii U seems to be a lot more powerful than the previous generation

When we asked Bonstead if he thought it was possible that the Wii U version of Darksiders II would be the best version of the game, he said, "Yeah, just because the hardware is more powerful and it will have some extra features that I think will actually be useful to people playing the game. With it’s controller, [the Wii U version of Darksiders II] might be the best version of the game."

Sega's initial, "very early doors" reaction to Wii U is that "we're finding it to be quite powerful".

Whether that means more powerful than PS3 and 360 - the billion dollar question - Dunn wouldn't specify. "It's too early to call," he said. "It's different."

The signs, however, are encouraging. Dunn said initial experimentation revealed the Wii U to be "a good platform to develop for".

Really, you need to stop with the negative nancy bullshit, because there are plenty of positive reactions. Yes, there are not so positive or more tepid reactions as well, but consider this:

Peter Moore said:
With the Wii U, I don't know a huge amount more than you already know. Our developers know a lot more than me because they're working on the actual hardware. The excitement once again is they are redefining the way that we interact with our games. That is the raw excitement.

Peter fucking Moore didn't know very well how powerful the thing was at E3. Coupled with the fact that hardware is not yet finalized and how much actual information some devs are privy to when making comments, positive statements are a lot more telling than negative ones because if anything, the hardware is not going to get worse as new dev kits are shipped out.
 
I fell into the trap, damned! There has been mixed feedback on the "power" of the console, so I'll just put my expectations back to rock-bottom... hey, I can only be positively surprised then. -- I also do this for any movie I go watch these days :(
 
Status
Not open for further replies.
Top Bottom