Wii U Speculation Thread of Brains Beware: Wii U Re-Unveiling At E3 2012

Status
Not open for further replies.
theBishop said:
Just like Gamecube wasn't that different from N64 which wasn't that different from SNES which wasn't that different from NES.

I really don't understand this "End of History" mentality with certain people on GAF. As if technological progress has just stopped because the hive collectively decided that we have today is "Good Enough". It's never been true in the past, and there's no reason to think it's true now.

In 5 years, PS3 and 360 games will look like a jaggy mess just like PS2 games do now.

LOL, wasn't the samaritan demo based off unreal 3.
 
Luckyman said:
UE will continue develop all the time. They will slap UE4 on it sometime

I'd have no problem if that's what happened. But Tim Sweeny has made several statements about the general direction of rendering, and if that's what he's working on, it's significantly different from UE3.
 
theBishop said:
Just like Gamecube wasn't that different from N64 which wasn't that different from SNES which wasn't that different from NES.

I really don't understand this "End of History" mentality with certain people on GAF. As if technological progress has just stopped because the hive collectively decided that we have today is "Good Enough". It's never been true in the past, and there's no reason to think it's true now.

In 5 years, PS3 and 360 games will look like a jaggy mess just like PS2 games do now.
It's not about what's "good enough". As long as the systems are basically identical, feature and architecture wise (and it looks very likely that this will be the case), there's no reason to reinvent the wheel. Gamecube and N64, on the other hand, were radically different designs, with different featuresets and a completely different architecture.
 
DaSorcerer7 said:
LOL, wasn't the samaritan demo based off unreal 3.

Epic didn't ship their first UE3 game until 2006. First-generation games on a new platform often use last-gen technology. That Samaritan looks as good as it does should indicate how much better graphics are going to get from there.
 
I feel like a broken record, but here it goes again...

As long as Wii U has Shader Model 4.1 support, I don't see what would be stopping it from getting PS4/720 multi-console ports. Even if it isn't tremendously popular, it still amounts to free money for publishers. Sure, they may need to scale back the graphics a bit and perhaps limit some physics, but that's SO much better than the Wii's situation.

EDIT: Samaritan demo is the Killzone demo of this next gen. Sure, some future graphics chip could render something close to that, but games aren't going to become short films all the sudden. There's only so much a publisher is going to spend on realism before it becomes pointless.
 
wsippel said:
It's not about what's "good enough". As long as the systems are basically identical, feature and architecture wise (and it looks very likely that this will be the case), there's no reason to reinvent the wheel. Gamecube and N64, on the other hand, were radically different designs, with different featuresets and a completely different architecture.

In terms of architecture, Xbox360 isn't "radically" different from Xbox1. Yes they went from a celeron processor to a multicore powerPC, but it's nothing too weird from a coding perspective.

You're eager to discount the impact of raw processing power, and I don't understand why.
 
theBishop said:
Epic didn't ship their first UE3 game until 2006. First-generation games on a new platform often use last-gen technology. That Samaritan looks as good as it does should indicate how much better graphics are going to get from there.

Exactly, it also demonstrates that calling for an Unreal 4 engine just because you think the xbox720 or ps4 are going to be somehow lightyears ahead in Wii U tech of the is....premature.
 
lherre said:
I mean that:

- first 2 years: x360-ps3 ports (easy money because porting is cheap)
- ps4-Xbox next arrive: some ports but each time less frequent, 3rd party support goes to the new machines because the gap in power with Wii u is too big so port will be expensive too much compromises to downport games (Wii situation)

Is my personal bet. I can be mistaken of course.

I'd agree with this post.
Using the current engines and seeming like the PC equivalents on a console would be the most likely outcome. Meaning, because of the supposedly higher graphical fidelity, the Wii-U would be a more favourable option for companies to publish rather than the PC.

Wii-U should aim to become the go-to-console for the 'bleeding edge graphics version' of 3rd party games with publishers like EA not having to worry about all those little issues most PC gamers complaint about eg. DRM, dedicated servers ect.

It has a short amount of time to capitalise on it's advantages before the new consoles are pushed out though. So playing a Playstation/Playstation 2 style of market capitalisation in the opening of the next round of consoles would be the best model for Nintendo to follow.

If things keep going like they have been with long periods without games and shitty marketing. Then Nintendo will get whats been coming to them.
 
Ue4 of course is in the works for a long time ... But epic "only" sells ue3 right now ... Go from uex to uex + 1 is not free so do you think that epic is interested in advertise ue4?
 
theBishop said:
Epic didn't ship their first UE3 game until 2006. First-generation games on a new platform often use last-gen technology. That Samaritan looks as good as it does should indicate how much better graphics are going to get from there.
I'm pretty sure Epic said that they aren't developing a new engine from scratch. When the next-generation starts they will be offering UE4, but in reality UE4 will still be based on UE3. They are just changing the name for marketing reasons.

Also, I'm pretty sure that the consensus is that the raw power of the next generation of consoles will be low compared to current 2011 PC hardware due to different technical reasons.
 
lherre said:
Ue4 of course is in the works for a long time ... But epic "only" sells ue3 right now ... Go from uex to uex + 1 is not free so do you think that epic is interested in advertise ue4?
I think Epic said they'd roll together the Samaritan stuff and some other improvements together and that would be UE4 so I don't think UE4 will be a massive jump like UE2 to UE3 was.
 
lherre said:
I mean that:

- first 2 years: x360-ps3 ports (easy money because porting is cheap)
- ps4-Xbox next arrive: some ports but each time less frequent, 3rd party support goes to the new machines because the gap in power with Wii u is too big so port will be expensive too much compromises to downport games (Wii situation)

Is my personal bet. I can be mistaken of course.
Just downgrading settings and resolution shouldn't be all that expensive at all.
 
Mr_Brit said:
I think Epic said they'd roll together the Samaritan stuff and some other improvements together and that would be UE4 so I don't think UE4 will be a massive jump like UE2 to UE3 was.

Remember the called "ue2.5" too, very similar to ue3. Obviously all the ue3 tech will be there (it will be silly to start again)
 
theBishop said:
In terms of architecture, Xbox360 isn't "radically" different from Xbox1. Yes they went from a celeron processor to a multicore powerPC, but it's nothing too weird from a coding perspective.

You're eager to discount the impact of raw processing power, and I don't understand why.
Nonsense.
 
theBishop said:
In terms of architecture, Xbox360 isn't "radically" different from Xbox1. Yes they went from a celeron processor to a multicore powerPC, but it's nothing too weird from a coding perspective.

You're eager to discount the impact of raw processing power, and I don't understand why.
Xbox used a single core, single thread, out-of-order 32bit x86 processor, 360 uses a triple core, dual threaded in-order 64bit PPC. That's different enough to warrant a new engine. And that has little to do with overall performance, it's a different concept that absolutely requires a new approach to engine design. The next generation of consoles, however, will be similar enough to run the exact same engines with different assets. We're only talking about an expected performance difference of somewhere between 200 and 500%. That's not nearly enough to warrant completely different, incompatible middleware.
 
Mr_Brit said:
Who says PS4/next xbox games won't run at 720p?
I know that's a possible issue, but hopefully that won't come into play until closer to the middle of the generation. If it happens at the start... Well, then, Nintendo might as well just give up on third-party support forever. :/
 
wsippel said:
Xbox used a single core, single thread, out-of-order 32bit x86 processor, 360 uses a triple core, dual threaded in-order 64bit PPC. That's different enough to warrant a new engine. And that has little to do with overall performance, it's a different concept that absolutely requires a new approach to engine design. The next generation of consoles, however, will be similar enough to run the exact same engines with different assets. We're only talking about an expected performance difference of somewhere between 200 and 500%. That's not nearly enough to warrant completely different, incompatible middleware.
200%?!! Lol. If PS4/next xbox are only twice as fast as their predeccesors they're dead on arrival. The Wii U should be at least twice as fast as them, never mind PS4/next xbox.
 
Lonely1 said:
Nonsense.

We can argue about the meaning of "radical" if you like. All I'm really trying to say is Xbox360 and Xbox1 both have essentially a PC-like workflow. Unlike PS2 or PS3, the processor on both Xboxes is a conventional design familiar to PC developers. Yes there are significant differences between PowerPC and x86, but until you really get deep in optimization most of those differences are invisible to the coder.

And more to the point, the difference between Xbox2 and Xbox3 will be at least as significant as the difference between Xbox1 and Xbox2.
 
The_Endgamer said:
I'd agree with this post.
Using the current engines and seeming like the PC equivalents on a console would be the most likely outcome. Meaning, because of the supposedly higher graphical fidelity, the Wii-U would be a more favourable option for companies to publish rather than the PC.

Wii-U should aim to become the go-to-console for the 'bleeding edge graphics version' of 3rd party games with publishers like EA not having to worry about all those little issues most PC gamers complaint about eg. DRM, dedicated servers ect.

It has a short amount of time to capitalise on it's advantages before the new consoles are pushed out though. So playing a Playstation/Playstation 2 style of market capitalisation in the opening of the next round of consoles would be the best model for Nintendo to follow.

If things keep going like they have been with long periods without games and shitty marketing. Then Nintendo will get whats been coming to them.

Bleeding edge graphics is something sony and nintendo have lost to MS now two generations straight. While I think nintendo should permanently take the cube approach to console in terms of hardware design they see little benefit in hardware that is exploted by virtually no one in the 3rd part realm. Nintendo just needs to learn on the hardware end not to let either of these companies get a huge advantage as it continually hurts them.
 
Mr_Brit said:
200%?!! Lol. If PS4/next xbox are only twice as fast as their predeccesors they're dead on arrival. The Wii U should be at least twice as fast as them, never mind PS4/next xbox.
Between Wii U and PS4/ Xbox3, not between PS4/ Xbox3 and their respective predecessors.
 
The thread picked up pace while it was doing this.

lherre said:
First I want to know why the people think about R7XX family all the time. I'm not sure if it was press people or where this rumor comes from but I think any dev said nothing about a R7XX model or any particular model. I can't said too much about it (specific details) but in the docs exists a detail that people can interpret that is a R7XX card, but it can be anything at all. But nothing in the documentation (as I said before) speaks about any specific models, clocks speeds, etc. So at this moment is a bit premature to say one thing or another.
I'm sure my info comes from the second version of the devkits at least (third if we count internal-alpha hardware in nintendo quarters that didn't go to licensees). I'll try to gather new info (i'm curious too XD).

Sorry to be so vague.

No need to apologize at all because I fully understand your situation and tried to be as considerate as possible. I got the answer I was looking for so I'm content now. At best the R700 rumor was only based on one (possibly vague) piece of information and it spread like wildfire. IMO I think a variation of an R700 can be eliminated as being in the final kit. And hopefully those issues you mentioned will be rectified for you guys come the next kit. I'm sure it's like having a new toy that you can't play with too roughly or it will break and that sounds like it would be frustrating and annoying.

With the lack of info you guys have, it sounds like Nintendo's trying to give themselves as much wiggle room as possible for hardware power tweaks.

wsippel said:
Someone told me that an early devkit supposedly used an off-the-shelf RV770LE. But that chip would certainly not be used in the final hardware either way.

That was the chip I speculated on earlier as well. I wonder what fixes Nintendo will be making to get beyond that.

Mr_Brit said:
I'm not talking about final specs but about the specs in the dev kits. Without knowing things like frequencies, number of ROPs, bandwidth, number of TMUs, latency, amount of RAM, bus width etc. it makes it very hard to develop for the console. In a sense it makes it similar to developing on PC where you have no idea about the hardware being used so have a non definite performance target which means that the games will end up looking awful/ running badly.

Seems like they have been pushing it to the point of failure to see where the current cap is at. That's all they would need to do right now anyway if they aren't getting all the details.

BurntPork said:
Well, that basically confirms that that it's closer to 2-3x the current gen. So the max is definitely 640SPs. That said, it also means that is can't be a 500MHz RV740 like Mr. Brit thinks. It has to be something more.

Could also be the minimum.

EatChildren said:
A couple of years isn't long in the course of a generation, and Epic's vision is how I see it.

The main concern is if Nintendo can iron out the hardware issues, such as the overheating problem. If they cant, they'll be declocking that GPU.

It's definitely more realistic than how some around here believe. But part of me doesn't agree because 2 years is roughly 1/3 of a generation and that could be a considerable amount of time.
 
theBishop said:
Just like Gamecube wasn't that different from N64 which wasn't that different from SNES which wasn't that different from NES.

I really don't understand this "End of History" mentality with certain people on GAF. As if technological progress has just stopped because the hive collectively decided that we have today is "Good Enough". It's never been true in the past, and there's no reason to think it's true now.

In 5 years, PS3 and 360 games will look like a jaggy mess just like PS2 games do now.

Except, you're wrong and that won't happen. If CG movies only make current HD twin games look worse, there's no way a new console will make them look like a jaggy mess. The fact is, we can look at movie CG technology to get an idea for where console graphics will ultimately end up at and at the moment, they don't completely shit on the HD consoles like they did on the PS2/Xbox/GC
 
bgassassin said:
That was the chip I speculated on earlier as well. I wonder what fixes Nintendo will be making to get beyond that.

The 770LE seems like a good target. Shrinking it to 40nm and trying to balance clock speed and TDB could account for some of the overheating.
 
theBishop said:
We can argue about the meaning of "radical" if you like. All I'm really trying to say is Xbox360 and Xbox1 both have essentially a PC-like workflow. Unlike PS2 or PS3, the processor on both Xboxes is a conventional design familiar to PC developers. Yes there are significant differences between PowerPC and x86, but until you really get deep in optimization most of those differences are invisible to the coder.

And more to the point, the difference between Xbox2 and Xbox3 will be at least as significant as the difference between Xbox1 and Xbox2.

Nonsense.
The "PowerPC" inside the 360/Cell is very different from an actual PowerPC, to top it all off.

200%?!! Lol. If PS4/next xbox are only twice as fast as their predeccesors they're dead on arrival. The Wii U should be at least twice as fast as them, never mind PS4/next xbox.

And if MS or Sony decide to go with Fusion Trinity (or a similar such bulldozer-based APU) as their base? (it will still be far faster than "2x more", but won't be a massive distance from whatever the lower-specced WiiU ends up being).

UE4 blah blah blah

Even if MS and Sony decide they're going to be morons and release $600 beast systems again (which still wouldn't run a game based on Samaritan, btw), Epic and other middleware developers won't be reinventing the wheel. What runs on these uber-expensive consoles will still run on WiiU. Whether they want to make the effort to run it on the WiiU is a completely different discussion, but it will be capable of supporting the middleware.
 
theBishop said:
Just like Gamecube wasn't that different from N64 which wasn't that different from SNES which wasn't that different from NES.

I really don't understand this "End of History" mentality with certain people on GAF. As if technological progress has just stopped because the hive collectively decided that we have today is "Good Enough". It's never been true in the past, and there's no reason to think it's true now.
It's not about consoles being 'good enough to mark end of history', it's about market viability. These consoles have to take their technology from somewhere, and present that in a viable cost envelope for their own market.

Historically, some gens have been technologically-revolutionary, while other gens - 'merely' evolutionary. SNES was an evolutionary advancement over NES, while N64 was a true paradigm shift from SNES. Gamecube was a big evolutionary step over N64, but just because the underlying technology was still very young.

This gen too was a big evolutionaty step from last (speaking of ps360), yet most of the difference in the eye of the semi-tech-literate (aka enthusiast) public came from the resolution jump. Heck, some software devs even used that to their advantage - e.g. Polyphony Digital reusing game assets from their last-gen installments.

What I, and as it seems several other posters in this thread, believe, is that next gen will be a strictly evolutionary advancement.

1. There won't be a resolution jump.
2. There won't be a GPU paradigm shift, outside of tessellation (which WiiU's GPU will most likely feature in one form or another).
3. There won't be a CPU paradigm shift - the multi-core jump was already made this gen, and game engines/middleware followed suit.

Aka, next gen will be a more-of-the-same gen. Whether it will feature some quantitative advancements so big as they could bring along qualitative shifts is yet to to be seen. I'm in the 'Don't expect Samaritan on next gen' camp, and from all I've heard from Epic, so are they.

In 5 years, PS3 and 360 games will look like a jaggy mess just like PS2 games do now.
See my PD remark in the first paragraph. Quite a few games from the (technological) last gen (i.e. incl. Wii) look quite nice when upscaled to current-tech-gen resolutions. And the tech gap between WiiU and ps720 won't be anywhere near that between, say, ps2 and ps3.
 
BurntPork said:
Ah, I knew I forgot something. The 360 has 24MB, right? So... 24-32MB? Yeah, that sounds right.


Already posted, but the 360 has just 10 MB eDRAM as part of the "daughter die" connected to the main GPU. The connection between the eDRAM and the main GPU is 32 GB/sec and 256 GB/sec between the eDRAM on the logic on the daughter die.
 
wsippel said:
Xbox used a single core, single thread, out-of-order 32bit x86 processor, 360 uses a triple core, dual threaded in-order 64bit PPC. That's different enough to warrant a new engine. And that has little to do with overall performance, it's a different concept that absolutely requires a new approach to engine design. The next generation of consoles, however, will be similar enough to run the exact same engines with different assets. We're only talking about an expected performance difference of somewhere between 200 and 500%. That's not nearly enough to warrant completely different, incompatible middleware.

Interestingly, Epic's president did throw out some numbers.

"So I think that's the real challenge for us now, rather than worrying about the difference between a couple consoles and some order of magnitude, whether 3X or 4X. It's about how do we deal with iPhone 8... if you watch where the gamers are going that's where they are. Your iPhone 8 will probably plug into your TV, or better yet, wirelessly connect to your television set to give you that big screen gaming experience with good sound. So really, what's the point of those next-gen consoles? It's a very interesting situation to be looking at. That's what we're starting to think about more... not how do we scale from some Nintendo platform to some other future console," he concluded.

http://www.industrygamers.com/news/epic-president-whats-the-point-of-next-gen-consoles/

Does he mean means 3x or 4x in an order of magnitude (which is 30/40x) or just 3x/4x? The latter seems more reasonable and makes more sense.
 
3-4x the performance of current gen consoles has been the range most of us with our heads in reality have been expecting though I dont think what was said can be taken as anything.
 
theBishop said:
Just like Gamecube wasn't that different from N64 which wasn't that different from SNES which wasn't that different from NES.

I really don't understand this "End of History" mentality with certain people on GAF. As if technological progress has just stopped because the hive collectively decided that we have today is "Good Enough". It's never been true in the past, and there's no reason to think it's true now.

In 5 years, PS3 and 360 games will look like a jaggy mess just like PS2 games do now.


Give this man a cookie!

As far as I'm concerned, until "diminishing returns" is proven fact there's no reason to believe anything is different than the past.

In fact personally I've been proclaiming all along that the "next gen" leap if anything, will be bigger than ever. Two reasons: in past generations, high end incremental PC development continued alongside the consoles, "spoiling" next gen graphics before they got here. High end PC development has slowed or stopped drastically this generation though, most PC games are just console ports with a few bells and whistles added if you're lucky. So we haven't had next gen graphics spoiled for us yet, so the leap will be huge the way it has never been in history I believe.

Second reason is common sense, past gens were 5 or 6 years, this gen will be 7 or 8 years. A longer time between hardware means the jump naturally will be bigger just because of time.

Anyway, just look at Battlefield 3 PC footage, if next gen doesnt deliver anything beyond that (and I believe it will actually deliver far far beyond that) it's pretty much a next gen leap right there.

Or lets just look another way, do people honestly believe a console with 4GB, or even 8GB, of RAM, wont deliver games that are drastically better looking than todays consoles with 1/2 GB? It's not even possible, even ignoring the CPU and GPU (which obviously isn't the case).

But the easiest way to tell is just look at this thread, and how the Nintendo fans are spoiling for Wii U to have some serious horsepower under the hood. If it didn't matter, they wouldn't care.
 
I'm really out of it from heat exhaustion, so what has Epic said regarding Wii U GPU as far as how many SP in the R7xx GPU ? If anything. Or what kind of leap from the 360/PS3. I really cannot read much of the last page or so of this thread. Thanks in advance for any clarification. Or maybe I was just reading someone's guess--It was about the 3-4x leap. Sorry guys I'm really wiped out and cannot read anything clearly ATM.
 
Does he mean means 3x or 4x in an order of magnitude (which is 30/40x) or just 3x/4x? The latter seems more reasonable and makes more sense.

He meamns 3-4X, but I've noticed in past interviews Capps isn' the sharpest knife in the drawer, and he's a business strategy guy not a technical guy. Dont put much stock in a specific number he throws out.

I would pretty much bet next consoles besides Wii U will be at least 10X more powerful. It would be hard to be otherwise. We're at 10X now PC GPU vs console GPU (at least). But we have still 1-2 more doubling of GPU power due before next consoles hit probably no earlier than 2013. AMD Southern Islands should be ~2x current AMD, and it's due fall 2011 for example. Then there should be one more generation/doubling by 2013 (2012 will be a "refresh" year). So by 2013, high end PC should be at 40X current console.

So yeah, if next consoles are using anything remotely up to date, 10X in 2013 would be a mid-range PC GPU at best. If anything my bet is on a greater than 10X leap.
 
wsippel said:
UE3 will continue to be Epic's engine on all next gen systems. PS4 and Xbox3 won't be that radically different, just more of the same - no reason to develop a whole new engine.


Doubt that. In an interview Rein said the Samaritan demo wasn't UE4, but if he had to put a number on it it was UE 3.997 (or something very close to 4, dont remember exact random decimal he used to make his point).

So I would expect something like the Samaritan demo optimized=UE4=targeted at next gen Sony/MS.

Oh, and a little googling found here where Rein specifically states ue4=next gen consoles

http://www.vg247.com/2011/03/11/mar...me-a-full-game-ue4-when-next-consoles-arrive/
“What we’ve done in the past is that we draw the line with the console generations. I don’t want to pigeon-hole Unreal Engine as a console-only technology – it’s also mobile, handheld, and PC technology. But so far, that’s how it works. When these new ultra-mass market home entertainment devices come out and set a new bar, that’s when you’ll see us clicking over to ‘Unreal Engine 4′ – which will support those features and who knows what else.”
 
herzogzwei1989 said:
I'm really out of it from heat exhaustion, so what has Epic said regarding Wii U GPU as far as how many shader processors in the R7xx GPU ? If anything. Or what kind of leap from the 360/PS3. I really cannot read much of the last page or so of this thread. Thanks in advance for any clarification.

Nothing too specific. Only that it's stronger than the HD twins

http://www.industrygamers.com/news/epic-confirms-theyre-very-interested-in-wii-u/

"At the launch event at E3, some of the products that you saw running on Wii U were based on Unreal Engine technology. So that kind of gives you an idea of where we are in that space. You can certainly use our engine on that platform – it's a natural fit from a technology perspective," Capps added. "It opens up some doors that weren't open before on current generation consoles because it is going to be a powerful box. I'm sure [Epic VP] Mark Rein would love anyone who's interested to know how official our support is to get in touch with him!”
 
specialguy said:
Doubt that. In an interview Rein said the Samaritan demo wasn't UE4, but if he had to put a number on it it was UE 3.997 (or something very close to 4, dont remember exact random decimal he used to make his point).

So I would expect something like the Samaritan demo optimized=UE4=targeted at next gen Sony/MS.

Oh, and a little googling found here where Rein specifically states ue4=next gen consoles

http://www.vg247.com/2011/03/11/mar...me-a-full-game-ue4-when-next-consoles-arrive/
Samaritan is not possible on a single GPU at the kind of performance that was shown in the Demo. Nvidia has stated as such. Samaritan to appear on a single GPU is gonna be watered down to hell. Nvidia stated its gonna take a generation or two of graphics card advancement before we reach single card GPUs pulling the Samaritan demo not watered down
 
Sony/MS are obviously gonna get some kind of super GPU that can run Samaritan at 1080p60 and run super cool and Nintendo will be DOOOOOOOOOOOOOOOOOOOOOOOMED!
 
snesfreak said:
Sony/MS are obviously gonna get some kind of super GPU that can run Samaritan at 1080p60 and run super cool and Nintendo will be DOOOOOOOOOOOOOOOOOOOOOOOMED!
They won't no but they are going to tell everyone that they did or at least MS is.
 
specialguy said:
Give this man a cookie!

As far as I'm concerned, until "diminishing returns" is proven fact there's no reason to believe anything is different than the past.

In fact personally I've been proclaiming all along that the "next gen" leap if anything, will be bigger than ever. Two reasons: in past generations, high end incremental PC development continued alongside the consoles, "spoiling" next gen graphics before they got here. High end PC development has slowed or stopped drastically this generation though, most PC games are just console ports with a few bells and whistles added if you're lucky. So we haven't had next gen graphics spoiled for us yet, so the leap will be huge the way it has never been in history I believe.

Second reason is common sense, past gens were 5 or 6 years, this gen will be 7 or 8 years. A longer time between hardware means the jump naturally will be bigger just because of time.

Anyway, just look at Battlefield 3 PC footage, if next gen doesnt deliver anything beyond that (and I believe it will actually deliver far far beyond that) it's pretty much a next gen leap right there.

Or lets just look another way, do people honestly believe a console with 4GB, or even 8GB, of RAM, wont deliver games that are drastically better looking than todays consoles with 1/2 GB? It's not even possible, even ignoring the CPU and GPU (which obviously isn't the case).

But the easiest way to tell is just look at this thread, and how the Nintendo fans are spoiling for Wii U to have some serious horsepower under the hood. If it didn't matter, they wouldn't care.

Chip technology aside, there are more factors that Sony and Microsoft will have to consider for the next generation that will effect the performance of their consoles. These include heat, power consumption, and cost. PCs will still have the advantage of being able to be more expensive and to have bigger casing than consoles.
 
lwilliams3 said:
Chip technology aside, there are more factors that Sony and Microsoft will have to consider for the next generation that will effect the performance of their consoles. These include heat, power consumption, and cost. PCs will still have the advantage of being able to be more expensive and to have bigger casing than consoles.
Exactly, but this will be ignored as usual.
 
lwilliams3 said:
Chip technology aside, there are more factors that Sony and Microsoft will have to consider for the next generation that will effect the performance of their consoles. These include heat, power consumption, and cost. PCs will still have the advantage of being able to be more expensive and to have bigger casing than consoles.

The xbox1 came close though. lulz
 
All I want now is a 3-4x leap beyond 360/PS3 in terms of GPU performance. Developers will be able to harness that extra power and do amazing things with it. We just wouldn't see that on the PC side of things because PC games are not optimized to the same level as console games. That's why PC games running on 3-4x more powerful hardware won't look as good as a console with that much more power. Someone else can explain this better than I can. WiiU badly needs more pixel fillrate than 360/PS3, amoung other things. I'm hoping for the R7xx in Wii U to have 16 ROPs instead of the 8 that the HD consoles have. Hopefully 600 or more SP and 32-40 TMUs. If it had plenty of eDRAM then things will only be that dramatically much faster, especially in the closed box environment.
 
Am I the only one that thinks next gen will push a 1080p standard. Or at least a "faux" 1080p that Sony likes to use? I'm sure these guys are planning their hardware based on implementing things like MLAA/MXAA (since MSAA has some issues with differed rendering engines), so why is it something too much to ask for? It increases IQ by a lot, and with TV's only getting better, low IQ games will stick out. Hell, let's not forget Sony is pushing 3D real hard. They wont just STOP pushing how many pixels their console can push out. Surely they will try a 1080p standard so they can assure a 720p 3D standard without any hassle.
 
^ I would believe stable frame rates at 1080p would be a great target for them to achieve. That to me would make a big difference. With Nintendo coming out before the others and making that their target, then 1080p should be the standard for all.

Fourth Storm said:
The 770LE seems like a good target. Shrinking it to 40nm and trying to balance clock speed and TDB could account for some of the overheating.

I would believe now that it's not a target and if true that it was in the dev kit that it was merely a place holder. lhere said there was only one thing in the info they had that could be compared to an R700, so I don't see that as the target anymore. I'm not expecting it to be something much greater, but it's something that apparently has been the key stumbling block in the development process.

OG_Original Gamer said:
What would it cost Nintendo to redsign the WiiU caseing to accomodate heat dissipation?

I'm assuming it wouldn't be much.

It shouldn't since it's still early.

specialguy said:
But the easiest way to tell is just look at this thread, and how the Nintendo fans are spoiling for Wii U to have some serious horsepower under the hood. If it didn't matter, they wouldn't care.

That's not true at all. If fans were "spoiling" for more power you would be seeing multiple complaints about the current design direction. Only thing that's been going on is guessing what might be in it. Like someone putting a present in front of you and telling you that you can't open it. It's natural to then try to guess what's in it.
 
phosphor112 said:
Am I the only one that thinks next gen will push a 1080p standard. Or at least a "faux" 1080p that Sony likes to use? I'm sure these guys are planning their hardware based on implementing things like MLAA/MXAA (since MSAA has some issues with differed rendering engines), so why is it something too much to ask for? It increases IQ by a lot, and with TV's only getting better, low IQ games will stick out. Hell, let's not forget Sony is pushing 3D real hard. They wont just STOP pushing how many pixels their console can push out. Surely they will try a 1080p standard so they can assure a 720p 3D standard without any hassle.
That would make everyone happy. Wii U can use a 720p standard and remain relevant, and the PS4 and XB3 become premium (but still affordable) consoles compared to the Wii U. The separation of low-end and high-end consoles with both tiers supporting core games, in theory and combined with more bridge games from Nintendo, will make core gaming more accessible than ever before and should allow it to expand even faster.

However, that's precisely why 1080p won't be standard next-gen. Sony anbd MS won't push for that at all, and will instead try to get devs to push ultra-high details at 720p so that games can't be ported down to Wii U without the game becoming ugly.
 
Status
Not open for further replies.
Top Bottom