Rumor: Wii U final specs

Ports of Xbox 3 or PS4 would fair better since they would be built using the same principle of the Wii U CPU/GPU albeit those systems would be higher specs than Wii U. So basically the performance would be less on Wii U with a few corners cut but the core game should/would look very close to the other systems due to the Wii U's development being made with modern tech.
How on Earth does this make any sense? Would fare better how? Right now X360 ports to WiiU are generally slightly worse than the original or maybe on par at best, and you think that portdowns of games made for multiple times faster machines will fare better than that?

Do you really think that if a game, let's say runs at 1080p/60FPS on X720 and it runs at 720p/30FPS on WiiU, that this would mean the game fares better than ports from X360 made on WiiU today?
 
How on Earth does this make any sense? Would fare better how? Right now X360 ports to WiiU are generally worse than the original or maybe on par at best, and you think that portdowns of games made by multiple times faster machines will fare better than that?

Do you really think that if a game, let's say runs at 1080p/60FPS on X720 and it runs at 720p/30FPS on WiiU, that this would mean the game farex better than ports from X360 made on WiiU today?

You don't seem to know much about how these things work..... Read that last comment a few times over. His reasoning (which is actually logical) for the claim is stated in there.

Those of us that followed the WUSTs came to that conclusion (that Wii U will have an easier time with next-gen ports) a while ago.
 
How on Earth does this make any sense? Would fare better how? Right now X360 ports to WiiU are generally slightly worse than the original or maybe on par at best, and you think that portdowns of games made for multiple times faster machines will fare better than that?

Do you really think that if a game, let's say runs at 1080p/60FPS on X720 and it runs at 720p/30FPS on WiiU, that this would mean the game fares better than ports from X360 made on WiiU today?

It's about the consoles architecture. WiiU's is quite different from PS360's but will probably be more similar to PS4/X720's architecture. So this could become true.
 
How on Earth does this make any sense? Would fare better how? Right now X360 ports to WiiU are generally slightly worse than the original or maybe on par at best, and you think that portdowns of games made for multiple times faster machines will fare better than that?

Do you really think that if a game, let's say runs at 1080p/60FPS on X720 and it runs at 720p/30FPS on WiiU, that this would mean the game fares better than ports from X360 made on WiiU today?
Wishful thinking I suppose. I prefer to expect the worst and be pleasantly surprised by anything better.
So basically the performance would be less on Wii U with a few corners cut but the core game should/would look very close to the other systems due to the Wii U's development being made with modern tech.
Very close? A few corners cut? I'm just not seeing that as a realistic possibility.
 
If an architecture is slightly closer but (waring stupid multiplier, everything else would take too much time) has a 6x more powerful CPU then porting could well be significantly harder than from a system that is similarly powerful but somewhat different in design.

People seem to assume that it's simply a manner of dialing down some knobs, and for some games and graphics only this might be the case, but it hardly seems like GPU performance will be the biggest issue facing Wii U when it comes to the prospects of releasing fully-featured PS4/720 ports.

Those of us that followed the WUSTs came to that conclusion (that Wii U will have an easier time with next-gen ports) a while ago.
If there was ever an indication of the hive mentality fomented in the so-called "WUSTs" this would be it.
 
If an architecture is slightly closer but (waring stupid multiplier, everything else would take too much time) has a 6x more powerful CPU then porting could well be significantly harder than from a system that is similarly powerful but somewhat different in design.

People seem to assume that it's simply a manner of dialing down some knobs, and for some games and graphics only this might be the case, but it hardly seems like GPU performance will be the biggest issue facing Wii U when it comes to the prospects of releasing fully-featured PS4/720 ports.

If there was ever an indication of the hive mentality fomented in the so-called "WUSTs" this would be it.

No? I just stated that people who followed the WUSTs came to that conclusion a while ago. What about it is so "hivemind"? I honestly can't believe that this thread is in so much deep shit now that people have resorted to what are pretty much personal attacks.
 
I don't think they're implying PS4/Nextbox Wii U ports would be closer to those counterparts than the PS360 WiiU ports are close to those versions (in the latter case we could even reach 100% parity or even improvements on WiiU over the next year or two depending on the effort publishers care to fund), just that it can probably do (cut down, to what extent is on a per game basis and on exactly what kind of power gap we're talking about, which is still unknown) next gen ports and so fare better than Wii did where ports were only rarely viable, and more than this that not handling current ports perfectly already doesn't mean it can't have next gen ports at all like others want to already assume (which also makes sense if you just take things as "power" based ignoring any architectural differences). If the other next gen systems have similar architecture (again, cut down to different degrees) ports could be viable unlike in the case of Wii, that's all most agree on as far as I have seen. It makes sense, new low end PC hardware can have issues with games made for different older hardware yet at the same time run games that older hardware can't run at all by virtue of being similar to the modern high end models, even if the user has to turn sliders down. But few respond to this kind of rational speculation, most just choose their fights against the most irrational posts with just as irrational opposing opinions.
 
It's about the consoles architecture. WiiU's is quite different from PS360's but will probably be more similar to PS4/X720's architecture. So this could become true.

CPU

PowerPC: WiiU (3 cores), Xbox 360 (3 cores), PS3 (1 core+SPUs)

x86 (probably): Xbox 3 (~4 cores), PS4 (~4 cores)

If the Xbox 3 and PS4 will have a more powerful CPU, chances are devs need to put CPU tasks on the GPU for Wii U ports. That means even less resources for graphics.
 
You don't seem to know much about how these things work..... Read that last comment a few times over. His reasoning (which is actually logical) for the claim is stated in there.

Those of us that followed the WUSTs came to that conclusion (that Wii U will have an easier time with next-gen ports) a while ago.
I know quote a bit about "how these things work". Only way his statement makes any sense is if he's talking in relative terms. I.e. PSP is far less powerful than PS3, but LitteBigPlanet was still possible to make for it, with big sacrifices, but still. So in that sense PSP fared better than anyone would expect.
 
If an architecture is slightly closer but (waring stupid multiplier, everything else would take too much time) has a 6x more powerful CPU then porting could well be significantly harder than from a system that is similarly powerful but somewhat different in design.

People seem to assume that it's simply a manner of dialing down some knobs, and for some games and graphics only this might be the case, but it hardly seems like GPU performance will be the biggest issue facing Wii U when it comes to the prospects of releasing fully-featured PS4/720 ports.

If there was ever an indication of the hive mentality fomented in the so-called "WUSTs" this would be it.

Sure, but from the rumors, the PS4 doesn't have a CPU like that. Current Dev kits use A10 (a 4 core CPU with 4 threads) it's faster than Wii U's CPU, but nothing like those numbers, also Jaguar is the accepted final CPU cores in the rumor thread, as I've pointed out in an earlier page, this CPU core is 3.1mm per core @ 28nm, which is about the same size as Wii U's CPU if the processes are taken into consideration. They are maxed out at 4 cores, 2GHz and single channel memory.

Jaguar isn't some magic bullet point, it's actually speculating that the PS4's CPU will be on par with Wii Us and much slower than Cell, which even A10 is in a lot of respects.

Gemüsepizza;44807206 said:
CPU

PowerPC: WiiU (3 cores)3Threads(basically confirmed), Xbox 360 (3 cores)6Threads, PS3 (1 core+SPUs)I believe the core is also multithreaded?

x86 (probably): Xbox 3 (~4 cores), PS4 (~4 cores) AMD has no multithreading, so these are single threaded for each core, or 4 threads.

If the Xbox 3 and PS4 will have a more powerful CPU, chances are devs need to put CPU tasks on the GPU for Wii U ports. That means even less resources for graphics.
 
This isn't related to CPU, GPU or memory, but I thought it could convenient to post it here anyway, as I haven't seen it posted before:

Geneva and Santa Rosa, November 27, 2012 – STMicroelectronics (NYSE:STM), the world’s leading MEMS manufacturer, and PNI Sensor Corporation, the U.S.- based geomagnetic-sensor manufacturer, today announced that Nintendo Co., Ltd. has adopted an advanced sensor solution for Nintendo’s newly launched Wii U™ that includes sensors from ST and PNI. PNI’s 3-axis geomagnetic sensor, based on proprietary magneto-inductive technology and driven by ST’s ASIC, together with an ST 3-axis accelerometer, enables intuitive motion sensing in gaming applications.

The Wii U supports the need of users to enjoy intuitive motion control that is stable under various environments. The geomagnetic sensor delivers new playability in combination with other sensors. Further, the accelerometer accommodates a wide variety of players’ motions in gaming.

http://www.st.com/internet/com/press_release/t3346.jsp

http://www.pnicorp.com/gaming/RM3000_in_WiiU
 
No? I just stated that people who followed the WUSTs came to that conclusion a while ago. What about it is so "hivemind"?

Coming to any conclusion with so little info is impossible. His point is that some consider it a forgone conclusion because a bunch of people in that thread agreed .
 
Sure, but from the rumors, the PS4 doesn't have a CPU like that. Current Dev kits use A10 (a 4 core CPU with 4 threads) it's faster than Wii U's CPU, but nothing like those numbers, also Jaguar is the accepted final CPU cores in the rumor thread, as I've pointed out in an earlier page, this CPU core is 3.1mm per core @ 28nm, which is about the same size as Wii U's CPU if the processes are taken into consideration. They are maxed out at 4 cores, 2GHz and single channel memory.

Jaguar isn't some magic bullet point, it's actually speculating that the PS4's CPU will be on par with Wii Us and much slower than Cell, which even A10 is in a lot of respects.

If the PS4 CPU is so slow, then why switch at all? They could just use the CELL again and have backwards compatibility.
 
No? I just stated that people who followed the WUSTs came to that conclusion a while ago. What about it is so "hivemind"?
Basically, the "WUSTs" were, for months, an echo chamber where any negative speculation (or even just speculation perceived as negative) about the Wii U was met with derision, while every positive tidbit was immediately absorbed into the collective "knowledge" base. I feel like the expectations built up in such an environment are the main reason we have this very negative and bipartisan situation now.

I still remember the time I had the audacity to suggest that the Wii U CPU might be clocked lower (not even be slower, literally just have a lower clock) than PS360s CPUs, and the page-long outpouring of outrage that caused.


Sure, but from the rumors, the PS4 doesn't have a CPU like that. [...]
Your point is valid if the PS4 were to feature 4 Jaguar cores clocked at 2 GHz or less. I consider that incredibly unlikely.
 
Basically, the "WUSTs" were, for months, an echo chamber where any negative speculation (or even just speculation perceived as negative) about the Wii U was met with derision, while every positive tidbit was immediately absorbed into the collective "knowledge" base. I feel like the expectations built up in such an environment are the main reason we have this very negative and bipartisan situation now.

I still remember the time I had the audacity to suggest that the Wii U CPU might be clocked lower (not even be slower, literally just have a lower clock) than PS360s CPUs, and the page-long outpouring of outrage that caused.

I'm not disagreeing with you here. I still don't think my last comment deserved to be relegated to the "hivemind mentality" found in those threads. It was completely, and utterly besides the point--not to mention unnecessary.
 
Gemüsepizza;44807405 said:
If the PS4 CPU is so slow, then why switch at all? They could just use the CELL again and have backwards compatibility.

Cell has given PS3 a lot of problems with 360 ports, it will fair no better with XB3 ports, X86 is a good way to ensure that ports will be easy, since PC gaming won't be leaving X86 behind anytime soon.

Beyond that A10 might be weaker in some respects, it's still faster in others... the ones that will matter next gen, even Wii U's CPU should be stronger in this regard. (though that is speculation at this point)

When was this confirmed that WiiU CPU had 3 physical/logical cores?

Developers here posted that Wii U's early dev kits had SMT, they later reported that SMT was removed. It should just be 3 cores with 1 thread each, though like I said, it was only basically confirmed, meaning not actually confirmed but highly unlikely to be anything else thanks to GAF insiders who actually work on the dev kits.
 
I'm not disagreeing with you here. I still don't think my last comment deserved to be relegated to the "hivemind mentality" found in those threads. It was completely, and utterly besides the point--not to mention unnecessary.
The way I read your comment -- starting from "You don't seem to know much about how these things work" and culminating into wisdom gathered from the "WUSTs" of all places may have ticked me off unnecessarily, sorry about that.

Still, I think claiming that ports from PS4/720 will be "easier" than from 360/PS3 is at best misleading, in my opinion. Sure, there may be similarities that enable developers to more effectively use Wii U's architecture, but will that increase in effectiveness make up for the greatly increased overall requirements? I don't think so.
 
Cell has given PS3 a lot of problems with 360 ports, it will fair no better with XB3 ports, X86 is a good way to ensure that ports will be easy, since PC gaming won't be leaving X86 behind anytime soon.

Beyond that A10 might be weaker in some respects, it's still faster in others... the ones that will matter next gen, even Wii U's CPU should be stronger in this regard. (though that is speculation at this point)

cell has not created any problems for ps3.

cell has saved the ps3 for its shitty gpu.

Its the weak gpu that is the problem not cell.
 
I'm not disagreeing with you here. I still don't think my last comment deserved to be relegated to the "hivemind mentality" found in those threads. It was completely, and utterly besides the point--not to mention unnecessary.

You don't see how using a group of people believing something to corroborate/legitimize a theory someone is disagreeing with merits consideration as to how this corroborating group reaches it's conclusions?

What you did is the equivalent of trying to legitimize the "if evolution is real why we still got monkeys!?" argument by saying Liberty University supports this theory. Now you're acting offended that someone called out your bullshit.

Pointing out the inherent bias was entirely necessary after you propped the group up as something to legitimize people actually thinking the Wii U is going to handle Xbox 3/PS4 downports well.
 
Still, I think claiming that ports from PS4/720 will be "easier" than from 360/PS3 is at best misleading, in my opinion. Sure, there may be similarities that enable developers to more effectively use Wii U's architecture, but will that increase in effectiveness make up for the greatly increased overall requirements? I don't think so.

Indeed. If PS4/720 are hitting 2Tflops on the GPU side, that might be > 5x Wii U's GPU performance. Even if we assume PS4/720 shift things to a more GPU orientated model for more ex-graphics tasks, how does this benefit Wii U relative to PS3/360 ports if the GPU deficit vs PS4/720 is actually greater than the CPU deficit vs the former? (I'm assuming Xenon doesn't offer 5x the performance of Wii U's CPU)

I think it would actually be better for Wii-U if 'the game' side of next-gen processing didn't start taking advantage of GPU. Graphics tasks could possibly be scaled downward up to a point without affecting the core game, but it's much harder to start talking about scaling down simulation and logic processing while keeping the actual 'game' side of things the same. If games - meaning game logic and simulation - start pivoting around GPU chips 5 times more powerful than Wii U's, Wii U is likely even more screwed there than it is in bringing over processing from PS3/360's CPUs.

Also, regarding Jaguars, can someone explain the 'hard' 2GHz cap being claimed? Are we sure that's not a suggested max clock frequency for AMD skus based on their likely product contexts (lower power devices)? If a client was willing to allow their chips more power draw, surely the architecture can clock up?
 
Your point is valid if the PS4 were to feature 4 Jaguar cores clocked at 2 GHz or less. I consider that incredibly unlikely.

I'm just talking about the PS4 rumor to use one, I think A10 makes a lot more sense, but it also takes up a lot more room on the die, if the chip really houses both a Pitcain GPU and the A10 CPU (just the CPU) you are looking at over 320mm^2, using jaguar 4 core would be only 224mm^2, both of course at 28nm.

It really depends on what they go for to be perfectly honest.
 
You don't see how using a group of people believing something to corroborate/legitimize a theory someone is disagreeing with merits consideration as to how this corroborating group reaches it's conclusions?

What you did is the equivalent of trying to legitimize the "if evolution is real why we still got monkeys!?" argument by saying Liberty University supports this theory. Now you're acting offended that someone called out your bullshit.

Pointing out the inherent bias was entirely necessary after you propped the group up as something to legitimize people actually thinking the Wii U is going to handle Xbox 3/PS4 downports well.

No. I told him that the reasoning is found within the same comment that he (Lord Error) responded to. I then added that a bunch of people came to that conclusion a while ago (in the WUSTs). I didn't use the WUST as the reason why I thought he was wrong.
 
Eh what, I thought it was Cells incredible hard way to program for that made it very hard to port to, not the GPU

I think it's a mixture of that, the less flexible split memory pool and the non-unified shader architecture. Altogehter, not the best conditions for easy 360 -> PS3 ports.
But at least they were in the same power range. Porting from PS4/720 to Wii U will most likely be significantly harder.
 
Basically, the "WUSTs" were, for months, an echo chamber where any negative speculation (or even just speculation perceived as negative) about the Wii U was met with derision, while every positive tidbit was immediately absorbed into the collective "knowledge" base. I feel like the expectations built up in such an environment are the main reason we have this very negative and bipartisan situation now.

I still remember the time I had the audacity to suggest that the Wii U CPU might be clocked lower (not even be slower, literally just have a lower clock) than PS360s CPUs, and the page-long outpouring of outrage that caused.

I think the WUSTs over reaction to things was a counter balance/reaction to the extreme negativity of a very vocal minority (I'm not saying you or your comment were a part of this). I mean before they even showed any footage people assumed it was weaker than current gen, and when PS360 footage was shown at it's initial showing people were POSITIVE the footage looked worse than the exact same footage.

I'm not saying the attitude in the WUSTs was the correct one, I'm just saying that it sprang forth from the extreme negativity the forum was pouring out.
 
gofreak, The GPGPU side does depend on how much it's used for game logic sure, but how much would be used is likely not that much, The GPU is far better at doing those tasks, so we just don't know the outcome of the switch over, but Wii U should be able to handle a downport if the publisher wants to bring the game to the platform, even if some "cool" effects are lost. That answer is on a case by case basis though and not one general brush stroke. We assume far too much and stop people from speculating because they don't fit in our box, that is being done by a great deal of us. I'm 100% sure I've done it before.

As for the Jaguar cores, you can only overclock a CPU so much, these cores were designed with a 2GHz limit in mind, if AMD was heavily modifying Jaguar, it would no longer be called jaguar. I am sure however that it could probably be overclocked, but I heavily doubt much beyond 10%. There is a lot of limits to the CPU, like being designed to work as a quad core or dual core, nothing larger, or the single channel memory interface... If so much of it has changed, they wouldn't call them jaguar cores, and they would also end up a lot bigger.

Basically, why build an A10 out of jaguar, if you have a bunch sitting on the table, that sort of customization just isn't expected or logical.
 
Some posts from lherre:-

Nop, core 0 has one size, and 1 and 2 have another totally different.

Think that one of the cores is the "master" for the other 2 so its necessities are different.

L2 cache isn't near 16 mb, it is less (more less). The 3 cores hasn't the same amount of L2 cache between them.
 
Also, regarding Jaguars, can someone explain the 'hard' 2GHz cap being claimed? Are we sure that's not a suggested max clock frequency for AMD skus based on their likely product contexts (lower power devices)? If a client was willing to allow their chips more power draw, surely the architecture can clock up?

It may have to do with a thermal limit. Clocking it above 2ghz may make it really hard to cool effectively, but I don't know for sure. That's just one possibility.
 
Also, regarding Jaguars, can someone explain the 'hard' 2GHz cap being claimed? Are we sure that's not a suggested max clock frequency for AMD skus based on their likely product contexts (lower power devices)? If a client was willing to allow their chips more power draw, surely the architecture can clock up?


It's related to transistor size. The higher the frequency, the larger the transistors need to be in the design. The larger the transistors, the more area and power they consume. Jaguar was designed to be small and power efficient. The clocks were probably chosen after carefully optimizing the design for power and area consumption. Yeah, you could overclock those designs, but you're probably better off just going with a design built for high frequency operation. It would be more optimal.


If the PS4 consists of an APU plus discrete GPU where the APU is something like 4 Jaguar cores + 384 shares, it will be absolutely more powerful than the Cell and certainly the WiiU CPU. Think of the 4 Jaguar cores as replacements for the PPU and the APU's shaders are the SPE replacements. Each Jaguar core could probably handle the workload of the PPU with ease. The raw flop performance of lets say 384 shaders @ 500 MHz, would be more then the theorticals on the Cell, but a lot more accessible due to an easier programming model. This is again assuming APU+GPU.
 
It's related to transistor size. The higher the frequency, the larger the transistors need to be in the design. The larger the transistors, the more area and power they consume. Jaguar was designed to be small and power efficient. The clocks were probably chosen after carefully optimizing the design for power and area consumption. Yeah, you could overclock those designs, but you're probably better off just going with a design built for high frequency operation. It would be more optimal.


If the PS4 consists of an APU plus discrete GPU where the APU is something like 4 Jaguar cores + 384 shares, it will be absolutely more powerful than the Cell and certainly the WiiU CPU. Think of the 4 Jaguar cores as replacements for the PPU and the APU's shaders are the SPE replacements. Each Jaguar core could probably handle the workload of the PPU with ease. The raw flop performance of lets say 384 shaders @ 500 MHz, would be more then the theorticals on the Cell, but a lot more accessible due to an easier programming model. This is again assuming APU+GPU.

Of course at this point you are talking about using a 212mm^2 GPU along with another ~200mm^2 APU. expensive, and power hungry, besides using 2 GPUs is less efficient than just one much faster one, and does all the same jobs with less heat and a faster communication bus between the CPU and GPU, since they share a die.

Sure a 350mm^2 APU is huge, but it would create less heat, use less power, be cheaper and perform better than the Dedicated GPU and APU combo.
 
Eh what, I thought it was Cells incredible hard way to program for that made it very hard to port to, not the GPU

there probably was a learning curve.

But that is not why ps3 multiplatform ports still are left behind. The situation is the way it is because of ps3 weak gpu.

When they say they take advantage of cell, most of the time it actually means cell does things the rsx should be doing.

if ps3 had xenos and cell, things would be completely different.

ps3 multiplatform games falter because of weak gpu compared to xenos and slightly less ram.

Not because of cell.
 
Gemüsepizza;44807206 said:
CPU

PowerPC: WiiU (3 cores), Xbox 360 (3 cores), PS3 (1 core+SPUs)

x86 (probably): Xbox 3 (~4 cores), PS4 (~4 cores)

If the Xbox 3 and PS4 will have a more powerful CPU, chances are devs need to put CPU tasks on the GPU for Wii U ports. That means even less resources for graphics.

You might want to double your core estimates for the next Xbox and PlayStation.
 
You might want to double your core estimates for the next Xbox and PlayStation.

Why not quadruple it. I hear AMD will be using HD8000 GPUs when those consoles launch, are you sure they are dusting off GCN to use for them?

The reason why I am even responding to this post is because it seems so pulled right out of thin air. Is there a new rumor about PS4? because quad core A10 should be the most current dev kits right now.
 
Of course at this point you are talking about using a 212mm^2 GPU along with another ~200mm^2 APU.
PS3 had a 258mm² GPU and a 236mm² CPU ("APU" even!) at launch.

Now I'm not suggesting that we will see a repeat performance of that level of awesomeness, but it's already quite a bit smaller than what they used in PS3.

And I don't get why you seem to reject 8 Jaguar cores out of hand. As you yourself pointed out, those cores are tiny. If they are using them they better be using 8!
 
Why not quadruple it. I hear AMD will be using HD8000 GPUs when those consoles launch, are you sure they are dusting off GCN to use for them?

The reason why I am even responding to this post is because it seems so pulled right out of thin air. Is there a new rumor about PS4? because quad core A10 should be the most current dev kits right now.

Well, I think he works for Polygon.

But there are a lot of rumours going around.

Anyway, where I think 720/PS4 development could help out with the Wii U porting situation is if the CPUs are also less powerful than a Xenon, or similarly require some level of CPU workarounds for cross-generational ports.

Then devs would have 2 or 3 targets requiring CPU 'work' when bringing over a 360 game. That would help Wii U presumably, making it no longer an odd-man here.

That scenario might not be so much of a help for next-gen exclusive games, though, depending on what devs are doing with GPUs.

BUT:

z0m3le said:
if the publisher wants to bring the game to the platform

If pubs decide wii u is the jumping off point for next-gen games, these problems don't materialise anyway :) For non-cross-generational games anyway.

Question is if that'll happen. I have my doubts based on apparent pub commitment so far.

Thanks also to McHuj for the insight on clocking...I wonder how far a jaguar can be pushed before reliability becomes an issue. A switch to jaguar precipitating an increase in the number of cores - with an eye on aegies' comment - wouldn't be too surprising though, if '10x PS3 PPU' was still Sony's target (assuming that prior rumour was legit).
 
If aegis works for Polygon (and if they have the currently-circulating spec sheets), then we've basically confirmed in this thread that the Durango and Orbis are using 8 core Jaguar variants.
 
If aegis works for Polygon (and if they have the currently-circulating spec sheets), then we've basically confirmed in this thread that the Durango and Orbis are using 8 core Jaguar variants.
I wound't be surprised in the least if this is true. It would make sense:
- Tiny cores
- Cheap cores
- Don't have to worry about any kind of legacy, so might as well put lots of cores and let developers figure out how to use them.
 
I wound't be surprised in the least if this is true. It would make sense:
- Tiny cores
- Cheap cores
- Don't have to worry about any kind of legacy, so might as well put lots of cores and let developers figure out how to use them.

Well, depends how the spec sheet is interpreted.
It could be a 4 core Steamroller unit with 8 threads... but 4 "cores" being doubled would indicate, in language, that it's 8 cores. And because we're talking about a console with a TDP limit you can infer that if this person has legitimate information that we're talking about 8 full cores then we're dealing with the likes of:

1) Low power IBM cores (A2, Wii U CPU, 470s, etc) - least likely
2) Low power ARM cores (I don't expect an 8-core entirely ARM based console) - second least likely
3) Low power Intel cores (though the Durango dev kits are supposedly intel-based, rumour has it they are Xeons in basic server cases. You're not getting 8-core Xeons in a console, and they are likely just there to stand in for x86 hardware to be developed on. Intel also isn't an easy partner to work with). - very unlikely
4) Low power AMD cores (ie Jaguar) - I'd say this is most likely still, based on all the rumours.

It would be customized to some degree in any case, but the relation will probably be to case 4 if you take what we've all heard at face value.
 
Basically, the "WUSTs" were, for months, an echo chamber where any negative speculation (or even just speculation perceived as negative) about the Wii U was met with derision, while every positive tidbit was immediately absorbed into the collective "knowledge" base. I feel like the expectations built up in such an environment are the main reason we have this very negative and bipartisan situation now.

I still remember the time I had the audacity to suggest that the Wii U CPU might be clocked lower (not even be slower, literally just have a lower clock) than PS360s CPUs, and the page-long outpouring of outrage that caused.


Your point is valid if the PS4 were to feature 4 Jaguar cores clocked at 2 GHz or less. I consider that incredibly unlikely.

Fair enough if you posted regularly in them from the start but i really can't remember people expecting anything more than a 1 TFLOP GPU, 3GB's of Ram and a very like 360 CPU at Max. Not exactly super unrealistic expectations.

I mostly listened to BG and Ideaman as i had no idea anyone else was a developer, 'insider', or had 'sources'.

Those guys were always very fast to play down specs, i was told around WUST 3 that there was no chance of the GPU even approaching 1TF, 800 GF's at most and by the end of the fourth thread BG had said on several occasions that from his calculations it would be around 600 GFLOPs at most.

Regarding the CPU, people that asked were told it would be similar to Xenon but clocked slower (A Tri core 1.5 - 2Ghz CPU wasn't far off).

The console i had in my head after several months reading the WUST's daily was -

Tri Core 2Ghz CPU.
2-3GB's of Ram.
~600 GFLOP GPU.

Not far off what we are seeing at all. I really don't get this attitude that people were expecting some 2TF GPU monster with 4-6 GB's of Ram and a 4Ghz CPU, if you can find people saying that and being backed up by the guys 'in the know' then i would be interested in reading them because i never saw anything remotely along those lines.

*Edit

How fast do you expect PS4's CPU to be 4Ghz ?. My guess would be it will be 2-2.5 Ghz supported by a really, really powerful 2TF GPU and 4GB's of Ram. GPU centric much like WiiU.
 
there probably was a learning curve.

But that is not why ps3 multiplatform ports still are left behind. The situation is the way it is because of ps3 weak gpu.

When they say they take advantage of cell, most of the time it actually means cell does things the rsx should be doing.

if ps3 had xenos and cell, things would be completely different.

ps3 multiplatform games falter because of weak gpu compared to xenos and slightly less ram.

Not because of cell.

It's both, the PPU is rather shit. SPUs are the only part of the PS3 design that are actually nice, but it's a PITA to make them sing.

I've heard so many stories of the first X360 -> PS3 ports running at <10fps. It was a pain until people started using SPUs, and not just for helping the GPU.
 
How on Earth does this make any sense? Would fare better how? Right now X360 ports to WiiU are generally slightly worse than the original or maybe on par at best, and you think that portdowns of games made for multiple times faster machines will fare better than that?

Do you really think that if a game, let's say runs at 1080p/60FPS on X720 and it runs at 720p/30FPS on WiiU, that this would mean the game fares better than ports from X360 made on WiiU today?

Wii U modern architecture vs PS4/Xbox3 modern architecture

Wii U modern architecture vs Xbox 360 last gen architecture


These ports are being made from the ground up first (and with years of development) on the Xbox 360 and then ported over to Wii U in some cases only 6 months. The Wii U also has plenty of tricks of up it's graphical sleeve that ports will never show you. You said: "Right now X360 ports to WiiU are generally slightly worse than the original or maybe on par at best" and the reason is the same as crappy 360 to PC ports generally go.

We shouldn't need a whole ton of power to run a port of a game from 7 year old tech on a modern gaming PC - but usually they run not as good as could due to poor optimization. Wii U has a similar "problem" during launch and yes I do think in the future down ports from those coming systems would work pretty good given that the games have equal time in development across all platforms, taking advantage of what each system does best. The Wii U will be very similar to a Gamecube in terms of performance, deceptively powerful and not expected at first.

Again, go read Iwata asks on Wii U hardware.
 
Wii U modern architecture vs PS4/Xbox3 modern architecture

Wii U modern architecture vs Xbox 360 last gen architecture


These ports are being made from the ground up first (and with years of development) on the Xbox 360 and then ported over to Wii U in some cases only 6 months. The Wii U also has plenty of tricks of up it's graphical sleeve that ports will never show you. You said: "Right now X360 ports to WiiU are generally slightly worse than the original or maybe on par at best" and the reason is the same as crappy 360 to PC ports generally go.

We shouldn't need a whole ton of power to run a port of a game from 7 year old tech on a modern gaming PC - but usually they run not as good as could due to poor optimization. Wii U has a similar "problem" during launch and yes I do think in the future down ports from those coming systems would work pretty good given that the games have equal time in development across all platforms, taking advantage of what each system does best. The Wii U will be very similar to a Gamecube in terms of performance, deceptively powerful and not expected at first.

Again, go read Iwata asks on Wii U hardware.

WiiU architecture is not meaningfully different from X360 other than missing 3 hardware threads if it's not SMT. So assuming a 30fps game you have 180ms per frame to do stuff with your shitty slow in-order Xbox 360 CPU and 90ms to do things with your nice "modern" out of order WiiU CPU (modern is in scare quotes because OoOE is not anywhere close to new). So to break even you have to get 2x performance out of WiiU or offload shit to GPU (which you can do on Xbox 360 anyway). Early ports indicate that it probably isn't breaking even, but it's close enough.

Assuming next gen xbox and ps3 are fast out of order processors with a minimum of 4 hardware threads and a GPU with a compute ring then this puts WiiU at an even bigger potential disadvantage since it can't lean on gains from OoOE to make up the slack, nor will it be able to lean on GPGPU (in whatever sense of the word) to make up the slack since newer consoles should also have that capability.

Add to this the slashdot anon claiming WiiU uses paired singles which are vastly inferior SIMD to VMX or SSE, and you've got another area where the WiiU is at a disadvantage.

Basically claiming ports from next-gen will fare better is ridiculous since you'll have an even larger power gap to make up. (Edit: I should add that this is only true if the extra power of the next-gen machines is actually used.)
 
WiiU architecture is not meaningfully different from X360 other than missing 3 hardware threads if it's not SMT. So assuming a 30fps game you have 180ms per frame to do stuff with your shitty slow in-order Xbox 360 CPU and 90ms to do things with your nice "modern" out of order WiiU CPU (modern is in scare quotes because OoOE is not anywhere close to new). So to break even you have to get 2x performance out of WiiU or offload shit to GPU (which you can do on Xbox 360 anyway). Early ports indicate that it probably isn't breaking even, but it's close enough.

Assuming next gen xbox and ps3 are fast out of order processors with a minimum of 4 hardware threads and a GPU with a compute ring then this puts WiiU at an even bigger potential disadvantage since it can't lean on gains from OoOE to make up the slack, nor will it be able to lean on GPGPU (in whatever sense of the word) to make up the slack since newer consoles should also have that capability.

Add to this the slashdot anon claiming WiiU uses paired singles which are vastly inferior SIMD to VMX or SSE, and you've got another area where the WiiU is at a disadvantage.

Basically claiming ports from next-gen will fare better is ridiculous since you'll have an even larger power gap to make up. (Edit: I should add that this is only true if the extra power of the next-gen machines is actually used.)

Don't get that at all, WiiU will be at most 4 times weaker than PS4 / 720, the Wii was using totally out of date hardware that couldn't even support the most basic versions of UE3 and was 10 - 15 times weaker than PS360.

PS4 / 720 are rumoured to be GPU centric consoles like WiiU and have around 2 TFLOP GPU's with 4-6GB's of Ram, if the publishers decide too they can simply scale down the games for WiiU, from 1080p / 60fps / high graphical settings on PS4 / 720 to 720p / 30fps / lower graphical effects on WiiU.

All of the next gen engines are highly scalable so that multi platform games can be released on as many different platforms as possible.

Third party publishers simply cant afford to go through another expensive generation releasing only two versions of each game, i think for the first 2-3 years of PS4 / 720's life multi platform games will be avaialble for PS3 / PS4 / 360 / 720 and WiiU.
 
Don't get that at all, WiiU will be at most 4 times weaker than PS4 / 720, the Wii was using totally out of date hardware that couldn't even support the most basic versions of UE3 and was 10 - 15 times weaker than PS360.

PS4 / 720 are rumoured to be GPU centric consoles like WiiU and have around 2 TFLOP GPU's with 4-6GB's of Ram, if the publishers decide too they can simply scale down the games for WiiU, from 1080p / 60fps / high graphical settings on PS4 / 720 to 720p / 30fps / lower graphical effects on WiiU.

All of the next gen engines are highly scalable so that multi platform games can be released on as many different platforms as possible.

Third party publishers simply cant afford to go through another expensive generation releasing only two versions of each game, i think for the first 2-3 years of PS4 / 720's life multi platform games will be avaialble for PS3 / PS4 / 360 / 720 and WiiU.

This. Also, I'm surprised to see some of the more sensible posters on here (posters who infact PC game with monster rigs and are knowledgeable about the PC landscape) downright selectively rule out the Wii U in terms of scalability. You need to look no further than PCs to see that scalability is the Wii U's saving grace.

For instance, on paper, or 'on the surface' (as some seem to judge the Wii U), my 'weak' A8-4500M laptop with integrated graphics shouldn't hope to run the latest games at an acceptable performance level seeing as higher end PC hardware is comfortably more than 15 times as powerful. Yet, by the 'miraculous' power of scaling, I am able to run Medal of Honor Warfighter at a little more than a constant 30 fps with ALL effects (albeit scaled down). And through wonderful optimization by the developers, I am able to run Dishonored at above 50 fps with MAX (in-game) settings. All running at 768p and all on my 'crappy' (but MODERN) laptop. And the best part? This is all under a 35W, low power draw.
 
Top Bottom