How big is the power gap between Wii U and PS3/360?

Red Dead Redemption looks better....seems to also have a bigger draw distance as well:

IHXKz.gif

rdr1o4kq.gif

Are you serious... it looks so much worse compared to Zelda U.
 
Looks worse and runs sub 30 fps much of the time. Was that game even native 720p?

To my eyes it looks much better since it doesn't have the messed up unphysical bloom effect that is present on most characters in WiiU first party games. Makes me want to take a sand paper and grind the characters till they reflect light like in a natural way.
 
I don't understand why everyone is saying how Mario Kart 8 has some of the best graphics. My GF bought it and when I played it, there was nothing about the graphics that screams this is better than PS3. If you are comparing it to the old versions of the same game, then yes it's a big jump but not compared to what's been done on PS3/360.
 
To my eyes it looks much better since it doesn't have the messed up unphysical bloom effect that is present on most characters in WiiU first party games. Makes me want to take a sand paper and grind the characters till they reflect light like in a natural way.

The bloom is a design choice and not really anything to do with the power.

I agree it looks out of place sometimes though, some sections in WWHD were a bit overpowered by bloom
 
GET_OUT!.gif


Seriously, Bayo 2 looks amazing and is a day 1 purchase but it looks nothing like GoW3. Hell, there aren't many games on PS4 that look as good yet!

Both games look about as good as me, comparing to what is shown so far from Bayo 2, hopefully the full game will look great and have a consistent framerate at 60fps.

I don't understand why everyone is saying how Mario Kart 8 has some of the best graphics. My GF bought it and when I played it, there was nothing about the graphics that screams this is better than PS3. If you are comparing it to the old versions of the same game, then yes it's a big jump but not compared to what's been done on PS3/360.

It is a big jump if you notice framerates/resolutions. The Sega racing games and Banjo-Kazooie:N&B are comparable to Nintendo's look, and those games were sub-30fps/720p, MK8 is 60fps/720p, with much better lighting and shadows.

This thread is just gonna go in circles with people posting pics of sub 30fps games to compare them to 60fps games. Not to mention the comparisons between completely different styles.
 
To my eyes it looks much better since it doesn't have the messed up unphysical bloom effect that is present on most characters in WiiU first party games. Makes me want to take a sand paper and grind the characters till they reflect light like in a natural way.

So you are talking personal prefrence as opposed to raw horsepower... This thread is about horsepower.
 
It is a big jump if you notice framerates/resolutions. The Sega racing games and Banjo-Kazooie:N&B are comparable to Nintendo's look, and those games were sub-30fps/720p, MK8 is 60fps/720p, with much better lighting and shadows.

This thread is just gonna go in circles with people posting pics of sub 30fps games to compare them to 60fps games. Not to mention the comparisons between completely different styles.

It is not when you look at the fact that the geometry is piss-poor with some objects having way less poly-count than GT and Forza on PS360. Nintendo achieved 60 fps by butchering of a bunch of stuff like AA and poly-count. So yeah, MK8 proves nothing to me.
 
I am quite familiar with that thread. Are you suggesting that any kind of consensus has ever been reached there?
Yes, there has been and it left the Wii U with a 160ALU / 176 gflops GPU which is significantly less than both what the XBox360 and the PS3 have. People can believe all they want but there is no evidence that it's really more powerful.
 
I am quite familiar with that thread. Are you suggesting that any kind of consensus has ever been reached there?

No reasoning with loons who keep on going Shinen, 1080p and DR,DR,DR when the damn game has very low-poly objects and looks like absolute garbage. Fourth Storm was the most reasonable poster in that thread and his estimate of 176 GFlops was independently corroborated by another trusted forum poster, who was earlier sceptical of this. I will take their word and the fact that there is no game on the WiiU rivalling GoW3 or U3 as proof of the fact that the WiiU GPU is 176 GFlops
 
M°°nblade;116977952 said:
Yes, there has been and it left the Wii U with a 160ALU / 176 gflops GPU which is significantly less than both what the XBox360 and the PS3 have. People can believe all they want but there is no evidence that it's really more powerful.

Where about in the thread is this out of curiousity? I've read a few pages but didn't really see anything

Being able to "switch on" pc textures on NFS:MW would suggest otherwise, but i'm not really knowledgeable enough on the tech side to know

edit: never mind i've just read Fourth Storm's post earlier in this thread. So the GPU has less grunt but a better feature set
 
It is not when you look at the fact that the geometry is piss-poor with some objects having way less poly-count than GT and Forza on PS360. Nintendo achieved 60 fps by butchering of a bunch of stuff like AA and poly-count. So yeah, MK8 proves nothing to me.

Those games have nice cars, but the tracks are super simplistic. Shadows are pretty blurry too.

No reasoning with loons who keep on going Shinen, 1080p and DR,DR,DR when the damn game has very low-poly objects and looks like absolute garbage. Fourth Storm was the most reasonable poster in that thread and his estimate of 176 GFlops was independently corroborated by another trusted forum poster, who was earlier sceptical of this. I will take their word and the fact that there is no game on the WiiU rivalling GoW3 or U3 as proof of the fact that the WiiU GPU is 176 GFlops

Bayonetta 2 seems to come close to GoW 3, so we'll see.
 
M°°nblade;116975675 said:
You forgot to mention only half of it's memory is used for games, that it's only half as fast and that a 119 pages long GPU feature&power analysis thread pointed out the GPU being less powerful after all. :p
The GPU has a lower theoretical peak, but it's still faster in real world applications. For starters, it's a true unified shader architecture, whereas RSX still had dedicated vertex and pixel shader units and Xenos had some sort of proto-unified shaders where not every shader unit supports the same operations, making it pretty much impossible for developers to max it out and potentially causing stalls. Also, the Wii U architecture relies heavily on local stores/ caches and DMA, so the main memory bandwidth isn't really all that important.
 
Where about in the thread is this out of curiousity? I've read a few pages but didn't really see anything

Being able to "switch on" pc textures on NFS:MW would suggest otherwise, but i'm not really knowledgeable enough on the tech side to know
Me neither. But having higher res textures seems something than can be explained by having a larger amount of memory available.
 
M°°nblade;116975675 said:
You forgot to mention only half of it's memory is used for games, that it's only half as fast and that a 119 pages long GPU feature&power analysis thread pointed out the GPU being less powerful after all. :p

It's 4x total memory. Not all of ps360 is available to use either.

Also, as I pointed out on the previous page, it makes no sense to say "less powerful" unless you specify which aspect of graphics processing output you're talking about. I'm sure you know from that thread that the general consensus was that in this case less shaders/flops didn't mean 'worse'. Those shaders are much more modern and flop count alone is not an accurate yardstick when you're comparing GPUs from different generations.
 
It is not when you look at the fact that the geometry is piss-poor with some objects having way less poly-count than GT and Forza on PS360. Nintendo achieved 60 fps by butchering of a bunch of stuff like AA and poly-count. So yeah, MK8 proves nothing to me.

Lol the fuck? Does GT or Forza have to calculate what items 11 other computer players are going to get in addition to directing them? The physics that go along with using said items? Shadows casting on individual pieces of koopa shells as they break? Level geometry that isn't static? There are so many other factors that go into Mario Kart.

No reasoning with loons who keep on going Shinen, 1080p and DR,DR,DR when the damn game has very low-poly objects and looks like absolute garbage. Fourth Storm was the most reasonable poster in that thread and his estimate of 176 GFlops was independently corroborated by another trusted forum poster, who was earlier sceptical of this. I will take their word and the fact that there is no game on the WiiU rivalling GoW3 or U3 as proof of the fact that the WiiU GPU is 176 GFlops
Bayonetta 2 runs at a rock solid 60FPS, GOW3 fluctuates to 30, probably even lower at certain scenes.

Perhaps you missed the video I posted on the previous page?
https://www.youtube.com/watch?v=OAHup4CXF9E&feature=youtu.be&t=22m30s
 
M°°nblade;116977952 said:
Yes, there has been and it left the Wii U with a 160ALU / 176 gflops GPU which is significantly less than both what the XBox360 and the PS3 have. People can believe all they want but there is no evidence that it's really more powerful.

The problem with theoretical maximum calculations, especially in the case of Flops, is it doesn't really matter what the theoretical maximum quantity of MUL-ADD operations a GPU or GPU/CPU combo can crank out in a second. Comparing such a number across multiple computer organizations/architectures is foolish.

Example:

A HD6870 uses a 5way VLIW Architecture (as did all AMD GPS from the HD2000 - 5000 series) and is a 2016 GFLOP GPU.

A HD7850 uses GCN 1.0 Architecture and is a 1760 GFLOP GPU.

A HD7850 produces better graphical output than a HD6870 despite having a lower theoretical ability to crank out MUL-ADD operations.

How this relates:

The GPU in the 360 is from an Architecture from that predates even the VLIW5 architecture. So attempting to compare the graphical performance as a theoretical Floating point number directly between the GPU or CPU/GPU combo in teh Xbox 360 and the GPU in the WiiU is not an acceptable comparison.
 
Simply not true. Wii U has more / faster / better structured memory. 4x more. A faster GPU with more modern shader capabilities. And is capable of running great looking games a 60fps.

4X more memory, but only just over double more for the game. The PS360 use 45-60MB for the OS iirc, while the Wii U reserves 1GB.

And "better structured" has to be defined. I'm absolutely sure it has a better memory controller which allows lower latencies and higher efficiency than the PS360, but it's RAM modules can only have a total of 12.8GB/s bandwidth for a unified CPU/GPU pool, while the PS3 had two pools of ~20GB/s and the 360 had a unified ~23GB/s as well plus 10MB of very fast eDRAM for the GPU alone. Sure, real world efficiencies will bring them closer, but I don't think enough to jump that gap.

The 32MB eDRAM will no doubt help, but it's like the XBO problem on crack, fast eDRAM, limited speed to the rest of the memory compared to the others.

Remember these things? Good times in the mid 'aughts
X360bandwidthdiagram.jpg


kaigai01l.gif



Also as for the consensus not being formed in that Latte GPU thread, I'm pretty sure the 320 shader theory was mostly thought of as crackpot, while 160 was very likely. The Xenos would equate to 240 shaders if counted the way they are now, but of course less efficient. Also, nice of the above link to only show midway through that thread, rather than nearer to the end where most people were convinced, maybe two or three left out. 160 shaders is just the most likely with its die size removing the eDRAM and the 8 compute blocks.

I'm sure the Wii U has a better GPU than the last gen due to architectural efficiency improvements. The CPU will maybe be better for integer work especially if devs don't take care of pipeline bubbles (large hits for the PS360, 4-6 cycles for the WIi Us pipeline), far worse for floating point and SIMD. If a game relies on a lot of SIMD, it will need a lot of reworking for the U. To answer the OPs question of "how big is the power difference", it's not easy to say in a nutshell. It's a very mixed bag. Better in some ways, worse in others.
 
4X more memory, but only just over double more for the game. The PS360 use 45-60MB for the OS iirc, while the Wii U reserves 1GB.

And "better structured" has to be defined. I'm absolutely sure it has a better memory controller which allows lower latencies and higher efficiency than the PS360, but it's RAM modules can only have a total of 12.8GB/s bandwidth for a unified CPU/GPU pool, while the PS3 had two pools of ~20GB/s and the 360 had a unified ~23GB/s as well plus 10MB of very fast eDRAM for the GPU alone. Sure, real world efficiencies will bring them closer, but I don't think enough to jump that gap.

The 32MB eDRAM will no doubt help, but it's like the XBO problem on crack, fast eDRAM, limited speed to the rest of the memory compared to the others.

360 uses 32MB and PS3's OS footprint varies depending on what optional programs you choose to run with your game.
 
Wii U games look great due to Nintendo's art direction and color palette decisions. It's pretty well documented that it's comparable to the last generation systems power wise. Not sure where all this debate comes from.

Those who say that Wii u games are technically more advanced than ps4/xbone or that the new systems would "melt" trying to run Zelda are embarrassing themselves though.
 
I am quite familiar with that thread. Are you suggesting that any kind of consensus has ever been reached there?

Yeah, the 160 shaders thing. Well, imagine this. I present to you a GPU with 1400 shaders, now you would think this is great. Problem is, each shader is supported by a measly 2kb of sram. For some reason, despite having so many shaders this GPU can't out do Xenos. It's obviously the memory embedded with the shader. Shaders core is bottlenecked, memory available to shaders is small and slow.

This is before you get to main memory or the eDRAM on Latte.
 
It's 4x total memory. Not all of ps360 is available to use either.
But those systems only use 32-42 MB for their OS and didn't reserve 1000MB for it like the Wii U does, so mentioning total memory is a useless parameter since the games don't benefit from it. It's approximately only 2x the amount for in-game memory usage.

Also, as I pointed out on the previous page, it makes no sense to say "less powerful" unless you specify which aspect of graphics processing output you're talking about. I'm sure you know from that thread that the general consensus was that in this case less shaders/flops didn't mean 'worse'. Those shaders are much more modern and flop count alone is not an accurate yardstick when you're comparing GPUs from different generations.
I agree with all that, yet there's still little base for Clockwork5 to say it's more powerful.
 
Lol the fuck? Does GT or Forza have to calculate what items 11 other computer players are going to get in addition to directing them? The physics that go along with using said items? Shadows casting on individual pieces of koopa shells as they break? Level geometry that isn't static? There are so many other factors that go into Mario Kart.


Bayonetta 2 runs at a rock solid 60FPS, GOW3 fluctuates to 30, probably even lower at certain scenes.

Perhaps you missed the video I posted on the previous page?
https://www.youtube.com/watch?v=OAHup4CXF9E&feature=youtu.be&t=22m30s

Exactly, people are just easily impressed by realism.
 
The WiiU certainly has more RAM and a more modern GPU than it's last gen equivalents, but in terms of overall power, the performance is much closer to the 360/PS3 than it is to the Xbox One and PS4.

Nintendo's strength is (and always has been) in creating a visual style that looks impressive while not being particularly taxing on the hardware. They've been doing this since the Gamecube days; there's a reason why the graphics on games like Wind Waker hold up so well even more than a decade later. The games might be pretty, but the PS4 and Xbox One could easily do what's being done there, many times over. Sadly, the reverse isn't true, which is hurting Nintendo on the 3rd party front. Sure, the PS3 and 360 would have no hope of keeping up, but those systems had far less RAM and a much more archaic GPU architecture (not to mention that those systems are 8-9 years old now; handily beating them should be expected, not some special achievement deserving of praise).
 
I wonder why the Wii U needs so much RAM just for the OS. I wonder if they will ever lower it and give it back for games?
 
M°°nblade;116980757 said:
But those systems only use 32-42 MB for their OS and didn't reserve 1000MB for it like the Wii U does, so mentioning total memory is a useless parameter since the games don't benefit from it. It's approximately only 2x the amount for in-game memory usage.

Agreed. I was pointing out that it's not a useful stick for either side to use.

M°°nblade;116980757 said:
I agree with all that, yet there's still little base for Clockwork5 to say it's more powerful.

Agreed again :) See my posts from the previous page on using phrases like "more powerful" and "less capable"; it's all nonsense without specific context and just bogs down the discussion imo.
 
360 uses 32MB and PS3's OS footprint varies depending on what optional programs you choose to run with your game.

Ah, they shrunk the OS a few times, wasn't up to date. Even more to my point though. The Wii U does not give 4X more memory to the game, it's just over double.
 
Wii U has more RAM and a much better GPU, but a much worse CPU.

The difference is hard to assess, it's more complicated than just "it's X times more powerful".

But it's certainly closer to the competition's past gen than their current gen.
 
4X more memory, but only just over double more for the game. The PS360 use 45-60MB for the OS iirc, while the Wii U reserves 1GB.

And "better structured" has to be defined. I'm absolutely sure it has a better memory controller which allows lower latencies and higher efficiency than the PS360, but it's RAM modules can only have a total of 12.8GB/s bandwidth for a unified CPU/GPU pool, while the PS3 had two pools of ~20GB/s and the 360 had a unified ~23GB/s as well plus 10MB of very fast eDRAM for the GPU alone. Sure, real world efficiencies will bring them closer, but I don't think enough to jump that gap.

The 32MB eDRAM will no doubt help, but it's like the XBO problem on crack, fast eDRAM, limited speed to the rest of the memory compared to the others.
You don't constantly max out the memory bus, you mostly need high bandwidth to deal with spikes. As I just wrote, Wii U relies on local stores and DMA. It's not just MEM1 (the eDRAM) - CPU, GPU, IO processor, audio DSP and video encoder all have large dedicated local caches, which helps reducing or even eliminating those spikes.
 
I wonder why the Wii U needs so much RAM just for the OS. I wonder if they will ever lower it and give it back for games?

I think it's so you can pop into a browser at any time and a few other background functions. I'd love for them to shrink it and have devs have another 50% more RAM on the thing, but with an already slow OS, shrinking the RAM used by it AND speeding it up at the same time is going to prove quite difficult.

You don't constantly max out the memory bus, you mostly need high bandwidth to deal with spikes. As I just wrote, Wii U relies on local stores and DMA. It's not just MEM1 (the eDRAM) - CPU, GPU, IO processor, audio DSP and video encoder all have large dedicated local caches, which helps reducing or even eliminating those spikes.

I'm aware, and like I said, it's a bit like the XBO problem on crack I would imagine. Nintendo themselves will obviously do great at optimizing around it, but for third party devs working with a single fast memory pool is easier than micromanaging a eDRAM/eSRAM cache.


The PS4 vs XBO memory architectures are a bit (and I emphasize "bit" before someone runs away with thinking I said the situations are the same) like comparing the 360 or PS3 memory bandwidth to the Wii Us. Both had faster main pools of memory, the 360 had less eDRAM and the PS3 had none. Perhaps the Wii U could punch above them combining the eDRAM with the newer memory controller, but it's something else devs have to worry about.
 
Bayonetta 2 is much, much more impressive than God of War 3. I do not even get the "look at dem big dudes fighting" argument.... scales are fun, but they do not make your game technically superior in any way.
 
Bayonetta 2 is much, much more impressive than God of War 3. I do not even get the "look at dem big dudes fighting" argument.... scales are fun, but they do not make your game technically superior in any way.

The discussion is not about games' technical superiority though. Otherwise GTA5 would win easily. How Rockstar managed to pull that off on both 360 AND PS3 is a miracle.

If you want to compare the consoles power you should look at the hardware, not at the games.
 
The discussion is not about games' technical superiority though. Otherwise GTA5 would win easily. How Rockstar managed to pull that off on both 360 AND PS3 is a miracle.

If you want to compare the consoles power you should look at the hardware, not at the games.

Not sure if serious? The game dipped to 20FPS.

Meanwhile here's some 60FPS gameplay of Bayonetta 2:
http://videos.videopress.com/ebOzQphn/e3-2014-60fps_hd.mp4

and Mario Kart 8:
http://www.gamersyde.com/hqstream_mario_kart_8_electrodrome-32084_en.html
http://www.gamersyde.com/hqstream_mario_kart_8_bowser_castle-32085_en.html
 
The lack of power is overrated to me because Nintendo games have unique art styles rather than going for photo realism. I think in the long run, it will pay off for Nintendo.
 
The lack of power is overrated to me because Nintendo games have unique art styles rather than going for photo realism. I think in the long run, it will pay off for Nintendo.

I think this boils down to the chicken-egg argument. Nintendo HAS to rely on cartoonish art styles due to lack of sufficient power to drive anything else that can be eye-catching. I think the recent Nintendo games look fantastic, but I do not turn a blind eye to how those visuals are being achieved and what they are masking.
 
Bayonetta 2 runs at a rock solid 60FPS, GOW3 fluctuates to 30, probably even lower at certain scenes.

Perhaps you missed the video I posted on the previous page?
https://www.youtube.com/watch?v=OAHup4CXF9E&feature=youtu.be&t=22m30s
Probably? Lol. God of War 3 runs at an unlocked framerate, with the framerate rarely dipping below 30FPS and the average framerate ranging between about 40 or 50 FPS.

Of course, if you had actually played the game, you would know this. But please, continue making more uninformed statements about games you clearly haven't even played.
 
Aonuma confirmed it was all in-game running on a Wii U.

Yes it was "in game" (ie in engine/real-time on WiiU) but probably wasnt "gameplay" (ie someone actually playing part of the game). Might have been , but Aonumas translation seems to have meant in game rather than gameplay.
 
Yes it was "in game" (ie in engine/real-time on WiiU) but probably wasnt "gameplay" (ie someone actually playing part of the game). Might have been , but Aonumas translation seems to have meant in game rather than gameplay.

Yes that's what I meant. It's running realtime.
 
Probably? Lol. God of War 3 runs at an unlocked framerate, with the framerate rarely dipping below 30FPS and the average framerate ranging between about 40 or 50 FPS.

Of course, if you had actually played the game, you would know this. But please, continue making more uninformed statements about games you clearly haven't even played.

Lol, I've never played it? Perhaps you want me to go into detail on the Hercules gloves? Or maybe the final boss fight? Or how about Cerberus boss battle that takes place prior to repositioning stone hands? I played through that button mashing POS last year.


Also here's a video buddy:
https://www.youtube.com/watch?v=kYnydQ4t0aU

Dips to 35 FPS in this video alone. Below 30 in the beginning of the video.
 
Wii U has more RAM and a much better GPU, but a much worse CPU.

The difference is hard to assess, it's more complicated than just "it's X times more powerful".

But it's certainly closer to the competition's past gen than their current gen.

The cpu isn't much worse it's just very different, it's far more efficient and at some types of code actually performs much better
 
Lol, I've never played it? Perhaps you want me to go into detail on the Hercules gloves? Or maybe the final boss fight? Or how about Cerberus boss battle that takes place prior to repositioning stone hands? I played through that button mashing POS last year.


Also here's a video buddy:
https://www.youtube.com/watch?v=kYnydQ4t0aU

Dips to 35 FPS in this video alone. Below 30 in the beginning of the video.
Sorry, you sound like someone who hasn't played it. If you had played the game, you wouldn't be lumping it with other 30FPS games like TLoU and Uncharted.

So either you played it and you're making disingenuous claims about technical details about it being "30FPS", or you haven't played it and you're uninformed. Either way, you're a fool.
 
Sorry, you sound like someone who hasn't played it. If you had played the game, you wouldn't be lumping it with other 30FPS games like TLoU and Uncharted.

So either you played it and you're making disingenuous claims about technical details about it being "30FPS", or you haven't played it and you're uninformed. Either way, you're a fool.

Still don't believe me? Perhaps I should go further?
Final boss fight with Zeus has you destroying Gaia's heart
Do you want me to keep listing portions of the game? It really wasn't very good, I'd rather not.

If you are referring to this post of me "lumping" them together when someone else did, there's a reason why I put a tilde in front of 30, hint: it isn't for Uncharted or TLOU.
http://www.neogaf.com/forum/showpost.php?p=116970122&postcount=543
 
The cpu isn't much worse it's just very different, it's far more efficient and at some types of code actually performs much better

To add to this, its short pipeline (4-6 [it adds 2 cycles for FP] vs ~30 or whatever in Xenon/Cell) makes it good for general and branchy code, especially if programmers don't take care to not make pipeline bubbles. It's only a 4-6 cycle hit for Espresso, and a 30 cycle for the others. I think between 12-16 for Jaguar, don't remember. Also very efficient for integer work.

Where it's far worse is in SIMD and floating point, the PowerPC 750 doesn't have traditional SIMD units, but the Gamecube version added basic SIMD in paired singles, far less broad than what was in Xenon/Cell. The Wii and WIi U versions still have that very basic block for SIMD. And floating point was always a weakness on the 750 which carried over.


Now, in the 7th generation, since Cell and Xenon were so good at SIMD and floating point, games leaned on those very heavily by the end. So they'll have to be reworked with integer and shitty SIMD in mind for the Espresso.

Saying one is better than the other is a very complicated matter, but I'd put it in a nutshell as that Xenon and Cell are much higher in theoretical performance, while the Wii U is easier to take advantage of but hampered by current cross gen games using what it's bad at. Towards the end of the 7G cycle the Xenon and Cell are being made very good use of, below their theoreticals still I'm sure, but much closer to them than they were in the start of their gen.

~15Gflopsfor 16bit flops on the 64bit FPU for the Wii U seems insurmountably smaller than the 200Gflops of the Cell or more realistically 100 for the Xenon, but those numbers were reached with all their SIMD units, so only count for code working on them.
 
Top Bottom