MrPresident
Banned
Orders of magnitude? An order of magnitude is a ten times increase, so orders of magnitude would be hundreds, thousands, etc.
Sorry. Multiple times faster.
I'm not an English native.
Orders of magnitude? An order of magnitude is a ten times increase, so orders of magnitude would be hundreds, thousands, etc.
Both questions are a bit loaded. A game that runs at 60fps is going to need more CPU power than when running that same game at 30fps and is independent of resolution. And, even with enough CPU power, it could still be limited by memory bandwidth.
It's not debatable that both Xbox One and PS4 are orders of magnitude faster than the Wii U on pretty much all fronts. But it doesn't mean that some of the same games aren't possible on all platforms. There are countless PC games out there playable across systems with similarly great performance differentials.
The hardest thing to get around is the difference in memory size. A game using 5+ gigabytes of RAM on anything other than texture data is pretty much impossible on a system with just 1gb of usable RAM.
Both questions are a bit loaded. A game that runs at 60fps is going to need more CPU power than when running that same game at 30fps and is independent of resolution. And, even with enough CPU power, it could still be limited by memory bandwidth.
It's not debatable that both Xbox One and PS4 are orders of magnitude faster than the Wii U on pretty much all fronts. But it doesn't mean that some of the same games aren't possible on all platforms. There are countless PC games out there playable across systems with similarly great performance differentials.
The hardest thing to get around is the difference in memory size. A game using 5+ gigabytes of RAM on anything other than texture data is pretty much impossible on a system with just 1gb of usable RAM.
Another question: is 720p60 comparable to 1080p30? Which one is more taxing?
Is 720p60 equally taxing as 1080p30?
Slandering Montpelier without a source, not nice...
That's a good question. Some often point to the 2.25X increase in pixel, and directly compare it to the 2X increase in framerate performance, and conclude that the former is more taxing. But that's not a fair direct comparison, because in one case, you're just rendering more pixels. In the other case, the GPU has extra frames of pixels to render, textures for those new frames, lights for those new frames, etc. Double the work in every area. Then there's new frames of animation, sounds that need to be synced to those frames, and whatever work the CPU has cut out. It's a lot of extra work all around!
I've seen a dev(on GAF, I believe) said something to the effect of; "boosting resolution is much easier than doubling framerate. So if devs have a bit of performance to spare, most would opt for an increase in pixels." This is the reason why 60fps is normally a goal decided upon very early on in development. The whole engine must be built around it.
So although it's hard to quantify by a directly comparison of numbers, I'm inclined to believe that 720p-60fps may be more taxing, in general.
That's a good question. Some often point to the 2.25X increase in pixel, and directly compare it to the 2X increase in framerate performance, and conclude that the former is more taxing. But that's not a fair direct comparison, because in one case, you're just rendering more pixels. In the other case, the GPU has extra frames of pixels to render, textures for those new frames, lights for those new frames, etc. Double the work in every area. Then there's new frames of animation, sounds that need to be synced to those frames, and whatever work the CPU has cut out. It's a lot of extra work all around!
I've seen a dev(on GAF, I believe) said something to the effect of; "boosting resolution is much easier than doubling framerate. So if devs have a bit of performance to spare, most would opt for an increase in pixels." This is the reason why 60fps is normally a goal decided upon very early on in development. The whole engine must be built around it.
So although it's hard to quantify by a directly comparison of numbers, I'm inclined to believe that 720p-60fps may be more taxing, in general.
Both questions are a bit loaded. A game that runs at 60fps is going to need more CPU power than when running that same game at 30fps and is independent of resolution. And, even with enough CPU power, it could still be limited by memory bandwidth.
It's not debatable that both Xbox One and PS4 are orders of magnitude faster than the Wii U on pretty much all fronts. But it doesn't mean that some of the same games aren't possible on all platforms. There are countless PC games out there playable across systems with similarly great performance differentials.
The hardest thing to get around is the difference in memory size. A game using 5+ gigabytes of RAM on anything other than texture data is pretty much impossible on a system with just 1gb of usable RAM.
One thing is that often framerate decisions come down to how long it takes to process a frame. CPU and even GPU calculations take time, the more a developer wants to do during a frame will obviously take more of that time up. Eventually the processing time goes beyond 16ms, which makes constant 60fps imposible. The decision is to either cut back on things that need processing, or drop the fps to 30, or something lower (or variable).Thanks, so it seems 1080p is more taxing on the graphics card and taking the game to 60fps affects the system in general. IMO Wii U seems indeed very balanced for 720p60.
One thing is that often framerate decisions come down to how long it takes to process a frame. CPU and even GPU calculations take time, the more a developer wants to do during a frame will obviously take more of that time up. Eventually the processing time goes beyond 16ms, which makes constant 60fps imposible. The decision is to either cut back on things that need processing, or drop the fps to 30, or something lower (or variable).
1080p/30 vs 720p/60 is pretty hard to judge.
Thanks for your inputs, this is very interesting. Basically there are so many variables to go by that it makes comparing the two very hard to do, (engine, effects, game non-graphics calculations, etc.)
To be honest I did not expect so many Wii U games to be 720p60, while some are simpler others like Bayo2, W101 and MK8 seem to be doing a lot.
I have a question too, and sorry if it has been discussed already, but do we know why the Wii U and PS360 versions of Rayman look the same? Before it went multiplat, didn't someone at Ubisoft say that they were doing things graphically that could only be done on the Wii U? Was that BS or what?
Here is the PC picture of Rayman Legends
http://images.eurogamer.net/2013/articles//a/1/6/1/2/3/1/6/PC_000.bmp.jpg
Here is the 360 version of the same scene. Use the cage to compare as you can see the aliasing, allowing you to compare the two.
http://images.eurogamer.net/2013/articles//a/1/6/1/2/3/1/6/360_000.bmp.jpg
Here is the Wii U picture just for the comparison.
http://images.eurogamer.net/2013/articles//a/1/6/1/2/3/1/6/WiiU_000.bmp.jpg
Same native resolutions.
The PC and Wii U versions look about the same, minus particles(fireflies). The 360 version's lighting looks different. Compare the sides of the dragon, the Wii U and PC look identical. 360 looks darker... looks like a different lighting engine maybe? Dunno, hard to tell.
Just like how the HD twins have existed next to the PCs with a lot of memory? There are always ways around it, its just how the results pan out.
Is 720p60 equally taxing as 1080p30?
I really am not the type to care about the graphics between the 3 consoles, but it kinda grinded my gears when Nintendo mentions the 2 gamepad support 2 years ago then just fall silent about it.
I wonder if they're spending so much time learning the console that they just put that idea on the backburner.
I really am not the type to care about the graphics between the 3 consoles, but it kinda grinded my gears when Nintendo mentions the 2 gamepad support 2 years ago then just fall silent about it.
I wonder if they're spending so much time learning the console that they just put that idea on the backburner.
Actually, it's not so simple, even just for the GPU. In terms of frame buffer and bandwidth, yeah they're pretty similar, but things like shader calculations take time as well, and doubling how many you need to do per second is non-trivial, as resolution has little to no effect on a shaders performance.There's not a 'yes or no' answer to that. Both take roughly the same amount of GPU resources but the rest of the system also needs to be up to snuff.
That's a good question. Some often point to the 2.25X increase in pixel, and directly compare it to the 2X increase in framerate performance, and conclude that the former is more taxing. But that's not a fair direct comparison, because in one case, you're just rendering more pixels. In the other case, the GPU has extra frames of pixels to render, textures for those new frames, lights for those new frames, etc. Double the work in every area. Then there's new frames of animation, sounds that need to be synced to those frames, and whatever work the CPU has cut out. It's a lot of extra work all around!
I've seen a dev(on GAF, I believe) said something to the effect of; "boosting resolution is much easier than doubling framerate. So if devs have a bit of performance to spare, most would opt for an increase in pixels." This is the reason why 60fps is normally a goal decided upon very early on in development. The whole engine must be built around it.
So although it's hard to quantify by a directly comparison of numbers, I'm inclined to believe that 720p-60fps may be more taxing, in general.
You are correct. It's not that it's impossible to adapt game engines to the system despite its massive gulf in performance in comparison. (For example, see Frostbite 2/3 and its debate despite its existence on the 360, UE4 and its mobile variants, etc). However, the issue with the Wii U is that there is near zero business incentive on the platform when it comes to the bean counters (both prior to launch, and obviously after).
Thank all of you for those elaborate answers. I did underestimated how many more resources are needed to push a game from 30 to 60 frames.One thing is that often framerate decisions come down to how long it takes to process a frame. CPU and even GPU calculations take time, the more a developer wants to do during a frame will obviously take more of that time up. Eventually the processing time goes beyond 16ms, which makes constant 60fps imposible. The decision is to either cut back on things that need processing, or drop the fps to 30, or something lower (or variable).
1080p/30 vs 720p/60 is pretty hard to judge.
I wouldn't say nessisarily that the WiiU is balanced for 60fps, more that the games targeting the system (that are 60fps), are balanced well for 720p/60fps...
For that idea, they may also want to reduce the production cost of the controller. Nintendo would likely want to increase the popularity of their console before those production issues significantly improves.I really am not the type to care about the graphics between the 3 consoles, but it kinda grinded my gears when Nintendo mentions the 2 gamepad support 2 years ago then just fall silent about it.
I wonder if they're spending so much time learning the console that they just put that idea on the backburner.
Thanks, so it seems 1080p is more taxing on the graphics card and taking the game to 60fps affects the system in general. IMO Wii U seems indeed very balanced for 720p60.
I think I saw a mod mention that keeping a consistent 60 fps is closer to 3x the performance requirement of 30 fps due to those issues.
Orders of magnitude? An order of magnitude is a ten times increase, so orders of magnitude would be hundreds, thousands, etc.
Well, we already know that the 2nd gamepad will add limitations to the experience (half the refresh rate), given Nintendo is already having trouble selling, and spreading the message about the gamepad, I don't think it's in Nintendo's best interest to open the floodgates for negativity about the experience when using two gamepads.I really am not the type to care about the graphics between the 3 consoles, but it kinda grinded my gears when Nintendo mentions the 2 gamepad support 2 years ago then just fall silent about it.
I wonder if they're spending so much time learning the console that they just put that idea on the backburner.
Well, we already know that the 2nd gamepad will add limitations to the experience (half the refresh rate), given Nintendo is already having trouble selling, and spreading the message about the gamepad, I don't think it's in Nintendo's best interest to open the floodgates for negativity about the experience when using two gamepads.
While it's yet to be seen how the general gaming performance will be affected having 2 gamepads, the limitation that we know of isn't where you're thinking (I believe). It's actually the data communication between WiiU and Gamepad, the Wii U is only able to send 60 frames of data to the gamepad per second (not relating to GPU performance), so to support 2 gamepads, it has to split up that 60 frames to two devices.Could they not just disable the display to the tv when using 2 game pads?
It's a development decision.Thanks, so it seems 1080p is more taxing on the graphics card and taking the game to 60fps affects the system in general. IMO Wii U seems indeed very balanced for 720p60.
Interesting, Wii U seems built for strong 720p games IMO, with either 60 or 30 fps depending on the effects, etc devs are willing to push.
I really am not the type to care about the graphics between the 3 consoles, but it kinda grinded my gears when Nintendo mentions the 2 gamepad support 2 years ago then just fall silent about it.
I wonder if they're spending so much time learning the console that they just put that idea on the backburner.
To put it in context, a game designed explicitly to run at 1080p30 on a PS4 would be far too CPU intensive to run at 720p60 on a Wii U.
No, the 360 has a known gamma issue.
Just wondering if it is possible that Nintendo could have disabled some of the shaders until PS4XB1 started production. Then unlock them in the October update. May start pushing out warm air with unlocked shaders.
So does the Wii U, but it isn't represented in the shots...
Just wondering if it is possible that Nintendo could have disabled some of the shaders until PS4XB1 started production. Then unlock them in the October update. May start pushing out warm air with unlocked shaders.
There would be no reason to do this, so no.Just wondering if it is possible that Nintendo could have disabled some of the shaders until PS4XB1 started production. Then unlock them in the October update. May start pushing out warm air with unlocked shaders.
My biggest hope is that they can really optimize the OS over the coming year and eventually unlock a further half a gig of RAM for developers.
I am employed as a programmer but what we do has nothing at all to do with games.Is Rolf NB a dev ?, if so who does he work for.
Well, to be more accurate, the Wii U needs a set that plays well with Limited RGB. Some capture equipment (like the one Lens of Truth uses) doesn't play well with this but my TV sets and projectors certainly do so it looks "normal".
We will find out soon if they do this with the upcoming update.
...Can't the Internet Browser take up a hefty chunk of RAM?
Didn't Iwata say around Jan this year that people have not seen 50% of what WiiU is capable of ?. He could and is most probably talking about developers learning the hardware though.
This is the same guy that told us: (paraphrasing here)
We've learned from our mistakes with the 3DS. The Wii U will have a strong lineup of games within the launch Window.
No it's launch window was not fine. Even now, it's not fine.
Not even Nintendo agree with you. Only last month they went on record saying they believe their biggest issue is lack of compelling software.
Which is why they had games with 2:3 attach rates. Lol yea, that's not compelling at all...
Its launch day was fine. Everything after 2012 was a bungled mess. IIRC, the main reason so many parties came to launch was because Nintendo promised a better royalty percentage for launch-day games. In all honesty they should have extended that offer through the window so they would have seen a more consistent release schedule throughout instead of 25 some odd games with varying levels of port quality (many bad) all in pretty much one day.Wii U's launch window was fine. It was everything between then and about now that blew.
So Nintendo are lying then are they? Even they're saying the lack of compelling software is hurting the Wii U.
Also it's the major reason why the Wii U is breaking records for worst selling major console since the Saturn.
You're dreaming if you think the Wii U's software lineup even now is acceptable. It's pathetic.
Wii U's launch window was fine. It was everything between then and about now that blew.
There is a difference between not understanding what is going on and knowing the issue but being unable to do much about it. The software wasn't ready, and they decided not to rush them. They figured the first half of the year would be rough for the Wii U. That was why they did that epic January Nintendo direct.No it's launch window was not fine. Even now, it's not fine.
Not even Nintendo agree with you. Only last month they went on record saying they believe their biggest issue is lack of compelling software.
Thanks for the reply. Looks like all the enchantments were due to the upgraded shader effects and RAM. I personally found that pretty cool. Nintendo may have a fun time taking advantage of the complete hardware.My slighly different outake
- Added non-anamorphic 16:9, renders 1080p
- Reworked graphical overhead from Flipper/TEV Pipeline to a modern GPU/programmable shader environment
- High Dynamic Range (HDR)
- Screen space Ambient Occlusion (SSAO)
- Self-shadowing
- Removed horrible fake-DOF blur filter
- Loads up entire ocean at once to improve traveling/pacing
- 32 bit color framebuffer (up from 16-bit with dithering)
- Redone sky and clouds
- Bigger resolution texture assets
Nothing wrong with the way you were wording it all, though.It did, notice the background texture:
![]()
It's not even the same asset, it was clearly painted over, reinterpreted; doesn't 100% match.
Other comparison, since we've been at it:
![]()
![]()
It's a crying shame they didn't spruce up geometry a little.
. The machine was designed to use a lower amount of RAM than the other next-gen consoles. It will even be interesting to see how devs will use all of that RAM. Devs were able to do some impressive stuff while managing less than 512MB.Didn't Iwata say around Jan this year that people have not seen 50% of what WiiU is capable of ?. He could and is most probably talking about developers learning the hardware though.
I really wouldn't be surprised though if clock changes were possible through a system update although obviously not to the ridiculous degree that the rumour from a few months ago would suggest. Maybe a small boost to the CPU and or GPU would be possible.
My biggest hope is that they can really optimize the OS over the coming year and eventually unlock a further half a gig of RAM for developers.
The WiiU specs are what they are but I believe if you gave Nintendo the chance to change / tweak one thing about the specs, they would have put another 2GB's of RAM in the system which would give it 3GB's for games with the possibility of 3.5GB's in the future which isn't all that far behind PS4 and XBOX ONE's 5GB's of RAM for games. It would make porting games, far, far easier from the other two next gen consoles.
As for shitty sales, Wii U was selling okay up until it fell off a cliff in January (conveniently when the launch window titles just about stopped coming), so I guess that further supports my argument the launch window software wasn't that bad.