Smurfman256
Member
Wind Waker is 30 fps. I saw it at the Best Buy Nintendo E3 event. I Always thought 30 fps was intentionally implemented.
Thanks. It'd be a lot easier to tell if all of the trailers on the Nintendo eShop weren't locked at 30FPS.
Wind Waker is 30 fps. I saw it at the Best Buy Nintendo E3 event. I Always thought 30 fps was intentionally implemented.
I haven't really been here for long but I always hear you mentioning "WUST days". What does that mean? Just curious.
I haven't really been here for long but I always hear you mentioning "WUST days". What does that mean? Just curious.
What pushes Baynetta 2 above current gen?If this company doesn't make technically advanced games, the same goes for Bayonetta 2.
For now, Bayonetta 2 is at a level which is ahead of what was possible the last generation. Even The Last of Us (the highest achievement seen on current gen) is below it technically.
What pushes Baynetta 2 above current gen?
It seems to share compromises current gen games have to run 60fps. 720p, no AA, pixelated shadows and even textures, baked shadows, relatively simple geometry, low quality depth of field...
This is why I think that the only reason the people think it looks better is because it's significantly more colourful. Nothing more. They see the nicer colours and start coming up with a bunch of hollow statements trying to convince themselves and other people that it looks like a generational jump.
This is why I think that the only reason the people think it looks better is because it's significantly more colourful. Nothing more. They see the nicer colours and start coming up with a bunch of hollow statements trying to convince themselves and other people that it looks like a generational jump.
USC-fan said:No.
They are about as similar as x360 is to xbone.
This is why I think that the only reason the people think it looks better is because it's significantly more colourful. Nothing more. They see the nicer colours and start coming up with a bunch of hollow statements trying to convince themselves and other people that it looks like a generational jump.
I don't think it's evidence of a generational leap but at the very least, Bayo2 looks better than Bayo1. Textures are noticeably higher res, the framerate is much more stable around 60fps, polycount seems pushed farther between the boss and the hair-demon-thingy, the lighting is better, etc.
But ultimately I don't even think Bayo2 is the best looking Wii U game, it just performs admirably with a lot of flair.
I think nintendo doing good.
You know that the difference between 1080P & 720P is power and ram.
For example PSX4one release 1080P game used 7GB ram.
To release same game on Wii U it's need 1.75 Ram to show it on 720P.
And this is the math.
1080p used 4 times power than 720p.
8Gb Ram 4 times size than 2Gb.
PSX4one 4 times power than Wii U.
If the graphics card on Wii U used customized function that already on PSX4one Graphics card , then the only difference between the them is 4 times power,size.
That mean the Wii U save in Multi games because it has customized function.
This is Intellectually.
But if we goes by technically it's clear that if any games release on PSX4one with 720p resolution will never release it on wii U and to release it on wii U they should make it 576P resolution with many technique removed.
I think 3rd party will never release multi game on Wii U because of many technique removed , and they don't like to make their game look much worse.
Uhhhmmm. No.. People have been gaming at 1080p@30fps for years on PC systems less "powerful" than a Wii U, with less ram and a worse graphics card.
edit: You have to play to your systems strengths and set a minimum acceptable target. Optimization takes you miles with proper hardware utilization.
Somehow you are right.
Last i visited my friend and he is PC gamer and i take my Wii U with me.
Then i show him my Trine 2 he is impressed from the Wii U and he said i never thought wii U will make games really nice and it's show graphics near my PC it look nearly the same.
And he turn on his Trine 2 on his PC with FXAA , 1080P , 60FPS.
And he comment your Wii U with only 300$ show really good graphics and my computer with only 3000$ show nearly same graphics with only few advantages on my PC.
And he is happy with Nintendoland & NSMB and he comment your Wii U really nice if we think it's with only 300-350$ price and with best controller ever.
And he wish to has Gamepad on PC.
I'm not saying that it has bad graphics. I have a Wii U. I also have a damn fine PC. If your friend payed $3k for his PC he either has an extremely overpowered PC or he did not build it himself.
I'm not knocking the Wii U, I just wanted you to know that PC's with less "power" as you like to put it, have been doing the same games as Wii U, but at 1080p@30 fps.
I love the off-tv play, and I think the gamepad is comfortable as hell. But, more than anything, the Wii U is going to be a system that is left with a ton of exclusive titles, first-party or 3rd-party, as the multiplatform games that are coming out for the PSOne are going to very soon not be worth the effort to scale down to Wii U capabilities. Which in-turn means, that games that are released on the platform, will most likely be tailor made to work the strengths of the system, but as time goes by, we won't see much parity between the SonyM$ twins either.
It'll be an interesting future for sure.
Completely unrelated, are you using a text translator? It seems English is not your fist language, and I'm curious as to what is.
Geometry is higher than the one seen in The Last of Us, and the same goes for texture quality.What pushes Baynetta 2 above current gen?
It seems to share compromises current gen games have to run 60fps. 720p, no AA, pixelated shadows and even textures, baked shadows, relatively simple geometry, low quality depth of field...
It's funny to see how it seems to some people that the WiiU has a "color trick" or anything.Smurfman256 said:This is why I think that the only reason the people think it looks better is because it's significantly more colourful. Nothing more. They see the nicer colours and start coming up with a bunch of hollow statements trying to convince themselves and other people that it looks like a generational jump.
The jump between Bayonetta 2 and any game made by this studio on past-gen consoles is MUCH bigger than the jump between any of the games they did there.TheGuardian said:Although I'm not dismissing potential improvements due to the hardware having some strong points above the PS360, it is natural that after all the experience that Platinum gained since the first Bayonetta, doing stuff like Vanquish and MGR, the sequel would look better, even if it was on the old consoles.
Geometry is higher than the one seen in The Last of Us, and the same goes for texture quality.
I mean, this is the geometry of the Killzone 4 scenario:
The funny thing about that Killzone 4 pic, was that it was from one of the most impressive scenes shown in the game so far. The geometry is really simple, the colors and textures are reused everywhere, yet it gets a pass but Bayonetta 2 don't? Now, don't get me wrong, the overall image still looked pretty good, but it was funny how people were quick to point out similarly simple geometry in the buildings in Bayo2.
Never mind the fact that you were speeding past those buildings on a jet, at 60fps during gameplay(vs 30fps cutscene for KZ:SF), or in that Gamorrah boss fight to the end, you see a large city scape similar in scope to the one in that opening sequence of KZ:SF, much less monotonous textures, insane water particle FX, a ton of stuff happening on screen any given time, or the fact that close-up of character models hold their own - quite easily too - against KillZone's. Nope, people chose not to see those things because a Wii U game is not allowed to be impressive, even while they praise a PS4 game that doesn't offer as much to marvel at.
Although I'm not dismissing potential improvements due to the hardware having some strong points above the PS360, it is natural that after all the experience that Platinum gained since the first Bayonetta, doing stuff like Vanquish and MGR, the sequel would look better, even if it was on the old consoles.
I often wonder if people claiming Bayonetta 2 looking like current gen even watched the whole Gomorrah fight in the demo playthroughs. Never have i seen anything looking this good on PS360. As you said, during that fight scene we see the following:
-> you fight that huge super detailed monster wich is scaling a building
-> Windows of the building reflects the sunlight and other buildings of the city
-> Said city is also fully rendered in the background
-> Alot of small chips from the buildings crumbe off and fly through the air
-> WHILE it appears to be raining... (Wich might only be from water pipes from the building being broken though)
No one is claiming Wii U is as strong as PS4/Xbox One. It clearly isnt. But can we now please all agree on that fact that Wii U is a pretty good step beyond PS360 now?
No one is claiming Wii U is as strong as PS4/Xbox One. It clearly isnt. But can we now please all agree on that fact that Wii U is a pretty good step beyond PS360 now?
No one is claiming Wii U is as strong as PS4/Xbox One. It clearly isnt. But can we now please all agree on that fact that Wii U is a pretty good step beyond PS360 now?
What does it mean?
I've asked this in another thread but didn't get a response. Perhaps it's been answered before.
Is there a comprehensive comparison between the Wii and the Wii U hardware?
What's the difference in RAM, CPU, and GPU?
What's the difference in FLOPs?
Maybe the Wii U is considerably more powerful than the Wii. In that case, the complaints from gamers would mostly stem from people who played on 360/PS3 and PC.
Otherwise, Wii gamers would probably be impressed by the jump, right?
Very interesting, and that does seem to match up with the situation that Ideaman replied. Thanks for your explaination.This would imply that the WiiU OS doesn't have a scheduler; or a traditional one anyway.
Normally threads run for a period of time (a 'slice') then check to see if they should carry on running or handover processing time to another thread.
Time slicing allows for multitasking. The lack of a scheduler would imply that the WiiU does not pre-emptively multitask. This would imply that it would be up to the developer to add handover instructions within their code to stop something hogging a hardware thread/core.
The 'main thread' is probably a reference to the main loop (int main()which is always running. This generally does little other than some initial dispatch instructions. If this has the same 'nice' value as other threads then a core could be sat there doing nothing much at all; and this could be what Ideaman was talking about when he said some devs had to alter their code in order to get it running across all cores.
The lack of a scheduler would mean that more CPU time is freed to process instructions undisturbed.. but it would also mean that developers have to be more aware of what they're doing in order to make the most out of the power.
/cpu_derail
I've asked this in another thread but didn't get a response. Perhaps it's been answered before.
Is there a comprehensive comparison between the Wii and the Wii U hardware?
What's the difference in RAM, CPU, and GPU?
What's the difference in FLOPs?
Maybe the Wii U is considerably more powerful than the Wii. In that case, the complaints from gamers would mostly stem from people who played on 360/PS3 and PC.
Otherwise, Wii gamers would probably be impressed by the jump, right?
Very interesting, and that does seem to match up with the situation that Ideaman replied. Thanks for your explaination.
The lighting does indeed look better, but textures are also important, and DX10 doesn't support tesselation, does it?
The funny thing is that this strong rumor about the CPU been underutilized early on has not been reported by the media. That Project Cars log seems to support it.
Guess there is no fun in reporting it, only fun comes from bashing the Wii U CPU.
Seeing as current gen consoles graphic cards are based on the DirectX9, Wii U on DirectX 10.1 and PS4/XB1 on DirectX11 I found the following interesting. Note as it has been cleared many times, this refers to feature set and not the actual DirectX API, which I think only MS uses.
Disclaimer: I know is not as simple as this, as many other factors weigh in but I thought it was interesting nonetheless. Other components in PS4 and XB1 and the various mysteries in Wii U like the tessellation unit which we know exists but we are not sure about the implementation.
http://www.overclock.net/t/597046/dx11-vs-dx10-vs-dx-9-pics
from there I added the below comparisons. From the pictures IMO there is a bigger difference from DX9 to DX10 than DX10 to DX11 on the lightning, shadows, DOF and detail.
This seems to be around the same difference I am seeing in the batch of games shown @ E3 for Wii U, that even SSB, that its Brawl roots are still obvious, looks very very good with just the lightning, also the WW port, HDfied and better lighting goes a long way.
Road DX9
![]()
Road DX10
![]()
Road DX11
![]()
Dragon DX9
![]()
Dragon DX10
![]()
Dragon DX11
![]()
House DX9
![]()
House DX10
![]()
House DX11
![]()
Another shot from Call of Juarez BiB comparing DX9 and DX10.
![]()
Other DX9 vs DX10
Crysis
![]()
Age of Conan: Hyborean Adventures
![]()
Just lightning, shadows and AA goes a long way in creating more life-like scenes
AMD cards have had hardware tessellation since 2001. It was only recently implemented into DX11.What has been said around here (please correct me if I am wrong) is that earlier non DX11 cards had tessellation units but were not as efficient as those in DX11.
The Wii U has a tessellation unit and a developer (Shinen) is using it. It is not clear which tessellation unit the Wii U has as the chip is custom.
From what I understand is that the Wii U supports many of the key features in Direct X11, but is actually running Direct X10.1...?
Thinking about when Tetsuya Nomura said that both Kingdom Hearts 3 and Final Fantasy XV will not be coming to Wii U due to it not supporting Direct X11. Can someone explain to me the limitations of a Direct X10.1 system that supports Direct X 11 effects.
My Questions Are:
- Can Direct X11 games not run on Direct X10.1 at all?
- What is stopping Direct X11 games running on Direct X10.1?
- What features are missing that might not allow optimizing and porting?
Seeing as current gen consoles graphic cards are based on the DirectX9, Wii U on DirectX 10.1 and PS4/XB1 on DirectX11 I found the following interesting. Note as it has been cleared many times, this refers to feature set and not the actual DirectX API, which I think only MS uses.
Another shot from Call of Juarez BiB comparing DX9 and DX10.
![]()
Other DX9 vs DX10
Crysis
![]()
Age of Conan: Hyborean Adventures
![]()
From what I understand is that the Wii U supports many of the key features in Direct X11, but is actually running Direct X10.1...?
Thinking about when Tetsuya Nomura said that both Kingdom Hearts 3 and Final Fantasy XV will not be coming to Wii U due to it not supporting Direct X11. Can someone explain to me the limitations of a Direct X10.1 system that supports Direct X 11 effects.
My Questions Are:
- Can Direct X11 games not run on Direct X10.1 at all?
- What is stopping Direct X11 games running on Direct X10.1?
- What features are missing that might not allow optimizing and porting?
-Set default memory alignment on WiiU to be 64 bytes due to FS_IO_BUFFER_ALIGN requirement
No. There's a BIG misunderstanding around this. What happened is that WiiU's GPU (Latte) started development in 2008 over an R700, which is a DX10.1 compilant card with some other features not included on the DX10.1 standard.From what I understand is that the Wii U supports many of the key features in Direct X11, but is actually running Direct X10.1...?
Thinking about when Tetsuya Nomura said that both Kingdom Hearts 3 and Final Fantasy XV will not be coming to Wii U due to it not supporting Direct X11. Can someone explain to me the limitations of a Direct X10.1 system that supports Direct X 11 effects.
My Questions Are:
- Can Direct X11 games not run on Direct X10.1 at all?
- What is stopping Direct X11 games running on Direct X10.1?
- What features are missing that might not allow optimizing and porting?
No. There's a BIG misunderstanding around this. What happened is that WiiU's GPU (Latte) started development in 2008 over an R700, which is a DX10.1 compilant card with some other features not included on the DX10.1 standard.
Because we don't know what we currently have, the safe bet is to assume that at least the WiiU GPU will be an R700.
That being said, EVERY SINGLE GPU ON THE MARKET STARTS DEVELOPMENT OVER AN OLDER DESIGN, and not the past year's design like some people say. A GPU is not designed, tested, tuned, tested again, fabricated, distributed and commercialized in 1 year.
The HD5870 was the first ATI DX11 GPU, and some people here act as if this was made from scratch. The reality is that this GPU not only was really similar to the HD4870 (R700) in terms of design, but that it was also designed over an even older design (maybe the R600 or even the R500). And the same goes for the R700, it was in terms of architecture an evolution of an R600, but this doesn't mean that it's development didn't start until the R600 was finalized and commercialized.
We have rumours that the final silicon of the WiiU GPU wasn't finished until early 2012. We know that the devkits of the console weren't at version 1.0 (that is, complete) until AFTER the console was released (December 2012, confirmed by Nintendo), which gives more strength to the rumour of the GPU not being finalized until very late.
So in other words, from the DX10.1+some other things that was the R700 Nintendo has had 4 whole years to modify, redesign and adapt the GPU design. I insist, final silicon wasn't finalized until early 2012, that means that all the tunning (that is increase or decrease frequencies and test it's reliability under multiple scenarios) was done during the first half of that year.
What this means is nothing and everything. If the design wasn't finalized until that late, and if when examined the GPU turned to have a completely hand-made layout to squeeze every single mm^2 of chip area a LOT of changes could be made.
Maybe this GPU has features not included on the DX11 standard and maybe it lacks other of those DX11 features.
Only time will tell us what this console is good at, but seeing recent games like Bayonetta 2 or X, it is at least a noticeable jump from Xbox 360 and PS3 and only developers know how far it can go.
A few comments:
PS4/XB1 GPUs support DirectX 11.1 (not that it is a big difference, just saying). Even the upcoming DX11.2 will be supported.
The most obvious difference here is tesselation. Just because you have the hardware that supports it doesn't mean you can use it for free though. For example, on my GPU (GTX 460), the same scene will run at half the fps with the high tesselation factor which is used in this screenshot.
You can achieve a decent effect for the road without having to use tesselation btw. It's called parallax occlusion mapping and can be enabled in Crysis using DX9 (example).
The difference in texture quality has nothing to do with DX10. They could've enabled it in DX9, just chose not to. Not your fault of course; sadly, some comparisons are made to me misleading.
This is a similar case. All features of Crysis' DX10 mode (except the motion blur solution) could be enabled in DX9 mode, just not through the main menu options but the .ini files. It might have performed sightly better in DX10 though.
This features different lighting no doubt. Keep in mind though that god rays like this are achievable in DX9 without problems, as we have seen in many games. The devlopers chose not to implement it specifically for DX9 because it would've meant additional effort. Most people with a decent PC would just use the DX10 mode anyway.
The DirectX 10 SDK became available in February 2007.[9]
New features:
Fixed pipelines[10] are being done away with in favor of fully programmable pipelines (often referred to as unified pipeline architecture), which can be programmed to emulate the same.
New state object to enable (mostly) the CPU to change states efficiently.
Shader model 4.0 enhances the programmability of the graphics pipeline. It adds instructions for integer and bitwise calculations.
Geometry shaders, which work on adjacent triangles which form a mesh.
Texture arrays enable swapping of textures in GPU without CPU intervention.
Predicated Rendering allows drawing calls to be ignored based on some other conditions. This enables rapid occlusion culling, which prevents objects from being rendered if it is not visible or too far to be visible.
Instancing 2.0 support, allowing multiple instances of similar meshes, such as armies, or grass or trees, to be rendered in a single draw call, reducing the processing time needed for multiple similar objects to that of a single one.[11]
Direct3D 10.1
Direct3D 10.1 was announced by Microsoft shortly after the release of Direct3D 10 as a minor update. The specification was finalized with the release of November 2007 DirectX SDK and the runtime was shipped with the Windows Vista SP1, which is available since mid-March 2008.
Direct3D 10.1 sets a few more image quality standards for graphics vendors, and gives developers more control over image quality.[12][13] Features include finer control over anti-aliasing (both multisampling and supersampling with per sample shading and application control over sample position) and more flexibilities to some of the existing features (cubemap arrays and independent blending modes). Direct3D 10.1 level hardware must support the following features:
Mandatory 32-bit floating point filtering.
Mandatory support for 4x anti-aliasing
Shader model 4.1
Thanks for your inputs. Very interesting indeed and makes the whole diminishing returns make more sense, and other components like CPU, available memory and memory bandwitdth all that much important in building on the differences between consoles. So IQ, resolution and FPS plays a major role also, but this is mostly determine by power.
So can I ask what are the advantages of a DX10 compliant card over a DX9 and a DX10.1 over a DX10 compliant.
One thing I read is that they can do the same "techniques" in a more efficient way. Certainly important but I am more curious to know if there is some technique that cannot be done in DX9.
And again, please remember this is a graphics card that is compliant with the DX10.1 features not that it uses DX10.1 API.
Also what does Shader model 4.1 and geometry shaders bring to the table or better someone can translate into dummy mode the following if even possible:
From http://en.wikipedia.org/wiki/Microsoft_Direct3D
Interesting the bit about 4x anti-aliasing being Mandatory.
Was anything significant changed between DirectX 11 and 11.1?
That and 11.2 seem to be centered more on the API itself, but someone more techie than me could chime in.
Eh? Films have used it for years.Am I the only one who finds it funny that people expected games back in the early PS3 era to be using ray tracing while the only film to use it up to this point was Monsters University?
Eh? Films have used it for years.
my mistake. MU is the first PIXAR film for ALL of the lighting to be ray traced.
We are still considering SteamWorld Dig for Wii U, but I also think we have other ideas that suit the Wii U better. The Wii U is a very powerful console, but I think Nintendo has a hard time explaining to consumers who should buy it and why. It becomes a paradox: if not enough people buy the console, developers are not going to flock to it which means that it takes longer for the console to establish itself.
Julius: We have really enjoyed working with the Wii U hardware. It was rather easy to port our modern proprietary engine to it, and it does pack the punch to bring to life some really awesome visuals. The Wii U is a very modern console with a lot of RAM which helped us out a lot during development. The hardware capabilities have improved quite a lot from the original Wii, and the Wii U is a truly powerful console. The console is definitely more powerful than the Xbox 360 and PS3.
With Splot though the situation is a bit more complex as the technology were using isnt the same as for Trine 2: Directors Cut, so theres a lot of technical work in getting Splot running on high end consoles like the Wii U.