Naked Prime
Member
Do we know if the WiiU will let you watch videos off of your home network like the PS3 & 360 does?
Do we know if the WiiU will let you watch videos off of your home network like the PS3 & 360 does?
Do we know if the WiiU will let you watch videos off of your home network like the PS3 & 360 does?
Do we know if the WiiU will let you watch videos off of your home network like the PS3 & 360 does?
We don't know but knowing Nintendo it's extremely unlikely.
Also....do we think after some time 3rd party Gamepads might see the light of day? I can see little changes like capcitive or multi touch (not sure it would be more than a bullet point on the back of the box), higher dpi screen, being able to use the built in mic for ingame chat, increased range (though I suspect someone might make an attempt at Gamepad range extenders) etc. Though I'll laugh my ass off if I see a MadCatz brand Gamepad feat. Beats by Dre audio
yeah thats the sad truth I'm leaning towards. WiiU is gonna replace my 360 as my all in one box (fuck paying for Live to use other services I already pay for) for TV. I was hoping to expand it to my collection on my PC. But I can hope.
MadCatz is bringing a FightStick over, as well as a WiiU version of their Black Ops / FPS Pro controller.
![]()
![]()
The switches let you assign the extra buttons to decrease stick sensitivity when held, or double as other buttons on the controller.
why are things like this hidden in this thread. I just spend over $100 on amazon yesterday trying to get new controllers for my Wii U. I wish I knew this was coming
MH3U doesn't have it ostensibly because they want people to buy the (offline) 3DS version as a companion release. LAME :/Tekken Tag, Lego City, Pikmin, Rayman, Arkham Asylum, and Monster hunter all are missing the feature according to that chart. Not sure if they use touch extensively or in any special way.
But if they still sell well, pubs and devs will think its not a very appealing feature, or at least one that consumers wouldn't mind if it were missing.
I thought Rayman Legends was confirmed though?
My SFIV TE fightsticks are still going strong after 4 years. They're better build quality than my Hori stick IMO.Yes, but remeber: IT'S MADCATZ!! Their controllers are craaaaaaaaaaaaap! Wouldn't be suprised if the analog sticks gave out after a month of regular use.
Do we know if the WiiU will let you watch videos off of your home network like the PS3 & 360 does?
Hey guys,
one of my friend's has a really interesting question(that I would like to know as well!) for the hardware gurus of this forum..
He says, "We all know that framerate drops occur when the hardware can't handle the game. Take a pc for example. If your setup can't handle the game at a certain graphics setting, then framerate drops will occur. Tone down the settings, or beef up the hardware, and you solve the problem. How will this work on Wii U with Wii games that pushed the Wii too hard and caused framerate drops? The Wii U will have more than enough horsepower to run the games. So if MW3 can't hold 30fps (the max for that game) on the Wii because the Wii can't handle it, will it hold 30fps on the Wii U with the beefed up hardware? I don't know that it will, but I can't think of a reason as to why it wouldn't."
Can anyone answer this?
Rayman Legends isn't on either 360 or PS3. Origins is, but Legends is upgraded graphically. I'd hope those upgrades wouldn't require a resolution drop, but who knows. If I had to guess though I'd guess it'll be 1080p.
I'm sure someone will make a media server program that supports Wii U, or add the support to an already existing one. People did that with all the other consoles, including Wii. Having an HTML5-capable browser will make it even easier. DLNA is just one method of getting content playing on a device, it's not the only way to go about it.
Hey guys,
one of my friend's has a really interesting question(that I would like to know as well!) for the hardware gurus of this forum..
He says, "We all know that framerate drops occur when the hardware can't handle the game. Take a pc for example. If your setup can't handle the game at a certain graphics setting, then framerate drops will occur. Tone down the settings, or beef up the hardware, and you solve the problem. How will this work on Wii U with Wii games that pushed the Wii too hard and caused framerate drops? The Wii U will have more than enough horsepower to run the games. So if MW3 can't hold 30fps (the max for that game) on the Wii because the Wii can't handle it, will it hold 30fps on the Wii U with the beefed up hardware? I don't know that it will, but I can't think of a reason as to why it wouldn't."
Can anyone answer this?
You 99.9% think Wii U is going to be limited to what Wii set as precedent?Does the Wii let you do this? If not, 99.9% likely the Wii U won't either.
Tekken Tag, Lego City, Pikmin, Rayman, Arkham Asylum, and Monster hunter all are missing the feature according to that chart. Not sure if they use touch extensively or in any special way.
But if they still sell well, pubs and devs will think its not a very appealing feature, or at least one that consumers wouldn't mind if it were missing.
EDIT: Apparently the game also only uses ONE CORE.
Does the Wii let you do this? If not, 99.9% likely the Wii U won't either.
MadCatz is bringing a FightStick over, as well as a WiiU version of their Black Ops / FPS Pro controller.
![]()
![]()
The switches let you assign the extra buttons to decrease stick sensitivity when held, or double as other buttons on the controller.
Not the whole game.http://gamingbolt.com/nano-assault-neo-dev-impressed-with-the-wii-u-power
Apaprently Nintendo knew of the bottlenecks of the system and provided devs ways on how to bypass them with minimal work.
EDIT: Apparently the game also only uses ONE CORE.
In a sense; yes it was. At E3 (if I remember correctly), you could play on either the TV or the GamePad.
Hey guys,
one of my friend's has a really interesting question(that I would like to know as well!) for the hardware gurus of this forum..
He says, "We all know that framerate drops occur when the hardware can't handle the game. Take a pc for example. If your setup can't handle the game at a certain graphics setting, then framerate drops will occur. Tone down the settings, or beef up the hardware, and you solve the problem. How will this work on Wii U with Wii games that pushed the Wii too hard and caused framerate drops? The Wii U will have more than enough horsepower to run the games. So if MW3 can't hold 30fps (the max for that game) on the Wii because the Wii can't handle it, will it hold 30fps on the Wii U with the beefed up hardware? I don't know that it will, but I can't think of a reason as to why it wouldn't."
Can anyone answer this?
I think that Wii-mode will be like GameCube-mode; the hardware will be locked at the performance of the Wii to provide the most accurate playback.
Shiota said:The designers were already incredibly familiar with the Wii, so without getting hung up on the two machines' completely different structures, they came up with ideas we would never have thought of. There were times when you would usually just incorporate both the Wii U and Wii circuits, like 1+1. But instead of just adding like that, they adjusted the new parts added to Wii U so they could be used for Wii as well.
Tekken Tag, Lego City, Pikmin, Rayman, Arkham Asylum, and Monster hunter all are missing the feature according to that chart. Not sure if they use touch extensively or in any special way.
But if they still sell well, pubs and devs will think its not a very appealing feature, or at least one that consumers wouldn't mind if it were missing.
pikmin probably because of the stupid New play controls style gyro aiming. It'd be trivial to fall back to analogue stick aiming when on the game pad.
bah, if Nintendo themselves aren't even going to push for off-TV play on more of their games I'm worried it'll end up being an ignored feature quite quickly.
The Wii U is a different thing altogether. Even though the CPU is reportedly based on the Wii's Broadway, just clocking down to 729MHz and running the code on one of the cores won't result in identical performance, as the significantly increased cache and possible microarchitectural changes could certainly affect how well CPU-limited games run.
Well they have to show off how important the gamepad is so people don't call it a gimmick (although some of the ways they are using it seem like complete gimmicks). I would expect later software to offer off screen play like the new Mario.
And how do you know that Wii U's CPU doesn't also have a fallback mode, an internal switch that makes it behave exactly like Broadway? Depending on how much "enhancements" were actually done to the broadway cores this could be possible, imo.
You recently made positive statements about the Wii U, on its more modern GPU, its computational power that allows more enemies in Nano Assault Neo, and that you only used a fraction of its capacity. What are globally the advantages in graphics, physics, AI, to develop on Wii U in comparison to the Wii and the 3DS too which, albeit obviously weaker, also has relatively modern graphical functions? Describe us how your team felt when you first worked on the console dev kits while you were accustomed to Wii technology?
The 3DS and Wii U GPU are totally different. The 3DS GPU is very specialized while the Wii U GPU is quite open. For both designs you have to choose wisely how to use them. Both can generate great visuals and have lots of options.
When testing our first code on Wii U we were amazed how much we could throw at it without any slowdowns, at that time we even had zero optimizations. The performance problem of hardware nowadays is not clock speed but ram latency. Fortunately Nintendo took great efforts to ensure developers can really work around that typical bottleneck on Wii U. They put a lot of thought on how CPU, GPU, caches and memory controllers work together to amplify your code speed. For instance, with only some tiny changes we were able to optimize certain heavy load parts of the rendering pipeline to 6x of the original speed, and that was even without using any of the extra cores.
You already have lots of power at hands without digging deeper. So im pretty sure we will see many cool stuff on the Wii U when developers are understanding it better.
In comparison to the Wii, the Wii U has much more potential for optimizing. On Wii you knew what was possible and used that power. On Wii U you can take many different approaches to tackle a problem. Fortunately you already have lots of power at hands without digging deeper. So im pretty sure we will see many cool stuff on the Wii U when developers are understanding it better.
More specifically, weve heard rumors about the CPU, that its supposedly the weakest link of the system. Word has spread that its some sort of Broadway (Wii CPU) but in a three-core configuration and improved. Others have argued that based on its reduced size seen in recent pictures and the overall low consumption of the unit, it is not very powerful. Have you encountered any problems during your development because of this component or is it efficient enough?
We didnt have such problems. The CPU and GPU are a good match. As said before, todays hardware has bottlenecks with memory throughput when you dont care about your coding style and data layout. This is true for any hardware and cant be only cured by throwing more megahertz and cores on it. Fortunately Nintendo made very wise choices for cache layout, ram latency and ram size to work against these pitfalls. Also Nintendo took care that other components like the Wii U GamePad screen streaming, or the built-in camera dont put a burden on the CPU or GPU.
Neos resolution is 720p. Why is it not 1080p? Beside, weve witnessed jaggies and seemingly a lack of anti-aliasing in some other games footage, can you reassure us on the image quality of your title? With its more up-to-date GPU and other factors such as cache amount, the Wii U should be pretty capable in this area.
Any modern GPU supports various anti-aliasing modes with the usual Pros and Cons and its the case for the Wii U one. Many GPUs have a certain amount of AA even for free when rendering. Usage of these modes depends on your rendering style (like forward or deferred) and other implementation details.
Nano Assault Neo is running in 720p yes. We had the game also running in 1080p but the difference was not distinguishable when playing. Therefore we used 720p and put the free GPU cycles into higher resolution post-Fx. This was much more visible. If we had a project with less quick motions we would have gone 1080p instead i guess.
Its not a problem to make beautiful 1080p games on the Wii U. As on any console or PC such titles need ~200% more fill rate than 720p. You can use this power either for 1080p rendering or for more particles, better post-Fx, better materials, etc.
It should also be not forgotten that many current gen games dont even run at 720p, but at lower resolutions which are scaled up (not to mention that most also only run at 30fps).
Oh stop bullshitting.Nano Assault Neo is running in 720p yes. We had the game also running in 1080p but the difference was not distinguishable when playing.
I'm confused... I'd say the far GREATER fear for me is that off-screen gameplay is ALL developers use it for. I don't simply want to play a PS360 game on my tablet![]()
Maybe Because their texture resolution wasn't significant enough to make a difference in 1080p? In which case, design a better game and don't use old assets!Oh stop bullshitting.
Oh stop bullshitting.
Maybe Because their texture resolution wasn't significant enough to make a difference in 1080p? In which case, design a better game and don't use old assets!
I don't know enough about this - you clearly are the man when it comes to 720-->1080 conversions!!
Unless something changed from the screenshots they released earlier, the game would greatly benefit from higher resolution simply in terms of IQ -- even with the same assets. It's really aliased.Maybe Because their texture resolution wasn't significant enough to make a difference in 1080p? In which case, design a better game and don't use old assets!
Well, apparently they thought they had higher priorities than squeezing the extra performance required for the res jump. That does not mean they did not use (some of) that difference for a better look at the original res. Last but not least, it's a launch game (as in launch day), so no need to be overly-critical of them. IMO, of course.Oh stop bullshitting.
Unless something changed from the screenshots they released earlier, the game would greatly benefit from higher resolution simply in terms of IQ -- even with the same assets. It's really aliased.
I'm not really criticizing their accomplishments (Shin'en have always appeared to be very competent), just that one statement. To me it's on the same level as Capcom saying that their 30 FPS title "feels" like 60 FPS. It should be possible to talk about your title and what it does (and doesn't) offer without being unreasonable.Well, apparently they thought they had higher priorities than squeezing the extra performance required for the res jump. That does not mean they did not use (some of) that difference for a better look at the original res. Last but not least, it's a launch game (as in launch day), so no need to be overly-critical of them. IMO, of course.
You can call it nitpicking, but I hope you don't think I'm being disingenuous. To me that really does look quite messy in terms of IQ, when scaled to fullscreen. (Note that I prefer using SGSSAA at 2560x1440, since regular MSAA just isn't temporally stable)Plus I dunno how anybody can look at this in motion and say that additional AA would've a "great benefit." That's some serious nitpicking right there.
I'd be perfectly fine with that. Or even that they considered the post-processing to be more important. Just skip the part about the resolution not being noticeable.They should of just said "we wanted more fillrate".
While I do agree with your general statement, I find nothing unreasonable in Shin'en's answer to that particular question. AAMOF, I think it's an all-in-all great interview, particularly in terms of developer's honesty on matters few devs would be so honest about, even among themselves, and even when taking into account the obligatory platform lip service.I'm not really criticizing their accomplishments (Shin'en have always appeared to be very competent), just that one statement. To me it's on the same level as Capcom saying that their 30 FPS title "feels" like 60 FPS. It should be possible to talk about your title and what it does (and doesn't) offer without being unreasonable.
Yes, in screenshots it's all clearly visible (not just from the rim lighting, but from the few instances of objects 'flashing'), but as I stated earlier, it's all below my sensitivity thresholds during gameplay. And I'm not exactly insensitive to such things ; )The bright shading along the contours of objects (I think this was called rim lighting) increases the aliasing issue IMHO.
I hate to be "that guy" in this arguement... but the direct feed footage of it on my 1080p TV I could have easily mistaken for 1080p... super fast motion with a rock solid framerate does hide the imperfections that are VERY obvious from screen shots.
Now if it was 60fps with a very static screen, it would have been a lot more obvious.