Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.
It cannot be a default Tegra X1. It will be custom, so the theory of "Nvidia had thousands of Tegra chips in a box and had to get rid of them" simply does not hold because of that. Maybe what we saw is that those chips that had to be sold, are now in Devkits, and in the specification meantime, Nvidia made a custom chip for Nintendo with Maxwell or Pascal architecture, whatever.
 
I'm expecting around 450 GFLOPS while it's in handheld mode and 650-700 GFLOPS docked; which in my opinion is excellent for a portable. The Wii U is 350 GFLOPS and I've always been satisfied with how Nintendo games look because of their great art direction, the modern architecture should also help separate the lines a little more. Anyone who is looking at the Switch to play anything more than first party and indie games like Minecraft and Terraria is probably setting themselves up for disappointment though.
 
Man if only we weren't so disillusioned with PR talk. But you know, fool me once and all that. No man's sky this year and some other blunders certainly don't inspire confidence in company talk either.

Still, my hunch is that neither nvidia nor Nintendo are half-assing this. I really think they are going all out because the Switch will be their way of the future. Unifying libraries was always going to be a major goal to hit and now they are doing it.

Maybe I'm too naive. I'm also definitely ignorant in various ways regarding this matter. But it is what I choose to think for the time being.
 
I'm expecting around 450 GFLOPS while it's in handheld mode and 650-700 GFLOPS docked; which in my opinion is excellent for a portable. The Wii U is 350 GFLOPS and I've always been satisfied with how Nintendo games look because of their great art direction, the modern architecture should also help separate the lines a little more. Anyone who is looking at the Switch to play anything more than first party and indie games like Minecraft and Terraria is probably setting themselves up for disappointment though.

WiiU is actually closer to half that (176Gflops), though because it was much more modern than its competition at the time (360/PS3) it still more than matched their 250Gflops. So as you say more modern architecture does help.
 
2 years is your speculation, so you're kind of building up on your own bias here.

Straight from Jen-Hsun words.-

"And so we've worked with them now for almost two years."

Nvidia CEO talking about Switch

But the fan you just mentioned is on while undocked and you don't need a fan to cool a 20nm Tegra Maxwell (X1) at 585mhz (300gflops). That chip runs passively cooled at up to 850mhz (435gflops).

I know, I can't quite figure out the clocks that fit:

* Undocked Switch needing active cooling
* Shield TV Tegra X1 1Ghz 20w TDP
* Two profiles that would allow 720p or 1080p

Thing is, on 16 nm I highly doubt active cooling would be needed to achieve a 720p goal so that points towards 20nm, but given the Pixel C example raises doubts about the cooling solution, but someone mentioned an interesting point about avoiding throttling issues and also I think is safe to say that CPU cores are not going to be downclocked on portable mode to achieve coherency between both modes so this could be a reason for the active cooling.

I disagree and mainly because this is Nintendo. Their DNA is to make things custom in their systems. They are not putting a stock or even close to stock " minor changes" into switch

Is it? The last custom design from Nintendo in the console Market was GC 15 years ago, but both Wii and Wii U are rehashes of that architecture and portables have been designed with old/cheap technology forever.

People should not read into my words any kind of doom and gloom, or me trying to downplay Nintendo, I mean for me this is the most exciting product coming from Nintendo since GC and the second time I'm going to be there day one since N64.

But just following basic logic / ockham's razor with the info we really have (not speculation) the really close to 20nm Tegra X1 scenario seems a lot more likely than a super customized 16 nm SoC. I may be wrong and feel free to disregard my mumblings, but funny thing is I'm super pumped about playing portable 720p no slowdown BotW (and maybe -fingers crossed- 1080p on TV), so whatever happens it's already a big win-win for me.
 
What about the fan an custom is to guarantee a full X1 power on portable mode (512 Tf) and when docked, it is used overclock to 700 TF or more?

It's very close to Xbox one and really can show improvement over Wii U.
 
They used the same kind of PR when they announced the (later to be discovered buggy... scaler chip and more) RSX for the PS3 while they were also working on their unified shader chip which released almost alongside the PS3 itself.

Why waste a chance of putting some PR that makes them sound amazing?

I think this is the big takeaway in this discussion. While Nintendo might want to get those chips super cheap at 20nm, I'm not sure Nvidia would necessarily want to sell then older tech for their first big jump into console gaming.

Nvidia loves being on the cutting edge and talking up their products, if this was just some dump of their old tech they wouldn't be able to do that and would send a message that Nintendo only went with them because they were cheap. I think Nvidia wants to see their logo on this and be wowed by the graphics and battery life. I'd be surprised if they let Nintendo use old tech in this product that means so much to them.
 
They probably bank on bankrupting the competition and giving businesses no choice :).
Their competition being AMD and Intel in the desktop, and a plethora of healthy chipmakes in the mobile segment? ; )
 
Thing is, on 16 nm I highly doubt active cooling would be needed to achieve a 720p goal so that points towards 20nm, but given the Pixel C example raises doubts about the cooling solution, but someone mentioned an interesting point about avoiding throttling issues and also I think is safe to say that CPU cores are not going to be downclocked on portable mode to achieve coherency between both modes so this could be a reason for the active cooling.

I'm just wondering if they're going to downclock the CPU in portable mode. Only due to the fact that they detailed that in their patent in the recent patent thread.

Of course they could have just covered that for the patent for legal reasons and are not going to downclock the CPU on the Switch.
 
I think this is the big takeaway in this discussion. While Nintendo might want to get those chips super cheap at 20nm, I'm not sure Nvidia would necessarily want to sell then older tech for their first big jump into console gaming.

Nvidia loves being on the cutting edge and talking up their products, if this was just some dump of their old tech they wouldn't be able to do that and would send a message that Nintendo only went with them because they were cheap. I think Nvidia wants to see their logo on this and be wowed by the graphics and battery life. I'd be surprised if they let Nintendo use old tech in this product that means so much to them.
People will be wowed even with the worst case scenarios we are talking about because Nintendo/ monolith soft etc optimizes the shit out of their games and tools/ apis etc will be top notch. Even worst case the games will blow away any android or iOS device.
 
Just as a reminder, the whole "Nvidia has a bunch of chips lying around that they need to get rid of" or "Nvidia has a large unfulfilled order that they need to get rid of" and sell for cheap were only theories presented by Thraktor earlier this year, and not at all rumors or based on any evidence. They were theories for how the whole "Nvidia may even be taking a loss" thing from the Semiaccurate post might be resolved. So let's not treat it as a rumor when it's 100% speculation.

And regarding customization, I think regardless of any PR or anything like that, we can all agree that the Switch should be at the very least 512GFlops when docked, right? Which is about what we've been expecting.
 
I stand corrected.

Wii U has a very custom GPU. So yeah, Nintendo customised the shit out of every console's chip.

I would say we don't know all about what Nintendo does but just even looking at their lst console Wii u was heavily customized. We had the two threads on gaf for the cpu and the GPU. Both were customized in a manner not to resemble anything "off-shelf" or close to off shelf. But none of that matters if you still get it wrong. I think the deal with Nvidia was just what they needed. Someone that will help them understand what is needed in hardware and won't let them cheap out and make dumb decision. I feel like amd was just taking their money and saying fuck it if that's what you want.

I would say that both Wii and Wii U are the wrong kind of customization if you ask me, and would have fared better with some up to date cheap off the shelf parts instead of rehashing GC architecture just for the sake of backwards compatibility, but I don't want to derail the thread.

And regarding customization, I think regardless of any PR or anything like that, we can all agree that the Switch should be at the very least 512GFlops when docked, right? Which is about what we've been expecting.

I agree that this baseline performance is almost unmissable by any means.
 
Is 512 GFlops a magic threshold Nintendo absolutely needs to hit or is that merely an arbitrary number?

That's the standard performance of Tegra X1, which is what has been heavily rumored to be in the early Switch devkits. It would be strange if the final product doesn't at least reach that number.

And since it's really all we have to work on, I think it's worthwhile to use it as the basis for our expectations.
 
I would say that both Wii and Wii U are the wrong kind of customization if you ask me, and would have fared better with some up to date cheap off the shelf parts instead of rehashing GC architecture just for the sake of backwards compatibility, but I don't want to derail the thread.

I think most of us would agree.
 
Sooooo, from someone's who's still rather green, can someone explain the "Ninjablade" situation for me?

smhzvutw.gif


Sorta like Spieler Eins but more focused on graphics/tech discussion far out of his breadth of knowledge.
 
I would say that both Wii and Wii U are the wrong kind of customization if you ask me, and would have fared better with some up to date cheap off the shelf parts instead of rehashing GC architecture just for the sake of backwards compatibility, but I don't want to derail the thread.
Wii U yes but I'd argue Wii was really a clever solution for Nintendo at the time. They salvaged the incredibly efficient and affordable Gamecube design, beefing up clocks and memory to what they likely should've been in the first place, and it really helped them move ahead in an uncertain console market by giving them the space to focus on the new interface design. Familiar hardware and tools also allowed them to get out an unprecedented year one lineup that really is what drove the platform into the stratosphere, almost a game a month for 18 months.
 
Wii U yes but I'd argue Wii was really a clever solution for Nintendo at the time. They salvaged the incredibly efficient and affordable Gamecube design, beefing up clocks and memory to what they likely should've been in the first place, and it really helped them move ahead in an uncertain console market by giving them the space to focus on the new interface design. Familiar hardware and tools also allowed them to get out an unprecedented year one lineup that really is what drove the platform into the stratosphere, almost a game a month for 18 months.

Yea the Wii U was kind of worst of all worlds. At least the Wii was cheap, easy to develop for, they could use the same tools, they knew it well cause it was a beefed up gamecube etc. The Wii U was different enough to require a lot of work New tools etc, not cheap enough to be sold cheaply, not powerful enough to be easy to port even ps3 360 to. The only advantage of all that customization was bc which turned out to not matter much at all.
 
Yea the Wii U was kind of worst of all worlds. At least the Wii was cheap, easy to develop for, they could use the same tools, they knew it well cause it was a beefed up gamecube etc. The Wii U was different enough to require a lot of work New tools etc, not cheap enough to be sold cheaply, not powerful enough to be easy to port even ps3 360 to. The only advantage of all that customization was bc which turned out to not matter much at all.
I think even the Wii U chip is pretty clever from an engineering standpoint (and isn't that costly itself btw), it just wasn't the right move to make for a 2012 console with ANOTHER new-yet-old interface approach. I do feel like it could've worked in a hypothetical "Wii HD" a couple years earlier but as is it was the wrong architecture for the wrong console at the wrong time.
 
I'm just wondering if they're going to downclock the CPU in portable mode. Only due to the fact that they detailed that in their patent in the recent patent thread.

Of course they could have just covered that for the patent for legal reasons and are not going to downclock the CPU on the Switch.

Seems unlikely they would do that for gaming.

It might downclock in idle or in OS.

Does the fillrate at all scale with this raw graphics processing measurement?

Yes, the ROPs and shader units are on the same clock domain since Kepler.
 
Is it? The last custom design from Nintendo in the console Market was GC 15 years ago, but both Wii and Wii U are rehashes of that architecture and portables have been designed with old/cheap technology forever.

People should not read into my words any kind of doom and gloom, or me trying to downplay Nintendo, I mean for me this is the most exciting product coming from Nintendo since GC and the second time I'm going to be there day one since N64.

But just following basic logic / ockham's razor with the info we really have (not speculation) the really close to 20nm Tegra X1 scenario seems a lot more likely than a super customized 16 nm SoC. I may be wrong and feel free to disregard my mumblings, but funny thing is I'm super pumped about playing portable 720p no slowdown BotW (and maybe -fingers crossed- 1080p on TV), so whatever happens it's already a big win-win for me.

Nintendo's handhelds may have used "old and cheap" technology, but that definitely doesn't equate to off-the-shelf chips. The GBA, DS and 3DS had custom SoCs designed in-house by Nintendo using ARM (and in the case of 3DS DMP) IP. They even designed a whole new chips for the DSi and n3DS as well.

On the console front, for their 3D consoles, we've seen:

N64 - CPU: Lightly customised NEC MIPS core - GPU: Heavily customised by Silicon Graphics (loosely based on an existing CPU/DSP)
GC - CPU: Existing IBM PPC core with significant customisations, including a large set of ISA extensions - GPU: Fully custom GPU designed for the ground up for Nintendo by ArtX.
Wii - CPU & GPU: Lightly modified versions of existing GC hardware.
Wii U - CPU: GC/Wii core modified with multicore support and new cache - GPU: Custom chip designed in-house by Nintendo using AMD IP.

You can argue that Wii and Wii U were "rehashes" of existing tech, but they sure as hell weren't off the shelf chips. Even going back to the NES and SNES Nintendo used custom hardware.

Now I'm not arguing that they're going to use a custom GPU designed from the ground up like back in the Gamecube days, that's clearly not practical for any console maker nowadays. They're going to use the Maxwell/Pascal GPU architecture and they'll almost certainly use stock ARM CPU cores, but their choice of how the GPU is configured (ie number of SMs, ROPs, cache configuration, etc) will be entirely unconstrained by what the TX1 happened to use. Similarly, the choice of CPU cores they use, their number, how they're clustered and what size cache to use is entirely open for them to choose as they wish without constraint. Ditto with memory interface and any other number of aspects of the SoC.

We also need to keep in mind how big a deal this chip is from the perspective of Nvidia's Tegra division. Even in the worst case scenario it will be the biggest selling Tegra powered device by at least an order of magnitude (possibly two) and will almost certainly comfortably outsell every other Tegra chip combined. They're not going to put piles of effort into the TX1 and then just phone this one in. We also know that Nintendo "contributed significantly" to Nvidia's big jump in earnings last quarter, which is about the time you may expect Nintendo to make final payments for completed R&D. They wouldn't be paying Nvidia that kind of money for a lightly modified TX1.

That's not to say I expect something vastly more powerful than TX1, as a 2 SM GPU would seem the most likely given Nintendo's cost and thermal constraints. However, any other similarities to TX1 will likely be little more than coincidence.
 
Can some of the more knowledgeable tech / graphical people explain what improvements we should expect not only from the power leap over WiiU but in terms of the improved features and effects on Switch ?

Better textures with the bump in RAM and some AA at last are obvious ones.

What kinds of things will be possible from the CPU leap ? Will all Nintendo games now have PBR for example ? Thanks.
 
Nintendo's handhelds may have used "old and cheap" technology, but that definitely doesn't equate to off-the-shelf chips. The GBA, DS and 3DS had custom SoCs designed in-house by Nintendo using ARM (and in the case of 3DS DMP) IP. They even designed a whole new chips for the DSi and n3DS as well.

On the console front, for their 3D consoles, we've seen:

N64 - CPU: Lightly customised NEC MIPS core - GPU: Heavily customised by Silicon Graphics (loosely based on an existing CPU/DSP)
GC - CPU: Existing IBM PPC core with significant customisations, including a large set of ISA extensions - GPU: Fully custom GPU designed for the ground up for Nintendo by ArtX.
Wii - CPU & GPU: Lightly modified versions of existing GC hardware.
Wii U - CPU: GC/Wii core modified with multicore support and new cache - GPU: Custom chip designed in-house by Nintendo using AMD IP.

You can argue that Wii and Wii U were "rehashes" of existing tech, but they sure as hell weren't off the shelf chips. Even going back to the NES and SNES Nintendo used custom hardware.

Now I'm not arguing that they're going to use a custom GPU designed from the ground up like back in the Gamecube days, that's clearly not practical for any console maker nowadays. They're going to use the Maxwell/Pascal GPU architecture and they'll almost certainly use stock ARM CPU cores, but their choice of how the GPU is configured (ie number of SMs, ROPs, cache configuration, etc) will be entirely unconstrained by what the TX1 happened to use. Similarly, the choice of CPU cores they use, their number, how they're clustered and what size cache to use is entirely open for them to choose as they wish without constraint. Ditto with memory interface and any other number of aspects of the SoC.

We also need to keep in mind how big a deal this chip is from the perspective of Nvidia's Tegra division. Even in the worst case scenario it will be the biggest selling Tegra powered device by at least an order of magnitude (possibly two) and will almost certainly comfortably outsell every other Tegra chip combined. They're not going to put piles of effort into the TX1 and then just phone this one in. We also know that Nintendo "contributed significantly" to Nvidia's big jump in earnings last quarter, which is about the time you may expect Nintendo to make final payments for completed R&D. They wouldn't be paying Nvidia that kind of money for a lightly modified TX1.

That's not to say I expect something vastly more powerful than TX1, as a 2 SM GPU would seem the most likely given Nintendo's cost and thermal constraints. However, any other similarities to TX1 will likely be little more than coincidence.

Think about the Classic Edition Mini NES ;p
 
Can some of the more knowledgeable tech / graphical people explain what improvements we should expect not only from the power leap over WiiU but in terms of the improved features and effects on Switch ?

Better textures with the bump in RAM and some AA at last are obvious ones.

What kinds of things will be possible from the CPU leap ? Will all Nintendo games now have PBR for example ? Thanks.

PBR is GPU side, but I think we can expect it to run pretty much any modern shader and stuff, the question is how well and at what fidelity. That said, UE4 is officially supported by Epic, so take that as you will
 
Think about the Classic Edition Mini NES ;p

Well, yes, but that's not exactly a console in the traditional sense, and given how low a performance bar is necessary to emulate NES games there was no point designing a custom chip when so many cheap ARM SoCs are available.

Seems unlikely they would do that for gaming.

It might downclock in idle or in OS.

You're most likely right, but in theory if they were to use an adaptive sync display in portable mode they may decide that lower frame rates are more acceptable in portable mode versus docked mode with fixed-sync TVs, and could clock down the CPU on the assumption that, say, 45fps is good enough in portable mode versus 60fps in docked. It would allow them to clock the CPU down by about 25% and bring the GPU down even further (assuming a resolution drop in portable as well), saving a reasonable amount of power.

That's not to say that I'd expect it to happen, not least because, even if 60fps Nintendo games may look fine at 45fps on an adaptive sync display, 30fps third party games wouldn't fare as well at 22fps (or below if they struggle to hit 30 docked).
 
PBR is GPU side, but I think we can expect it to run pretty much any modern shader and stuff, the question is how well and at what fidelity. That said, UE4 is officially supported by Epic, so take that as you will

Yeah I know that much lol !

It was interesting hearing Miyamoto complain about the WiiU CPU. I wonder what they can come up with physics wise now they have a much more capable CPU to work with. That might play a large role in Mario Switch.

Hopefully the more powerful hardware can spark their developers crazy imaginations. I suppose there's a limit to how good their characters can look given their chosen artsyle.
 
Can some of the more knowledgeable tech / graphical people explain what improvements we should expect not only from the power leap over WiiU but in terms of the improved features and effects on Switch ?

Better textures with the bump in RAM and some AA at last are obvious ones.

What kinds of things will be possible from the CPU leap ? Will all Nintendo games now have PBR for example ? Thanks.
Since you mention PBR - that will be mainly a function of modern graphics pipelines - UE4, Unity, etc.
 
Yeah I know that much lol !

It was interesting hearing Miyamoto complain about the WiiU CPU. I wonder what they can come up with physics wise now they have a much more capable CPU to work with. That might play a large role in Mario Switch.

Hopefully the more powerful hardware can spark their developers crazy imaginations. I suppose there's a limit to how good their characters can look given their chosen artsyle.
Also according to the Nvidia blog post they made a custom physic engine for the switch whatever that means.

Is it possible botw switch version uses more advance physics?
 
Since you mention PBR - that will be mainly a function of modern graphics pipelines - UE4, Unity, etc.

Interesting thanks.

Do you think Nintendo would entertain the possibility of using UE4 as their main engine for first party games or will they stick with their own in house engines ? Maybe the fp16 performance advantages while using UE4 could sway it ?

Also I've noticed some (at least to my layman's eye) advanced lighting techniques in MK8 and BotW (esp in the shrines) that seem to mimic more advanced lighting solutions that should be impossible on WiiU hardware. I take it that's just their custom solution to lighting ?

Also according to the Nvidia blog post they made a custom physic engine for the switch whatever that means.

Is it possible botw switch version uses more advance physics?

That's very interesting indeed. I can't wait to see the differences between the WiiU and Switch version of BotW that go beyond visuals (if there are any).
 
Interesting thanks.

Do you think Nintendo would entertain the possibility of using UE4 as their main engine for first party games or will they stick with their own in house engines ? Maybe the fp16 performance advantages while using UE4 could sway it ?

Also I've noticed some (at least to my layman's eye) advanced lighting techniques in MK8 and BotW (esp in the shrines) that seem to mimic more advanced lighting solutions that should be impossible on WiiU hardware. I take it that's just their custom solution to lighting ?

I can guarantee they will stick with their own engines. Do you think Nintendo would pay anyone a license for anything in their own games?
 
Yeah I know that much lol !

It was interesting hearing Miyamoto complain about the WiiU CPU. I wonder what they can come up with physics wise now they have a much more capable CPU to work with. That might play a large role in Mario Switch.

Hopefully the more powerful hardware can spark their developers crazy imaginations. I suppose there's a limit to how good their characters can look given their chosen artsyle.

I'm kinda impressed by the phyics on display in Breath of the Wild, it'll definitely be interesting to see what they can cook up on better hardware.
 
What are the chances that in docked mode it hits 512 GFLOPS and in portable mode it matches the Wii U at 176 Glops.

It has a fan running in Portable mode and a fan running even faster in Docked mode. That seems to suggest numbers are on the higher end. An X1 under clocked by 20% requires no fan at all to stay cool in a device Switch sized. The Shield TV is switch size and the fan runs basically silent so you would not even know it was there and Shield TV is a full powered default X1.
 
It has a fan running in Portable mode and a fan running even faster in Docked mode. That seems to suggest numbers are on the higher end. An X1 under clocked by 20% requires no fan at all to stay cool in a device Switch sized. The Shield TV is switch size and the fan runs basically silent so you would not even know it was there and Shield TV is a full powered default X1.

I'd sooner say, given the fan revelation, that this thing is running at stock X1 speeds on portable, and then whatever+% when docked. This also leads me to think this isn't a 20nm X1 derivative either.
 
What are the chances that in docked mode it hits 512 GFLOPS and in portable mode it matches the Wii U at 176 Glops.

Zero, hate to keep repeating it but you don't need a fan to cool that kind of performance. No matter what reasons are considered (no throttling, 20nm ect) it's just pointless to have a fan for a 340mhz Tegra chip and Nintendo don't just throw fans in handhelds for no reason (they've never used a fan before in a handheld).
 
I know, I can't quite figure out the clocks that fit:

* Undocked Switch needing active cooling
* Shield TV Tegra X1 1Ghz 20w TDP
* Two profiles that would allow 720p or 1080p

Thing is, on 16 nm I highly doubt active cooling would be needed to achieve a 720p goal so that points towards 20nm, but given the Pixel C example raises doubts about the cooling solution, but someone mentioned an interesting point about avoiding throttling issues and also I think is safe to say that CPU cores are not going to be downclocked on portable mode to achieve coherency between both modes so this could be a reason for the active cooling.

I think the need to avoid any throttling could explain the use of a fan to some degree but not the the degree where it's needed for Tegra at 600mhz or so. Maybe at Pixel C speeds of 850mhz, I can see that.

Unless the CPU is more powerful than we're expecting, as obviously during gaming that will need to stay full speed even in portable mode. Perhaps the CPU is 4x A72 and 2x A53 all running at pretty high clocks constantly during gaming, that may require a fan even when the CPU is running at 600Mhz or so.
 
Why? Seems pple are always choosing the negatives rumours when it concerns Nintendo. Suddently they all forget the machine is rumoured to be running Dark souls 3.

The developers didnt exactly give it a glowing review on performance. They stated it was running at “a level of performance they are happy with.” If the performance was amazing they wouldnt be waiting until they see the install base to port it.
 
The developers didnt exactly give it a glowing review on performance. They stated it was running at “a level of performance they are happy with.” If the performance was amazing they wouldnt be waiting until they see the install base to port it.

I'm sure they would but that's not the point.
 
Status
Not open for further replies.
Top Bottom