A Nintendo Switch has been taken apart

It doesn't bother me one way or another if it is 20nm or 16nm, and I have no technical evidence, however in my gut I want to say it is 16nm. Why else would Nintendo choose this launch timing, and wait to deliver the final dev machines if they weren't waiting on a particular change in tech.

Zelda has been done for a while now, and I am of the opinion that Pokemon Sun and Moon had been back ported to 3DS from the work they did on a Switch game just so they wouldn't miss their anniversary. If Nintendo was waiting to launch the Switch until they got more 3rd party support they would have gotten out the final dev machines sooner.

Are there any alternative theories as to why the Switches design was finalized so late?

Look at the launch dates for the 1st party games. It's pretty obvious Nintendo struggled to finalise enough games in time for Switch's first 6 months.

Plus this is also the official reason provided by Nintendo for releasing in March 2017, that they needed the time to make the software.

The Pokemon theory has very little base in reality considering the timing of the Pokemon releases on DS and 3ds combined with the 3ds launch.
 
But now the docked mode seems to lag behind for the straight 720p to 1080p conversion. I would have expected this increase to be combined with an increase in docked speed as well.

I was thinking the same thing, there should be a power boost for docked mode too, imo.
 
So the foxconn leak mentioned an enchanced unit. Im still not sure what it is but im wondering if it is somehing you plug into the switch or if it is a dock with a powerfull GPU?

Im wondering that if it plugs to the Switch, maybe the CPU clocks will get boosted because of the new configuration. Im a Iphone developer but im not as knowledgeble as some of you are in latest hardware designs. I was thinking that portable mode power levels actually hold back docked mode power levels because you dont want to have too big of a difference in handheld and docked mode.
 
The new boost power mode seems to be dedicated for the games that couldn't run anyhow at 1080p while docked, allowing them to run at native resolution on the tablet at least.
 
The new boost power mode seems to be dedicated for the games that couldn't run anyhow at 1080p while docked, allowing them to run at native resolution on the tablet at least.

It fits with the Foxconn leak, 5x base frequency was already an assumption based on foxconn's 921mhz. It also gives a different multiplier from handheld to docked mode, instead of 2.5x it is 2.4x so slightly less performance available for the resolution bump, but more than the 2.25x required. This does make the foxconn leak many times more possible because Eurogamer's "final clocks" weren't final.

Edit: read the article, seems undocked clocks might be the only change and cpu unchanged.
 
because Eurogamer's "final clocks" weren't final.

This will be the only time I will address this line of thinking, because it's wasted time.

Using the semantics as an argument is a lame approach to a debate. The "final" clocks are actually still there, just one additional mode for GPU was added (with the corresponding cut for the memory clock). The CPU clock is unchanged.

But whatever, fight the semantic fight if you want.
 
Would they really boost handheld mode and not docked mode? Seems very strange considering battery life and heat (to a certain extent) wouldn't be an issue while docked.
 
Is the fan on n Switch too weak to dissipate heat if they try to boost the docked speed? If the GPU throttles at 786mhz, is the fan on then?
 
Would they really boost handheld mode and not docked mode? Seems very strange considering battery life and heat (to a certain extent) wouldn't be an issue while docked.

I don't see it as "boost", more like fine tunning the system. Maybe they didn't need 1600Mhz RAM speed in handheld mode, and cut it to 1300 --and that gave them more battery life for the GPU.
 
On top of all this data and analysis, we can add an additional metric on top - power consumption from the wall in docked mode, measured from final retail Switch hardware. Under load, the highest power draw we've seen so far is 16W.
How would you even get to 16W with that hardware while docked oO
Even if we assume the worst case, 2W for the CPU and RAM each, in docked mode there is no display and no speakers. Add another Watt for the fan and 1 for wireless connectivity, which leaves us with 10 Watt for the GPU @768 Mhz.

The only options i can think of are:
-It was only a short peak and the average power draw is way lower
-The power measurements for the GPU from Nvidia were actually done on a clock closer to 250-300 mhz and not the estimated ~500 mhz, or it was a really well binned X1, maybe both
-Running a Wii U Port is much more taxing than the Manhattan Benchmark
 
Sorry for asking but I didnt read the thread for a while and just would like to know if that means we could get more speed in docked mode or is that impossible?

How would you even get to 16W with that hardware while docked oO
Even if we assume the worst case, 2W for the CPU and RAM each, in docked mode there is no display and no speakers. Add another Watt for the fan and 1 for wireless connectivity, which leaves us with 10 Watt for the GPU @768 Mhz.

The only options i can think of are:
-It was only a short peak and the average power draw is way lower
-The power measurements for the GPU from Nvidia were actually done on a clock closer to 250-300 mhz and not the estimated ~500 mhz, or it was a really well binned X1, maybe both
-Running a Wii U Port is much more taxing than the Manhattan Benchmark

The guy from the german gamepro tested it also and had as an result 12 watt when everything was full charged and up to 18 watt when the console was in docked mode and was charging the console and the joy-cons.
Also it is to mention that the console has a temperature from about 104 degree fahrenheit in docked and handhed mode while playing Zelda and gets about 38 db loud.
 
Would they really boost handheld mode and not docked mode? Seems very strange considering battery life and heat (to a certain extent) wouldn't be an issue while docked.
When docked the console runs at full speed. It is what it is. Unless we start to talk about overclocking, which is not the case here.

When in portable mode the console is purposefully throttled to conserve energy, but it's a purely programmed restriction. So there's room to lift that restriction. Instead of just limiting it to 40% clock speed it now has available a mode where it's limited to 50% (which represents a 25% increase).
 
In usual Nintendo fashion I can see where they're going with it. You'd get some amazing games that perform at 60fps using the typical Nintendo art style. Problem is that 3rd party games go for the more realistic approach and Ive been telling myself that it's never going to be for that. When I tried it at a demo booth it felt like an arcade machine. Intense, fun, Vibrant visuals but not super realistic graphics. I feel if it succeeds, 3rd party devs will take a different approach. Not release their serious realistic games on it and may even recreate simpler more basic games dedicated to the platform. It will also be where Indies will truly flourish. So basically if a game doesn't have similar art direction which means more simpler less intensive graphics it won't release on the Switch. For example Borderlands will go on Switch but Battlefield will not...etc. Sorry late here and probably ranting and and derailing this a bit. Lol.

The question is why would 3rd parties go to the effort of creating unique games just for the switch. This is the problem Nintendo have. PS4/XONE and PC have games that can just slide across, you cant do that with Switch without significant outlay. The userbase is effectively 0 vs nearly 100m with consoles and pc combined. It doesnt make financial sense to develop exclusive software.
 
This will be the only time I will address this line of thinking, because it's wasted time.

Using the semantics as an argument is a lame approach to a debate. The "final" clocks are actually still there, just one additional mode for GPU was added (with the corresponding cut for the memory clock). The CPU clock is unchanged.

But whatever, fight the semantic fight if you want.

After reading the article, they say they had a look at the updated dev doc, so I'm not trying to stick to the foxconn leak, I just hadn't read the article yet. However the power consumption wouldn't match in portable mode, and the memory bump has also been disabled in portable mode as a compromise. They also take a look at this tear down as final hardware. We are less than a week away, so I see no reason to assume this is in fact final hardware since that is speculation right now. Nothing about the SoC would need to be changed to reach the foxconn clocks either, as they are within the X1's capacity already and moving to 16nm actually shouldn't increase or decrease the die size, as logic size doesn't change.

I'd also like to point out that clocking the Switch up to 768mhz from 384mhz isn't going to increase the docked power consumption from ~2.5w SoC to a 12watt power draw, it's a bit ridiculous to take all this stuff at face value, I'll wait a couple weeks still.
 
This thread is interesting but I'm not very knowledgeable when it comes to all these flops and processes and whatnot.

In pure simple terms for me to understand, how much more powerful is Switch compared to Wii U? Are we talking 2-3 Wii U's of power?
 
This will be the only time I will address this line of thinking, because it's wasted time.

Using the semantics as an argument is a lame approach to a debate. The "final" clocks are actually still there, just one additional mode for GPU was added (with the corresponding cut for the memory clock). The CPU clock is unchanged.

But whatever, fight the semantic fight if you want.

So Eurogamers final clocks were final its just that Nintendo added a new mode with a higher clock?.. Now that's semantics. Clearly what was being touted as final clocks weren't entirely final as now handheld mode has the option of a higher clock speed.

There are obviously still very reasonable arguments against the Foxconn clocks, I'm not denying that. But one thing's for sure "we already have final clocks from Eurogamer" isn't one of them.
 
This thread is interesting but I'm not very knowledgeable when it comes to all these flops and processes and whatnot.

In pure simple terms for me to understand, how much more powerful is Switch compared to Wii U? Are we talking 2-3 Wii U's of power?
Maybe 1.5x to 2 in handheld mode at most.considering the differences in architecture. But with more advantages like triple the memory
 
This thread is interesting but I'm not very knowledgeable when it comes to all these flops and processes and whatnot.

In pure simple terms for me to understand, how much more powerful is Switch compared to Wii U? Are we talking 2-3 Wii U's of power?

undocked, it's about twice as powerful with about 20% raw performance increase over Wii U before architecture and fp16 optimizations are taken into account.

If a developer for instance comes along and can use ~70% of the GPU's performance in fp16, it would reach about 333gflops or well over 3 times Wii U's performance, this is a peak optimization estimate and not something you'd see everywhere, but the handheld is much faster now.
 
This is a forest for the trees misapprehension though. In the context of wider multi-platform development, having a contemporaneous console design whose power is out of line with competitors can be very developer-unfriendly, even if the generation to generation portability is more straightforward. Think art reduction costs, re-design of gameplay systems for divergent CPU capabilities, etc.

NB: I do not speak for the company, etc.

That only been the case like one time though...
 
When docked the console runs at full speed. It is what it is. Unless we start to talk about overclocking, which is not the case here.

When in portable mode the console is purposefully throttled to conserve energy, but it's a purely programmed restriction. So there's room to lift that restriction. Instead of just limiting it to 40% clock speed it now has available a mode where it's limited to 50% (which represents a 25% increase).

Overclocking isn't relevant here, it's a end user thing not a chip designer/system manufacturer thing.
 
undocked, it's about twice as powerful with about 20% raw performance increase over Wii U before architecture and fp16 optimizations are taken into account.

If a developer for instance comes along and can use ~70% of the GPU's performance in fp16, it would reach about 333gflops or well over 3 times Wii U's performance, this is a peak optimization estimate and not something you'd see everywhere, but the handheld is much faster now.

Maybe 1.5x to 2 in handheld mode at most.considering the differences in architecture. But with more advantages like triple the memory

Ahh thanks, so it's like double the power but with better performing tech, that's good. Nintendo did some great stuff on Wii U especially BotW so I'm excited to see what else they do with more power
 
Ahh thanks, so it's like double the power but with better performing tech, that's good. Nintendo did some great stuff on Wii U especially BotW so I'm excited to see what else they do with more power

What is really exciting to me is second wave software from Monolith, I mean if they did a full game in only 3 years with unfinished hardware and did XBCX on Wii U, just think what is possible for them with this new hardware in 2020/2021.
 
What is really exciting to me is second wave software from Monolith, I mean if they did a full game in only 3 years with unfinished hardware and did XBCX on Wii U, just think what is possible for them with this new hardware in 2020/2021.

I never played Xenoblade Chronicles X, but it always looked quite amazing for a Wii U game. I am quite interested in seeing them take full advantage of a system that is, when docked, about 3-5 times as powerful.
 
So Eurogamers final clocks were final its just that Nintendo added a new mode with a higher clock?.. Now that's semantics. Clearly what was being touted as final clocks weren't entirely final as now handheld mode has the option of a higher clock speed.

There are obviously still very reasonable arguments against the Foxconn clocks, I'm not denying that. But one thing's for sure "we already have final clocks from Eurogamer" isn't one of them.

You want to discuss semantics? Let's discuss semantics. Let's do it based on actual quotes.

http://www.eurogamer.net/articles/digitalfoundry-2016-nintendo-switch-spec-analysis

Eurogamer said:
Documentation supplied to developers along with the table above ends with this stark message: "The information in this table is the final specification for the combinations of performance configurations and performance modes that applications will be able to use at launch."

It's a direct quote from the docs, it belongs to Nintendo, not to Eurogamer.

http://www.eurogamer.net/articles/d...-boosts-handheld-switch-clocks-by-25-per-cent

As things stand, our previously reported CPU and GPU clocks remain the default configurations for docked and handheld modes. However, having looked first-hand at a revised version of the document we previously saw in December, a new 'NX add-on' note appended to the doc introduces an expanded table of operating modes. This is how the table looks now, with the new additions in bold.

Now tell me again how this works against the credibility of Eurogamer?
 
You want to discuss semantics? Let's discuss semantics. Let's do it based on actual quotes.

http://www.eurogamer.net/articles/digitalfoundry-2016-nintendo-switch-spec-analysis



It's a direct quote from the docs, it belongs to Nintendo, not to Eurogamer.

http://www.eurogamer.net/articles/d...-boosts-handheld-switch-clocks-by-25-per-cent



Now tell me again how this works against the credibility of Eurogamer?

Well they did say that they got a look at that document a lot longer ago than December... The reason they posted the article on December 6th was said by them, it was venture beat's article that lead them to write their article, not seeing the docs that week, which they said they sat on for months.

What we do know is 500mhz X1 draws 1.5watts so you have 384mhz drawing at most a watt. Giving the SoC a maximum power draw of 2.83watts, leaving 3.6watts for the rest of the system? ridiculous.
 
You want to discuss semantics? Let's discuss semantics. Let's do it based on actual quotes.

http://www.eurogamer.net/articles/digitalfoundry-2016-nintendo-switch-spec-analysis



It's a direct quote from the docs, it belongs to Nintendo, not to Eurogamer.

http://www.eurogamer.net/articles/digitalfoundry-2016-nintendo-switch-spec-analysis



Now tell me again how this works against the credibility of Eurogamer?

True, there is nothing in the new Eurogamer leak that invalidates the first report being, at first, the final specs. It's not like all bets are off now. The validity of this new report hinges on how recent it is: if they have a leak that came from like august, september or october, then it is not the most recent rumour, and nothing would rule out the Foxconn leak's speeds coming to fruition at later points in time in that case (since these clocks were still being tested at the time of the Foxconn leak). I personally don't think this is the case, and I am counting on the new Eurogamer specs to be what we are dealing with for launch (although the questions asked here about power consumption remain interesting nonetheless).

Edit: Although if they had this new spec sheet before December, then they most likely would not have published the old spec sheet in December, so that kinda tells us that they at least received the new spec sheet after airing their first report.
 
What we do know is 500mhz X1 draws 1.5watts so you have 384mhz drawing at most a watt. Giving the SoC a maximum power draw of 2.83watts, leaving 3.6watts for the entire system? ridiculous.

You're totally ignoring the power draw of the memory.

Seeing how the increase in the GPU clock is offset by decreasing the memory clock, it might be significant. edit: the decrease in memory clock is for both handheld modes

I posted earlier in the thread some info how the power draw for LPDDR4 increases dramatically with the performance and we don't know exactly how it works in Switch.

Edit: and all these calculation are based on the assumption that Manhattan benchmark pushes X1 as much as Zelda.
 
You're totally ignoring the power draw of the memory.

Seeing how the increase in the GPU clock is offset by decreasing the memory clock, it might be significant.

I posted earlier in the thread some info how the power draw for LPDDR4 increases dramatically with the performance and we don't know exactly how it works in Switch.

The memory speed decrease and GPU speed increase need not necessarily be related, though. They could have just found out that the higher memory speed turned out to be unnecessary for the handheld mode, and thus disabled it to save some battery. It could still be a significant amount, but we can't deduce it from this I think (that is not to say that it could not be a signficant power draw, of course: I don't know that).
 
The memory speed decrease and GPU speed increase need not necessarily be related, though. They could have just found out that the higher memory speed turned out to be unnecessary for the handheld mode, and thus disabled it to save some battery. It could still be a significant amount, but we can't deduce it from this I think (that is not to say that it could not be a signficant power draw, of course: I don't know that).

Edit: read again, the reduced memory clocked is for both handheld modes, so it is unrelated.

Anyhow, we still don't know what the power draw for it is.
 
You're totally ignoring the power draw of the memory.

Seeing how the increase in the GPU clock is offset by decreasing the memory clock, it might be significant.

I posted earlier in the thread some info how the power draw for LPDDR4 increases dramatically with the performance and we don't know exactly how it works in Switch.

Edit: and all these calculation are based on the assumption that Manhattan benchmark pushes X1 as much as Zelda.

Probably a fairly safe assumption when you consider Zelda is a Wii U port and has less than a year of development time. We will know for sure in a couple weeks what is going on, considering Nintendo's memory options, a 2watt+ memory solution seems ridiculous.
 
The Eurogamer article explicitly talked about how they actually think the Foxconn clocks were used as a stress testing clock speed, yet everything we know about the Tegra X1 in the Shield TV tells us these speeds shouldn't be possible for any reasonable amount of time.

The Foxconn leak does not explicitly say those clocks were run for hours or days, but it heavily suggests it. I still think- even if this just was a stress test and the clocks will never be raised to this level- that the only reasonable conclusion from the chip reaching those clocks is that it's a 16nm process node. I'm not sure why Eurogamer seems to ignore that discrepancy in their article. Maybe it's because the Shield TV throttling still isn't common knowledge?

Of course if the Foxconn clocks were not for a stress test and were just listed as the maximum possible speeds like 2GHz and 1GHz are for the TX1 that changes the story.
 
Maybe Mdave can test the Foxconn clocks on the Shield TV.

If he can input specific clocks like that this would indeed be a good test. I think based on what he's said though the clocks typically go far lower than that. And keep in mind the Shield TV has a much larger volume for heat dissipation, though I guess we can't be sure that the cooling is that much more efficient.
 
True, there is nothing in the new Eurogamer leak that invalidates the first report being, at first, the final specs. It's not like all bets are off now. The validity of this new report hinges on how recent it is: if they have a leak that came from like august, september or october, then it is not the most recent rumour, and nothing would rule out the Foxconn leak's speeds coming to fruition at later points in time in that case (since these clocks were still being tested at the time of the Foxconn leak). I personally don't think this is the case, and I am counting on the new Eurogamer specs to be what we are dealing with for launch (although the questions asked here about power consumption remain interesting nonetheless).

Edit: Although if they had this new spec sheet before December, then they most likely would not have published the old spec sheet in December, so that kinda tells us that they at least received the new spec sheet after airing their first report.

There is newer documentation than December. That much I do know. I haven't seen it personally though. I have friends who have access to the portal, but they won't tell me much. They said they have documentation from January 12th.
 
Exactly. And reducing that amount of effort to get acceptable performance for most games seems like an important hardware design goal even for consoles these days -- much more so than it used to be.

Just look at the platform development from PS2 and PS3 to PS4.
But reducing the amount of effort (i.e. ultimately to no extra optimisation cost) means providing the performance levels of a desktop, whatever that entails. Where do we draw the line with that?

I'd love to see that...
A few screengrabs.
 
There is newer documentation than December. That much I do know. I haven't seen it personally though. I have friends who have access to the portal, but they won' tell me much. They said they have documentation from January 12th..
That's interesting. If the Eurogamer documentation is from January 12th as well, then it would stand to reason that the clock speeds of the presentation are aimed at the boost mode as well (since we can likely assume Zelda uses it, right?).
 
That's interesting. If the Eurogamer documentation is from January 12th as well, then it would stand to reason that the clock speeds of the presentation are aimed at the boost mode as well (since we can likely assume Zelda uses it, right?).

January 12th? It would be too much of coincidence to not.

If he can input specific clocks like that this would indeed be a good test. I think based on what he's said though the clocks typically go far lower than that. And keep in mind the Shield TV has a much larger volume for heat dissipation, though I guess we can't be sure that the cooling is that much more efficient.

The new Shield is much smaller using a revision of the same SoC (that resembles at first glance with the Switch SoC), so the larger volume doesn't seem to do too much for it.
 
The Eurogamer article explicitly talked about how they actually think the Foxconn clocks were used as a stress testing clock speed, yet everything we know about the Tegra X1 in the Shield TV tells us these speeds shouldn't be possible for any reasonable amount of time.

The Foxconn leak does not explicitly say those clocks were run for hours or days, but it heavily suggests it. I still think- even if this just was a stress test and the clocks will never be raised to this level- that the only reasonable conclusion from the chip reaching those clocks is that it's a 16nm process node. I'm not sure why Eurogamer seems to ignore that discrepancy in their article. Maybe it's because the Shield TV throttling still isn't common knowledge?

Of course if the Foxconn clocks were not for a stress test and were just listed as the maximum possible speeds like 2GHz and 1GHz are for the TX1 that changes the story.

The maximum possible speeds would still be what you listed for the TX1.

What you go through in this post is what I'm actively thinking about, it's easy to just accept what is told to you, but the article makes a fair amount of assumptions. I think it's also odd for Nintendo to have moved away from their concept of docking the device for 1080p, because the 384mhz to 768mhz would not be enough of a jump to reach the resolution change.

Also Jose at IGN commented that when the device is docked, it is much much hotter than when it is undocked, and a 100% clock increase on the GPU shouldn't be enough to accomplish this max out. It's also compelling that the handheld mode boost of 384mhz is what we assumed it would be with the foxconn clocks.

January 12th? It would be too much of coincidence to not.



The new Shield is much smaller using a revision of the same SoC (that resembles at first glance with the Switch SoC), so the larger volume doesn't seem to do too much for it.

Zelda power draw doesn't max out the system with "about 3 hours" so who knows?

The new shield is the same size as the old shield, the hard drive space is just removed. Though has anyone checked the power consumption of that device? there was a slight performance upgrade over the original shield tv.
 
For CPU experts (*cough* blu *cough*) would 3x A57s at 1GHz with these GPU clocks be "less CPU limited" as Matt said than the PS4 or XB1 CPUs are? I'm assuming he meant the balance between GPU and CPU is better for the Switch than it is for the PS4 and XB1.

I still haven't quite figured out how A53s compare to twice as many (much faster) Jaguar cores. EDIT: oops meant A57s.

The maximum possible speeds would still be what you listed for the TX1.

What you go through in this post is what I'm actively thinking about, it's easy to just accept what is told to you, but the article makes a fair amount of assumptions. I think it's also odd for Nintendo to have moved away from their concept of docking the device for 1080p, because the 384mhz to 768mhz would not be enough of a jump to reach the resolution change.

Also Jose at IGN commented that when the device is docked, it is much much hotter than when it is undocked, and a 100% clock increase on the GPU shouldn't be enough to accomplish this max out. It's also compelling that the handheld mode boost of 384mhz is what we assumed it would be with the foxconn clocks.

I agree with almost all of that, but the Switch getting much hotter when docked can be easily explained by the battery being charged.
 
That's interesting. If the Eurogamer documentation is from January 12th as well, then it would stand to reason that the clock speeds of the presentation are aimed at the boost mode as well (since we can likely assume Zelda uses it, right?).

I really have no idea. I would assume Zelda uses it, simply because of how demanding open worlds are.
 
Top Bottom