A Nintendo Switch has been taken apart

I think it's also odd for Nintendo to have moved away from their concept of docking the device for 1080p, because the 384mhz to 768mhz would not be enough of a jump to reach the resolution change

You can't make an argument about sticking to a convention that yourself are not respecting when your flagship game is 900p/720p

Also Jose at IGN commented that when the device is docked, it is much much hotter than when it is undocked, and a 100% clock increase on the GPU shouldn't be enough to accomplish this max out.

That's a rather empirical measurement for such a definitive conclusion.
 
He played for 5-6 hours straight with a tablet-ish, and it was connected to a power source, you bet it's warm.

EDIT: i don't know what you're trying to say with the Manhattan benchmark but it barely looks better than a PS2 game thanks to more modern-ish effects and a higher res.
 
For CPU experts (*cough* blu *cough*) would 3x A57s at 1GHz with these GPU clocks be "less CPU limited" as Matt said than the PS4 or XB1 CPUs are? I'm assuming he meant the balance between GPU and CPU is better for the Switch than it is for the PS4 and XB1.

I still haven't quite figured out how A53s compare to twice as many (much faster) Jaguar cores.



I agree with almost all of that, but the Switch getting much hotter when docked can be easily explained by the battery being charged.

He said it was after 4 to 5 hours with the battery fully charged. It's in the NVC podcast from this week.
 
You can't make an argument about sticking to a convention that yourself are not respecting when your flagship game is 900p/720p



That's a rather empirical measurement for such a definitive conclusion.
But many other games do experience such resolution bump... To the point where Zelda is an outlier so far. And the only games that are 720p both when in handheld and dock mode are all very far away in time.
Also, AC4 was 900p at launch on PS4, and 1080p with the day one patch. COD Ghosts was 720p.
 
He said it was after 4 to 5 hours with the battery fully charged. It's in the NVC podcast from this week.

It would still be constantly charging, though I guess electronics typically don't get that hot when charging at 100%, so good point.

It would be interesting to see some hard data about temperatures/fan noise.
 
But many other games do experience such resolution bump... To the point where Zelda is an outlier so far. And the only games that are 720p both when in handheld and dock mode are all very far away in time.
Also, AC4 was 900p at launch on PS4, and 1080p with the day one patch. COD Ghosts was 720p.

And that's why this new handheld mode is optional.

If a game can run in 1080p docked then it can user the lower clocked handheld mode for 720p and a better battery life.

If a game has troubles running at 1080p while docked and runs instead at 900p or even lower, the new boost mode is a God given for the handheld mode.

It would still be constantly charging, though I guess electronics typically don't get that hot when charging at 100%, so good point.

Anecdotal evidence: my phone runs much hotter while plugged in even for the most mundane tasks.
 
It would still be constantly charging, though I guess electronics typically don't get that hot when charging at 100%, so good point.

It would be interesting to see some hard data about temperatures/fan noise.

I'm sure we will soon, now that I think about it, he did say that when he pulled the switch off the dock, the battery was actually at 83% so it is lightly recharging the device, not the full below 50% charge power but still should add something to the device, didn't someone give us 12w usage when docked? too bad we don't know what X1's 768mhz would draw over 384mhz, it being 5.6w more I guess is possible, though the pixel C drew 8watts for the entire system during the Manhattan demo, but it was likely throttled since the CPU should be drawing around that alone.

And that's why this new handheld mode is optional.

If a game can run in 1080p docked then it can user the lower clocked handheld mode for 720p and a better battery life.

If a game has troubles running at 1080p while docked and runs instead at 900p or even lower, the new boost mode is a God given for the handheld mode.



Anecdotal evidence: my phone runs much hotter while plugged in even for the most mundane tasks.

The docked mode has more going on in Zelda than just a resolution bump probably, but the battery capacity while playing zelda is about 3 hours according to Nintendo and jose was able to get 2 hours and 40 minutes before he docked zelda, but he didn't let it run out which could have given him another 10-15 minutes, he didn't know at the time that Switch will save state when the battery dies. Really cool feature IMO
 
He said it was after 4 to 5 hours with the battery fully charged. It's in the NVC podcast from this week.

No he said it was charging and complained because if you play while it charges it takes forever. All in all it's normal that it was warm.
 
For CPU experts (*cough* blu *cough*) would 3x A57s at 1GHz with these GPU clocks be "less CPU limited" as Matt said than the PS4 or XB1 CPUs are? I'm assuming he meant the balance between GPU and CPU is better for the Switch than it is for the PS4 and XB1.
To adequately answer that question takes probing the full setup - i.e. the afore-mentioned CPU and a TX1 GPU. Since I don't have access to either a Switch or a Shield, I really cannot give an adequate answer.

I still haven't quite figured out how A53s compare to twice as many (much faster) Jaguar cores.
Jaguars still have the better IPC and the twice-wider SIMD ALUs. That said, 28nm A53s run happily at 2GHz (with the appropriate heat dissipation), so for general purpose code it's not that hard for the A53s to stay within 1.6GHz jags, core per core. Or did you mean something else?
 
To adequately answer that question takes probing the full setup - i.e. the afore-mentioned CPU and a TX1 GPU. Since I don't have access to either a Switch or a Shield, I really cannot give an adequate answer.

Jaguars still have the better IPC and the twice-wider SIMD ALUs. That said, 28nm A53s run happily at 2GHz (with the appropriate heat dissipation), so for general purpose code it's not that hard for the A53s to stay within 1.6GHz jags, core per core. Or did you mean something else?

I actually meant A57 in my second paragraph, my bad. But thanks for the answer.

My hands type a lot faster than my brain does at times.
 
You want to discuss semantics? Let's discuss semantics. Let's do it based on actual quotes.

http://www.eurogamer.net/articles/digitalfoundry-2016-nintendo-switch-spec-analysis



It's a direct quote from the docs, it belongs to Nintendo, not to Eurogamer.

http://www.eurogamer.net/articles/d...-boosts-handheld-switch-clocks-by-25-per-cent



Now tell me again how this works against the credibility of Eurogamer?

I didn't say it worked against the credibility of Eurogamer, you're imagining that. I said the argument that clocks can't change because Eurogamer have seen docs with final clocks (which plenty of people used) is something that can't be used as an argument anymore. My point there is some of us said be careful assuming these truly are final when the doc itself specifically mentions "clocks to use at launch". There is no reason to say "clocks to use at launch" if those clocks are 100℅ final.
 
Totally. But it's also a bit more complicated than just the install base. Games wise there will also be little competition out of the gate, Nintendos own games excluded. So if the Switch gains traction, and there's some data showing that Switch consumers buy more than just Nintendo games (one of the biggest issues in the past) then it might be worth the risk to sell to the smaller, more content starved audience, rather than throw a game into an overcrowded marketplace elsewhere. Granted with non-indie 3rd parties, that's a very large financial risk.

I feel this is the biggest red herring. I think Nintendo fans would buy non nintendo games if they were of a similar quality to those of the other consoles, I know I would. The problem at the moment is that only Nintendo can put 100% into games there because it is in their interest for the console to do well.

To port a COD/Fifa/Battlefield/GTA you have to make sacrifices the publishers are unwilling to do, because you would cut the game down so much it would no longer be attractive to consumers, if Nintendo only swallowed their pride and had power parity, it wouldn't be an issue at all.

I feel we will forever be in this mediocrity spiral Nintendo is in, in regards to console sales, until they realise this. Up the power, reap the rewards.
 
You could argue that the switch has mediocre specs, however it does have a really good excuse in that it has to be a workable handheld in terms of battery and in terms of been a practical size. So the goal is to get the best bangs per buck out of it's form factor. You could argue that it needs to be 16nm and A72 but I think that may have ment abandoning work already done and would have delayed the switch further. Now if Nintendo had gone for a pure home console there would be no excuse for not going after the ps4/Xbox one this time round.
 
It looks like the exact same chip, just a slightly different serial code/printing layout.

Yeah, I was just pointing out it wasn't X1's memory chips, it is the same bandwidth but people are jumping the gun too much to protect their feelings or not to get angry or whatever else stops people from speculating.
 
Have any of these tear downs answered the only question that matters?

That being can the default analog sticks be replaced with Xbox One analogs?
 
I never played Xenoblade Chronicles X, but it always looked quite amazing for a Wii U game. I am quite interested in seeing them take full advantage of a system that is, when docked, about 3-5 times as powerful.

I think the graphics in that game are overrated (I'm actually playing it right now). It looks better in videos and screenshots than when you actually are running around in the game's world yourself (too much pop-in).
 
I never played Xenoblade Chronicles X, but it always looked quite amazing for a Wii U game. I am quite interested in seeing them take full advantage of a system that is, when docked, about 3-5 times as powerful.

What they were able to achieve with X with the unfortunate hardware-imposed limitations they had to work with is nothing short of impressive.
 
What they were able to achieve with X with the unfortunate hardware-imposed limitations they had to work with is nothing short of impressive.

With switch they finally have hardware that can push a realistic style well, and they decided to do an anime style, it will be interesting to see what they do after xb2 because they've only been able to push 1 title a console generation for Nintendo, but this game is coming so early and the switch is so easy to develop for, I think they will get a second and even 3rd title out on switch.
 
I've asked this again on another thread but no one had bothered to answer me. So with all the information we have so far on Switch's hardware, can anyone tell me how it compares to other mobile SoC like Apple's A10 or Snapdragon 821?
 
With switch they finally have hardware that can push a realistic style well, and they decided to do an anime style, it will be interesting to see what they do after xb2 because they've only been able to push 1 title a console generation for Nintendo, but this game is coming so early and the switch is so easy to develop for, I think they will get a second and even 3rd title out on switch.

I think they will too. And that X cliffhanger must be resolved!
 
Don't know if this has been mentioned yet but Nintendo's Japanese support site says the Switch supports UHS-I Class 1 cards (just like those licensed HORI cards).

Q: 【Switch】使用できるmicroSDカードは何ですか?

A: Nintendo Switchは、microSD/microSDHC/microSDXCメモリーカードに対応しています。

また、UHS-I規格およびUHSスピードクラス1に対応しています。

"Q: What microSD cards could be used with the Switch?

A: Nintendo Switch can support microSD/microSDHC/microSDXC memory cards.

It also supports the UHS-I standard as well as UHS Speed Class 1."

--------------

On other news, the SD Association has just introduced the UHS-III standard which still utilizes the same layout as UHS-II cards.

bus_speed.jpg
 
With switch they finally have hardware that can push a realistic style well, and they decided to do an anime style, it will be interesting to see what they do after xb2 because they've only been able to push 1 title a console generation for Nintendo, but this game is coming so early and the switch is so easy to develop for, I think they will get a second and even 3rd title out on switch.

Not completely correct on the past. I mean, that's kinda correct for Takahashi as a director, but that also applies to him before joining Nintendo although his role in Xenosaga III was important.
As far as Monolith goes, due to their peculiar structure they've usually been able to output 4-5 "full" games per gen, more if they collaborate with others.

Currently it's not clear how deep their role is with the new Zelda, but since they went as far as putting it in their homepage beside XB2 I believe a good amount of personnel was "lent" to the Zelda team and I'd consider it reasoning about the future.

Anyway, I'm not sure how heavy is the development of XB2, the trailer totally gave me an idea of a smaller project than X and even XB were, one which can surely leave room for another important project to be in development in parallel (hopefully X-2).
It remains to see if team #2 has some involvement in this or they're another variable to be considered: although the two PxZ games weren't comparable to the Xeno games, they have proven to be able to ouput good/great secondary projects (without Takahashi) too, like Disaster or BK (and unlike Soma Bringer). The development context and experience from X should give a nice boost too as you say.

tl;dr: I absolutely see Monolith realeasing two more games other than Zelda and XB2 by 2020, and three could easily happen since they always have with normal platforms (bad Wii U, bad).
 
I can't imagine XC2 being a smaller project/budget than Xenoblade or even X wich had ugly, simple cutscenes. Shorter? Maybe due to the massive increase in costs. But not worse.
 
So, would the Switch's support of UHS-II cards only be possible in a future hardware revision?

Depends if the pins are there or not in the current switch. I'm hoping they are, but just waiting for a firmware update to make use of (similar to how USB2.0 is only supported until a firmware upgrade for USB3.0)
 
Don't know if this has been mentioned yet but Nintendo's Japanese support site says the Switch supports UHS-I Class 1 cards (just like those licensed HORI cards).



"Q: What microSD cards could be used with the Switch?

A: Nintendo Switch can support microSD/microSDHC/microSDXC memory cards.

It also supports the UHS-I standard as well as UHS Speed Class 1."

--------------

On other news, the SD Association has just introduced the UHS-III standard which still utilizes the same layout as UHS-II cards.

bus_speed.jpg

UHS III looks sexy as hell, hope future iterations of Switch support higher uhs modes, even if its just uhs II to get even faster loading times.
 
I've asked this again on another thread but no one had bothered to answer me. So with all the information we have so far on Switch's hardware, can anyone tell me how it compares to other mobile SoC like Apple's A10 or Snapdragon 821?
Well, I'll state the obvious: not favourably CPU-wise, and favourably (but not by much) GPU-wise.

I'm guessing he means the picture on Samsung's site which confirms it was the module used by the Switch.
Nobody saved that?
 
Don't know if this has been mentioned yet but Nintendo's Japanese support site says the Switch supports UHS-I Class 1 cards (just like those licensed HORI cards).



"Q: What microSD cards could be used with the Switch?

A: Nintendo Switch can support microSD/microSDHC/microSDXC memory cards.

It also supports the UHS-I standard as well as UHS Speed Class 1."

--------------

On other news, the SD Association has just introduced the UHS-III standard which still utilizes the same layout as UHS-II cards.

bus_speed.jpg

I'm not familiar with these things so any recommendations of this type of card?
Higher capacity the better and would prefer one over 128GB's.
 
But reducing the amount of effort (i.e. ultimately to no extra optimisation cost) means providing the performance levels of a desktop, whatever that entails. Where do we draw the line with that?


A few screengrabs.

How well does that run on an Nvidia Shield TV? Just for shits and giggles. I realize that we're still about 20+ years (maybe even closer to 40, TBH. That is, however, assuming that TV manufacturers stop at 8K for consumer devices and keep higher resolutions for archival and professional use only since the Human eye has issues with even 4K at current TV sizes) from real-time ray-traced lighting, but I'd still like to see how this tiny little thing pumps it out.
 
I've asked this again on another thread but no one had bothered to answer me. So with all the information we have so far on Switch's hardware, can anyone tell me how it compares to other mobile SoC like Apple's A10 or Snapdragon 821?

Bearing in mind the
1) Shield TV throttles a bit on combined load, but not to Switch clocks, see the Shield throttling tests somewhere in here.
2) The Switch is clocked at half the full Shield CPU speed, and 35-70% of the full GPU clock (docked/undocked)

Shield
74671.png

A10
83906.png


Shield
https://www.futuremark.com/hardware/mobile/NVIDIA+Shield+Android+TV/review
A10
https://www.futuremark.com/hardware/mobile/Apple+iPhone+7/review


So the short of it would be the A10 rolls even the Shield TVs CPU at full clocks, the Switch has little chance there. The Shield TV enjoys enough thermal overhead to still be beating the iPhone 7 on the GPU side, but at docked clocks the Switch would be close-ish, while undocked obviously it's around half of its full performance.



It's crazy that Apple just decided to get serious about building ARM cores and started consistently rolling everyone year over year, I guess that happens when you buy PA Semi and Intrinsity and actually keep the talent and manage it well. I think an Apple TV that kept up to date with iPhone chips and maybe with a heatsink and active cooling would be one beast of a microconsole, with more storage.
 
Sweet, thanks for sharing those.
Sure. Funnily enough, that started life as an experiment in real-time ray-accelerator building - sparse octrees as an acceleration structure, built per frame, and ended up as a Tetris, which eventually got an OpenCL as well as network-distributed versions (it's hard to come by 8-core+ desktops outside the office, hopefully things will change come Ryzen). Code even got used in a public event intro, but most importantly my kids like playing it ; )

How well does that run on an Nvidia Shield TV? Just for shits and giggles. I realize that we're still about 20+ years (maybe even closer to 40, TBH. That is, however, assuming that TV manufacturers stop at 8K for consumer devices and keep higher resolutions for archival and professional use only since the Human eye has issues with even 4K at current TV sizes) from real-time ray-traced lighting, but I'd still like to see how this tiny little thing pumps it out.
It may sound weird, but this was mostly coded on a dual-core Bobcat, before it got moved to big iron. So even though I haven't ran it on a Shield, I have an idea how it'd perform. On the bobcat, to get a playable fps (~12fps) I had to reduce the framebuffer to 128x128 and drop the AO ray count to 16 (per pixel). For comparison, the screengrabs show 64 AO rays and 256 AO rays (first 3 and second 3, respectively) per pixel. So a Shield CPU should hypothetically be able to run this at 256x256 at somewhat higher fps, but definitely not at 64 AO rays, and 256 is strictly GPGPU territory ; )
 
Don't know if this has been mentioned yet but Nintendo's Japanese support site says the Switch supports UHS-I Class 1 cards (just like those licensed HORI cards).



"Q: What microSD cards could be used with the Switch?

A: Nintendo Switch can support microSD/microSDHC/microSDXC memory cards.

It also supports the UHS-I standard as well as UHS Speed Class 1."

--------------

On other news, the SD Association has just introduced the UHS-III standard which still utilizes the same layout as UHS-II cards.

bus_speed.jpg

A bit sad it doesn't support UHS-II cards to make the console more future proof but oh well, USB-C wasn't expected either so we are still lucky :)
 
So the short of it would be the A10 rolls even the Shield TVs CPU at full clocks, the Switch has little chance there. The Shield TV enjoys enough thermal overhead to still be beating the iPhone 7 on the GPU side, but at docked clocks the Switch would be close-ish, while undocked obviously it's around half of its full performance.

Implying that the iPhone doesn't throttle as well outside of 2 minutes long benchmarks?
 
Implying that the iPhone doesn't throttle as well outside of 2 minutes long benchmarks?

Far less than the TX1, or any other (full clocked, the Switch clocks were obviously chosen to never throttle) high performance mobile ARM chip out there.

iphone6s-1.png


iPhone6sPlus_TRex_Rundown.png


That's the 6S, can't find the 7 but it got even better on throttling.
 
Top Bottom