Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.
...

Urban legend SoCs aside, the article raises more questions than it answers.
Yep, it is like we switched out the unknown of the clocks for the unknown of the Cuda Core count.
The final devkit/retain unit may have removed the fan. That would explain all of those.

If it still has a fan though then it kind of has to have more CUDA cores. Otherwise that's a ridiculous waste of space and added cost. And a point of failure which seemingly is completely unnecessary. Considering the DF article only has info on the clock speed I'm gonna hope for more SMs.

And likely be disappointed again.

Even then, it has a vent, they would have to close that up, I mean NOA is in Seattle area, and the vent is at the top of the device, how did this pass QA testing if it didn't need active cooling?
 
...

Urban legend SoCs aside, the article raises more questions than it answers.

Yes, the clockspeeds are surprising but it doesn't really tell us about the specs in full.

Is it really just a downclocked X1? Is there really no other customisations to it like the SM setup?

The annoying thing is still no confirmation about the CPU cores, I guess we're really getting A57 as opposed to A72?
 
You have to take into consideration that the switch will be running under constant load to achieve its Max performance which would be the reason it needs to be actively cooled. The pixel most likely would never be that stressed running anything in its library.
 
It would draw less than 4watts (3DS XL draws about 4watts) yet it has a fan, and is much thicker than 3DS. This news doesn't make sense with what we assumed before (4 A57 cores and 256 cuda cores)
The main role of the fan is probably to cool the device in console mode. The idea of running the fan in a low rpm while in portable mode does seem a little bit excessive though, especially with the leaked low clock speeds.
 
You have to take into consideration that the switch will be running under constant load to achieve its Max performance which would be the reason it needs to be actively cooled. The pixel most likely would never be that stressed running anything in its library.

Haven't benchmarks shown that the Pixel C doesn't do this?

The main role of the fan is probably to cool the device in console mode. The idea of running the fan in a low rpm while in portable mode does seem a little bit excessive though, especially with the leaked low clock speeds.

But even this docked speed is lower than the Pixel C's GPU speed, which does not require a fan. A fan in a portable device is a big commitment, as it takes up quite a bit of space and cost, and becomes a very common point of failure. If this could get by with passive cooling they most certainly would not have included a fan.
 
Haven't benchmarks shown that the Pixel C doesn't do this?



But even this docked speed is lower than the Pixel C's GPU speed, which does not require a fan. A fan in a portable device is a big commitment, as it takes up quite a bit of space and cost, and becomes a very common point of failure. If this could get by with passive cooling they most certainly would not have included a fan.
As you said, the Pixel C is passively cooled, which means that it probably throttle it's GPU clock speed when reaching a certain temperature. Something that a dedicated handheld gaming device can't afford to do, hence the ability to control the temperature with an active cooling solution like a fan.
 
As you said, the Pixel C is passively cooled, which means that it probably throttle it's GPU clock speed when reaching a certain temperature. Something that a dedicated handheld gaming device can't afford to do.

But like I was saying haven't people been sharing benchmarks that show the Pixel C performs a certain program at a its peak clock speed consistently? Meaning, it doesn't necessarily throttle its GPU clocks?
 
But like I was saying haven't people been sharing benchmarks that show the Pixel C performs a certain program at a its peak clock speed consistently? Meaning, it doesn't necessarily throttle its GPU clocks?
But are we talking about the CPU or GPU running at peak clock speed? Fully loading the CPU for a long period of time shouldn't be a problem. The GPU on the other hand generates a lots of heat.
 
Doesn't the Pixel C uses the case as heat sink aka it gets pretty fucking hot at heavy loads - based on reviews.

Not sure if that is so save if you want to make a dedicated gaming platform and being on peak load all the time.
 
Doesn't the Pixel C uses the case as heat sink aka it gets pretty fucking hot at heavy loads - based on reviews.

Not sure if that is so save if you want to make a dedicated gaming platform and being on peak load all the time.
And since it's a tablet, the size of the surface used to dissipate all that heat.
 
But are we talking about the CPU or GPU running at peak clock speed? Fully loading the CPU for a long period of time shouldn't be a problem. The GPU on the other hand generates a lots of heat.

It was running the manhattan demo, which loads GPU and CPU. It consumed 8watts btw and the CPU is clocked at 1.9ghz which actually consumes 4 of those watts, @ 1ghz it would consume ~1watt. This device doesn't need a vent, much less a fan. It doesn't make sense if this is just a X1 configuration at all.

Doesn't the Pixel C uses the case as heat sink aka it gets pretty fucking hot at heavy loads - based on reviews.

Not sure if that is so save if you want to make a dedicated gaming platform and being on peak load all the time.

The Pixel C is 7mm, Switch is 15mm, CPU alone is much much more power draw (about 400% more) the GPU clocked at 768ghz (when docked) would still only draw around 2watts, a 3watt SoC with a fan and heat sink, and the vent is on the top of the device? There is just no way that it makes technical sense. I mean you can put a fan on a calculator, but I'd have the same questions.
 
I'm rather confused by the eurogamer article given other information. /shrug

In fact these specs don't make too much sense with Matt's info, or Laura's...
 
I'm rather confused by the eurogamer article given other information. /shrug

In fact these specs don't make too much sense with Matt's info, or Laura's...
Or anybody's, for that matter.

I'm wondering how many wiiU up-ports might have a CPU issue as well..
 
Or anybody's, for that matter.

I'm wondering how many wiiU up-ports might have a CPU issue as well..

What are we thinking is off? Eurogamer's article is only clocks, they don't have actual guts info as far as I can tell.

But that's even off from the stuff in this very thread [OP], which is really bizarre.
 
What are we thinking is off? Eurogamer's article is only clocks, they don't have actual guts info as far as I can tell.

But that's even off from the stuff in this very thread [OP], which is really bizarre.
Yeah, ultimately the clock speeds alone means very little.
 
So, since it's going to be like last time with the WiiU and nobody being able to confirm anything until we see pictures of the chip itself, how long would it be after release before knowing how customized it is? One month?
 
I'm rather confused by the eurogamer article given other information. /shrug

In fact these specs don't make too much sense with Matt's info, or Laura's...

Me too. Did they do a last minute downclock?

I don't understand: why Nintendo should customize a Tegra X1 to make it less powerful? Honest question
 
What are we thinking is off? Eurogamer's article is only clocks, they don't have actual guts info as far as I can tell.

But that's even off from the stuff in this very thread [OP], which is really bizarre.
Very much. While GPU's still largely an unknown, CPU clocks are the most striking discrepancy - not only does the suggested CPU complex not outdo those Jaguars (and is far from the TX1), it could even have issues with wiiU up-ports, which could be CPU-limited now, depending on bad star alignment. Truly bizarre.
 
Or anybody's, for that matter.

I'm wondering how many wiiU up-ports might have a CPU issue as well..

I remember this from matt:

I don't think there will be technical limitations preventing pretty much any game from showing up on the Switch.

http://www.neogaf.com/forum/showpost.php?p=225430134&postcount=551
Of course they both could. Maybe not as dense for Unity, I don't know, but it certainly wouldn't be a night and day difference.


With just the frequency I find this article from eurogamer is really not telling at all (especially if we based our information on past matt post).
 
So I guess this generations WUST was this here NSDKSL (Nusdicksul)?
 
Digital Foundry pretty much confirmed the specs from the first post and is suggesting Nintendo is briefing developers with the same specs as well.

Dang we just went circles for over 100 pages for nothing lol
 
Digital Foundry pretty much confirmed the specs from the first post and is suggesting Nintendo is briefing developers with the same specs as well.

No, all they said is the same thing that Emily Rogers and NateDrake have said, which is that the specs in the OP look "pretty close" to what the final specs are. We have absolutely no idea what the major differences are, and Digital Foundry explicitly said they have no idea what customizations have been made.
 
I'm optimistic about the €199-249 price range.
Will be a great platform for that price

The price won't change because of clocks, they are meaningless to the price of the device. Core count, active or passive cooling (switch uses the more expensive option here) those are what matter. You'll still be lucky if this is $250 (USD)
 
Digital Foundry pretty much confirmed the specs from the first post and is suggesting Nintendo is briefing developers with the same specs as well.

Dang we just went circles for over 100 pages for nothing lol

No they haven't, in fact they've confirmed completely different spec targets than the OP.
 
Going over the specs further, before Switch was revealed, some information popped up on Twitter that was thought to be out of date / fake. However, Digital Foundry says that “Nintendo has briefed developers recently with the same information.” Almost everything was on point besides the 4K30 aspect of the spec.

These were the specs that had surfaced on Twitter:

CPU: Four ARM Cortex A57 cores, max 2GHz
GPU: 256 CUDA cores, maximum 1GHz
Architecture: Nvidia second generation Maxwell
Texture: 16 pixels/cycle
Fill: 14.4 pixels/cycle
Memory: 4GB
Memory Bandwidth: 25.6GB/s
VRAM: shared
System memory: 32GB, max transfer rate: 400MB/s
USB: USB 2.0/3.0
Video output: 1080p60/4K30
Display: 6.2-inch IPS LCD, 1280×720 pixels, 10-point multi-touch support

VS

Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch. #1
https://twitter.com/NWPlayer123/stat...16886109655041

Four ARM Cortex-A57 cores, max 2GHz
NVidia second-generation Maxwell architecture
256 CUDA cores, max 1 GHz, 1024 FLOPS/cycle
4GB RAM (25.6 GB/s, VRAM shared)
32 GB storage (Max transfer 400 MB/s)
USB 2.0 & 3.0
1280 x 720 6.2" IPS LCD
1080p at 60 fps or 4k at 30 fps max video output
Capcitance method, 10-point multi-touch
 
No they haven't, in fact they've confirmed completely different spec targets than the OP.

Right, that's a good point. The CPU is clocked at 2GHz in the OP here and the GPU is clocked at 1GHz.

People seem to be forgetting that the leaker from the OP has explicitly said on twitter that these specs are not leaks- they are just guesses and speculation. I don't know why this is still being taken as a leak in this case.
 
The interesting part is that the clock speeds of the Switch according to Digital Foundry is 1 GHZ vs a Shield TV which runs at 2GHZ (which the above suggested). That does not sound good..
 
Right, that's a good point. The CPU is clocked at 2GHz in the OP here and the GPU is clocked at 1GHz.

People seem to be forgetting that the leaker from the OP has explicitly said on twitter that these specs are not leaks- they are just guesses and speculation. I don't know why this is still being taken as a leak in this case.

The OP was always the max clocks for the Tegra X1. If Nintendo went with lower clocks, perhaps they went wider with more CPU + GPU cores than standard to overcome some of the power consumption deficiencies of 20nm. Otherwise we're looking at a device that will essentially be limited to Nintendo games once again, barring 3rd parties willing to take a ground up approach on the platform.
 
The OP was always the max clocks for the Tegra X1. If Nintendo went with lower clocks, perhaps they went wider with more CPU + GPU cores than standard to overcome some of the power consumption deficiencies of 20nm. Otherwise we're looking at a device that will essentially be limited to Nintendo games once again, barring 3rd parties willing to take a ground up approach on the platform.

But that is the thing, we already have rumors about 3rd party games running fine on the switch. We don't have a complete picture here, it looks like the configuration has changed from 2SM to 3SM or even 4SM, and possibly more cpu cores/A72 cores, this thing just doesn't need expensive active cooling if it is just X1 with those clocks.
 
The OP was always the max clocks for the Tegra X1. If Nintendo went with lower clocks, perhaps they went wider with more CPU + GPU cores than standard to overcome some of the power consumption deficiencies of 20nm. Otherwise we're looking at a device that will essentially be limited to Nintendo games once again, barring 3rd parties willing to take a ground up approach on the platform.

Which goes against what pretty much every insider has said. I'm hopeful that we'll see 3SMs somehow but it's just odd that they'd increase the price so much when just upping the clocks would give them a comparable power range. Maybe adding SMs is cheaper than increasing the battery size?

Either way, if this thing has even one fan, let alone two as reported by LKD, the specs discussed by Digital Foundry make absolutely no sense.
 
According to digital foundries exclusive leaked info.

Undocked 1020Mhz CPU / 307.2MHz GPU / 1331-1600 MHz Ram (Developer can choose)
Docked 1020MHz CPU / 768MHz GPU / 1600MHz
 
Clocks are a bit lower than my predictions (specially CPU ones) but in the range I was expecting, this does however solve some pieces of the puzzle for me:

* As I expected CPU clocks are the same in docked/portable mode.
* GPU clocks 307.2 vs 768 Mhz, now this is the best thing out of the news, the GPU clock ratio is x2.5 docked, which is over the 2.25 ratio between 720p and 1080p so this highly points that the system is designed to give a 720p experience on the go and a full 1080p on TV.

Jimmy Fallon demo was hard to judge, but these clocks puts my 1080p BotW dream really, really close to being real!!!
* Memory downclocking was expected as GPU doesn't need as much bandwidth if its working at lower clocks.

Still a mystery:

* Portable active cooling: With those clocks it's puzzling why the heck it needs active cooling.
-> We could speculate that it's truly still using 20nm as node because they got a really nice deal with some semiconductor plant, like half the cost per chip vs. 14/16nm nodes.
* Highly customization?: My theory is that there isn't, I was rooting for memory setup changes, but doesn't seem as necessary with those clocks, base 25GB bandwidth may even do the trick, more CPU cores on the other side would really make sense but seeing the overall decisions in the design I'm doubting them as well.
 
But that is the thing, we already have rumors about 3rd party games running fine on the switch. We don't have a complete picture here, it looks like the configuration has changed from 2SM to 3SM or even 4SM, and possibly more cpu cores/A72 cores, this thing just doesn't need expensive active cooling if it is just X1 with those clocks.

There's nothing to indicate that the cooling system is particularly elaborate.

I could see them going wider if it could be done cheap enough, but it's just idle speculation. One of the many ways to avoid the heat and power consumption penalties of 20nm in comparison to 16nm. It could very well be that they just decided they didn't need the additional performance. It wouldn't be the first time.
 
But that is the thing, we already have rumors about 3rd party games running fine on the switch. We don't have a complete picture here, it looks like the configuration has changed from 2SM to 3SM or even 4SM, and possibly more cpu cores/A72 cores, this thing just doesn't need expensive active cooling if it is just X1 with those clocks.

Weren't they just that the game can "run" on Switch? That really doesn't say anything about performance.
 
Which goes against what pretty much every insider has said. I'm hopeful that we'll see 3SMs somehow but it's just odd that they'd increase the price so much when just upping the clocks would give them a comparable power range. Maybe adding SMs is cheaper than increasing the battery size?

Either way, if this thing has even one fan, let alone two as reported by LKD, the specs discussed by Digital Foundry make absolutely no sense.
I think it has more to do with overheating than battery. Also using a bigger battery leaves you with less space for components because it takes more space.

3SM would make much more sense, it would justify the fans and at least when docked it would be in line with our expectations (actually a bit better than the stock tx1, 600gflops vs 512), but the undocked and CPU clocks are still incredibly bad. I just don't understand.

Not that i expect 3SM anyway, this could just be a series of bad choices/priorities like previously seen with wiiu. I don't think they realize what kind of shitstorm they'll see after the first DF analysis of third party games, starting with Skyrim. Which is hilarious considering the exact same thing helped killing the wii u image for "core" gamers.
 
I'm wondering how many wiiU up-ports might have a CPU issue as well..

I might have asked a long time ago, but how do the A57 cores compare to PPC 750 cores (instr issue, pipelines, FPU, integer, cashews etc.)? Surely the OOOE in A57 is significantly better. :P
 
To be fair, that could mean unlocked 20-30fps. From gives no fucks about consistent performance.

That is still a pretty good result if the Switch really does have wiiu power when undocked, though.

Edit: actually a bit more powerful than wiiu considering wiiu's older architecture.
 
Very odd how we haven't heard any negative comments from third party devs regarding the power/capabilities of the Switch, this close to launch. Not even comments about difficulties getting games running at desired levels etc. Unless I missed them?

Feels like we had two or three high profile examples at this point with the Wii U.
 
Very odd how we haven't heard any negative comments from third party devs regarding the power/capabilities of the Switch, this close to launch. Not even comments about difficulties getting games running at desired levels etc. Unless I missed them?

Feels like we had two or three high profile examples at this point with the Wii U.

No games planned confirmed
 
LOL...gaf insiders are such a joke. Sorry.
You realize that everything reported is,just " rumor" right? And even if you take this article at face value it leaves a lot of infocout and they themselves haven't a idea of whats inside the box? You realize that right? And also one of these insiders has pretty much been right all along, like LKD. Its more to the picture, that much I am sure of.

Folks should calm down.
 
That is still pretty miraculous if the Switch really does have wiiu power when undocked, though.

I think most AAA style games that make it to switch will be running well under native resolution while undocked. It won't look that bad, necessarily, but expecting 720p is a pipe dream if we're dealing with an underclocked TX1.
 
Can someone explain to me how they come up with the 1020 mhz clock on the cpu and the gpu clock speeds? It looks to me that they are just guessing these due to the fact that its smaller than the shield android TV

Furthermore their entire line of argumentation is almost non existent since they always rely on rumors (Spec info from twitter and that VentureBeat article).

So I'm kinda shocked that everybody is losing their minds about this! The CPU and GPU speeds can very well be higher than that.
 
Status
Not open for further replies.
Top Bottom