Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.
The GPU in the PS4 and Xbone were cutting edge when they were announced, The GPU in the PS4 Pro compares to a mid/high end Radeon 480

The Switch is not even being released until 2017 and these leaks claim GPU tech from over a year ago

I think people are wishing to much that the Switch will both have a A72 CPU and a Pascal based GPU (which is more likely to be) so I think that the A57 and Maxwell based GPU is probably what Nintendo went with

It'll be whatever hits a good consumer price and profit for Nintendo with passable graphics.

How can you say it's underpowered if it's one of the highest end portable devices out there?
Ahh but remember, this is Nintendo's next Home Console, not handheld.
 
Pascal has been on the (auto) market since January. The rumours had the production for Switch started with trials somewhere in September. There is plenty of time for Pascal.

So two months before the final hardware went into production developers were still using X1 dev kits? I'm starting to think NS is basically a handheld SHIELD Android TV.

For comparison, Orbis dev kits with the final Liverpool APU were sent to developers in January 2013, PS4 went into full production 7 months later in August.
 
So two months before the final hardware went into production developers were still using X1 dev kits? I'm starting to think NS is basically a handheld SHIELD Android TV.

For comparison, Orbis dev kits with the final Liverpool APU were sent to developers in January 2013, PS4 went into full production 8 months later in August.
And this comparison matters how?
 
How long of a battery life do people reasonably expect and want for this device to have? I guess I'm in the minority here, but unless I take it on a camping trip or something, I can't think of a situation where I'll be gaming continuously for more than 3 hours in mobile mode without access to a wall outlet or power source.

I guess roadtrips? Even then though, motion sickness would probably kick in, before I run outta juice.

My thoughts exactly.

I think I dont play that much even on my 3DS
 
Have you missed that it sits in a dock which increases performance? Nobody believes it will have 1Tflop 32fp GPU in mobile mode (though technically with the newest mobile hardware it may be possible). But 1tflop in docked mode is more than possible.

Even if it is in docked mode the cooling system should be some super cutting edge technology that is not even used in best laptops, to cool a 1 Tf 32fp gpu in such a compact area. and from the video you can clearly see their is only one vent. When most new laptops with decent cooling system have at least 2 fans 1 for the cpu and 1 for the gpu and several vents to exhaust heat. And dont forget here we are taking about a device of the size of a tablet where heat will get trapped easily.

And i dont think their will be magic sauce inside the docking bay to magically remove accumulative heat from the unit. it is all about thermals, hence it is quite impossible to fit a 1tf 32bit gpu inside such a device unless they use some cutting edge technology not available to the public which is quite improbable because it will increase the price of the unit quite exponentially . I am just being realistic.
 
Even if it is in docked mode the cooling system should be some super cutting edge technology that is not even used in best laptops, to cool a 1 Tf 32fp gpu in such a compact area. and from the video you can clearly see their is only one vent. When most new laptops with decent cooling system have at least 2 fans 1 for the cpu and 1 for the gpu and several vents to exhaust heat. And dont forget here we are taking about a device of the size of a tablet where heat will get trapped easily.

And i dont think their will be magic sauce inside the docking bay to magically remove accumulative heat from the unit. it is all about thermals, hence it is quite impossible to fit a 1tf 32bit gpu inside such a device unless they use some cutting edge technology not available to the public which is quite improbable because it will increase the price of the unit quite exponentially . I am just being realistic.

From the Nvidia Switch thread:

Not that I expect Switch to hit XBO or PS4 performance levels, but it would be entirely technically possible for them to do so within a 10W envelope. The GP104 Pascal GPU (which is 20 SMs or 2560 "cores") consumes 36W at 1060 MHz. By that basis, an appropriately scaled down Pascal GPU with 6 SMs (768 cores) should be able to achieve ~1GHz within 10W for 1.5TF of FP32 or 3 TF of FP16. Easily the match of PS4 or XBO provided the CPU/RAM/etc are up to it.

As I say I'm certainly not expecting that (perhaps half the performance is plausible), but there's a lot to be gained from a wide application of an energy efficient architecture on a new node with a modest clock speed.

So it's certainly technically possible. Just likely not cheap.
 
Do we know (or have reason to suspect) if the dock has a fan that blows air up through that center vent in the tablet? If it does then I don't see why they can't basically max that chip out when docked. That would also explain the vertical orientation of the docked system.
 
How can you say it's underpowered if it's one of the highest end portable devices out there?

Underpowered home console. How is that so difficult to understand? They've said the NS is first and foremost a home console (although in reality it is of course a (pretty large) mobile device which can be docked for TV-out).
 
And this comparison matters how?

What do you think? It's a fairly obvious comparison.

PS4 dev kits were on final hardware 7 months before production started.

If NS started production in September, dev kits with the final SoC should have been available months in advance. So why then do the July leaks point to the X1 still being used in dev kits?
 
From the Nvidia Switch thread:



So it's certainly technically possible. Just likely not cheap.

Technically it is possible however, their are still other factors to cosider for the tdp of the whole device. and every component will produce heat, because R=V/I.
So as i said before unless they make it a premium item like the surface pro and charge a premium price, a high performance detachable device is quite improbable.
 
What do you think? It's a fairly obvious comparison.

PS4 dev kits were on final hardware 7 months before production started.

If NS started production in September, dev kits with the final SoC should have been available months in advance. So why then do the July leaks point to the X1 still being used in dev kits?

The NS most certainly is not "finalized" with a Jetson X1 board that I can, right now, go and buy on Amazon. In fact the very notion that a commonly available, non-custom board is somehow the final kit for a custom board is ludicrous. Logic aside, the grammar of the sentence doesn't even make any sense.
 
So two months before the final hardware went into production developers were still using X1 dev kits? I'm starting to think NS is basically a handheld SHIELD Android TV.

For comparison, Orbis dev kits with the final Liverpool APU were sent to developers in January 2013, PS4 went into full production 7 months later in August.
All this tells me is PS4 used well established (rather than bleeding edge) tech. Xbox 360 XDKs didn't get final hardware until July/August 2005 (launch in Nov 2005) for comparison.
 
What do you think? It's a fairly obvious comparison.

PS4 dev kits were on final hardware 7 months before production started.

If NS started production in September, dev kits with the final SoC should have been available months in advance. So why then do the July leaks point to the X1 still being used in dev kits?
The leaks also pointed to them having noisy active cooling, so it's possible they were overclocked to approximate the power of the final chip.

Could be Nvidia is really cutting it close with getting the architecture they're using into production. Or at least I'd like to think it's that cutting-edge.
 
The NS most certainly is not "finalized" with a Jetson X1 board that I can, right now, go and buy on Amazon. In fact the very notion that a commonly available, non-custom board is somehow the final kit for a custom board is ludicrous. Logic aside, the grammar of the sentence doesn't even make any sense.

Sorry, which part are you having issues with?
 
What do you think? It's a fairly obvious comparison.

PS4 dev kits were on final hardware 7 months before production started.

If NS started production in September, dev kits with the final SoC should have been available months in advance. So why then do the July leaks point to the X1 still being used in dev kits?

Nvidia has confirmed that the Switch uses a custom SoC. The devkits we know about use stock TX1s. Therefore we have confirmation that the devkits we know about (which could have been sent out even earlier than July, when the report came out) are not using finalized hardware.

Technically it is possible however, their are still other factors to cosider for the tdp of the whole device. and every component will produce heat, because R=V/I.
So as i said before unless they make it a premium item like the surface pro and charge a premium price, a high performance detachable device is quite improbable.

If you read Thraktor's post, that should be able to get 1.5TF of 32FP at ~10W, so excessive cooling wouldn't be needed.

The SoC itself would likely be far more expensive than Nintendo wants, but your initial post claimed that it was "quite impossible" to get a 1TF GPU into that form factor, which his post clearly says to be an incorrect claim.
 
So two months before the final hardware went into production developers were still using X1 dev kits? I'm starting to think NS is basically a handheld SHIELD Android TV.

There were also rumours about new kits arriving in August. Your insistence on finding some definitive proof where is none yet is charming, though.

PS4 was using old tech, of course they had them in the kits earlier. It's not that complicated to understand why there can be differences.
 
If you read Thraktor's post, that should be able to get 1.5TF of 32FP at ~10W, so excessive cooling wouldn't be needed.

The SoC itself would likely be far more expensive than Nintendo wants, but your initial post claimed that it was "quite impossible" to get a 1TF GPU into that form factor, which his post clearly says to be an incorrect claim.

Have you seen GTX1050 Leaked specs of 1.8 tf and 75 wats???
 
What do you think? It's a fairly obvious comparison.

PS4 dev kits were on final hardware 7 months before production started.

If NS started production in September, dev kits with the final SoC should have been available months in advance. So why then do the July leaks point to the X1 still being used in dev kits?
Not all systems follow an identical manufacturing roadmap. Again to use Xbox 360 as a counter example, XDKs got final hardware roughly a month before mass production started.

You keep pointing to PS4 to rationalize NS using a stock X1 like the early devkits, but PS4 is entirely irrelevant to this really. BTW where are you getting that NS production even started in September?
 
What do you think? It's a fairly obvious comparison.

PS4 dev kits were on final hardware 7 months before production started.

If NS started production in September, dev kits with the final SoC should have been available months in advance. So why then do the July leaks point to the X1 still being used in dev kits?

I'll link you a little story to see how Nintendo works and why it's irrelevant to compare Nintendo's processes with the others'

http://www.eurogamer.net/articles/digitalfoundry-2014-secret-developers-wii-u-the-inside-story

Another curious thing to note at this point was that over the course of six months we received multiple different development kits in a variety of colours, none of which revealed why they were different from the previous one.
 
Have you seen GTX1050 Leaked specs of 1.8 tf and 75 wats???

Ah, after a little digging I think his post was referring to the power draw of the GPU cores alone, whereas the 75 watts for the GTX1050 includes the whole board, fan included.

But it's also obviously a prioritization of cores vs. clock speeds. More cores = more cost but more performance per watt.

I'll admit I'm no expert here, I'm just going by what people a lot smarter than me have been saying.
 
Sorry, which part are you having issues with?

Standard Jetson X1 board.

Custom Tegra board.

These are non-equatable by the very simple word "custom" which is where the grammar of *my* sentence made no sense when trying to equate them.
 
Ah, but the WiiU uses AMD and supposedly Nvidia FLOPS are better than AMD FLOPS (I never could figure out why however. No one really explains it).
FLOPs are FLOPs. What people mean when they make such a statement (which is both inaccurate and somewhat right at the same time) is that historically (and to this day) Nvidia GPU architectures with N FLOPs will perform better over a wide variety of games than AMD architectures with N FLOPs. AMD also requires more memory bandwidth than NV for similar performance.

Of course, this is mostly based on PC comparisons, so it's unclear how much of it is applicable in the console space -- it depends on how much of the difference is down to hardware or low-level software and how much of it is down to drivers. For example, regarding the bandwidth comparison we recently learned that NV have been using a far more efficient rasterization order since the Maxwell architecture.
 
The GPU in the PS4 and Xbone were cutting edge when they were announced, The GPU in the PS4 Pro compares to a mid/high end Radeon 480

The Switch is not even being released until 2017 and these leaks claim GPU tech from over a year ago

ROTFLMAO!!! talk about revisionist history 😂
 
I'm actually kind of worried that despite everyone's optimism that Switch will not in fact run all Xbone ports.

If true it would leave Nintendo in a tricky situation. Wii U, Vita, 3ds, and PS360 up ports are all most likely easily done. But I'm not sure that is enough to sustain the Switch. Valve games should be easily ported to Switch, if Nintendo can convince Valve.

Or Nintendo is creating a streaming service that streams more complex games. But I'm not sure that will be very popular.

I'm just not sure what to think.
 
Have you seen GTX1050 Leaked specs of 1.8 tf and 75 wats???

The quoted TDP of GTX1050/1050Ti aren't really comparable to the hypothetical Tegra chip I was describing for a number of reasons:

1. They're clocked much higher (~1.4GHz boost clocks)
2. That's a full-board TDP including GDDR5 (not the most power efficient RAM)
3. They're manufactured on Samsung's 14nm process, not TSMC's 16nm process which the rest of the Pascal line is manufactured on (although admittedly there's no particular reason to rule out Samsung's 14nm for Switch)

To illustrate the difference, all you need to do is to look at the 180W TDP for the GTX1080, which uses a GP104 at ~1.7GHz boost clock, and compare it to the mere 36W the exact same chip pulls once you clock it down to a little over 1GHz and remove the RAM.

This is pretty typical for CPUs, GPUs and any other manner of integrated circuits. The relationship between power consumption and clock speed is close to exponential, and if you clock down by only a relatively small amount you can save a surprisingly large amount of power. It's how Intel can sell the exact same die as either a 65W TDP desktop chip or a 4.5W TDP ultra-thin laptop chip.
 
I'm actually kind of worried that despite everyone's optimism that Switch will not in fact run all Xbone ports.

If true it would leave Nintendo in a tricky situation. Wii U, Vita, 3ds, and PS360 up ports are all most likely easily done. But I'm not sure that is enough to sustain the Switch. Valve games should be easily ported to Switch, if Nintendo can convince Valve.

Or Nintendo is creating a streaming service that streams more complex games. But I'm not sure that will be very popular.

I'm just not sure what to think.

If Nintendo cares about that it's possible to get close enough. Operative word being if.
 
I'm actually kind of worried that despite everyone's optimism that Switch will not in fact run all Xbone ports.

If true it would leave Nintendo in a tricky situation. Wii U, Vita, 3ds, and PS360 up ports are all most likely easily done. But I'm not sure that is enough to sustain the Switch. Valve games should be easily ported to Switch, if Nintendo can convince Valve.

Or Nintendo is creating a streaming service that streams more complex games. But I'm not sure that will be very popular.

I'm just not sure what to think.
The biggest potential impediment to ports will be how the system sells.
 
There were also rumours about new kits arriving in August. Your insistence on finding some definitive proof where is none yet is charming, though.

PS4 was using old tech, of course they had them in the kits earlier. It's not that complicated to understand why there can be differences.

Old tech? I'm pretty sure no such APUs existed before they were developed specifically for the consoles.

Also TSMC's 16nmFF has been in volume production for low power parts for over a year, and Maxwell Tegra already had most of the low power optimizations and features (like double rate fp16) before they came to Pascal.
 
Old tech? I'm pretty sure no such APUs existed before they were developed specifically for the consoles.

That specific AMD architecture was 1 year old. So the APU was most ready to be produced much earlier in the devkits life.

Also TSMC's 16nmFF has been in volume production for low power parts for over a year, and Maxwell Tegra already had most of the low power optimizations and features (like double rate fp16) before they came to Pascal.

So you're saying that even if Nvidia didn't have yet some Parker ready to provide to Nintendo for devkits they could have used anything produced by TSMC on the 16nmFF node? Or what's your point here?
 
The quoted TDP of GTX1050/1050Ti aren't really comparable to the hypothetical Tegra chip I was describing for a number of reasons:

1. They're clocked much higher (~1.4GHz boost clocks)
2. That's a full-board TDP including GDDR5 (not the most power efficient RAM)
3. They're manufactured on Samsung's 14nm process, not TSMC's 16nm process which the rest of the Pascal line is manufactured on (although admittedly there's no particular reason to rule out Samsung's 14nm for Switch)

To illustrate the difference, all you need to do is to look at the 180W TDP for the GTX1080, which uses a GP104 at ~1.7GHz boost clock, and compare it to the mere 36W the exact same chip pulls once you clock it down to a little over 1GHz and remove the RAM.

This is pretty typical for CPUs, GPUs and any other manner of integrated circuits. The relationship between power consumption and clock speed is close to exponential, and if you clock down by only a relatively small amount you can save a surprisingly large amount of power. It's how Intel can sell the exact same die as either a 65W TDP desktop chip or a 4.5W TDP ultra-thin laptop chip.

Power consumption is about linear with frequency. And the voltage is exponential in relation to the power.

$P=CV^2f+P_s where P_s is the zero frequency static power dissipation.

However their are many more things and constrains to consider which leads for the huge power increase when frequencies increases by a certain margins, such as node size of the transistor. And as the frequency and power increases the temperature produces by the chip is increased which would mean, the electrons inside the chip are relatively in a higher excited state which would cause the chip to have higher losses, hence more voltage margin is needed when increasing frequency above a certain threshold.
 
Yeah, it's linear with frequency, but as you explain higher frequencies commonly require higher voltages, so you have an exponential component there too. In practice, you probably require significantly less than half the power to e.g. run a 1050 at 750 MHz compared to 1.5 GHz.
 
I'll link you a little story to see how Nintendo works and why it's irrelevant to compare Nintendo's processes with the others'

http://www.eurogamer.net/articles/digitalfoundry-2014-secret-developers-wii-u-the-inside-story

I linked that very same article in this thread a couple of pages back. What's notable about it is that throughout the entire development process the dev kits remained almost identical to the target hardware, aside from bug fixes.

If that's how Nintendo works, then we would have seen something long in advance of full production.

BTW a Tegra X1 in the NS is logically consistent with the custom SoC claim since the dev kit was never explicitly identified as a Jetson board. It could be a die shrunk X1 with a few alterations for power saving, and therefore custom in the same way RSX was a custom GPU. Would totally fit in with the way Nvidia works.
 
Ah, but the WiiU uses AMD and supposedly Nvidia FLOPS are better than AMD FLOPS (I never could figure out why however. No one really explains it).

normally Nvidia flop for flop vs AMD is generally more effiecient is the better way of saying it. I think the best example someone told me is 2 people walking down a sidewalk from point A to point B. While one walks straight from point A to point B. The other person would walk and stop occasionally. They still get to the same point just one gets distracted and or trip along the way. Well thats how it was describe to me.
 
BTW a Tegra X1 in the NS is logically consistent with the custom SoC claim since the dev kit was never explicitly identified as a Jetson board. It could be a die shrunk X1 with a few alterations for power saving, and therefore custom in the same way RSX was a custom GPU. Would totally fit in with the way Nvidia works.

Pascal is essentially a die shrunken Maxwell that allows for higher clocks.

https://www.youtube.com/watch?v=nDaekpMBYUA

If Nintendo went with a "shrunken down TX1", they'd be getting Pascal. At that point it would just be a matter of determining whether they use the extra power overhead to increase efficiency or performance. My guess is they would increase the clocks in docked mode for better performance, and use the efficiency gains while mobile to increase battery life.

They would be fools to stick with TX1 and 20nm.
 
If Nintendo went with a "shrunken down TX1", they'd be getting Pascal. At that point it would just be a matter of determining whether they use the extra power overhead to increase efficiency or performance. My guess is they would increase the clocks in docked mode for better performance, and use the efficiency gains while mobile to increase battery life.

Can it really do that? I thought the chip was stuck with either one or the other. Like if you choose efficiency, then you're stuck with it, and not something that can be switched back and forth. Unless that is why the system is called Switch.
 
Can it really do that? I thought the chip was stuck with either one or the other. Like if you choose efficiency, then you're stuck with it, and not something that can be switched back and forth. Unless that is why the system is called Switch.

The way I understand it, the "60% increased efficiency, 40% increased performance" line is just a way of saying that Pascal architecture allows for higher performance per watt. The GPU clocks can be changed dynamically depending on what Nintendo wants in any given scenario, and that still doesn't override the performance per watt gain.
 
That specific AMD architecture was 1 year old. So the APU was most ready to be produced much earlier in the devkits life.

So you're saying that even if Nvidia didn't have yet some Parker ready to provide to Nintendo for devkits they could have used anything produced by TSMC on the 16nmFF node? Or what's your point here?
My point is a 16nmFF Parker SoC could have been ready to sample much earlier than August, if such a design was in the works, in order to meet Nintendo's timeline.

Also, GCN in the PS4's APU had custom feature sets that weren't in GCN 1.0, not to mention that no AMD APUs at the time even featured GCN GPUs and certainly no APU had ever featured a 1152 shader core part.

Manufacturing larger, power hungry APUs is much more complicated than mobile chips, and attempting to downplay the effort it took to make the console APUs doesn't bolster your argument about the NS SoC being so difficult to sample it had to have come out much later.
 
Status
Not open for further replies.
Top Bottom