Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.
Given the size of the Switch I'd go with 4x A72 for games and 2x A53's for OS functions. All clocked around 1.8Ghz or so.

What I was hoping for:

4x A72 at 1.8Ghz
2x A53 at 1.8Ghz
6GB LPDDR4 (5GB reserved for games)
1.5TF FP16 or 768 GFLOPS FP32

What are the leaked specs?

4x A57 at unknown speed
2x Denver cores at unknown speed
4GB (3.2GB for games)
Unknown GPU spec (some Japanese writer said it was 768 GFLOPS FP32)
 
768 Gflops FP32 doesn't sound too bad

Part of what has been so funny about the last few days of doom and gloom. The Power Margins from the beginning have been around 512-768. Yet suddenly last few days people have had aneurisms over the final power range fitting within what has been expected all along.
 
Given the size of the Switch I'd go with 4x A72 for games and 2x A53's for OS functions. All clocked around 1.8Ghz or so.

What I was hoping for:

4x A72 at 1.8Ghz
2x A53 at 1.8Ghz
6GB LPDDR4 (5GB reserved for games)
1.5TF FP16 or 768 GFLOPS FP32

What are the leaked specs?

4x A57 at unknown speed
2x Denver cores at unknown speed
4GB (3.2GB for games)
Unknown GPU spec (some Japanese writer said it was 768 GFLOPS FP32)

Nothing has been confirmed about the CPU. There has been no indication it is using Denver. The 80% RAM for games should be taken with a grain of salt.

It's weird how a custom Nvidia GPU is what leaked but there is still no confirmation on what is the final CPU and RAM. That's why it's probably going to stay using A57 for the CPU and LPDDR4 RAM. The latter was very likely anyway, it's just unknown about the memory setup.
 
Part of what has been so funny about the last few days of doom and gloom. The Power Margins from the beginning have been around 512-768. Yet suddenly last few days people have had aneurisms over the final power range fitting within what has been expected all along.
Well I think it's cause it's below that "TF" threshold. If it was 950 GFLOPS it would be considered weak but if it was 1TF it sounds like it's in another category.

Truth is, at 720p, and with Tegra's efficiency with FP16, along with Unreal Engine 4 making use of FP16 now, 1.5TF of Pascal FP16 or 768 GFLOPS Pascal is certainly enough to run anything that the PS4 and Xbox One can do.

It's ready gonna have better CPU's then both of them and even the Pro, it's the ram amount I'm worried about.

But we've gotten at least two different verified posters saying the NS can run anything the other two consoles can or that games not coming to NS won't be because of power. So my fears aren't that high.

But please Nintendo, go with 6GB lol.
 
^^

Hopefully Nintendo and Nvidia are working on something to make this process easier than it was on PPC. I am hoping maybe the Vulkan API can be used as well.

I know this is selfish and 99% likely to be unrealistic, but I'd hope Nvidia's marketing deals for their GPU's carry over to Switch carry over to Switch support.


And where's this 768 GFlops at FP32 come from?
 
I know this is selfish and 99% likely to be unrealistic, but I'd hope Nvidia's marketing deals for their GPU's carry over to Switch carry over to Switch support.


And where's this 768 GFlops at FP32 come from?

A Japanese Interview had 768 Gflops while docked. Its unclear if its knowledge or speculated knowledge. It fits with what a pascal tegra can do though
 
I'm starting to like the idea of the switch more and more. Yea the stats don't look impressive for a home console, but as a portable, it looks fantastic. I think what its going to come down to for me is the display technology used.
 
A Japanese Interview had 768 Gflops while docked. Its unclear if its knowledge or speculated knowledge. It fits with what a pascal tegra can do though

768Gflops FP32 while docked is probably the most I'd expect so I'd be happy with that. Newer architecture + FP16 usage would have it right around XBox One performance. Got a link to that article?
 
Are you sure on X1's CPU cores not being able to all be active simultaneously? I remember Anandtech saying that all 8 cores are exposed by the OS and so usable at the same time in Shield TV.

Tegra X1 uses cluster migration. Only one cluster is active at any time, so even if only one of your jobs requires the big cluster then all your jobs will run on the A57 cores.
 
768Gflops FP32 while docked is probably the most I'd expect so I'd be happy with that. Newer architecture + FP16 usage would have it right around XBox One performance. Got a link to that article?

This is the original article:
http://www.4gamer.net/games/990/G999026/20161021062/

This is the one people are referencing:
http://wccftech.com/nintendo-switch-nvidia-tegra-pascal/

It still sounds like speculation, would be preferable if someone fluent in Japanese can read and interpret it.
 
Given the size of the Switch I'd go with 4x A72 for games and 2x A53's for OS functions. All clocked around 1.8Ghz or so.

What I was hoping for:

4x A72 at 1.8Ghz
2x A53 at 1.8Ghz
6GB LPDDR4 (5GB reserved for games)
1.5TF FP16 or 768 GFLOPS FP32

What are the leaked specs?

4x A57 at unknown speed
2x Denver cores at unknown speed
4GB (3.2GB for games)
Unknown GPU spec (some Japanese writer said it was 768 GFLOPS FP32)

Stop daydreaming.

And btw, there is no mention of Denver cores in the leaked spec, and mentions 1Ghz / 256 cores which max at 512 Gflops fp32 [That Japanese writer is speculating so not a valid source]

Plus we don't even have confirmation of docked mode enabling full clock speed, and I personally doubt as handheld can have full 1Ghz clock.
 
768Gflops FP32 while docked is probably the most I'd expect so I'd be happy with that. Newer architecture + FP16 usage would have it right around XBox One performance. Got a link to that article?

A Japanese Interview had 768 Gflops while docked. Its unclear if its knowledge or speculated knowledge. It fits with what a pascal tegra can do though

It's basically just the GPU from the TX1 with the benefits of being put on a 16nm node. It's basically what i'm expecting from a pascal based successor of the maxwell gpu inside the TX1.
 
Stop daydreaming.

And btw, there is no mention of Denver cores in the leaked spec, and mentions 1Ghz / 256 cores which max at 512 Gflops fp32 [That Japanese writer is speculating so not a valid source]

Plus we don't even have confirmation of docked mode enabling full clock speed, and I personally doubt as handheld can have full 1Ghz clock.

It's almost a law of nature that in these threads with every page we try to bargain for slightly better specs than we thought plausible the page before. In the end, everybody has silently accepted speculative specs on the upper end of the spectrum of what could maybe be possible. What we'll get in the end usually ends up being on the other end of that spectrum.

As I've written a few pages before, I agree with you that we should not uncritically assume performance figures based on the maximum clock speeds of Nvidia's chips, given that the Switch as a portable system will have to balance performance with battery life. And even if the SoC will run at max speed when docked, games will have to be made such that they are compatible to the what the device can provide in mobile mode.

Devices like an iPad consume ~9-12W under full load, and that includes the display. Four 16nm-A57 cores at 2Ghz by themselves already consume 5,5W under full load. Add to that the power consumption of GPU, display, and all the rest, and it looks wise to manage expectations a bit.
 
How large of an L3 cache would you think necessary? If they went with the same cache configuration as Parker, L1 and L2 are both 2MB.

I honestly don't know. Part of the issue is that I can't actually find official confirmation of the GPU L2 cache size (which is the important one in this case) for TX1 or Parker. I seem to recall that TX1 had a 512KB GPU L2 cache, but that isn't something I can find any hard data on. My gut says that an L3 victim cache of somewhere between 2MB - 4MB would probably do the job (Apple uses a 4MB L3 victim cache for the same purpose in its SoCs), but it's not something you can really tell without testing, and without a TX1 or Parker customised with a large L3 victim cache that can be fractionally disabled it's not something we can test.

(For what it's worth, the CPU L1 and L2 caches on Parker aren't both 2MB. The L1 depends on the cores, with the Denver cores each getting 128K I-cache and 64K D-cache, and the A57s getting 48K I/32K D each. There's then a 2MB L2 shared between the Denvers and a separate 2MB L2 shared among the A57s)

Well Nvidia already has an API named NV API. So the N could have just been added because it's Nintendo's version.

https://developer.nvidia.com/nvapi

From my brief reading of this, it seems to be more a PC-oriented API designed to be complementary to DX/OGL/etc. Things like thermal management, driver initialisation, display configuration and overclocking controls aren't entirely relevant in the console space.

Hmm... Maybe the HBM stack is for Xavier? I was reading Drive PX2 specs and had it 8GB of 128-bit LPDDR4 RAM for system memory and 4GB of GDDR5 RAM for Graphics memory.

HBM is probably overkill for Switch, so the other option is higher LPDDR4 RAM Memory Bandwidth (so 128-bit, 2 chips) and more cache.

____

16nm for the GPU's of the Slim consoles makes sense, and likely for PS4 Pro.

Yes, Xavier would be the most likely suspect. It is worth noting that the article claimed the chip was "in production", and Xavier isn't, although it is possible this was mis-interpreted or mis-transcribed by the author. It would be an unusual option for Xavier, though, in that GDDR5(X) could give them the same bandwidth and substantially more capacity for a lower cost, and I wouldn't expect them to be so heavily power-constrained to choose HBM2 on the basis of power savings. The target render Nvidia showed of the Xavier board also showed what clearly looked like RAM modules sitting next to the SoC (although of course one can read too much into early target renders like this).

Alternatively it is technically possible that it's for a GPU, but given the capacity and bandwidth it would only be suitable for an entry-level card, and Nvidia would have to tape out a completely different die (probably something equivalent to a GP107) with a HBM interface just to be able to use it for a single slightly more power efficient laptop GPU. This would be an extremely unusual move for Nvidia, as they typically have a variety of SKUs per die across both laptop and desktop in order to take advantage of binning (the most power efficient dies going to laptops) and to reduce design costs and inventory risk. They wouldn't be able to do this here (no point selling it for desktops if it's just a more expensive version of the 1050Ti), and in any case it's unlikely that they'd be able to warrant the increased cost of a HBM-powered card versus its GDDR5 equivalent, as the power savings would be relatively small (especially as you're comparing to binned GP107's in laptops).

The last option (unless there's any other super-secret SoC due soon) is the Nintendo Switch. We know it's in production, we know Nintendo likes esoteric memory, and we have it on good authority that Switch will have 4GB of RAM. As you say, though, it would be overkill for a device of that performance level (you'd potentially be looking at equivalent bandwidth to PS4 Pro, despite being at best 1/5th as powerful and using a considerably more bandwidth efficient architecture), and it would almost certainly be very expensive for a device which we've been told is targeting a "surprisingly low" price point.

Effectively it has to be one of the above three. Xavier seems the least unlikely (although on the surface it wouldn't have been something I would have predicted), and I'd actually argue that a GPU with 4GB of HBM2 is more unlikely than Switch using it. For Switch, at least, it would appear to be the only feasible way of hitting 200GB/s+ of bandwidth in a portable form factor, if they were for some reason to decide that's what they want.
 
It's almost a law of nature that in these threads with every page we try to bargain for slightly better specs than we thought plausible the page before. In the end, everybody has silently accepted speculative specs on the upper end of the spectrum of what could maybe be possible. What we'll get in the end usually ends up being on the other end of that spectrum.

As I've written a few pages before, I agree with you that we should not uncritically assume performance figures based on the maximum clock speeds of Nvidia's chips, given that the Switch as a portable system will have to balance performance with battery life. And even if the SoC will run at max speed when docked, games will have to be made such that they are compatible to the what the device can provide in mobile mode.

Devices like an iPad consume ~9-12W under full load, and that includes the display. Four 16nm-A57 cores at 2Ghz by themselves already consume 5,5W under full load. Add to that the power consumption of GPU, display, and all the rest, and it looks wise to manage expectations a bit.

Exactly, maybe we will get something somewhat better than the leaked specs, but for the time being people should try to assume that those specs are quite close to the final product.
 
Given the size of the Switch I'd go with 4x A72 for games and 2x A53's for OS functions. All clocked around 1.8Ghz or so.

What I was hoping for:

4x A72 at 1.8Ghz
2x A53 at 1.8Ghz
6GB LPDDR4 (5GB reserved for games)
1.5TF FP16 or 768 GFLOPS FP32

What are the leaked specs?

4x A57 at unknown speed
2x Denver cores at unknown speed
4GB (3.2GB for games)
Unknown GPU spec (some Japanese writer said it was 768 GFLOPS FP32)

The leaked dev kit specs appears to be more or less the chipset for the NVIDIA shield TV. No Denver cores, and can be clocked up to max power: 512 GFLOPS FP32
 
Something we might be missing is that Nintendo things the Switch is one part of a platform.

They see the Switch as the entry level system, with a more powerful system coming out later IF the switch does well enough.

If they price this thing at $249 in 2017, at 2019 they can release a $299 thing that competes with PS4Pro and Scorpio while keeing the Switch at $199 for longer.
 
Exactly, maybe we will get something somewhat better than the leaked specs, but for the time being people should try to assume that those specs are quite close to the final product.

Wow its almost as if that's what we have been saying all along. 512-768gflops. You people come in acting like you are some voice of reason when you are just parroting what we have been saying all along. No one is expecting a 25w monster or anything of the sort.
 
Something we might be missing is that Nintendo things the Switch is one part of a platform.

They see the Switch as the entry level system, with a more powerful system coming out later IF the switch does well enough.

If they price this thing at $249 in 2017, at 2019 they can release a $299 thing that competes with PS4Pro and Scorpio while keeing the Switch at $199 for longer.
By 2019-2020 PS4.75 will probably come out with a 10 TFLOPS..
 
Wow its almost as if that's what we have been saying all along. 512-768gflops. You people come in acting like you are some voice of reason when you are just parroting what we have been saying all along. No one is expecting a 25w monster or anything of the sort.

On a related note, why the Wii U couldn't handle UE4? Phones with weaker GPUs have been able to use it, though I recall it being said that Wii U's CPU was surpassed for even portables a long time ago.
 
Wow its almost as if that's what we have been saying all along. 512-768gflops. You people come in acting like you are some voice of reason when you are just parroting what we have been saying all along. No one is expecting a 25w monster or anything of the sort.

The numbers that you are quoting as reasonable include the numbers based on the most powerful possible SoC (Parker) at maximum clock speed. So you kinda make my point. I don't think many people have, for instance, critically questioned the maximum CPU clock speed. Everyone is assuming that the CPU cores will run at maximum clock speed and that this is the baseline of performance comparisons, when in reality its reasonable to assume that the baseline will be given by how fast the CPU will clock in mobile mode.
 
Devices like an iPad consume ~9-12W under full load, and that includes the display. Four 16nm-A57 cores at 2Ghz by themselves already consume 5,5W under full load. Add to that the power consumption of GPU, display, and all the rest, and it looks wise to manage expectations a bit.

It was 4.7w at 2Ghz, that was also mention in the article as a absolute maximum that no real world load would reach (they used a virus in order to get absolute maximum possible power consumption at each frequency), with 80-90% of that number being more reasonable for real world high workloads.
 
On a related note, why the Wii U couldn't handle UE4? Phones with weaker GPUs have been able to use it, though I recall it being said that Wii U's CPU was surpassed for even portables a long time ago.

Had to do with architecture not power. And WiiU could technically do it.
 
I know of at least one game where the Wii U version was canned and they're just putting it on Switch instead (spoiler: it's not a very well kept miserable pile of secrets) because the Switch fully supports UE4.
 
I know of at least one game where the Wii U version was canned and they're just putting it on Switch instead (spoiler: it's not a very well kept miserable pile of secrets) because the Switch fully supports UE4.

I imagine this is going to be a more common choice among developers going forward. I wouldn't be surprised if something like Bloodstained got this treatment (and I imagine it will).
 
On a related note, why the Wii U couldn't handle UE4? Phones with weaker GPUs have been able to use it, though I recall it being said that Wii U's CPU was surpassed for even portables a long time ago.

Literally only because Epic wouldn't officially support it. They didn't even port UE3 themselves. Nobody was gonna do it with UE4 when you could just make your own engine at that point.

Keep in mind they barely support the platforms they are officially on.
 
Had to do with architecture not power. And WiiU could technically do it.
Was the dx10-equilivant feature set of the Wii U? Back in the WUST days, we were hoping that Nintendo had customized the R700-based architecture handle more modern features, but I guess that was too much of a change too late in the system's (hellish) development.
 
Wow its almost as if that's what we have been saying all along. 512-768gflops. You people come in acting like you are some voice of reason when you are just parroting what we have been saying all along. No one is expecting a 25w monster or anything of the sort.
Nintendo Switch Super, 4 TFLOPs in 2020 using 8 watts with 3hrs battery life. Believe.
 
It was 4.7w at 2Ghz, that was also mention in the article as a absolute maximum that no real world load would reach (they used a virus in order to get absolute maximum possible power consumption at each frequency), with 80-90% of that number being more reasonable for real world high workloads.

All that wouldn't really affect my argument. To look at it from another angle, the internal space of an iPad Air 2 is almost entirely taken by it's batteries, and they provide ~27 watt-hours of energy. The batteries of an Nvidia Shield Tablet only provide ~19 watt-hours. The Switch certainly doesn't look like it has enough room for larger batteries than that. Targeting a battery life of 3-4 hours with non-stop gaming, everybody can roughly estimate how much power the Switch's display, GPU, CPU, and all the rest are allowed to consume in total. Possible clock speeds then certainly become an interesting point of discussion.
 
Something we might be missing is that Nintendo things the Switch is one part of a platform.

They see the Switch as the entry level system, with a more powerful system coming out later IF the switch does well enough.

If they price this thing at $249 in 2017, at 2019 they can release a $299 thing that competes with PS4Pro and Scorpio while keeing the Switch at $199 for longer.
If not all devs are enthused with needing to deal with both PS4 and PS4 Pro, I'd worry that needing modes for Switch Undocked, Switch Docked, and Switch Pro would explode some heads.
On a related note, why the Wii U couldn't handle UE4? Phones with weaker GPUs have been able to use it, though I recall it being said that Wii U's CPU was surpassed for even portables a long time ago.
Somewhere along the line UE4 changed. Initially they were talking about UE4 being for machines like Xbone and beyond, while weaker ones would still use UE3. That's no longer the case, but once Wii U failed to take off it probably no longer seemed important.
 
Status
Not open for further replies.
Top Bottom