Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.
No, a portable PS4 isn't, that we agree on. But we're talking about them lightly modifying an off the shelf configuration and making it worse than originally intended, when it had already released in a similar form factor at higher performance. This is likely because they were concerned about reaching some arbitrary line in the sand on power consumption, or couldn't be bothered to spend a few extra dollars per unit on a larger battery.

It's easy to argue that they are just trying to save every penny they can to pass on the savings to consumers, but I would gladly pay an extra $50 for the extra horsepower needed for the possibility of a greater variety of software, and a battery that lasts an extra hour or two.

Well that's being talked about but we don't know that's the case. Personaly I think they'll have made some significant customisations. Wether that includes shader count is anyone's guess.
 
There is little reason that a $200 dock couldn't match Scorpio, considering this last holiday you could buy a $200 GPU from Nvidia with 6GB VRAM and match the 6TFLOPs RX 480 (and exceed it in many games)

The dock alone is going to cost you maybe $15 dollars to manufacture, leaving plenty of room for a 4+ tflop GPU in the dock for retail $199. They could market it as a 4k dock and heck they only really need 4GB VRAM in the dock rather than the 6GB, so they could probably even undercut that price point, and that is just this year, next year they might be able to do it for $99, it's hard to say really, but the idea behind SCD makes a good amount of sense for Nintendo to pursue if they want to pass hi fidelity graphics cost onto the customers, could even add 4K DLC packs to games just for texture packs.

Again given what we know, there should be plenty of overhead in Switch for higher clocks too, I'm not technically aware of a reason that the A57 cores couldn't hit 1.5ghz even portable other than battery life, I mean another ~watt of power draw isn't going to mean much, and that should put it in line with about 6 jaguar cores at 1.6ghz like PS4.

We also have iterations to think about, a "NEW" Switch could be over twice as powerful in the same power envelope and even match PS4 if they wanted to, people tend to think that power envelopes make it impossible, but 4SM @1.25ghz on 16nm pascal with 4 A72 cores @ 1.7ghz (same power draw as 4 A57 cores @1ghz 20nm btw) would draw less than 20watts and that should be fine for the docked device, which with the same performance ratio gives you 512gflops portable. (this is just speculation on a future iteration, while the technology was available to do this last year, no one should have expected this and it was always a best case scenario, but for an iteration, it makes a lot of sense, as it would handle the rumored Switch specs docked performance on the go)

The true innovation here though is how modular the design really is, I mean Nintendo can have you plug the switch into any new gimmick they want. VR? check! some stand alone new device powered by switch? check! overclocking the switch adds plenty of performance as well, because even the version of the switch we've been talking about for the last few pages is capable of CPU performance on par with PS4 pro (if upclocked) and GPU could match up to half of XB1 if upclocked when docked.

They can use switch as a trojan for pretty much any new idea they want to push since it really is just a stand alone screen with capable cpu performance and a modern GPU that ensures compatibility with the industry's gaming pipeline.

Thanks. So for me it comes down to reasonable price. Switch is already on the back foot for me as I'm paying for low end power just because it's portable, which I will probably never use. So they need to be really smart on price for me to bite. That is, $150-$199 max and then another $100 to be PS4/Pro level in a year or something like that. Either way, console only gamers like me will be paying a premium for less.
 
Thanks. So for me it comes down to reasonable price. Switch is already on the back foot for me as I'm paying for low end power just because it's portable, which I will probably never use. So they need to be really smart on price for me to bite. That is, $150-$199 max and then another $100 to be PS4/Pro level in a year or something like that. Either way, console only gamers like me will be paying a premium for less.

It is not going to be below 199.99. You arent going to find any device on the go that is going to match the Switch.. If you do it wont have the form factor the exclusive library that comes with it.

This is going to be the best portable gaming device on the market and I dont see why anyone thinks it will be less than 200 let alone 250 bux. Nintendo had no problem selling 3ds's with ps4s and x1s available. The Switch is going to fill that 3DS market along with what is left of the Nintendo Console market.

Your ps4 doesnt have exclusive Nintendo titles and cant be played anywhere.. So you arent paying a premium for less you are just paying for a different type of more.
 
Well that's being talked about but we don't know that's the case. Personaly I think they'll have made some significant customisations. Wether that includes shader count is anyone's guess.

None of the evidence we have points to anything else. The most likely explanation is Thraktor's, which is that they moved the Tegra X1 design to the newest 28nm process because it's cheap and nearly the same leakage as 20nm, and lowered the clocks to meet their thermal and power consumption needs. That pretty much precludes adding a large block of SRAM or Cuda Cores, because the chip would simply get too large. We know that they customized the USB Setup, because the TX1 didn't have USB 3.1 on board, but beyond that what is there to expect?
 
None of the evidence we have points to anything else. The most likely explanation is Thraktor's, which is that they moved the Tegra X1 design to the newest 28nm process because it's cheap and nearly the same leakage as 20nm, and lowered the clocks to meet their thermal and power consumption needs. That pretty much precludes adding a large block of SRAM or Cuda Cores, because the chip would simply get too large. We know that they customized the USB Setup, because the TX1 didn't have USB 3.1 on board, but beyond that what is there to expect?

None of the evidence points to anything very specific regarding final hardware though. We know that at some point devs were shown Jetson X1 specs and some clock speeds, that's it. Also Nintendo don't make decisions like that "because it's cheap" and I can't remember them ever using a off the shelf GPU. Even WiiUs GPU, for all its low end performance, was completely custom and actually quite expensive for what it was.

I'm not assuming Nintendo broke the habit of a life time based on one article which itself admits they have no idea what customisations have been made to the final chip.
 
None of the evidence we have points to anything else. The most likely explanation is Thraktor's, which is that they moved the Tegra X1 design to the newest 28nm process because it's cheap and nearly the same leakage as 20nm, and lowered the clocks to meet their thermal and power consumption needs. That pretty much precludes adding a large block of SRAM or Cuda Cores, because the chip would simply get too large. We know that they customized the USB Setup, because the TX1 didn't have USB 3.1 on board, but beyond that what is there to expect?

The opposite is more likely. The final dev kits in October were more powerful than the ones from July, the device also had a better battery life. 16nm makes more sense than 28nm in that context. I know depression is looked at as a safe bet around here, but honestly no matter what chip size they went with, the performance of the device will end up being the same, the clocks eurogamer listed are either old (July kits) or from October, but they did say they were sitting on that information for a while and October kits increased performance, so either kits in July had even lower clocks, or they could be outdated.

Final targets change, and even if when they heard them, they were final clocks, that could have changed when they moved to 16nm. For me this is a puzzle, it's not just what is the easiest solution but rather, what is the one that fits our information best, it's not that 28nm allowed Nintendo to release a stronger devkit with better battery life, those two factors can't both improve with a process node that at best matches what they were already using. Sorry, it just doesn't make sense imo.
 
Final targets change, and even if when they heard them, they were final clocks, that could have changed when they moved to 16nm. For me this is a puzzle, it's not just what is the easiest solution but rather, what is the one that fits our information best, it's not that 28nm allowed Nintendo to release a stronger devkit with better battery life, those two factors can't both improve with a process node that at best matches what they were already using. Sorry, it just doesn't make sense imo.


It makes 0 sense why they wouldnt use 16nm in the final product. 16 nm with leaked clocks that low would lead to pretty damn good battery life. Also it would leave room to loosen up the clocks later once they get everything under control after launch.

But its Nintendo , which really is the reason we are discussing which nm process they went with. to begin with lol.
 
None of the evidence points to anything specific regarding final hardware. Also Nintendo don't make decisions like that "because it's cheap" and they don't just use off the shelf chips. Even WiiUs GPU, for all its low end performance, was completely custom and actually quite expensive for what it was. Can you name the last off the shelf GPU they used (or the console it was in).

I'm not assuming Nintendo broke the habit of a life time based on one article which itself admits they have no idea what customisations have been made to the final chip.

I can't remember exactly, but did the PICA 200 (3DS GPU) have more done to it than being downclocked and shrunk down to 45nm from 65nm?
 
It is not going to be below 199.99. You arent going to find any device on the go that is going to match the Switch.. If you do it wont have the form factor the exclusive library that comes with it.

This is going to be the best portable gaming device on the market and I dont see why anyone thinks it will be less than 200 let alone 250 bux. Nintendo had no problem selling 3ds's with ps4s and x1s available. The Switch is going to fill that 3DS market along with what is left of the Nintendo Console market.

Your ps4 doesnt have exclusive Nintendo titles and cant be played anywhere.. So you arent paying a premium for less you are just paying for a different type of more.
Exactly,you are not going to find a system with similar library and specs at that price azaK,the Switch is as close to top of the line Mobile tech Nintendo has been in many years,they even got Nvidia to do it.
 
It makes 0 sense why they wouldnt use 16nm in the final product. 16 nm with leaked clocks that low would lead to pretty damn good battery life. Also it would leave room to loosen up the clocks later once they get everything under control after launch.

But its Nintendo , which really is the reason we are discussing which nm process they went with. to begin with lol.

Even with 28nm, there would be plenty of head room for higher clocks later, especially if they are getting 5+ hrs battery life like reported.
 
Sounds like a straight Xbox 360 port without proper optimization and probably buggy as hell. Maybe 4 people if you don't count QA, but that would be a good producer and 3 SENIOR engineers. Porting/optimizing is not something you want to trust to someone without experience. Probably less than a year to dev, but including testing and bug fixing I would guess that's a 6 month project. I'm probably being optimistic. QA would need to run a full test plan against it, and if they find a lot of bugs that could mean pulling in more engineers. This would probably all be outsourced to another studio, so there's some overhead there too.

As other people have said, T2 is probably better off spending the money on something else.

Porting games shouldn't require hundreds of people and years to do...unless they're basically remaking the game. This won't be the case with a NS version of GTA V. All of the art, models, logic, and gameplay elements are done so a group of programmers only have to move the code over to the NS and make adjustments where necessary. Rockstar I'm sure has it's own QA department, so they don't have to farm that out. They shouldn't need that many people to do the port if the tools for converting are good. And we haven't heard anything negative about this so far. Get the game running, then optimize. Run through QA and squash any bugs as necessary. Could be done in a few months and shouldn't cost anywhere near 1 million.
 
None of the evidence points to anything specific regarding final hardware. Also Nintendo don't make decisions like that "because it's cheap" and they don't just use off the shelf chips. Even WiiUs GPU, for all its low end performance, was completely custom and actually quite expensive for what it was. Can you name the last off the shelf GPU they used (or the console it was in).

I'm not assuming Nintendo broke the habit of a life time based on one article which itself admits they have no idea what customisations have been made to the final chip.

I think the Wii U being so customized makes it more likely that they would use a more "off the shelf" part this time. They put in a lot of work with Wii U and it didn't pay off at all.

Gamecube was a sales disappointment too, and they basically reused that hardware for Wii. They went bold with the controller and design, but the CPU/GPU were very close to Gamecube. Not trying to predict anything here, just saying I could imagine a similar situation with Switch.
 
Thraktor actually made a pretty good case for it. It's the only thing that jives with the low clocks speeds and still requiring a fan.

Doesn't jive with the improved battery life, though. Most recent rumor from Laura was 3 hours at full blast, and that's not unreasonably in-line with Nate's previous ballparks of 5-8 hours in portable (potentially).
 
Doesn't jive with the improved battery life, though. Most recent rumor from Laura was 3 hours at full blast, and that's not unreasonably in-line with Nate's previous ballparks of 5-8 hours in portable (potentially).

For all we know, that could have been a dev-kit model with a 3DS battery stuck into it.

No reason they would use batteries specific to Switch in dev-kits when those weren't final and that they still had to produce a retail model or working prototype.
 
Doesn't jive with the improved battery life, though. Most recent rumor from Laura was 3 hours at full blast, and that's not unreasonably in-line with Nate's previous ballparks of 5-8 hours in portable (potentially).

Couldn't improved battery life just come from a larger battery? Why does this necessarily mean hardware improvements?
 
Couldn't improved battery life just come from a larger battery? Why does this necessarily mean hardware improvements?

Battery technology has not caught up to hardware in terms of size. So unless the internals of the Switch changed to allow for a different size battery to exist I doubt it changed that drastically to go from 3 hours to 5-8.

It would be more likely that the final product is using a better fab process to allow for better battery life. This is assuming the 5-8 is even an accurate leak.
 
Thraktor actually made a pretty good case for it. It's the only thing that jives with the low clocks speeds and still requiring a fan.

Well, having 3SMs would also jive with that as well as with the Venturebeat article and what NateDrake initially heard but... yeah... it's 28nm for sure.
 
Battery technology has not caught up to hardware in terms of size. So unless the internals of the Switch changed to allow for a different size battery to exist I doubt it changed that drastically to go from 3 hours to 5-8.

It would be more likely that the final product is using a better fab process to allow for better battery life. This is assuming the 5-8 is even an accurate leak.

I don't get what you're saying. Going by the clock speeds and tech involved, the Switch most likely runs from 8W or less

For comparison, the Wii U Gamepad runs at 7.6W and had a crappy battery that made it last 3-5 hours and then Nintendo released a higher capacity battery to make it last up to 8 hours.

The sizes of the Gamepad and the Switch are similar and even the Switch is pretty thick, that suggests putting in a bigger battery shouldn't be a great hurdle.
 
It is not going to be below 199.99. You arent going to find any device on the go that is going to match the Switch.. If you do it wont have the form factor the exclusive library that comes with it.

This is going to be the best portable gaming device on the market and I dont see why anyone thinks it will be less than 200 let alone 250 bux. Nintendo had no problem selling 3ds's with ps4s and x1s available. The Switch is going to fill that 3DS market along with what is left of the Nintendo Console market.

Your ps4 doesnt have exclusive Nintendo titles and cant be played anywhere.. So you arent paying a premium for less you are just paying for a different type of more.

I think it's always a given that when someone states something it's their opinion. That is, it's less for me. Also my PS4 has all the third party games which if history is anything to go by is MORE than what I get on a Nintendo console. I also get good online, fast OS and a good featureset.

I agree though it won't be under $199US, 150 is what I think it's worth for me. Basically all it needs to be is a controller for Nintendo games that I can later augment with an SCD....although there's no promise that'll happen.
 
I think it's always a given that when someone states something it's their opinion. That is, it's less for me. Also my PS4 has all the third party games which if history is anything to go by is MORE than what I get on a Nintendo console. I also get good online, fast OS and a good featureset.

I agree though it won't be under $199US, 150 is what I think it's worth for me. Basically all it needs to be is a controller for Nintendo games that I can later augment with an SCD....although there's no promise that'll happen.

Mate, I'm a PC player. They're gong to have to try DAMN hard to convince me to purchase AAA games that I can already play at significantly higher resolutions and higher frame rates than consoles allow for. Maybe not on the go, but given the option between 60+ FPS and portability, I'm going with the improved response time.
 
batteries tend to be rather expensive - the more efficient you make your system, the smaller the battery you can get away with.

16nm wafers cost a lot of money too. There are multiple interpretations you can come up with for why the Switch runs at these clock speeds, yet still requires fans and has moderate battery life. None of it is for certain, but 28nm would jive with all of that, and free up budget for a slightly larger battery.
 
Not really. The X1 doesn't even use 28nm, it use 20nm, so I have no idea why it's even being brought up. No way it'd be cheaper to use that for ONLY the Switch chips and nothing else

People are bringing it up because of the leak that referenced lower clock speeds (half of what tegra can do) and because of the included fan.

People are insinuating that at 20nm or below there is no reason to include a fan with those clocks. So the assumption is that 28nm is being used.
 
People are bringing it up because of the leak that referenced lower clock speeds (half of what tegra can do) and because of the included fan.

People are insinuating that at 20nm or below there is no reason to include a fan with those clocks. So the assumption is that 28nm is being used.

Isn't another theory that Nintendo has made additions/customizations that generate more heat and need a fan? Or are people only assuming the customizations are removals from the default chip?
 
Isn't another theory that Nintendo has made additions/customizations that generate more heat and need a fan? Or are people only assuming the customizations are removals from the default chip?

I am not sure exactly.. I doubt anything they have added would generate that much more heat than the standard chip.

I am going with the most Nintendo path possible. The fan was included because the device ran warm and Nintendo wanted to make sure it was cool to the touch when extended gaming sessions occurred or when rapid charging was happening.
 
How can an old process be cheaper than a new process (I am assuming you are referencing economy of scale)? Aren't they 2 generations past 28nm already, it would seem that they are phasing that out in favor of FinFET at 16 and smaller?
16nm finfet pascal seems like such a non brainer if it was any other company.
 
I am not sure exactly.. I doubt anything they have added would generate that much more heat than the standard chip.

I am going with the most Nintendo path possible. The fan was included because the device ran warm and Nintendo wanted to make sure it was cool to the touch when extended gaming sessions occurred or when rapid charging was happening.

Well there are the memory pools they've added to pretty much every chip they've used for their consoles, they could have added more CUDA cores as well. Couldn't either of those generate significantly more heat?
 
Well there are the memory pools they've added to pretty much every chip they've used for their consoles, they could have added more CUDA cores as well. Couldn't either of those generate significantly more heat?

I am not sure exactly.. I mean you are adding more cores to something that is running at half of what it could be running at.
 
Here's a list of companies still making 20nm SoCs:

- MediaTek

That's it, and it's only a single die
, which they introduced in late 2015 with their X20/X25 and are now using for the X23/X27. Nvidia is just about still selling 20nm TX1 based devices, but I don't imagine the chip is still being fabricated, given the Shield TV is about to be replaced. In fact, I'm not aware of a single new 20nm chip going into production over the whole of 2016, let alone 2017. With the exception of a handful of high-end mobile SoCs over 2014-2015, everyone's either moving straight to finfet or sticking on 28nm.

There doesn't seem to be much of a reason to use 20nm. The price is too close to 16nm and the performance is too close to 28nm. If it weren't for the fact that TX1 is fabbed on 20nm we probably wouldn't even be considering it. That's not to say it's impossible, though. Perhaps TSMC is offering an exceptionally good deal to use up their remaining 20nm capacity.

The clock speeds we are aware of are perfectly doable on 28nm, and if Nintendo's goal is to make Switch as affordable as possible from day one then 28nm would seem like the sensible choice.
... Yeah, if that's the case this thing is not gonna be 20nm in any case. Maybe it's 16nm with 3-4SM, maybe it's 28 with 2SM, but it's definitely not 20nm. There's no point.

Personally, i'm team 28nm.
 
... Yeah, if that's the case this thing is not gonna be 20nm in any case. Maybe it's 16nm with 3-4SM, maybe it's 28 with 2SM, but it's definitely not 20nm. There's no point.

Personally, i'm team 28nm.
It often pays to be a pessimist with these things, but damnit Nintendo!
 
Not really. The X1 doesn't even use 28nm, it use 20nm, so I have no idea why it's even being brought up. No way it'd be cheaper to use that for ONLY the Switch chips and nothing else

Because it's less likely to be 20nm at TSMC when clients have abandoned that process node and either went with 16nmFF or 28nm.

Of course TSMC refines their process nodes to improve leakage and power consumption.

http://www.tsmc.com/english/dedicatedFoundry/technology/28nm.htm

They've refined their 28nm and 16nm nodes but have hardly touched the 20nm nodes according to their foundry page.

So as Thraktor stated in a post a couple of pages ago, they have a 28nm HPC+ node which is very close in comparison to the 20nm node.

On the other hand there is 16nmFF+ and 16nmFFC which is obviously better in power consumption and density. The question is, which is cheaper and gives better yields?

Thraktor summarises the above in this post anyway so take a look: http://m.neogaf.com/showpost.php?p=227547584
 
There is no way the chip will be 28nm if it was going to be we would have been getting reports of the dev kit being a tegra k1 chip instead of a custom tegra x1. Switch will be 20nm or 16nm and we will see how many SM it has on the 12th.
 
How can an old process be cheaper than a new process (I am assuming you are referencing economy of scale)? Aren't they 2 generations past 28nm already, it would seem that they are phasing that out in favor of FinFET at 16 and smaller?

Not really. The X1 doesn't even use 28nm, it use 20nm, so I have no idea why it's even being brought up. No way it'd be cheaper to use that for ONLY the Switch chips and nothing else

28nm is still a very commonly used process and and very cheap due to being tapped-out. It's also not going away anytime soon. It's what makes the most sense given what we know about the Nintendo of today. Before Wii U I'd say that they wouldn't use an old process as even Wii used the same process as PS3, but the Nintendo of today only cares about keeping costs down. They don't give a shit about efficiency. 28nm is definitely what we're getting. 20nm is a dead-end process, and 16nmFF wouldn't need a fan running in portable mode even it if has 3 SMs. There's just no way that it's anything other than 28nm.

There is no way the chip will be 28nm if it was going to be we would have been getting reports of the dev kit being a tegra k1 chip instead of a custom tegra x1. Switch will be 20nm or 16nm and we will see how many SM it has on the 12th.

The K1 is based on an older architecture and is slower than Switch's docked mode. X1 would be used in the dev kit even if it's 28nm.
 
Before Wii U I'd say that they wouldn't use an old process as even Wii used the same process as PS3, but the Nintendo of today only cares about keeping costs down.

If anything doesn't this point to the Wii U being the exception, not the rule going forward? Also, their decision to keep using the PowerPC line started with the GC for the Wii U probably didn't keep costs down
 
Mate, I'm a PC player. They're gong to have to try DAMN hard to convince me to purchase AAA games that I can already play at significantly higher resolutions and higher frame rates than consoles allow for. Maybe not on the go, but given the option between 60+ FPS and portability, I'm going with the improved response time.

Yeah but you PC players are a weird bunch :) Seriously though of course a person invested in a gaming PC is not going to bother with any console really, unless there's exclusives. However there are 10's of millions of console gamers that could be persuaded. That's who Nintendo's potential market is.
 
If anything doesn't this point to the Wii U being the exception, not the rule going forward? Also, their decision to keep using the PowerPC line started with the GC for the Wii U probably didn't keep costs down
IIRC, the reason why Nintendo had to use 45nm for the Wii U was due to the EDRAM used couldn't be used on a smaller process.
 
If anything doesn't this point to the Wii U being the exception, not the rule going forward? Also, their decision to keep using the PowerPC line started with the GC for the Wii U probably didn't keep costs down

Wii U was definitely the exception. In fact, the only reason for Wii U even being an exception is the fact that it was in development for long and stuck with things that couldn't cheaply be moved to 28nm at the time like its eDRAM. With that said, the only possibilities at this point are 28nm or 3 SMs; nothing else works with what we know. Classically, 16nmFF would actually be too new for any console. Only PS4 Pro and Xbox 360 have used a brand new process at the initial launch ever. Nintendo's very much stuck in the last century when it comes to engineering their hardware so if Switch was initially targeted for late 2016 it makes sense that tehy wouldn't use 16nmFF. As for 20nm, it's a ded-end, and it's not even much better than 28nm. Nintendo also really wants to keep the price as low as possible without a loss (much to the dismay of much of GAF, which seems to feel that they're entitled to hardware being sold at a loss and curse Nintendo for not doing so), so anything that helps them save a bit of money probably sounds good to them.
 
Status
Not open for further replies.
Top Bottom