Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.
We are going to get an X1 with better memory bandwidth and custom OS, tools / APIs. It will be a great first party / indie machine.
 
I think Thraktor did some analysis about what it would be better for battery, 2 SM with higher clocks or 3 SM with lower clock, but I can't find it now. I wonder if it's probable to have 3 SM on 20nm.
 
I just find it hard to belive it is x1. It had heat issues in almost all devices it was in. The pixel c had to downclock at points. 20nm was awful for mobile... Hope it is a rumor or battery life will blow.

That didn't stop Nintendo with Wii U. From what bgassassin found out, the Wii U had problems so they just downgraded the fuck out of it.
 
That best case isn't happening, that "worst case" is closer to the actual best case.1.2GHz on 20nm docked is optimistic due to yields; 1.7GHz on 16nmFFis just trying to make sure that you're disappointed.

Explain to me why a 800Mhz 20nm Maxwell is a best case scenario. Exactly what limits the chip to that performance level with active cooling when other devices run faster with passive cooling.

Also 1.7Ghz isn't necessary for 850Gflops if you increase cores which would be an option in a custom design.
 
Nate wasn't even close to my mind when I typed that.



Okay, so tell me: what do you lose from going with the most likely scenario, that it's 20nm Maxwell? Also, no, there's no chance that the final dev kits and the retail units are using different architectures. That's pure desperation. If the dev kit isn't using final hardware, it's not a final dev kit.



That best case isn't happening, that "worst case" is closer to the actual best case.1.2GHz on 20nm docked is optimistic due to yields; 1.7GHz on 16nmFFis just trying to make sure that you're disappointed.

do you know something we dont? because damn you are so freaking certain about things, and all from rumours,
 
Okay, so tell me: what do you lose from going with the most likely scenario, that it's 20nm Maxwell? Also, no, there's no chance that the final dev kits and the retail units are using different architectures. That's pure desperation. If the dev kit isn't using final hardware, it's not a final dev kit.

Again I think you're misunderstanding me. I'm saying that's it's possible, and in my opinion likely, that the final product (and devkits) are based on Maxwell architecture on a 16nm process node. If this is the case it could explain why Nate had heard Pascal, because as you can plainly see from the Venture Beat thread, a lot of people think Pascal = 16nm and Maxwell = 20nm/28nm.

If it's 20nm Maxwell (worst case scenario), judging from the fact that there is a fan that can be active in portable mode, the main thing you lose in portable mode is battery life, although in console mode you get a lower maximum clock rate, meaning a lower max performance. That's what you lose if you don't go 16nm.
 
Don't dev kits usually use off shelf parts because it takes a long time to design and manufacture custom chips before retail? You put something close to the chip you're designing in the dev kits so developers can get an idea of what they're working with before retail units are given out. Hasn't it worked like this for a looooooooong time? Why do I feel like people have forgotten that in this whole maxwell is in the final devkit debacle?
 
I think Thraktor did some analysis about what it would be better for battery, 2 SM with higher clocks or 3 SM with lower clock, but I can't find it now. I wonder if it's probable to have 3 SM on 20nm.
More units @ lower clock is always better for battery. It's just that the chip might become too big (read: either too expensive or just non-viably big).
 
Don't dev kits usually use off shelf parts because it takes a long time to design and manufacture custom chips before retail? You put something close to the chip you're designing in the dev kits so developers can get an idea of what they're working with before retail units are given out. Hasn't it worked like this for a looooooooong time? Why do I feel like people have forgotten that in this whole maxwell is in the final devkit debacle?
Early devkits sometimes do that. Final devkits should be near identical or identical.
 
Don't dev kits usually use off shelf parts because it takes a long time to design and manufacture custom chips before retail? You put something close to the chip you're designing in the dev kits so developers can get an idea of what they're working with before retail units are given out. Hasn't it worked like this for a looooooooong time? Why do I feel like people have forgotten that in this whole maxwell is in the final devkit debacle?
Because Nintendo.
/s

Basically we don't know yet and nobody can confirm, if there sources talk about the development kit, the final product or both. So we wait for January or new sources and spin in circles in two different threads.
 
Even if portable mode is 400 flops and docked around 600-700 glops that is still a big step up from PS360 and Wii U. How many troops does Iphone 7 and the fastest android have?
 
Here's the thing, this revelation doesn't change anything compared to previous rumors other than potentially battery life. None if this is really a huge deal. I'm also annoyed by people pretending to know more than they actually do.

This is funny, considering who's writing it...

There is MULTIPLE persons in this forum that clearly have deeper technical knowledge than your average forum member, and yet everything they say can just be ignored...
 
Here's the thing, this revelation doesn't change anything compared to previous rumors other than potentially battery life. None if this is really a huge deal. I'm also annoyed by people pretending to know more than they actually do.

Forgive my ignorance, but what is your background? Developer, engineer, something else?
 
Even if portable mode is 400 flops and docked around 600-700 glops that is still a big step up from PS360 and Wii U. How many troops does Iphone 7 and the fastest android have?

Iphone 7 is 172Gflops, not sure about fastest android phone, similar I'd think.
 
Even if portable mode is 400 flops and docked around 600-700 glops that is still a big step up from PS360 and Wii U. How many troops does Iphone 7 and the fastest android have?

Snapdragon 820 is at 500GFLOPs of FP32 perf right now. It's being hampered by notoriously bad GFX drivers. Despite that it comes remarkably close to the x1 in real world gaming performance. Not sure about synthetic benchmarks.
 
Snapdragon 820 is at 500GFLOPs of FP32 perf right now. It's being hampered by notoriously bad GFX drivers. Despite that it comes remarkably close to the x1 in real world gaming performance. Not sure about synthetic benchmarks.

Interested to see the benchmarks if you can link them. Can only find one benchmark were a 820 matched X1 in pixel C (so not full clock). Still very impressive to have that performance in a phone if it can really do it in most situations though!, no doubt 14nm helps.
 
Interested to see the benchmarks if you can link them. Can only find one benchmark were a 820 matched X1 in pixel C. Impressive to have that performance in a phone if it can really do it in most situations though!, no doubt 14nm helps.

I'm not sure about synthetic benchmarks at all but in real world performance for demanding games it's pretty similar.
 
I'm just going to expect 2x WiiU GPU, 2x WiiU RAM for games and perhaps a 5x leap in WiiU CPU performance then if Switch is anything better than that it will be a bonus.

I'm pretty excited about having a portable WiiU+ tbh.

I imagine they will release a much more powerful dedicated console based on the same architecture in late 2018.
 
NCL finally moves away from 200X PowerPC tech to 201X SoC ARM tech and a small minority still find something to complain about. Meh. I'm just fucking glad they're finally ditching exotically designed hardware in favour of more commonly used architecture that can support modern day APIs and whatnot.
Bring me The Mysterious Murasame Castle by PlatnumGames, Nintendo, and you can have all my money!
 
A gaming console can't throttle it's clockspeed when the SoC reaches a certain temperature, like most mobile devices. The fan should allow the Switch to keep a constant power target without sacrificing performance when in heavy load scenarios.

Ding!

https://en.m.wikipedia.org/wiki/Dynamic_frequency_scaling

Dynamic frequency scaling or "throttling" occurs either to prevent overheating and/or to reduce power consumption.

Phones and tablets are automatically set to throttle when running something with a high CPU clock speed especially since they don't have inbuilt fans. Monster Strike for example does this and causes the phone to get very hot even though it's a 2D marbles game. The phone runs hot which causes the battery to get hot from being encased with the SoC. As I said before, a battery running hot or very cold will not last as long as a battery running at around room temperature.

So with the Switch having a fan to operate in portable mode it is moreso to prevent the device from overheating as Nintendo do not want to throttle any clockspeeds which would be bad for performance.
 
I'm seeing speculation here suggesting that the difference between mobile and docked FLOPS will be on the order of +50% to +75%.

Does the fillrate at all scale with this raw graphics processing measurement? I was wondering if there could be a much higher delta. Would it be bizarre to think that Nintendo might have targeted a +125% boost in peak GFLOPS that would correlate with the +125% increase in fillrate from 720p to 1080p?

That would mean mobile 264 GFLOPS → docked 594 GFLOPS, or 352 GFLOPS → 792 GFLOPS, or 400 GFLOPS → 900 GFLOPS (or 455 → 1024 in deference to post#1 of this thread). A larger delta could make it still faster than the Wii U in mobile mode (as hinted by the GameXplain, et al, framerate counters suggesting a difference in dropped frames in the two versions of Zelda) but still keep low enough to hit the rumoured target battery life.

I dunno, is it feasible that Nintendo might decide to have such a large difference in performance between these two modes?
 
Interested to see the benchmarks if you can link them. Can only find one benchmark were a 820 matched X1 in pixel C (so not full clock). Still very impressive to have that performance in a phone if it can really do it in most situations though!, no doubt 14nm helps.

The Pixel C seems to have an edge on Gfxbench.

Even if the Switch is stock 20nm TX1, it would still be a high-end handheld device.
 
ninjablade alt account~
Holy fuck

4MScwDc.gif
 
All these mentions of ninjablade trigger me. I've had to deal with him since back on the BannedChartz forums and then he appears here one day and survives his junior member phase. Thought I'd never see him gone

/offtopic
 
All these mentions of ninjablade trigger me. I've had to deal with him since back on the BannedChartz forums and then he appears here one day and survives his junior member phase. Thought I'd never see him gone

/offtopic

Sooooo, from someone's who's still rather green, can someone explain the "Ninjablade" situation for me?
 
The one thing that has always bugged me about this OP is the 4 cores. Even the standard X1 has eight in BIG.little setup. I would expect Nintendo to go with similar but better than X1 as min.
 
The one thing that has always bugged me about this OP is the 4 cores. Even the standard X1 has eight in BIG.little setup. I would expect Nintendo to go with similar but better than X1 as min.
big.LITTLE is only needed when trying to optimize multi tasking and battery power. If the Switch can maintain a constant amount of CPU performance under load with the big cores, there's no reason for the LITTLE cores.
 
I remember a semiaccurate article around a year back (I cannot find it at the moment - sorry!) that essentially said Nvidia had a TON of 20nm wafers they needed to use, and were willing to dump them off SUPER cheap to anyone.

I always wondered if that's what the switch was. Just a giant dump of a lot of Silicon Nvidia was stuck with on 20nm Maxwell.

As disappointing as that is (not being Pascal), the actual abilities of the chips are the same. Just hotter / more power hungry (which isn't great since its a handheld, but if they invest the savings in screen and/or battery its probably worth it).

I remember this and also that Nvidia was loosing money with the deal (which a lot of people laughed at) but ultimately it could very well be all the truth.

Why would Nintendo not take advantage of a great deal like this? They get a massive leap technology wise from their archaic flipper-gekko architecture at super cheap prices and overall is a good stepping stone towards the new unified architecture they had in mind. The only downsides is that they would have to work around the heat problem, but aside from that they are saving big bucks on the most costly part of the new device and not even having to invest much as the design is already done.

I have defended in this thread the OP specs as a reasonable expectation, but I thought 16nm had a good chance of happening, but after the confirmation that the device is active cooled on handheld mode I'm leaning towards the 20nm bargain being a thing. This also opens the possibility of a Switch-mini in the not too distant future (e3 2018 presentation, fall 2018 release), as a 16nm shrink / no fan / 4~5 inch screen / no detachable joycons / no TV mode would be easy to make without resorting to more cutting edge fab nodes like 10nm.

So at this point I'm expecting almost an straightforward 20nm Tegra X1 chip (512 Gflops docked ~ 300ish? undocked) and crossing my fingers that some work has been done on the memory setup as it's the only weak spot overall in the SoC.
 
big.LITTLE is only needed when trying to optimize multi tasking and battery power. If the Switch can maintain a constant amount of CPU performance under load with the big cores, there's no reason for the LITTLE cores.

OS tasks, background tasks for it and for the games, etc... why waste any time slice from the big cores when you do not need them? You are better off offering games more deterministic performance and let the OS do its thing with the LITTLE ones:
 
I remember this and also that Nvidia was loosing money with the deal (which a lot of people laughed at) but ultimately it could very well be all the truth.

Why would Nintendo not take advantage of a great deal like this? They get a massive leap technology wise from their archaic flipper-gekko architecture at super cheap prices and overall is a good stepping stone towards the new unified architecture they had in mind. The only downsides is that they would have to work around the heat problem, but aside from that they are saving big bucks on the most costly part of the new device and not even having to invest much as the design is already done.

I have defended in this thread the OP specs as a reasonable expectation, but I thought 16nm had a good chance of happening, but after the confirmation that the device is active cooled on handheld mode I'm leaning towards the 20nm bargain being a thing. This also opens the possibility of a Switch-mini in the not too distant future (e3 2018 presentation, fall 2018 release), as a 16nm shrink / no fan / 4~5 inch screen / no detachable joycons / no TV mode would be easy to make without resorting to more cutting edge fab nodes like 10nm.

So at this point I'm expecting almost an straightforward 20nm Tegra X1 chip (512 Gflops docked ~ 300ish? undocked) and crossing my fingers that some work has been done on the memory setup as it's the only weak spot overall in the SoC.

But nVidia specifically stated that this was a custom chip. I'm not saying it may not be 20nm. That's a real possibility, but what would be custom about it if it was simply a stock Tegra X1?
 
I remember this and also that Nvidia was loosing money with the deal (which a lot of people laughed at) but ultimately it could very well be all the truth.

Why would Nintendo not take advantage of a great deal like this? They get a massive leap technology wise from their archaic flipper-gekko architecture at super cheap prices and overall is a good stepping stone towards the new unified architecture they had in mind. The only downsides is that they would have to work around the heat problem, but aside from that they are saving big bucks on the most costly part of the new device and not even having to invest much as the design is already done.

I have defended in this thread the OP specs as a reasonable expectation, but I thought 16nm had a good chance of happening, but after the confirmation that the device is active cooled on handheld mode I'm leaning towards the 20nm bargain being a thing. This also opens the possibility of a Switch-mini in the not too distant future (e3 2018 presentation, fall 2018 release), as a 16nm shrink / no fan / 4~5 inch screen / no detachable joycons / no TV mode would be easy to make without resorting to more cutting edge fab nodes like 10nm.

So at this point I'm expecting almost an straightforward 20nm Tegra X1 chip (512 Gflops docked ~ 300ish? undocked) and crossing my fingers that some work has been done on the memory setup as it's the only weak spot overall in the SoC.

that makes sense but why the 500 man years of work line from Nvidia though? Just for the memory setup? They were not forced to make this statement, they just could have say nothing. And remember that this is presented as the start of a "20 years" partnership. It's not just Nintendo buying a bulk of unused chips.
 
that makes sense but why the 500 man years of work line from Nvidia though? Just for the memory setup? They were not forced to make this statement, they just could have say nothing. And remember that this is presented as the start of a "20 years" partnership. It's not just Nintendo buying a bulk of unused chips.

They used the same kind of PR when they announced the (later to be discovered buggy... scaler chip and more) RSX for the PS3 while they were also working on their unified shader chip which released almost alongside the PS3 itself.

Why waste a chance of putting some PR that makes them sound amazing?
 
So at this point I'm expecting almost an straightforward 20nm Tegra X1 chip (512 Gflops docked ~ 300ish? undocked) and crossing my fingers that some work has been done on the memory setup as it's the only weak spot overall in the SoC.

This doesn't make any sense considering the Nvidia blog post.
 
They used the same kind of PR when they announced the (later to be discovered buggy... scaler chip and more) RSX for the PS3 while they were also working on their unified shader chip which released almost alongside the PS3 itself.

Why waste a chance of putting some PR that makes them sound amazing?

I think in this case they need a big win and a half-assed chip just isn't gonna cut it. Of course they're going to play it up, but I think they went all out considering their other failures. It sends a message to everyone else that they can do this and did a bang up job with the NS. If it turns out that they ended up not doing much with this, then it'll look bad on them for future business.
 
But nVidia specifically stated that this was a custom chip. I'm not saying it may not be 20nm. That's a real possibility, but what would be custom about it if it was simply a stock Tegra X1?

At this point (at least from my point of view) people should stop reading too much into the "custom chip", by itself it doesn't mean that there has been great changes, just that there are some modifications but maybe minor.

that makes sense but why the 500 man years of work line from Nvidia though? Just for the memory setup? They were not forced to make this statement, they just could have say nothing. And remember that this is presented as the start of a "20 years" partnership. It's not just Nintendo buying a bulk of unused chips.

This doesn't make any sense considering the Nvidia blog post.

Anyone who knows Jen-Hsun Huang knows his statements are PR diamonds (ie. Pascal being a 10x leap, 3x VR perfomance...), so as an advice don't read too much into them. Most of those 500 man years (250 person 2 years) could very well be the time spent designing the original Tegra X1, and the beauty of it is that he wouldn't even be lying.

As a side note that follows this line of thought, 2 year relationship between Nividia and Nintendo seems really short to make a true custom job from the ground up for Nintendo, so this also reinforces this theory to me.
 
I have defended in this thread the OP specs as a reasonable expectation, but I thought 16nm had a good chance of happening, but after the confirmation that the device is active cooled on handheld mode I'm leaning towards the 20nm bargain being a thing.

So at this point I'm expecting almost an straightforward 20nm Tegra X1 chip (512 Gflops docked ~ 300ish? undocked) and crossing my fingers that some work has been done on the memory setup as it's the only weak spot overall in the SoC.

But the fan you just mentioned is on while undocked and you don't need a fan to cool a 20nm Tegra Maxwell (X1) at 585mhz (300gflops). That chip runs passively cooled at up to 850mhz (435gflops).
 
Anyone who knows Jen-Hsun Huang knows his statements are PR diamonds (ie. Pascal being a 10x leap, 3x VR perfomance...), so as an advice don't read too much into them. Most of those 500 man years (250 person 2 years) could very well be the time spent designing the original Tegra X1, and the beauty of it is that he wouldn't even be lying.

As a side note that follows this line of thought, 2 year relationship between Nividia and Nintendo seems really short to make a true custom job from the ground up for Nintendo, so this also reinforces this theory to me.

2 years is your speculation, so you're kind of building up on your own bias here.
 
I think in this case they need a big win and a half-assed chip just isn't gonna cut it. Of course they're going to play it up, but I think they went all out considering their other failures. It sends a message to everyone else that they can do this and did a bang up job with the NS. If it turns out that they ended up not doing much with this, then it'll look bad on them for future business.

They probably bank on bankrupting the competition and giving businesses no choice :).
 
I disagree and mainly because this is Nintendo. Their DNA is to make things custom in their systems. They are not putting a stock or even close to stock " minor changes" into switch.





Very much agree. No one knows at least no one is saying anything. I just am willing to bet the bank Nintendo and Nvidia went all out on switch. Nintendo doesn't start new tech in their products with something crappy or half-assed. This will be a gamecube situation closer than any other console release we have seen from them... especially the last two consoles.

The ram solution will definitly be custom, there is no way Nintendo would allow 25gb / s to bottleneck the system.
 
Status
Not open for further replies.
Top Bottom