Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.
Definitely, which is why I think the marketing is focused on "core" gamers right now. Get those guys on board first, then go after everyone else. Good word of mouth from the "core" group will help bring the casuals. Good games will keep them around.
Casuals make up the largest demographic right? That makes them the core demographic. They are the ones who flock to Skyrim, GTA, COD, etc and make them sell so much.

It seems like a lot of people use the term core when they really mean hardcore. Core and casual are the same demographic based on the way they are being used here.
 
Casuals make up the largest demographic right? That makes them the core demographic. They are the ones who flock to Skyrim, GTA, COD, etc and make them sell so much.

It seems like a lot of people use the term core when they really mean hardcore. Core and casual are the same demographic based on the way they are being used here.

Yeah. I mean "hardcore gamer". I just hate typing it all out. I just call them the "core" guys since they're the folks who usually jump on first. Sorry for the confusion, but I think most people understand what I was getting at. Though, I think most folks do call the average Joe gamer a "casual".
 
Casuals make up the largest demographic right? That makes them the core demographic. They are the ones who flock to Skyrim, GTA, COD, etc and make them sell so much.

It seems like a lot of people use the term core when they really mean hardcore. Core and casual are the same demographic based on the way they are being used here.

Maybe we should take Reggie's cue and call them "enthusiast gamers".
 
To be fair this is addressed directly in his quote: tiled rasterisation having no effect on pure compute shaders, so that bandwidth saving technique would have effect on other areas, but not bandwidth required to pure compute.

That wasn't my point. The Xbox One has a lot more shaders to feed than the Switch, and it's main memory bandwidth would be even more constrained without the addition of ESRAM. We don't know the entirety of the Switch's memory setup, so it's impossible to know whether or not Nintendo have taken additional steps like ESRAM or Cache Customization to combat whatever deficiencies may exist in bandwidth for compute.
 
Yeah. I mean "hardcore gamer". I just hate typing it all out. I just call them the "core" guys since they're the folks who usually jump on first. Sorry for the confusion, but I think most people understand what I was getting at. Though, I think most folks do call the average Joe gamer a "casual".


Maybe we should take Reggie's cue and call them "enthusiast gamers".
Or, just realize there's a difference between core and hardcore... It's only four extra letters to type the correct word and not sound ridiculous comparing the same demographic against itself using two synonyms. The way the term "Casual" is used in these discussions is just an elitist way of talking down about the core demographic.

This use of casual started during the Wii's lifespan when people were salty that it was selling well, then continued as more of the core shifted to mobile. It's weird seeing people be dismissive of the group considering that you're talking about the group that will make or break a console. Without the core you don't sell.
 
I'm with you if you can find the talent, but that's the hard part.

I'm talking more in the context of official emulators (i.e. Virtual Console), where they should have access to all the necessary original hardware documentation (and potentially the designers as well). A team like M2 could probably manage it given the proper resources, although I doubt there's enough demand for Saturn VC for Sega/Nintendo to go to the expense.

IIRC, the Saturn rendered quads instead of triangles, right? That would also add a layer of complication to the mix.

It did indeed (and as far as I'm aware it's the only dedicated 3D rendering hardware ever made to use quads). It's the least of the problems when it comes to emulation, though, as you can simply render one quad as two triangles.

I guess it depends on how isolated the work that all the processors do is. Kinda makes me curious now, about how possible it is.

Here's the source code for one of the main Saturn emulators currently available, if you want to pore through it.

There are actually a few Saturn emulators out there, and given that most of them are either made by a single person or open-sourced with two or three contributors writing 99% of the code it's actually quite surprising how well they've done in terms of compatibility with Saturn's library. It's certainly not an easy machine to emulate, but the lack of interest in it (compared to something like Dolphin) is definitely holding it back too.

I can see it being interesting to someone just for the technical challenge of accurately emulating such a complex system, though, which might explain the number of closed-source emulators.
 
Any chance of smaller processors/chips taking the load of the main processor?

Like a dedicated sound chip? Or tessellation chip? Or physics etc. Or is that a thing of the past?
 
I'm talking more in the context of official emulators (i.e. Virtual Console), where they should have access to all the necessary original hardware documentation (and potentially the designers as well). A team like M2 could probably manage it given the proper resources, although I doubt there's enough demand for Saturn VC for Sega/Nintendo to go to the expense.



It did indeed (and as far as I'm aware it's the only dedicated 3D rendering hardware ever made to use quads). It's the least of the problems when it comes to emulation, though, as you can simply render one quad as two triangles.



[URL=" the source code for one of the main Saturn emulators currently available, if you want to pore through it.[/URL]

There are actually a few Saturn emulators out there, and given that most of them are either made by a single person or open-sourced with two or three contributors writing 99% of the code it's actually quite surprising how well they've done in terms of compatibility with Saturn's library. It's certainly not an easy machine to emulate, but the lack of interest in it (compared to something like Dolphin) is definitely holding it back too.

I can see it being interesting to someone just for the technical challenge of accurately emulating such a complex system, though, which might explain the number of closed-source emulators.

Yeah, thanks, but I doubt I would be able to understand most of their code. I don't have a lot of programming experience yet. But.. from what I hear the main reason saturn emulators (the open source ones of course) have advanced so slowly over the years was because of a lack of documentation.
 
Any chance of smaller processors/chips taking the load of the main processor?

Like a dedicated sound chip? Or tessellation chip? Or physics etc. Or is that a thing of the past?

A DSP is already onboard the Tegra X1 for Sound. It uses an older Cortex A9 for that, and it's incredibly small at 20 or 16nm. Nvidia's design already has native tessellators and does Physx, so if anything they'd just add more programmable Cuda Cores to the die rather than waste space on dedicated silicon.
 
I just watched a pretty interesting video on some guy speculating that the July devkits and all the Maxwell rumors are based on the Jetson TX1, and that the more powerful devkits sent out in October are the Jetson TP1.

https://m.youtube.com/watch?feature=youtu.be&v=n3_IE1LMmSY

Don't know if you guys have already talked about this yet or if it even adds anything to the discussion. I'm not technically savvy but I thought it was an interesting theory and worth sharing.
 
I just watched a pretty interesting video on some guy speculating that the July devkits and all the Maxwell rumors are based on the Jetson TX1, and that the more powerful devkits sent out in October are the Jetson TP1.

https://m.youtube.com/watch?feature=youtu.be&v=n3_IE1LMmSY

Don't know if you guys have already talked about this yet or if it even adds anything to the discussion. I'm not technically savvy but I thought it was an interesting theory and worth sharing.

Well, we have CES this week to see what Nvidia announce. We know a new Shield TV will be revealed but a new Jetson dev kit? I don't know.

The other problem I have with his speculation is he states that Pascal runs better than Maxwell at lower clockspeeds which he thinks explains the lower clockspeeds on the dev-kits.

We know Pascal isn't significantly different from Maxwell which is why the only difference we'd expect is more efficiency at the same clock speed due to different fab nodes.

Switch would have to have some new architecture that isn't Pascal to go along with this guy's idea that Switch is running as good as a Jetson TX1 but at lower clockspeeds.

To add to that, that would mean the CPU would have to be newer than an A57 as well.

Edit: Watch people now try to rationalise that it's now using Volta as the only explanation it runs better at lower clockspeeds.
 
A DSP is already onboard the Tegra X1 for Sound. It uses an older Cortex A9 for that, and it's incredibly small at 20 or 16nm. Nvidia's design already has native tessellators and does Physx, so if anything they'd just add more programmable Cuda Cores to the die rather than waste space on dedicated silicon.
Thanks for sharing that. Is there a place where you can publicly get more info about the Onboard chips in TX1?
 
I just watched a pretty interesting video on some guy speculating that the July devkits and all the Maxwell rumors are based on the Jetson TX1, and that the more powerful devkits sent out in October are the Jetson TP1.

https://m.youtube.com/watch?feature=youtu.be&v=n3_IE1LMmSY

Don't know if you guys have already talked about this yet or if it even adds anything to the discussion. I'm not technically savvy but I thought it was an interesting theory and worth sharing.

That's not breaking news or a discovery really, since it was already mentioned at the start of this thread that the leaked Dev Kit specs matched those of the Jetson TX1 to a T.

As ggx2ac points out the whole Pascal explaining low clock speeds doesn't makes sense since architecture wise their performance are almost the same, Pascal cards main triumph card is that they use 16nm finfet process that allows them to have really high clocks at very low tdp.

So now that we know the clocks the explanation has to be another one, either older fab process being used, avoiding throttling, battery life / thermal concerns, or something extra going on in the Switch SoC memory/cpu/house setup.

Personally I think it's a bit of everything mentioned (process, battery, thermals) with a high chance of some work in the cpu setup, maaaaaybe memory as well, but can't see the whole "more SMs conspiracy" having any chance.
 
Nearly two decades. I wonder how many DMA people are still around? Hell, Nintendo has lost a couple presidents since then...
Rockstar is earning BIG BIG BIG without Nintendo. Why should they bother?

It is Nintendo who needs to reel third party in. If a third party is not developing for Switch it is Nintendo's fault.

Nintendo need to sell the Switch to devs and publishers in a good way. Good documentation, free support, free upgrades, clear pricing, english support, middleware support, tools for porting, engine support etc.

If Nintendo doesn't do an awesome job; parties won't come. And why should they?
 
Nearly two decades. I wonder how many DMA people are still around? Hell, Nintendo has lost a couple presidents since then...

Sam and Dan have are still around and are the two critical members to the studio. If they're the one with the grudge, then they probably won't forget. Remember what Sam said about Ray Liotta

Did he really get what you were doing? His performance was good, obviously.
Yeah, his performance was very good. He was a very interesting guy to work with because we had to have him in for quite a long time – it was the most time we’ve ever had someone like that around, actually – and in some sessions he was so fired up and he was so into it, but then sometimes it’d be like he was in some kind of a hole, and he was very dark and couldn’t work. He’s a pretty amazing guy, kind of an amazing actor. He’s not been in as many good things as he should have been, I think. He is so good in Goodfellas that he kind of doesn’t need to do anything else, but whatever he’s in he always catches your eye because he’s got something about him, and in the flesh he’s definitely got that about him, too. I think he did get it, to be fair. But he made some comments later on through his agent, something like, “Hey, that game was so big I should have charged them more money”, and I hate that kind of chat. It’s like, be cool. You know? I hate that – it’s so cheesy. Like he’s saying, “Next time I’m really going to pin it to them”. Well, how about we just killed off your character? So he doesn’t exist – there is no next time. That’s how we handle that.

Rockstar completely changed how they cast characters since then. They basically get a bunch of unknowns now instead of major actors.
 
A DSP is already onboard the Tegra X1 for Sound. It uses an older Cortex A9 for that, and it's incredibly small at 20 or 16nm. Nvidia's design already has native tessellators and does Physx, so if anything they'd just add more programmable Cuda Cores to the die rather than waste space on dedicated silicon.

Thats pretty cool
 
If the switch is using 20nm fab chips (due to wafer contracts etc) do you think once the contracts run out Nintendo and nvidia would make the rest of the chips on 16nm as it seems like it would be more expensive to continue using the larger fab process?
 
It's everyone's job not just Nintendo. Yes Nintendo has to give the best hardware and tools available to third parties. But the publishers have to put effort in and not treat a Nintendo game or port as a red headed step child. Finally gamers have to support good content when it's released.
nope.

It's business. If Nintendo doesn't do a good job creating opportunities for 3rd-parties to make money, they won't come.

Who wants to loose money?

It's not a game, it's about making or losing millions of dollars.

If just a bit of money can be made.. then don't expect big effort. Also wiiu games had lot's of trouble because Nintendo didn't gave any good documentation and issues had to be send to Japan first. Creating WiiU games was a living hell (thanks to Nintendo).
 
nope.

It's business. If Nintendo doesn't do a good job creating opportunities for 3rd-parties to make money, they won't come.

Who wants to loose money?

It's not a game, it's about making or losing millions of dollars.

If just a bit of money can be made.. then don't expect big effort. Also wiiu games had lot's of trouble because Nintendo didn't gave any good documentation and issues had to be send to Japan first. Creating WiiU games was a living hell (thanks to Nintendo).

Wasn't creating PS3 games an even bigger hell yet developers stuck with that console when it didn't sell well during its first years.
 
Wasn't creating PS3 games an even bigger hell yet developers stuck with that console when it didn't sell well during its first years.
Sony did a better job than Nintendo giving support, updating documentation, implementing new tools, goving insite in the future of the console, sending technical experts etc.

Sony kept them on board. Sony also needed to re-introduce Playstation 3 (rebrand) to finally win.

But it was Sony, not the 3rd-party devs who wanted to trow away money... devs need good promises, help and a bright future. Sony gave this exactly.
 
Thanks for sharing that. Is there a place where you can publicly get more info about the Onboard chips in TX1?

Nvidia released a whitepaper on it.

PDF Warning
http://international.download.nvidia.com/pdf/tegra/Tegra-X1-whitepaper-v1.0.pdf

The major wildcard is what Nintendo would have added/removed or rearranged to suit their needs. They need a USB 3.1 chip to support displayport over USB-C and quick charging, which I bet is the reason the Switch supports 5-15V over 2.6a on it's AC Adapter. Then you have to wonder about embedded memory, additional CUDA Cores, etc. The spec leaks we have from the July Devkits are clearly based on the Jetson TX1, but the October ones are likely final hardware. Hopefully we get some new information soon.
 
Can you summarize it?

This isn't news. He put together that the July Devkit spec leaks were based on the Jetson TX1, which you could have learned from the first few pages of this thread quite a while ago. He also comments about the October Kits being more powerful overall, which is just repeating Laura Dale's information.
 
Buuuh, Thank for the clarification. Why so mad? Its hard to keep with the thread and the so many rumors.

Anyways... thanks for your input
 
Sony did a better job than Nintendo giving support, updating documentation, implementing new tools, goving insite in the future of the console, sending technical experts etc.

Sony kept them on board. Sony also needed to re-introduce Playstation 3 (rebrand) to finally win.

But it was Sony, not the 3rd-party devs who wanted to trow away money... devs need good promises, help and a bright future. Sony gave this exactly.
Third parties came for Wii (just not in the way you expected) third parties came to DS ,5hird parties even came to 3DS(japanese)will third parties come to the Switch? 😆

Keep in mind,no matter how good the documentation is. No matter how good the tools are,hell it doesn,t even matter how much the Switch sells in many cases games wont be ported due to graphics limitations,there is only so much you can do before it looks like a completely different game or do the work to change design to fit Switch handheld capabilities
 
Yeah, thanks, but I doubt I would be able to understand most of their code. I don't have a lot of programming experience yet. But.. from what I hear the main reason saturn emulators (the open source ones of course) have advanced so slowly over the years was because of a lack of documentation.

Yeah, the suggestion was a bit tongue-in-cheek, although with well-commented code you can potentially still learn quite a bit without much programming experience. I have no idea how well commented it is, though.

I can certainly see lack of documentation being a problem for emulating a system like the Saturn. I know for the N64, for example, there is some documentation out in the wild (although it's mid-90s era Nintendo documentation, so less than ideal), which would be a big help compared to doing it blind.

If the switch is using 20nm fab chips (due to wafer contracts etc) do you think once the contracts run out Nintendo and nvidia would make the rest of the chips on 16nm as it seems like it would be more expensive to continue using the larger fab process?

They'd move to a new node when it becomes financially advantageous for them to do so, and/or when a new form factor requires it (e.g. a Switch mini with a smaller battery and no active cooling), just as they (and Sony and MS) have done in the past. I kind of suspect they're actually using a 28nm process at this point, though, and 28nm is unlikely to go away for a while.
 
Yeah, the suggestion was a bit tongue-in-cheek, although with well-commented code you can potentially still learn quite a bit without much programming experience. I have no idea how well commented it is, though.

I can certainly see lack of documentation being a problem for emulating a system like the Saturn. I know for the N64, for example, there is some documentation out in the wild (although it's mid-90s era Nintendo documentation, so less than ideal), which would be a big help compared to doing it blind.



They'd move to a new node when it becomes financially advantageous for them to do so, and/or when a new form factor requires it (e.g. a Switch mini with a smaller battery and no active cooling), just as they (and Sony and MS) have done in the past. I kind of suspect they're actually using a 28nm process at this point, though, and 28nm is unlikely to go away for a while.

Nvidia: So we've got this cool Tegra line of processor, how can we customise it for you?
Nintendo: Well you know how we fuck up every piece of hardware we make...
Nvidia: Say no more.
 
Buuuh, Thank for the clarification. Why so mad? Its hard to keep with the thread and the so many rumors.

Anyways... thanks for your input

I'm not mad (If you're referring to me), this was posted almost 12 hours ago and is still on the current page because the thread activity is slow since there hasn't been any new rumours.

The guy in the video said he was just speculating but he isn't aware that Pascal is only more power efficient than Maxwell (They're not significantly different architectures) and it cannot run better than Maxwell at lower clockspeeds.

The rumours have slowed down since Eurogamer mentioning the clockspeeds. You can expect that to increase when Nvidia announces their new products at CES this week when people are trying to come up with something that doesn't make the Switch weaker than a TX1 when it's possible the Switch could be.
 
Nvidia: So we've got this cool Tegra line of processor, how can we customise it for you?
Nintendo: Well you know how we fuck up every piece of hardware we make...
Nvidia: Say no more.

The more recent 28nm processes (say TSMC's 28HPC+) are pretty damn close to matching 20nm in terms of performance and power efficiency while being a hell of a lot cheaper. There's a good reason that mid-range mobile SoCs like the Snapdragon 652 are on 28nm instead of 20nm.
 
They'd move to a new node when it becomes financially advantageous for them to do so, and/or when a new form factor requires it (e.g. a Switch mini with a smaller battery and no active cooling), just as they (and Sony and MS) have done in the past. I kind of suspect they're actually using a 28nm process at this point, though, and 28nm is unlikely to go away for a while.

Nvidia: So we've got this cool Tegra line of processor, how can we customise it for you?
Nintendo: Well you know how we fuck up every piece of hardware we make...
Nvidia: Say no more.

lol

Edit:

The more recent 28nm processes (say TSMC's 28HPC+) are pretty damn close to matching 20nm in terms of performance and power efficiency while being a hell of a lot cheaper. There's a good reason that mid-range mobile SoCs like the Snapdragon 652 are on 28nm instead of 20nm.

Well, I suggested 28nm as a possibility too but was called foolish for it even though I gave my reasons regarding thermals, fan and clockspeeds rumoured currently.

But anyway, that 28HPC+ is interesting to hear.
 
The more recent 28nm processes (say TSMC's 28HPC+) are pretty damn close to matching 20nm in terms of performance and power efficiency while being a hell of a lot cheaper. There's a good reason that mid-range mobile SoCs like the Snapdragon 652 are on 28nm instead of 20nm.

How close are we talking though? Will those cost savings be worth potentially handicapping the system?
 
How close are we talking though? Will those cost savings be worth potentially handicapping the system?

I am reading it:

http://www.tsmc.com/english/dedicatedFoundry/technology/28nm.htm

The 28LP process boasts a 20 percent speed improved over the 40LP process at the same leakage/gate.

Compared with TSMC’s 28LP, 28HPC provides 10% smaller die size and more than 30% power reduction at all levels of speed.

For 20nm:

TSMC's 20nm process technology can provide 30 percent higher speed, 1.9 times the density, or 25 percent less power than its 28nm technology.

The comparisons concern me, maybe ignoring the 28LP then... apparently the 28HPC should be close to 20nm in comparison as Thraktor says?

We do know that 20nm was skipped because the performance gains weren't worth it.
 
The more recent 28nm processes (say TSMC's 28HPC+) are pretty damn close to matching 20nm in terms of performance and power efficiency while being a hell of a lot cheaper. There's a good reason that mid-range mobile SoCs like the Snapdragon 652 are on 28nm instead of 20nm.

If they left the Tegra X1 as is on 28nm, it would be nearly 230mm^2. Seems like it would have a hugely adverse impact on yields. If this is the case, I'd guess they'd be cutting an SM at 28nm, halving our current performance estimates.
 
If they left the Tegra X1 as is on 28nm, it would be nearly 230mm^2. Seems like it would have a hugely adverse impact on yields. If this is the case, I'd guess they'd be cutting an SM at 28nm, halving our current performance estimates.

If the performance was halved wouldn't that essentially make the system like a WiiU+ at best?
 
They'd move to a new node when it becomes financially advantageous for them to do so, and/or when a new form factor requires it (e.g. a Switch mini with a smaller battery and no active cooling), just as they (and Sony and MS) have done in the past. I kind of suspect they're actually using a 28nm process at this point, though, and 28nm is unlikely to go away for a while.

Doesn't the mention of a Tegra X1 as the devkit sorta indicate that the devkits at least were 20nm? I guess they could be using a slightly downclocked 20nm devkit to approximate 28nm final hardware?
 
I don't understand why Pascal was such a big deal for months and months and suddenly it's just the same as Maxwell... :/

Pascal indicated a 16nm process, but that's not exclusive to Pascal. The 16nm process is the somewhat big deal, and the fact that it could be Maxwell on 16nm changes nothing.

However if it's 28nm then we were all way off anyway.
 
I don't understand why Pascal was such a big deal for months and months and suddenly it's just the same as Maxwell... :/

They are very similar. Pascal would have allowed for higher clock speeds at the same efficiency, or the same clocks at higher efficiency. Performance per MHz is roughly the same.
 
I don't understand why Pascal was such a big deal for months and months and suddenly it's just the same as Maxwell... :/

It's moreso the fab node that was a big deal compared to the architectures since they're similar.

16nm meant better power efficiency compared to 20nm.

There's no 16nm Maxwell, only Pascal.

We were told by way of rumour that Pascal was going to happen and apparently things changed.

Eurogamer mentions the GPU is Maxwell and has Pascal features apparently but 2nd Gen Maxwell was pretty close in features compared to Pascal as well.

NateDrake also says the final or most recent dev-kits sent out are using Maxwell.

Maybe Nintendo will still utilise 16nm fab nodes but it doesn't tie-in with the leaked clock speeds of the dev-kits unless Nintendo were wanting to make this run lower than 4W in portable mode.

Who knows.
 
Yep. Much lower performance while portable. Transplanting the 20nm Tegra X1 design to 28nm increase the die size by 90%.

I know Nintendo would be shitty enough to do this :'(

From what we saw of BOTW on Jimmy Fallon would the handheld performance have been possible on hardware much weaker than the WiiU in portable mode?

From what I have been following it honestly to me seems like we are getting a slightly custom X1. I actually think they probably are going to stay on a 20nm process. It would be strange to me if they took the fully function 20nm that everything was based off of and just went 28 when you know they will do a shrink at some point anyway.
 
I know Nintendo would be shitty enough to do this :'(

From what we saw of BOTW on Jimmy Fallon would the handheld performance have been possible on hardware much weaker than the WiiU in portable mode?

From what I have been following it honestly to me seems like we are getting a slightly custom X1. I actually think they probably are going to stay on a 20nm process. It would be strange to me if they took the fully function 20nm that everything was based off of and just went 28 when you know they will do a shrink at some point anyway.

If Nintendo has taught us anything. It should be to take your modest expectations and reduce them by 50℅.
 
Because they could earn even more? I don't understand this logic.

It's fairly simple.

The opportunity cost of developing a game is non-zero.

Publishers are businesses, if they think the opportunity cost is too high compared to the gains they are not going to risk the losses involved in development, or if they do they are going to moderate their expenditure on that title to reduce the opportunity cost.

To do otherwise would be a quick path to bankcruptcy.
 
If Nintendo has taught us anything. It should be to take your modest expectations and reduce them by 50℅.

I don't have high expectations anymore. The low clocks killed that. Right now I'm just interested in seeing what they reveal at the event. Should give a good idea of what we're looking at capability wise. I guess my only expectation is literally that we are getting an X1 with a few modifications maybe for bandwidth, at the Eurogamer clocks. I don't think we are getting any process changes. I think it's going to be 20nm. I don't think they will go 16nm and I don't think they will produce it on a 28nm when it will "eventually" get shrunk in a revision anyway.
 
Status
Not open for further replies.
Top Bottom