Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.
Which isn't different from the psp pro and Scorpio situation right now or pc gaming I general. By all accounts during development it's not half as difficult as your implying

It's still a closed system with two output specs rather than one.

Well, it isn't if you just make a game for the Switch from the ground or upport it from the WiiU.

But the workflow for a game that operates from a Xbox One baseline of around 1.3TF with 8GB ram moving to a system with something like 0.7TF to 0.5TF with 4GB ram is already quite the challenge.
 
Well, it isn't if you just make a game for the Switch from the ground or upport it from the WiiU.

But the workflow for a game that operates from a Xbox One baseline of around 1.3TF with 8GB ram moving to a system with something like 0.7TF to 0.5TF with 4GB ram is already quite the challenge.
No it's still not. Have you ever games on a pc at all? The average spec difference can be a lot larger than that while playing the same game and that's without much optimisation. Rise of the tomb raider had a xb360 port and most major yearly franchises had last gen versions until recently. It's really not a big deal at all. CPU and memory bandwidth would be the biggest problems but even they can be circumvented to a small degree.
 
People are seriously getting their hopes up for native resolutions higher than 1080p? Have you even met Nintendo? This is the problem when the internet's starved for news: expectations move in the opposite direction of reality.
 
Think switch will be made on Samsung 14nm node or TSCM 16nm node? Is there even a big difference between them? I know Nvidia goes Samsung with its Laptop GPU's
 
4GB Ram is insane at this point.

They need to use 6GB or 8GB. Shield TV uses 3GB LPDRR4 so they could just double it and call it a day.

If devs have to cut content to port then it's a bad idea. At least this way things will port over easily.

720p is fine. 900p/1080p on a TV is fine too since ordinary people struggle to tell the difference between 1080p and 720p anyways.

Not too worried about the FLOPS either at 720p. It's the RAM that's an issue for porting.

The difference in RAM is not going to make or break any potential port. We had cross-gen games despite an order of magnitude difference in RAM.
 
The difference in RAM is not going to make or break any potential port. We had cross-gen games despite an order of magnitude difference in RAM.

We don't talk too much about Wii U's RAM amount, but that's because the CPU was an even bigger problem.
It's sad but we know developers are lazy, and at this point every little extra work they have to do is weighting a hundred times against Nintendo. So half the RAM (and slower) is a problem, and we don't know where is the NX OS in this picture.


Regarding the clockspeed/battery in handheld mode, on 2nd thought, I think it's just better to not enable this by default.
Let the developers pro-actively enable the downclock it if they intend to optimize their game to save battery, even if they should be aware that battery life will be horrible if they don't.
 
We don't talk too much about Wii U's RAM amount, but that's because the CPU was an even bigger problem.
It's sad but we know developers are lazy, and at this point every little extra work they have to do is weighting a hundred times against Nintendo. So half the RAM (and slower) is a problem, and we don't know where is the NX OS in this picture.


Regarding the clockspeed/battery in handheld mode, on 2nd thought, I think it's just better to not enable this by default.
Let the developers pro-actively enable the downclock it if they intend to optimize their game to save battery, even if they should be aware that battery life will be horrible if they don't.
Ugh.
 
People are seriously getting their hopes up for native resolutions higher than 1080p? Have you even met Nintendo? This is the problem when the internet's starved for news: expectations move in the opposite direction of reality.

I'm sure there will be a 4K HDR screen on the unit itself.
 
Why have we gone off the 'X2' (Not a real name, I know)/Pascal idea recently then? No chance of Nintendo using a more efficient custom chip?
 
People are seriously getting their hopes up for native resolutions higher than 1080p? Have you even met Nintendo? This is the problem when the internet's starved for news: expectations move in the opposite direction of reality.

It's like pulling teeth with some of you.

No sane person expects resolutions beyond 1080p.

These instances where people like Pdot/ the3sickness take a single insane post and act like literally everyone is expecting Scorpio in a Tablet are beyond tiring, as are these "this is nintendo/ tamper your expectations" reminders.
 
Is anyone claiming that the Switch's overall performance will be better than XB1 or PS4? I don't know why you keep bringing that up. No one is expecting that.

Just having a CPU which performs better than Jaguar is not a big feat, and it won't make that big of a difference. The GPU of the Switch will still be weaker than XB1 and PS4, as well as less RAM and slower RAM. So it will still be weaker overall.

2 people have argued the Switch can be better than the Xbox One because the XB1 can't do F16 calcs, it can only do F32. Even then they both said it will be a game to game thing. I don't agree but this your FYI.
 
It's like pulling teeth with some of you.

No sane person expects resolutions beyond 1080p.

These instances where people like Pdot/ the3sickness take a single insane post and act like everyone is expecting Scorpio in a Tablet are beyond tiring, as are these "this is nintendo/ tamper your expectations" reminders.

I'm simply commenting based on posts I've read in this thread. If you're not one of these insane people then sit back and relax -- my comment wasn't aimed at you.
 

Too lazy to even comment? J/K Matt.

I do not think it is laziness, just that a dev team is a business and in lots of cases this kind of work does not give good enough ROI to invest on... see the paltry PS4 Pro updates in some titles.

OK poor choice of word of mine, sorry about that.

Actually it's like Pana' said, I meant extra work -> extra costs -> don't choose to port.
 
4GB Ram is insane at this point.

They need to use 6GB or 8GB. Shield TV uses 3GB LPDRR4 so they could just double it and call it a day.

If devs have to cut content to port then it's a bad idea. At least this way things will port over easily.

720p is fine. 900p/1080p on a TV is fine too since ordinary people struggle to tell the difference between 1080p and 720p anyways.

Not too worried about the FLOPS either at 720p. It's the RAM that's an issue for porting.

This is getting tiresome and it only takes like 2 min on Youtube to know that memory size is not a problem at all, any modern PC game runs fine on a decently capable 2GB Gpu as long as you don't hit very high / ultra settings, and I don't think anyone expects that kind of settings on Switch.

Memory bandwidth on the other hand is much more important, but if Switch SoC can get 128 bit bus 50Gb/s like Parker the Gflops / Bandwidth ratio would be way abobe the usual GTX 1000 series ratio, there is no need at all to hit Ps4One raw bandwidth numbers for the power envelope Switch is targeting.
 
No it's still not. Have you ever games on a pc at all? The average spec difference can be a lot larger than that while playing the same game and that's without much optimisation. Rise of the tomb raider had a xb360 port and most major yearly franchises had last gen versions until recently. It's really not a big deal at all. CPU and memory bandwidth would be the biggest problems but even they can be circumvented to a small degree.

And they were running on last gen engines.

The power gap is an important factor.
 
And they were running on last gen engines.

The power gap is an important factor.

Honestly, it's more likely that CPU bottlenecks become an issue than GPU power. Effects can be turned down, but if your game can't run on the CPU, you have bigger issues.

If the CPU rumors are true, then there's no reason to think that modern games can't come to the Switch with settings turned down.
 
I believe that will heavily depend on how powerful the Switch is in portable mode relative to "dock mode" The Switch in portable mode is capped at 720p, for example, so the dock mode extra power may be only designed to boost up the resolution.

I see what you are trying to get to, though: if developers were told that the system a certain power but Nintendo later clarified that the power can only be constantly maintained or reached in "dock-mode", that would suck. I doubt that it happened that way, though. From what is sounds, the performance of the system does improve in dock mode, but Nintendo may not have even told devs what was boosted. Perhaps they are trying to prevent some issues that came up with the PS4Pro. *shrug*

If done well, that can be managed I think. Yes technicaly its downgrading in portable mode, but there are at least a couple of ways to handle that

- home console 1080p, portable 720p - done solely through downclocking GPU/not using all shaders etc. CPU would still need to run at the same speed across both to maintain a common framerate

- home console 60fps, portable 30fps - done through downclocking CPU/disabling cores.

While plenty of gaf might hate the second option, it does at least provide a bit more granularity in terms of power consumption. Going from 1080p/60 docked, to 720p/30 handheld could be a huge power saving and still provide a massive jump over any other handheld.
 
If done well, that can be managed I think. Yes technicaly its downgrading in portable mode, but there are at least a couple of ways to handle that

- home console 1080p, portable 720p - done solely through downclocking GPU/not using all shaders etc. CPU would still need to run at the same speed across both to maintain a common framerate

- home console 60fps, portable 30fps - done through downclocking CPU/disabling cores.

While plenty of gaf might hate the second option, it does at least provide a bit more granularity in terms of power consumption. Going from 1080p/60 docked, to 720p/30 handheld could be a huge power saving and still provide a massive jump over any other handheld.
Scaling the CPU might give developers big headaches than with the GPU, therefore the lesser option of the two.

Also, the number of fps isn't always directly dependant on the CPU clock.
 
A 720p to 1080 p would require more than twice the upclock all other things being the same.
That's not how it works. You don't need to increase performance by 2.25x to go from 720p to 1080p (2.25x increase in resolution) at the same frame rate.
 
And they were running on last gen engines.

The power gap is an important factor.

It's not an important factor at all. There are nothing in terms of modern shaders and graphical techniques the XB1 and PS4 can do and the switch cannot. It's purely CPU, memory bandwidth and GPU power. Considering modern engines are more scaleable than last gen engines, there won't be any issues at all. porting will be more than doable the question will what the final performances end up at and the sacrifices made but the porting itself won't be a problem.
 
Hey guys sorry if this is a stupid question but why is there so much focus on hardware advancement and seemingly less on development innovation. Like for example is it possible to get double the performance and graphics simply from really intelligent coding? If this is possible wouldn't it make more sense to invest in better coding techniques than hardware?
 
Hey guys sorry if this is a stupid question but why is there so much focus on hardware advancement and seemingly less on development innovation. Like for example is it possible to get double the performance and graphics simply from really intelligent coding? If this is possible wouldn't it make more sense to invest in better coding techniques than hardware?
You don't think this happens?
 
Hey guys sorry if this is a stupid question but why is there so much focus on hardware advancement and seemingly less on development innovation. Like for example is it possible to get double the performance and graphics simply from really intelligent coding? If this is possible wouldn't it make more sense to invest in better coding techniques than hardware?

It's called coding optimisation and low-level API.
 
So the max technical ability of the Switch being thrown around is basically it's limit? There is no possible way for example to double this again with intelligent coding? Is it more a matter of physics? Just wondering how far coding can really take things. Sorry if it's a dumb query. I'm learning :)
 
I don't know why some people are quibbling over definitions like full power and overclocked in different modes. There is no such thing as overclocked in a custom SoC, the top speed it runs at is its normal clock speed, obviously.

At the same time unlike most consoles the limit here isn't the maximum they can manage to run at as a console, the limit comes from how much far they can push it to in mobile mode. Mobile mode is the baseline, with the system then up-clocking to whatever speed they need to increase resolution for TV output. That doesn't mean anyone is trying to claim it can "run above Nvidia's specs by overclocking in docked mode" or whatever other nonsense. There are no "Nvidia's specs" for this chip anyway, again its a custom SoC. Its simply the fact that if mobile mode wasn't there it
 
So the max technical ability of the Switch being thrown around is basically it's limit? There is no possible way for example to double this again with intelligent coding? Is it more a matter of physics? Just wondering how far coding can really take things. Sorry if it's a dumb query. I'm learning :)

What do you mean by max technical ability? We don't have any specs for the device at all.

Did these specs get any further corroboration?

No, they aren't directly related to Switch at all, though the performance they provide may not be too far off from what Switch will provide.
 
Did these specs get any further corroboration?
No, but this is the ballpark we can expect.
So..... It's true that the Switch should be able to handle Breath of the Wild better than Wii U, right?....... Right?
Yes, but it won't run like in those video on Wii U. You won't get bad encoding and fucked up gamma on your tv.
I do think Switch can perform better for certain, heavily optimized games, than the XB1
It really can't. Maybe in these particular circumstances (heavily optimized UE4 and in house games) it can be closer than what the paper says, but performing better? No way.
 
If the SCD is being used to do say increased graphics processing akin to an external GPU. There is nothing in the design of the Switch that indicates it having an PCI-e port or NVlink equivalent to have an external GPU connected to the Switch.

I don't even see how a new dock would work when the Switch has only 1 USB-C port which has to connect to the TV via HDMI to USB-C cable to do video output.

Not that I think such an SCD will happen but recall that the SCD patent did mention wireless connections between console and SCD for assisting with graphical processing. So it's possible I guess, but again that's very unlikely.

2 people have argued the Switch can be better than the Xbox One because the XB1 can't do F16 calcs, it can only do F32. Even then they both said it will be a game to game thing. I don't agree but this your FYI.

Ha, I think I've been one of those people. I do think Switch can perform better for certain, heavily optimized games, than the XB1, but overall it will be a weaker machine.

Did these specs get any further corroboration?

No, they are essentially a guess about what is in the devkit based only on Eurogamer reporting that the devkit is a Tegra X1. Which means, the specs in the OP are for a standard Tegra X1, which Nvidia has 100% confirmed is not the final hardware in the Switch. Some insiders have said it's fairly similar to what we should expect though.

So..... It's true that the Switch should be able to handle Breath of the Wild better than Wii U, right?....... Right?

Yes, definitely (assuming the TX1 rumors as a baseline are true). The Wii U is very, very outdated at this point. It's a miracle they have a game like that running on it to be honest.

EDIT:
It really can't. Maybe in these particular circumstances (heavily optimized UE4 and in house games) it can be closer than what the paper says, but performing better? No way.

If we wind up with the best case scenario SoC (768GFlops docked) and a CPU better than what's in the OP then heavy use of FP16 in a UE4 game for instance (like 1/3 of the code in FP16) should get the real world performance at least close to XB1 levels. Depending on how much better Nvidia's dev software works (if Nvidia flops> AMD flops holds true for consoles, which is unknown as of now) that might push it over XB1 levels, again for a hypothetical, highly optimized game. Especially if that game relies heavily on CPU functions.

But that's clearly all very hypothetical and in general this shouldn't reach XB1 levels for most (if not all) games.
 
4GB Ram is insane at this point.

They need to use 6GB or 8GB. Shield TV uses 3GB LPDRR4 so they could just double it and call it a day.

If devs have to cut content to port then it's a bad idea. At least this way things will port over easily.

720p is fine. 900p/1080p on a TV is fine too since ordinary people struggle to tell the difference between 1080p and 720p anyways.

Not too worried about the FLOPS either at 720p. It's the RAM that's an issue for porting.

We having this argument again? Yes, devs will have to cut content, but not as much as you seem to think. The PS4 and X1 have 8GBs of RAM, but only about 5GBs are freed for games. While the Switch is rumored to have 3.2GBs freed for games. Devs can easily cut back some texture or draw distance to fit the missing 2GBs. Not like they don't allow scalability for things like mobile and PC as is. Not only that, but the scale down in resolution for the 720p screen also helps. Hell, at 720p or 900p, they may not even have to scale anything back.
 
Just general question really. Just wondering for example if its possible for Nintendo to double power off cheaper less powerful hardware simply from more innovative breakthrough coding. With both their progressive and evolving development combined with hardware architecture know-how, you'd think this would be possible but perhaps it really does mainly come down to hardware and less on coding when talking about graphical capability.
 
So the max technical ability of the Switch being thrown around is basically it's limit? There is no possible way for example to double this again with intelligent coding? Is it more a matter of physics? Just wondering how far coding can really take things. Sorry if it's a dumb query. I'm learning :)

For relevance.

731496.jpg


734833.jpg


739838.jpg


742340.jpg


745877.jpg


746511.jpg
 
So the max technical ability of the Switch being thrown around is basically it's limit? There is no possible way for example to double this again with intelligent coding? Is it more a matter of physics? Just wondering how far coding can really take things. Sorry if it's a dumb query. I'm learning :)
Hardware and software will make a difference, but that is always the case. It's not like they could take hardware of half Xbox One and program it to get results as good as Xbox One, because such a skilled programmer would be able to get more out of Xbox One too.
 
Just general question really. Just wondering for example if its possible for Nintendo to double power off cheaper less powerful hardware simply from more innovative breakthrough coding. With both their progressive and evolving development combined with hardware architecture know-how, you'd think this would be possible but perhaps it really does mainly come down to hardware and less on coding when talking about graphical capability.

What you are describing (improving software to get more out of hardware) is something that everybody has been doing since the dawn of computing. That software is constantly improving to get more and more out of hardware but it's at a stage where all of these companies have become very efficient at doing this, and it would be very difficult to come out with some new breakthrough API that somehow doubles performance over the competition.

Now, Nvidia typically does provide better tools and software for their hardware than AMD does, which is why Nvidia chips often perform better than similarly specced AMD chips in a PC environment, so the Switch might have that type of software-side advantage, though in a much less extreme way than you seem to want (i.e. double the power).
 
I don't think Switch's Nvidia hardware will have that much of an advantage against AMD hardware if they where similarly specced when coded to the bone
 
Hey guys sorry if this is a stupid question but why is there so much focus on hardware advancement and seemingly less on development innovation. Like for example is it possible to get double the performance and graphics simply from really intelligent coding? If this is possible wouldn't it make more sense to invest in better coding techniques than hardware?

This happens constantly, on every level from API to shaders to vulkan to physic engines etc.. Look at how uncharted evolved from the first one on PS3 to get an idea how hardware gets better use, by the same team, once they learn to know the system to the core and are the API's and libraries mature:
 
Just general question really. Just wondering for example if its possible for Nintendo to double power off cheaper less powerful hardware simply from more innovative breakthrough coding. With both their progressive and evolving development combined with hardware architecture know-how, you'd think this would be possible but perhaps it really does mainly come down to hardware and less on coding when talking about graphical capability.

To put it plainly, no. There is no special coding nintendo can use to get more out if the systen that any other dev can't do. There are certainly tricks you can do to get better looking and performing games but there is no Nintendo voodo.
 
I don't think Switch's Nvidia hardware will have that much of an advantage against AMD hardware if they where similarly specced when coded to the bone

We know that there are two clear advantages on Nvidia's side when it comes to Tegra: tile-based rasterization and FP16 performance. How well will these be used by the tools on Switch remains to be seen. We know already that UE 4 is capable of using both FP16 and FP32, but we need to see it in practice.
 
I don't think Switch's Nvidia hardware will have that much of an advantage against AMD hardware if they where similarly specced when coded to the bone

Even in a console you still have APIs. I don't think we really have any prior example where an AMD GPU and Nvidia GPU were fairly similarly specced (same order of magnitude) in competing consoles at the same time, so I don't really know if we can say for sure that this will happen either way.

It's certainly an advantage on PCs- that's pretty much been proven. But I think we'll need to hear from developers to see if there's some similar advantage in consoles.

EDIT: I'm speaking specifically about the whole Nvidia flops > AMD flops which exists in PCs.
 
I don't think Switch's Nvidia hardware will have that much of an advantage against AMD hardware if they where similarly specced when coded to the bone

Nvidia uses a different rasterizer and has proprietary methods in place to make more out of it's memory bandwidth. So in some ways it would perform better. AMD bet heavily on async compute, so in games that use it heavily, you'd see an advantage go to AMD. But I would guess more games would be able to take advantage of the former.
 
So the max technical ability of the Switch being thrown around is basically it's limit? There is no possible way for example to double this again with intelligent coding? Is it more a matter of physics? Just wondering how far coding can really take things. Sorry if it's a dumb query. I'm learning :)
This is why late console games tend to look better than early ones, and first party tend to look better than third party.
 
We having this argument again? Yes, devs will have to cut content, but not as much as you seem to think. The PS4 and X1 have 8GBs of RAM, but only about 5GBs are freed for games. While the Switch is rumored to have 3.2GBs freed for games. Devs can easily cut back some texture or draw distance to fit the missing 2GBs. Not like they don't allow scalability for things like mobile and PC as is. Not only that, but the scale down in resolution for the 720p screen also helps. Hell, at 720p or 900p, they may not even have to scale anything back.

I'm more worried that the OS is gonna run like shit. Allocation of RAM being less than a gig for OS functions in 2017 worries the fuck out of me.
 
Just general question really. Just wondering for example if its possible for Nintendo to double power off cheaper less powerful hardware simply from more innovative breakthrough coding. With both their progressive and evolving development combined with hardware architecture know-how, you'd think this would be possible but perhaps it really does mainly come down to hardware and less on coding when talking about graphical capability.

Quite simply: No, that's not how it works. Anyone capable of coding so efficient that you'd "double the power" (which isn't really a thing) would be able to do so for the other consoles as well. The hardware is still unknown spec-wise, but we know it's not some special snowflake that'll trump better spec'ed consoles thanks to some crazy coding.

But we still don't know what the orange port on the back does, so it might have some special "hidden power" there....
 
Hehe, little does unsuspecting GAF know that nvidia actually created supercharged API for the Switch, allowing for seamless 4k 60 FPS gaming.....in handheld mode!
 
Status
Not open for further replies.
Top Bottom