Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.
Go back to the "can it do 4k?" part of the thread. Be prepared to cringe.

I simply asked if the dock dramatically increased performance or SCD's were used in conjunction with Switch, would WiiU assets at 4k be possible after personally calculating that an around 1.5tflop Nvidia GPU would be enough for 4k/30fps.

I got my answer and we all moved on :)

On the current topic, I'd be delighted with a 5 hour battery life while under load.
 
Based on the live Zelda demo on Fallon playing on the switch we can at least come to the conclusion it matches or somewhat exceeds the performance of the Wii U while docked. When they went portable it didnt look significantly downgraded.. pretty good imo.
 
GameCube is 15 years old hardware.

There is no reason to believe that a device that operates possibly around 350Gflops in mobile mode couldn't emulate it.

You realize that Dolphin emulator requires really beefy hardware (on the CPU side) to work right?

Or are you impliying that emulation is a simple task on different architectures?
 
You realize that Dolphin emulator requires really beefy hardware (on the CPU side) to work right?

Or are you impliying that emulation is a simple task on different architectures?

Well first of all the CPU will be at full clock speed in handheld mode, so its not a weak CPU. Also there are two big differences between a third party creating a PC emulator and a first party creating a console emulator.

The developers of Dolphin have to make the emulator work on lots of different hardware profiles and also don't know everything about the systems they're emulating.

Nintendo only have to make things work on a single fixed system, one they had a big hand in designing, and they're emulating a system they know inside and out.
 
The nice thing about the whole concept of the Switch is that, when docked, you can get that 40% increased performance at the same power draw (10W) and when undocked you can get the 60% increased power efficiency to bring it down to 4W (this is all based on 10W for 512GFlops for TX1).

This is why ensuring that the final product uses the most power efficient process node is so important. It means both more battery life AND more power when fully clocked.
You mean Pascal's concept? O_o

And this power/energy performance is separate from down clocking and up clocking, right?. This will be interesting. Though what if what Laura Kate dale says about increased performance on dock, actually refers to this?
 
You mean Pascal's concept? O_o

And this power/energy performance is separate from down clocking and up clocking, right?. This will be interesting. Though what if what Laura Kate dale says about increased performance on dock, actually refers to this?

I'll quote my post from the other thread-

Yes. The "40% more performance" or "60% more efficiency" are two ways of saying the same thing- Pascal chips get more performance per watt. This means, if you're keeping the exact same configuration of the TX1 but with Pascal/16nm architecture, the 10W for 512Gflops of the TX1 translates to either 4W for 512Gflops or 10W for 768Gflops for the Pascal chip. In theory anyway.

Meaning, due to the nature of the Switch, Nintendo can downclock to the 4W mode when undocked, and upclock back to the 10W mode when docked to take advantage of both the "40%" and the "60%"

This is how I understand it. Pascal/16nm basically means that you have the same processing capability using 60% less power.

So if the Pascal GPU is clocked at 1GHz (which is what the TX1 GPU is) you're getting the same exact performance as the TX1- 512GFlops, but at 60% less power draw, aka 4W.

On the other side, you can clock the GPU higher (I don't know the exact number, but I think it's around 1.5GHz) to reach a performance of 768GFlops but the power draw remains 10W. EDIT: Actually wouldn't 40% more be 716GFlops? I guess I don't know what the real numbers here would be as it's all theoretical.

This is all based on the fact that Pascal architecture enables the GPU to perform better per watt, meaning it performs better with less power supplied. So that 40% and 60% thing is just the same way of saying more performance per watt- and you get to choose how you take advantage of that simply by choosing a clock speed.

Edit:
Oh I get that. I was a bit surprised you can switch between both interchangebly.

Yeah it's as simple as changing the clock speed, which is exactly what happens when docking/undocking.
 
The nice thing about the whole concept of the Switch is that, when docked, you can get that 40% increased performance at the same power draw (10W) and when undocked you can get the 60% increased power efficiency to bring it down to 4W (this is all based on 10W for 512GFlops for TX1).

This is why ensuring that the final product uses the most power efficient process node is so important. It means both more battery life AND more power when fully clocked.

I just want to note that the X1 uses well over 10W in real-world usage. It's closer to 12-15W.
 
I need a lay person translator. :-)

So with the info and speculation currently available, sounds like 768 gflops in docked mode is at the very upper end of what we can reasonably expect. So then what's at the lower end (while not docked) of what we can reasonably expect? 300 gflops?

Because that's the only number that matters when we're trying to assess what this thing can do. Because whatever it can do, it has to be able to do it in 720p while not docked.

My other question: when assessing Switch VS Wii-U, wouldn't the flops be an apples-to-oranges comparison anyway, because the Switch GPU will have a lot more features than the Wii-U one has? Wouldn't that mean that if the flop counts were identical, the Switch game would still look or run better?

Thanks in advance.
 
Ultimately, with the Switch, I'm operating under my thumb rule to err on the side of caution. Throwing around speculation is all well and good, but Nintendo proved quite blatantly with the Wii, 3DS, and Wii U that they took an extremely conservative approach to hardware ceilings. And in all three case (notably with the Wii U) there was a lot of pre-full-reveal optimistic speculation regarding the size of the case, what could fit in, modern hardware, etc. And it turned out that Nintendo went cheap as fuck.

Not to disagree with the general gist of your post, but the bolded isn't really true at all for the Wii U. There's a reason it's still so expensive. Even aside from the GamePad and associated tech, Nintendo made two pretty expensive decisions when designing Wii U:

1. Native backwards compatibility with Wii
2. Large eDRAM pool on-die with the GPU

The former meant they had to use an IBM CPU, which meant they couldn't use a single-die SoC, and instead had to go with a much more expensive MCM. The latter (which was likely driven by BC as well, given the need to replicate the latency of Wii's 1T-SRAM) substantially increased the GPU die size, and prevented them from reducing costs by means of die shrink.

Nintendo spent a lot of money on making sure the Wii U was backwards compatible. Had they ignored BC they could have got more performance at lower cost with a simple ARM+AMD SoC.

GameCube is 15 years old hardware.

There is no reason to believe that a device that operates possibly around 350Gflops in mobile mode couldn't emulate it.

GPU performance is largely irrelevant for emulation. It's mostly a factor of CPU performance (and not generally parallelizable across more than a couple of cores).

Except it does. Simple algebra.
ΔE = P * Δt
Voltage of Li-Ion / Li-Po cells is a fixed material property. 3.3~3.7V depending on chemical cell composition. So the commonly quoted mAh capacities transform to Wh capacities just nicely by multiplying with 3.5 "give or take a few".
A 4000mAh battery at 3.5V is a 14Wh battery. If you want 3 hours out of a 14Wh battery, make sure to draw no more than 4.7W "give or take a few".

As soon as we agree what power dissipation headroom a device the size of the Switch might reasonably afford, we can fill in the variables and calculate.

Sidenote: Wii U is 34W with the major offenders ITO power draw made on a 28nm process. Switch is 16/14nm, yes?

The algebra is relatively simple, but it doesn't give us the battery size without knowing the system power draw, and it doesn't give us the power draw without knowing the battery size, so without any hard information on either it's not really giving us any new info on one or the other.

You can find 256+ GB storage that is cheap, or tiny, but not both yet.

Embedded UFS is available at 256GB size, and likely no more expensive than physically larger SSDs. It would be too expensive for a base model (unless Nintendo wants to bump up the price by about $50 to cover it), but it would be plausible for a more expensive model.
 
Heres an interesting video..

Earlier this week Eurogamer "leaked" VC games on the Switch, notably possibly GC games. Well just recently Digital Foundary/Eurogamer made a comparison video of gamecube and wii games vs dolphin emulated games on X1. Some of the games running on dolphin x1 at 480p were pretty close to hitting 60fps(around 50fps) for the gamecube games..

Perhaps it gives us a glimpse of what to expect.. Something like Pascal/x2 could certainly help with matching gamecube framerates.
https://www.youtube.com/watch?v=UGdHM8AIhvw

I hope they are able to up the resolution too.
 
GPU performance is largely irrelevant for emulation. It's mostly a factor of CPU performance (and not generally parallelizable across more than a couple of cores).
Gamecube emulation is a great feat IMO, do you think the games are being modded to off load the calculations from PPC instructions? Or do you think it will parse the instructions natively?
I know you probably don't have the answer, heh, just wondering if the Switch could handle raw instructions and processing dynamicaly.
 
Based on the gameplay on Jimmy Fallon do these specs seem like they could be correct?

I don't think that gameplay on Fallon really says anything, but the rumor of 5-8 hours of battery life essentially tells us this has to be Pascal based, so the specs in the OP (as many of us have been saying through this whole thread) are definitely incorrect.

And again, the specs in the OP were never presented as a leak- of the devkit or of the actual hardware. The twitter user just posted the specs of an off-the-shelf Jetson TX1 as previous rumors have said that TX1s were used in the NX devkits.
 
Gamecube emulation is a great feat IMO, do you think the games are being modded to off load the calculations from PPC instructions? Or do you think it will parse the instructions natively?
I know you probably don't have the answer, heh, just wondering if the Switch could handle raw instructions and processing dynamicaly.

Depends on what CPU it ends up having and the games themselves.

The Shield TV can do some games pretty well.

https://www.youtube.com/watch?v=aZeBB1i6rtk

But if you want to play something like this on a tablet, good luck.

https://www.youtube.com/watch?v=SErjEgELy14


So it should be possible, assuming the switch has a better processor than the shield, and nintendo is able to make an emulator at least as fast as dolphin on android (probably doable).
 
Isn't the 19-20W number for the entire device, while the 10W is for the SoC?

That's the entire Shield TV system: Ports, storage, power supply losses, wireless communication, etc

http://www.anandtech.com/show/9289/the-nvidia-shield-android-tv-review/9

None of that stuff if going to use 10W combined. 5W for everything else sounds right, or else Switch has an issue because it has to deal with all of that as well, plus a screen. You're not expecting Switch to use nearly 20W on the go, are you?
 
None of that stuff if going to use 10W combined. 5W for everything else sounds right, or else Switch has an issue because it has to deal with all of that as well, plus a screen. You're not expecting Switch to use nearly 20W on the go, are you?

Doesn't the Shield TV have a hard drive? At least one version does, right? Which version was measured at 19W in that article?

I'd imagine the hard drive (if present) uses quite a bit of that 9W. If the Shield TV without the hard drive is what hits 19W then yeah, I don't know what we can say about the SoC.

Is there no specific info about the TX1 SoC by itself, outside of the Shield (maybe in the Pixel C?)
 
I don't see any reason why Nintendo couldn't run any Gamecube or Wii games at 1080p with their original framerate on Switch. Judging the hardware potential by completely un optimised games running on a fan made emulator on top of a non Nintendo OS is foolish.

I imagine the improved resolution and need for optimisation is the reason they are apparently only releasing a few at a time (the leak mentioned three Gamcube games).
 
None of that stuff if going to use 10W combined. 5W for everything else sounds right, or else Switch has an issue because it has to deal with all of that as well, plus a screen. You're not expecting Switch to use nearly 20W on the go, are you?

"Sounds right" based on?

That's also on the 20nm half node.

You also ignored power supply efficiency, that's 19W at the wall, say an 80% efficient power supply, that's then 15W total peak power draw while gaming for the entire system...5W "sounds right" for everything else you say?
Because that would mean a 10W SoC.


Also have no clue how you think I was saying the Switch would draw 20W on the go. What I'm saying would lead to the opposite of raising the Switch's hypothetical power draw.
 
I was thinking about that last night. Nintendo is the only one out of the three that doesn't up the native resolution for their VC games. They really should this time do 720 and 1080p..
Nintendo did it first actually with N64 on Wii VC being 480p (still 480p on Wii U btw). Next was MS with Xbox Originals being 720p on 360 and finally now Sony with PS2 Classics being 1080p on PS4. I believe those are the only cases for improved native rendering at all among the 3, although hackers discovered the DS emulator has a disable uprebdering option on Wii U VC.
 
"Sounds right" based on?

That's also on the 20nm half node.

You also ignored power supply efficiency, that's 19W at the wall, say an 80% efficient power supply, that's then 15W total peak power draw while gaming for the entire system...5W "sounds right" for everything else you say?
Because that would mean a 10W SoC.


Also have no clue how you think I was saying the Switch would draw 20W on the go. What I'm saying would lead to the opposite of raising the Switch's hypothetical power draw.

5W is actually on the high end for all of that, and 80% efficiency is on the low end for a low-power device. Really, I was being generous. Just think about it: if the RAM, storage, ports (which was reaching considering that none of them were in use at the time), wireless and fan use that much energy in your mind, and Switch has all of that plus a screen, isn't there a bit of an issue here?
 
I was thinking about that last night. Nintendo is the only one out of the three that doesn't up the native resolution for their VC games. They really should this time do 720 and 1080p..

Personally I'd much rather they stick to native resolution but dramatically improve the framerates, especially on N64 games. I know that would be a challenge since some games' timing is tied to framerate like Mario Kart: Double Dash, but still.
 
I was thinking about that last night. Nintendo is the only one out of the three that doesn't up the native resolution for their VC games. They really should this time do 720 and 1080p..

I'd be perfectly fine with 720p tbh. If they somehow reach for 1080p then that's even better but this is Nintendo after all. They could give us the original resolution and be done with it :P
 
5W is actually on the high end for all of that, and 80% efficiency is on the low end for a low-power device. Really, I was being generous. Just think about it: if the RAM, storage, ports (which was reaching considering that none of them were in use at the time), wireless and fan use that much energy in your mind, and Switch has all of that plus a screen, isn't there a bit of an issue here?

I haven't said anything about the Switch's power draw, my point is your assertion that the Tegra X1 draws 15W needs backing up.

Nvidia says it draws "under 10W" in the Shield

https://shield.nvidia.com/blog/tegra-x1-processor-and-shield

Anandtech says the GPU draws 1.51W on sustained load, being the single biggest source of power draw in an SoC. You're suggesting the rest of the SoC, of which the GPU probably occupies at least half the space, consumes 10x what the GPU does.

http://www.anandtech.com/show/8811/nvidia-tegra-x1-preview/3

Until you have proof the X1 actually drew 15W on average under load, it's guesswork based on a larger platform in the Shield TV.

Since you're bringing the Switch into that, again, the Shield TV was on 20nm, etc etc.
 
5W is actually on the high end for all of that, and 80% efficiency is on the low end for a low-power device. Really, I was being generous. Just think about it: if the RAM, storage, ports (which was reaching considering that none of them were in use at the time), wireless and fan use that much energy in your mind, and Switch has all of that plus a screen, isn't there a bit of an issue here?

How where none of them in use at the time? Might be misreading your post here but it sounds like you just said RAM wasn't in use in a benchmark.
 
I don't see any reason why Nintendo couldn't run any Gamecube or Wii games at 1080p with their original framerate on Switch. Judging the hardware potential by completely un optimised games running on a fan made emulator on top of a non Nintendo OS is foolish.

Aspect ratio. All Gamecube games run in a 4:3 aspect ratio. If you changed them to 16:9 then you would either be stretching the original image to fit, meaning all bitmapped elements of the game (HUD etc) would look incredibly distorted and blurry, or you risk unintended consequences of having a larger field of view. For example, Metroid prime aggressively culls anything that is outside of the player's FOV, but (Based on VR mods for Dolphin) this FOV cull value is hard-coded and can not be changed without significant work (I.e it doesn't just update when you change the resolution.)

In order to fix this, and have the games play in what you may call "Proper" 1080p, actual maintenance would have to be done to make them suitable. That is probably not cost effective for a Virtual Console title, and is more fitting of a "HD Remaster" job.
 
Aspect ratio. All Gamecube games run in a 4:3 aspect ratio. If you changed them to 16:9 then you would either be stretching the original image to fit, meaning all bitmapped elements of the game (HUD etc) would look incredibly distorted and blurry, or you risk unintended consequences of having a larger field of view. For example, Metroid prime aggressively culls anything that is outside of the player's FOV, but (Based on VR mods for Dolphin) this FOV cull value is hard-coded and can not be changed without significant work (I.e it doesn't just update when you change the resolution.)

In order to fix this, and have the games play in what you may call "Proper" 1080p, actual maintenance would have to be done to make them suitable. That is probably not cost effective for a Virtual Console title, and is more fitting of a "HD Remaster" job.

So upres it while maintaining the aspect ratio
 
GPU performance is largely irrelevant for emulation. It's mostly a factor of CPU performance (and not generally parallelizable across more than a couple of cores).

It showcases the big performance gap between the two devices.

And even in the worst case Nintendo could still provide GameCube VC games through game by game optimized emulation.
 
I'm curious..

Unlikely hypothetical situation: Although its supposedly a custom chip If the Switch ends up using something similar to Tegra X2/Pascal with 1.5 TFLOPS FP16/.75 TFLOPS FP32, averaging about err 1.12 TFLOPS on paper, with a CPU that's like 25% better than xbone one(LC GEEK rumor from way back when), with lets say 6-8GB RAM..

How would it match head to head vs Xbone and PS4 framerates and resolution?

Having a strong CPU is integral for framerate stability especially, while GPU with resolution, right? I haven't even accounted bandwidth, and we can't really factor in architectural differences of the gpu with NVIDIA chip being more different and modern.
 
I'm curious..

Unlikely hypothetical situation: Although its supposedly a custom chip If the Switch ends up using something similar to Tegra X2/Pascal with 1.5 TFLOPS FP16/.75 TFLOPS FP32, averaging about err 1.12 TFLOPS on paper, with a CPU that's like 25% better than xbone one(LC GEEK rumor from way back when), with lets say 6-8GB RAM..

How would it match head to head vs Xbone and PS4 framerates and resolution?

Having a strong CPU is integral for framerate stability especially, while GPU with resolution, right? I haven't even accounted bandwidth, and we can't really factor in architectural differences of the gpu with NVIDIA chip being more different and modern.

It doesn't work like that. It depends on if the devs takes advantage of fp16. Some engines, like UE4, are designed to use fp16, while others may not even use it since it doesn't have the same benefits for the other consoles. If the CPU is more powerful than the other consoles, that can be used to move some tasks away from the less powerful GPU. Physics and AI could even surpass the other consoles.

Whatever specs the GPU ends up, ports should still be runable without severe optimization, but the system will greatly benefit from devs taking full advantage of its feature set. It will likely not reach XB1's raw power, but it will be able to render some impressive visuals.
 
I know it likely won't. Realistically speaking I'm expecting 500 GFLOPS min docked to 1 TFLOP MAX with 4GB of RAM as Emily Rodgers said since October.. Just was wondering how it would be against the other two in a hypothetical scenario.
 
It doesn't work like that. It depends on if the devs takes advantage of fp16. Some engines, like UE4, are designed to use fp16, while others may not even use it since it doesn't have the same benefits for the other consoles. If the CPU is more powerful than the other consoles, that can be used to move some tasks away from the less powerful GPU. Physics and AI could even surpass the other consoles.

Whatever specs the GPU ends up, ports should still be runable without severe optimization, but the system will greatly benefit from devs taking full advantage of its feature set. It will likely not reach XB1's raw power, but it will be able to render some impressive visuals.

That was something I was thinking about recently. If those tasks can be moved away from the GPU it not only makes the tasks run overall better it should make the system punch above its proverbial weight.
 
Aspect ratio. All Gamecube games run in a 4:3 aspect ratio. If you changed them to 16:9 then you would either be stretching the original image to fit, meaning all bitmapped elements of the game (HUD etc) would look incredibly distorted and blurry, or you risk unintended consequences of having a larger field of view. For example, Metroid prime aggressively culls anything that is outside of the player's FOV, but (Based on VR mods for Dolphin) this FOV cull value is hard-coded and can not be changed without significant work (I.e it doesn't just update when you change the resolution.)

In order to fix this, and have the games play in what you may call "Proper" 1080p, actual maintenance would have to be done to make them suitable. That is probably not cost effective for a Virtual Console title, and is more fitting of a "HD Remaster" job.
Wouldn't the textures also need work? I think that's another point to consider.
 
Aspect ratio. All Gamecube games run in a 4:3 aspect ratio. If you changed them to 16:9 then you would either be stretching the original image to fit, meaning all bitmapped elements of the game (HUD etc) would look incredibly distorted and blurry, or you risk unintended consequences of having a larger field of view. For example, Metroid prime aggressively culls anything that is outside of the player's FOV, but (Based on VR mods for Dolphin) this FOV cull value is hard-coded and can not be changed without significant work (I.e it doesn't just update when you change the resolution.)

In order to fix this, and have the games play in what you may call "Proper" 1080p, actual maintenance would have to be done to make them suitable. That is probably not cost effective for a Virtual Console title, and is more fitting of a "HD Remaster" job.

I'd imagine any gamecube vc titles would have to be in 4:3 with black bars at the sides
 
Status
Not open for further replies.
Top Bottom