Nintendo Switch: Powered by Custom Nvidia Tegra Chip (Official)

It has active cooling when docked, going by the leaks and the vents on the device.

This doesn't make any sense. Shield TV is capable of rendering at higher resolution. Pixel C is capable of rendering at higher resolution. Those statement go against most of the know real world usage cases of recent Tegra chips.

So is the rumored device. Is just if you want PS4/XBOne games at similar framerates, that's the compromise.

If he's legit, I'm thinking he talking about an specific scenario (a game?), where the effective performance they are getting 1/3 XBOne. Deferred rendering or some other high-bandwidth dependent scenario could have that effect, while other shading scenarios would offer different compromises.
 
Some poster at Anandtech forums (zlatan), which apparently has some reputation over there, is saying some interesting stuff that seem feasible about the Switch:

  • Is based on Pascal.
  • Memory speed is the weakest link, capping it's performance at 1/3 of the XBOne.
  • Tegra X1 is 1/8 of the XBOne, so the Switch is considerably more powerful.
  • Most (third party?) games will run at 540p, which is the minimum resolution allowed by Nintendo.
  • No up-clocking when docked, but that can be easy to implement if Nintendo choose to.
  • The dock has a dedicated upscalling chip.
All of this sound pretty reasonable to me, except it would be the first Nintendo console since N64 that is bottlenecked by its memory setup, which sounds pretty anti-Nintendo to me. Maybe this device is more Nvidia than Nintendo?

The 540p is a downer, that's why some of us wanted that resolution for the device in the first place!

Tales out of my ass Part deux.

Someone else posted this already.

Huge grain of salt:
Pascal based.
3 times less powerful than XB1 mainly by being greatly memory bandwidth starved.
504p is the minimun resolution required by Nintendo and what most games will use:

https://forums.anandtech.com/threads...#post-38530757

Non-sense.
 
He seems to think that Memory Bandwidth caps the power of the Tegra X1 at 1/8th - 1/10th of the Bone according to his post. And the Switch chip at 1/3rd the Bone despite the rest of the hardware being better. Which quite frankly sounds ridiculous.

He's basically dividing the total combined bandwidth of XBox One main memory + eSRAM (201GB/s) by the main memory bandwidth of X1 (25Gb/s) to come up with 1/8th. That's silly enough as it is (including eSRAM bandwidth but ignoring the fact that every GPU will have some embedded memory which ease main memory access), but he's then mixing X1 (Maxwell GPU) up with Parker (Pascal GPU). The later is faster and has much more memory bandwidth. In short he doesn't seem to have a clue.
 
He seems to think that Memory Bandwidth caps the power of the Tegra X1 at 1/8th - 1/10th of the Bone according to his post. And the Switch chip at 1/3rd the Bone despite the rest of the hardware being better. Which quite frankly sounds ridiculous.

As oversimplified as Gflops are as a means of comparing console performance, comparing a Maxwell/Pascal GPU to a GCN GPU based on memory bandwidth is straight-up crazy, given the vastly different ways in which they access memory, and the potential impact a sizable L2/L3 cache could have on bandwidth consumption. Even if this person does have insider info on Switch, I'd be extremely hesitant to read too much into any analysis derived from memory bandwidth.

I wouldn't be surprised, though, to see some of the more intensive third party games run at sub-HD resolutions when in handheld mode (ie the games which are running at 720p on XBO), but I don't expect that to be much of an issue given the screen size. If people are happy to play 720p games on a 60" screen I can't see them having that much of a problem with a 540p game on a 6" screen.

It appears the air inlets for the actual console are in the lower back, pretty much exactly where the dock has a massive indentation. It's never shown in the teaser, but I assume the dock also has vents where the indentation is, which would suggest a fan in the actual unit that, while docked, draws air through those vents in the dock, through the air inlets in the actual unit, and out via the vent at the top.

Yeah, that looks to be the way it operates. I'll be really interested to see the inevitable teardowns on this.
 
Some poster at Anandtech forums (zlatan), which apparently has some reputation over there, is saying some interesting stuff that seem feasible about the Switch:

  • Is based on Pascal.
  • Memory speed is the weakest link, capping it's performance at 1/3 of the XBOne.
  • Tegra X1 is 1/8 of the XBOne, so the Switch is considerably more powerful.
  • Most (third party?) games will run at 540p, which is the minimum resolution allowed by Nintendo.
  • No up-clocking when docked, but that can be easy to implement if Nintendo choose to.
  • The dock has a dedicated upscalling chip.
All of this sound pretty reasonable to me, except it would be the first Nintendo console since N64 that is bottlenecked by its memory setup, which sounds pretty anti-Nintendo to me. Maybe this device is more Nvidia than Nintendo?

The 540p is a downer, that's why some of us wanted that resolution for the device in the first place!


There's a lot that was going on with Nvidia, they had a wafer agreement to buy a certain number of Tegra dies, no one was buying Tegra, they tried making Shields to get rid of some, nowhere near enough sold...It was all the perfect storm for Nintendo to take advantage of.

So to the bolded...I wouldn't be surprised if it's a more standard solution than usual for Nintendo. It seems a whole lot like an updated Shield Portable conceptually.

I do hope there's better news on the bandwidth story though. Pascal Tegra does support separate VRAM.
 
Around 440 GFLOPS for the GPU sounds pretty credible to me (FP32 of course).
896x504 is 451k pixels which is around the third of the 1600x900 resolution (1.44 million pixels).
 
So is the rumored device. Is just if you want PS4/XBOne games at similar framerates, that's the compromise.

If he's legit, I'm thinking he talking about an specific scenario (a game?), where the effective performance they are getting 1/3 XBOne. Deferred rendering or some other high-bandwidth dependent scenario could have that effect, while other shading scenarios would offer different compromises.


zlatan said:
I don't familiar with NV planed product line. I can say that this is an SoC with a Pascal GPU. It's fast compared to other mobile products, but nowhere near as good as the Xbox One SoC. But hey, it consumes a lot less power. :) It is a very good product for the mobile market

Tales from his ass. He has no idea about Tegra Parker

Edit2: I think this are useful and maybe would help keep the discussion away from dumb assumptions:

NVIDIA-Tegra-Parker-SOC_Specs-1.png
 
It appears the air inlets for the actual console are in the lower back, pretty much exactly where the dock has a massive indentation. It's never shown in the teaser, but I assume the dock also has vents where the indentation is, which would suggest a fan in the actual unit that, while docked, draws air through those vents in the dock, through the air inlets in the actual unit, and out via the vent at the top.

Ah if that's the case it's a very elegant setup, as LordOfChaos said. Can't wait to get some more images of this device and dock!

Cool, which part of the video could you see the bottom in?

This is all proving to be surprisingly close to what I had envisioned when the first leaks of a dockable tablet were mentioned, hah. A tablet, with a pass-through heatsink but no fan, that a dock with a fan could blow through to make faster.
 
I have to agree with bytesized here, for obvious reasons.

I mean.. Guys.. This is Nintendo. You know surprises are never in the good direction with them.

You actually think, there is universe where Nintendo will show a worldwide reveal trailer with games that are way below what the machine can do, for reasons.. And then suddenly in 5 months everything will be way better ?

The sad truth is, i remember everyone in those speculation threads talking about what would be the minimum tegra performance for that range of price etc.. That was not counting on Nintendo's philosophy.

Which may have been from the start "having the WiiU in portable form is INCREDIBLE and people will be AMAZED at it. It's like 100x times more advanced than a 3ds for christ sake. So let's have that. Let's have a console that has the same graphics on tv and in portable mode cause that will be amazing. So if we have WiiU graphics in the portable, let's scale the tv mode to that for parity.

Or of course the console is super powerful and showing 3 games (4 with Mario but some seems to disagree) with exact WiiU graphics for the worldwide reveal may have been a sad mistake from them. Not even counting the absolutely abysmal message it sent everyone "Hey, This is the wiiu again".
 
Around 440 GFLOPS for the GPU sounds pretty credible to me (FP32 of course).
896x504 is 451k pixels which is around the third of the 1600x900 resolution (1.44 million pixels).

How is google pixel C touted as having around 1Tflop of power, but the switch is weaker with more ram and a more modern iteration of the same chip?
 
So now, from the beginning of their existence about 23 years ago, Nvidia is batting 3 for 5 as far as game system chip design wins.

NV2 chip for Sega during 1995-96, intended for a new console to replace the Saturn
original XBOX GPU: NV2A (plus the MCPX)
PlayStation 3 GPU - RSX
Tegra 2 in early 3DS devkits
Nintendo Switch - custom Tegra SoC
 
How is google pixel C touted as having around 1Tflop of power, but the switch is weaker with more ram and a more modern iteration of the same chip?

That's half precision FP16, so around 500Gflops single precision. The above 440 FP32 would be around 880 FP16.

Nvidia started the whole mess of comparing FP16 numbers when we've traditionally used FP32, though new architectures can take advantage of half precision.
 
Around 440 GFLOPS for the GPU sounds pretty credible to me (FP32 of course).
896x504 is 451k pixels which is around the third of the 1600x900 resolution (1.44 million pixels).

That range of flops isn't incredible no. Its just the fact that the other info the guy comes out with is complete rubbish and quite obviously based on flawed logic (dividing certain bandwidth numbers linearly to get performance and also doesn't know the difference between Pascal and Maxwell).
 
How is google pixel C touted as having around 1Tflop of power, but the switch is weaker with more ram and a more modern iteration of the same chip?
How is the rumor saying that the Switch is less powerful than the Pixel C? Hell, if accurate, it would be the most powerful handheld device on the market by a fair margin.
 
"504p"

So credible.

To be fair, 504p (896 x 504) is a true 16:9 resolution that is divisible by 8. Most mobile games run at a resolution far below their display resolution in order to save battery life/not overheat, sometimes with very odd numbers. The Playstation Vita for example had a native resolution of 960 x 544, yet many notable titles (Uncharted, Gravity Rush, Soul Sacrifice, Assassin's Creed, P4G, and LBP to name a few) run at 720 x 408. A few other games even run at 640 x 384. Many people wanted the Switch to have a 540p display for exactly this reason, as that was a much more realistic target for native res games in portable mode, while also being easy to scale up to 1080p (multiply by 4) for console mode.

I'm sure Nintendo's games will run at native res though, maybe not BotW, but Mario/Kart should.
 
There is nothing that guy says that stands out as overly credible. If it fits a mental bias then sure it may seem legit and no reason to question it.

Having even a hint of what Tegra X1 and Parker can do would quickly reveal the guy is full of shit
 
That's half precision FP16, so around 500Gflops single precision. The above 440 FP32 would be around 880 FP16.

Nvidia started the whole mess of comparing FP16 numbers when we've traditionally used FP32, though new architectures can take advantage of half precision.

Worth mentioning again however that Tegra can actually make good use of FP16, so its not a useless number in that respect.
 
There is nothing that guy says that stands out as overly credible. If it fits a mental bias then sure it may seem legit and no reason to question it.

Having even a hint of what Tegra X1 and Parker can do would quickly reveal the guy is full of shit
on paper specs anyone can say Tegra X1 is near 1/3 of XB1.Another thing is the improvement a Pascal version on 16nm could bring.
 
We really need to know if all the Docking station does is upclock the cpu and gpu and have the fans run faster (outside of additional ports)... OR if it actually contains an additional GPU or some sort of additional processing that allows it to gain more performance outside of the standard clock up.
I sort of expect a new dock to go on sale holiday 2018 that is an SCD as Nintendo's mid-gen refresh. You know, "Upgrade your switch without having to buy a whole new system."
 
I have to agree with bytesized here, for obvious reasons.

I mean.. Guys.. This is Nintendo. You know surprises are never in the good direction with them.

You actually think, there is universe where Nintendo will show a worldwide reveal trailer with games that are way below what the machine can do, for reasons.. And then suddenly in 5 months everything will be way better ?

The sad truth is, i remember everyone in those speculation threads talking about what would be the minimum tegra performance for that range of price etc.. That was not counting on Nintendo's philosophy.

Which may have been from the start "having the WiiU in portable form is INCREDIBLE and people will be AMAZED at it. It's like 100x times more advanced than a 3ds for christ sake. So let's have that. Let's have a console that has the same graphics on tv and in portable mode cause that will be amazing. So if we have WiiU graphics in the portable, let's scale the tv mode to that for parity.

Or of course the console is super powerful and showing 3 games (4 with Mario but some seems to disagree) with exact WiiU graphics for the worldwide reveal may have been a sad mistake from them. Not even counting the absolutely abysmal message it sent everyone "Hey, This is the wiiu again".

my head hurts
 
I have to agree with bytesized here, for obvious reasons.

I mean.. Guys.. This is Nintendo. You know surprises are never in the good direction with them.
While I'd normally tend to agree, this thing was developed by a very different team. This was an NTD joint, whereas previous handhelds have been designed in Japan. On top of that, even the current NTD isn't what it used to be. A lot of the old guard left a couple years back, including their former lead architect and their head of system architecture. The new boss of NTD is a former Nvidia guy, and he brought some talent with him. Folks who worked on Tegra, and the iPod before that. The project lead on NX is a former Qualcomm lead engineer.

It's obviously no guarantee, but certainly encouraging. These folks know mobile. I don't know how much weight "this is Nintendo" has given those circumstances.
 
Not that I believe it (why not leak it sooner?), but 440 flops is still nearly 3x more powerful than Wii U. Would make for some nice looking nintendo games.
 
So I can't understand console power unless we compare it to DBZ power levels. Is my understanding of the switch about right in DBZ terms?
PC = Pissed off SSJ2? Trunks
Scorpio = SSJ God Goku
PS Pro = SSJ God Vegeta
PS4 = SSJ3 Goku
XBONE = SSJ2 Gohan
Switch = SSJ Goku
Wii U = Krillin

Fixed as of latest Super eps :D
 

To add to that, seemed like the dock may blow air in the bottom here and it goes through the top. I believe Emily that it's actively cooled, my question is, is the fan in the dock or the tablet.

CvQDl4ZUMAAztKJ.jpg:large




Leaving the fan in the dock would be pretty elegant imo. Pass through a relatively large heatsink, while not having a moving part in the expensive and battery bound tablet.
 
Active cooling is assuredly happening. X1 and Parker while rather efficient for what they are still have not reached passive cooling stage. The only way its passively cooling is a pretty sizeable drop in performance when undocked.
 
There is nothing that guy says that stands out as overly credible. If it fits a mental bias then sure it may seem legit and no reason to question it.

Having even a hint of what Tegra X1 and Parker can do would quickly reveal the guy is full of shit

We, he's indeed saying stuff that makes sense to me, like a Tegra (any Tegra or mobile chip for that matter) being bandwidth-limited when compared to the consoles, this having a negative effect for rendering scenarios that require high bandwidth, which happens to be many console games.

Now, I would be very happy if Nintendo included a pool of faster VRAM so that this wouldn't be an issue! But I'm kind skeptic on a handheld approaching the XBOne GPU power at this point in time.
 
Active cooling is assuredly happening. X1 and Parker while rather efficient for what they are still have not reached passive cooling stage. The only way its passively cooling is a pretty sizeable drop in performance when undocked.

It depends entirely on the clock speeds. A Pascal 2 SM GPU at around 600-700MHz should be quite doable with passive cooling. In fact the TX1 was passively cooled in the Pixel C at around those clock speeds, although in a more favourable cooling environment than Switch (ie a large aluminium chassis to dissipate heat).

But with the vents it definitely looks like there's active cooling in there.
 
For the ones who are starting theorizing for nothing on the dock, you can stop, the dock... is just a dock...

Nintendo: Switch dock only for charging and TV-out, Amiibo support, no commt on touch #1
Thanks to casey_contra

Originally Posted by casey_contra

From Nintendo themselves:

"The main function of the Nintendo Switch Dock is to provide an output to the TV, as well as charging and providing power to the system."


Source: http://www.ign.com/articles/2016/10/...ional-features

Maybe a non-junior wants to create a new thread? Or is this not worthy?

Nintendo continued: "The dock is not the main console unit of Nintendo Switch. The main unit of Nintendo Switch is the unit that has the LCD screen, which the two Joy-Con controllers can be attached to and detached from. The main function of the Nintendo Switch Dock is to provide an output to the TV, as well as charging and providing power to the system."

Last edited by hohoXD123; Yesterday at 08:34 PM
 
Active cooling is assuredly happening. X1 and Parker while rather efficient for what they are still have not reached passive cooling stage. The only way its passively cooling is a pretty sizeable drop in performance when undocked.

Yeah, my Pixel C isn't actively cooled. Drastically different form factor, but still.
 
Now, I would be very happy if Nintendo included a pool of faster VRAM so that this wouldn't be an issue! But I'm kind skeptic on a handheld approaching the XBOne GPU power at this point in time.

Are you talking about approaching Xbox One power on paper, or on screen? Because those are two very different things. If the Switch uses Pascal and 128bit LPDDR4, it should have ~50Gb of bandwidth, which is 75% of the memory bandwidth of the main pool in the Xbox One. The XB1 also has ESRAM, but we aren't familiar enough with the switch SOC design to know if Nintendo has any on die memory. Could it approach the Xbox One if Nintendo clocks it high enough while docked? I think the math says it could, but we'll see what people making games for it end up saying.
 
You can see exhaust vents on the top if you look close enough. What I would imagine they're doing is, as mentioned, underclocking the console whilst in portable mode and running the console in full speed, thus activating the active cooling, when docked. That way, they can maintain a similar image quality from the assumed 720p mobile display to the 1080p found on home televisions.

Exactly. If Nintendo is serious about this being a home console first, 1080p in dock mode would be perfect. Time will tell.

Just a slight clarification on language, but I don't believe she's saying there's processing in the dock, it appears she's saying that there's extra performance while docked. It would definitely point to increased clock speeds while docked, as people have been speculating.

Ah, okay gotchya. It really is a no brainer to take this route, and I'm really happy to hear that the dock could aid with power in some manner.
 
We, he's indeed saying stuff that makes sense to me, like a Tegra (any Tegra or mobile chip for that matter) being bandwidth-limited when compared to the consoles, this having a negative effect for rendering scenarios that require high bandwidth, which happens to be many console games.

Now, I would be very happy if Nintendo included a pool of faster VRAM so that this wouldn't be an issue! But I'm kind skeptic on a handheld approaching the XBOne GPU power at this point in time.

http://www.anandtech.com/show/10596/hot-chips-2016-nvidia-discloses-tegra-parker-details
Outside of the CPU, NVIDIA has added some new features to Parker such as doubling memory bandwidth. For the longest time NVIDIA stuck with a 64-bit memory bus on what was essentially a tablet SoC lineup, which despite what you may think from the specs worked well enough for NVIDIA, presumably due to their experience in GPU designs, and as we've since learned, compression & tiling. Parker in turn finally moves to a 128-bit memory bus, doubling the aggregate memory bandwidth to 50GB/sec (which works out to roughly LPDDR4-3200).

http://www.realworldtech.com/tile-based-rasterization-nvidia-gpus/
Starting with the Maxwell architecture, Nvidia high-performance GPUs have borrowed techniques from low-power mobile graphics architectures. Specifically, Maxwell and Pascal use tile-based immediate-mode rasterizers that buffer pixel output, instead of conventional full-screen immediate-mode rasterizers. Using simple DirectX shaders, we demonstrate the tile-based rasterization in Nvidia’s Maxwell and Pascal GPUs and contrast this behavior to the immediate-mode rasterizer used by AMD.

Using tiled regions and buffering the rasterizer data on-die reduces the memory bandwidth for rendering, improving performance and power-efficiency. Consistent with this hypothesis, our testing shows that Nvidia GPUs change the tile size to ensure that the pixel output from rasterization fits within a fixed size on-chip buffer or cache.

Comparing just the memory bandwidth of an AMD APU with the memory bandwidth of a Nvidia SoC is not as relevant as it seems at first glance.
 
Yeah, my Pixel C isn't actively cooled. Drastically different form factor, but still.

But it most likely throttles. Not something you want from a dedicated gaming device.

Phones (and tablets, though less so) are generally optimized for very bursty tasks, not being placed under continuous load. Given the tiny enclosure and the power of the device, it seems prudent to actively cool it.
 
Because at the absolute low end of Tegra Maxwell with no active cooling (so downclocked) you still have a chip over twice the performance of WiiU.

Then you have the fact that Switch is actively cooled, therefore logically not downclocked. A quick look at Mario and Zelda doesn't tell you as much as basic facts about the kind of hardware Nintendo are using. There's no way that hardware is anything but significantly more powerful than WiiU.

That doesn't tell us anything. I doubt any of those games are running on final hardware. They used Xbox 360 footage for alot of Wii U pre release promos for example.
 
For the ones who are starting theorizing for nothing on the dock, you can stop, the dock... is just a dock...

"The main function of the Nintendo Switch Dock is to provide an output to the TV, as well as charging and providing power to the system."

Doesnt mean there isnt some type of processing in there to facilitate that better rather than increasing performance.
 
There are two controllers: Full, and split.
There are no observable differences between the Pro Controller and the Joycons together. All the buttons are the same, and short of gyro *possibly* being in only one of those two, they are identical not only in capabalities but also button nomenclature. This isn't a case where you have to worry about the fact that the Classic Controller doesn't have clicking sticks, the CCP doesn't have analog triggers, the Wiimote is... the Wiimote... and the Gamepad has a screen and none of the rest do. The only difference between the Pro and the Joycons is that the Pro has different ergonomics.
The single joycons are only there for lighter fare multiplayer, to give some multipurpose to the default controller in a way that gives this handheld some local multiplayer right out of the box.

I don't think controller support will be an issue.

There is a fairly big difference. The joycons do not have a true dpad, only buttons with arrow labels. The pro controls has a true dpad. While I'm not too worried about it because I assume they will sell a joycon with a true dpad separately. At least I hope so.
 
Comparing just the memory bandwidth of an AMD APU with the memory bandwidth of a Nvidia SoC is not as relevant as it seems at first glance.

Oh, do know that Nvidia is more efficient than AMD when it comes to bandwith (and overall power)! But its advantages have to be used correctly by the developer themselves, as the performance differences between Dx12 and Dx11 PC games on Nvidia hardware have shown.

I think that is quite probable for a small, time and budget limited Activision/Ubisoft/ect team to not get the most of a lower specced Nvidia machine when tasked to port code designed for the AMD twins.
 
Oh, do know that Nvidia is more efficient than AMD when it comes to bandwith (and overall power)! But its advantages have to be used correctly by the developer themselves, as the performance differences between Dx12 and Dx11 PC games on Nvidia hardware have shown.

I think that is quite probable for a small, time and budget limited Activision/Ubisoft/ect team to not get the most of a lower specced Nvidia machine when tasked to port code designed for the AMD twins.

The bolded isn't true at all (at least when it comes to bandwidth). The main differentiator between the two (tile-based rendering on Nvidia) is completely invisible to software, and wasn't even known about until a year or two after the hardware had already been released.
 
Top Bottom