Nintendo Switch: Powered by Custom Nvidia Tegra Chip (Official)

I follow you just fine. I think we're on different subjects. You were talking about the deficiencies in memory bandwidth between the Switch and the consoles, and somehow that became a tangent about API's.

There is no chance the Switch is as powerful as the PS4 and Xbox One. My point was that the bandwidth could be made less of a handicap if they used embedded memory or a sizeable cache setup. The source you pulled from Anandtech seems to want to make a direct comparison between the raw numbers, which can be misleading with how differently Tegra and AMD GCN manage memory. I don't think the Switch is some beast of a Pascal GPU that is held back by it's paltry memory bus. It's a much more balanced system than Maxwell.

How the Nintendo API stacks up to DX12 and GNM/GNMX is an open question. The fact that Nvidia had low level OS access and assisted with the tool chain is promising.
I apologize for the last sentence, it wasnt warranted. If the leaks are legit and the dev kits were off the shelves TX1, maybe the devs were more bandwith limited than what the retail units will be? And I wonder how much the API is based on Metal.

And thinking about the docking situation, perhaps even Nintendo is still undecided, so might not be wise for me to close the possibility. Cheers everyone!
 
The Xbone uses a Jaguar x86 processor, 8-core. The Jaguar is a mess of a CPU. Slow, processes badly, and all around mess from AMD (and I LIKE their processors). It was a big complaint about this (and the PS4) when their specs were announced. A good enough ARM process could be faster and more stable than the Jaguar, but we don't know the actually CPU in the Switch.

I thought jaguar was considered a pretty solid, well designed processor. Sebbbi talked pretty positively about it around the beginning of this generation.
 
I thought the jaguar was considered a pretty solid processor. Sebbbi talked pretty positively about it around the beginning of this generation.

Single threaded performance in Jaguar is pretty horrible. Developers have found way around it by moving certain threads over to the GPU, but it usually comes at the price of resolution or performance in some way. The ARM cores in the Switch should outperform individual PS4 and XB1 cores by pretty much every metric clock for clock. The real question mark will be where it ends up clocked at for both docked and handheld mode.
 
Single threaded performance in Jaguar is pretty horrible. Developers have found way around it by moving certain threads over to the GPU, but it usually comes at the price of resolution or performance in some way. The ARM cores in the Switch should outperform individual PS4 and XB1 cores by pretty much every metric clock for clock. The real question mark will be where it ends up clocked at for both docked and handheld mode.

Right, but I'm mostly responding to it being called a mess. Everything I've read seems to suggest it was a well designed, balanced architecture.
 
Single threaded performance in Jaguar is pretty horrible. Developers have found way around it by moving certain threads over to the GPU, but it usually comes at the price of resolution or performance in some way. The ARM cores in the Switch should outperform individual PS4 and XB1 cores by pretty much every metric clock for clock. The real question mark will be where it ends up clocked at for both docked and handheld mode.
A technical advantage over the other consoles? Oh boy.
 
Right, but I'm mostly responding to it being called a mess. Everything I've read seems to suggest it was a well designed, balanced architecture.

Do you have a source on that? I know there was a lot of optimism around the GPU power in the consoles, but I think the CPU grunt was always in question. The Jaguar Cores are essentially the same thing found in the AMD Athlon 5150, and it was easily outperformed in single threaded benchmarks by ultra low voltage intel parts destined for ultrabooks the same year it was released in the PS4 and XB1. In fact, the quad core variety released for PC's was handily defeated even in multi-threaded benchmarks by the Intel Atom Chips of the same release period.

Here's a review of the desktop class version of Kabini, which is the quad core Jaguar parts. Needless to say, it isn't great. The version reviewed here is the Athon 5350, which is actually clocked pretty close to the updated cores in the PS4 Pro.

http://www.anandtech.com/show/7933/the-desktop-kabini-review-part-1-athlon-5350-am1/3
 
It's seems you're in denial that handhelds have been on the decline for the past decade. I would put a statistic but I think you would just ignore it so I won't bother anymore with this conversation.
I'd say that's preeeetty wrong. DS was one of the biggest gaming machines of any kind ever and 2006-2010 was its heyday while PSP did alright beside it, so that was probably peak portable gaming. Definitely not actively in decline for the last 10 years. 3DS hasn't done as well, but it's not like pre-DS was some time of handheld obscurity either.
 
I think the wrong questions are being asked, concerning the possibility of a Pascal based Tegra(custom) GPU in combination with whatever ARM based CPU the Switch has.

Can it or will it match Xbox 1 when it comes to overall scene complexity?

Will it just fail at matching current gen consoles amount of shaders?

Doesn't the memory need to be high bandwidth in order to render at 1080p with high scene complexity and shaders?
 
I am trying to think of a popular device that is not made by Nvidia, that has their own Tegra chips inside them.

I had a Sony Tablet S which was terrible and never took off.

There was that Motorola Phone that came out around 2010. It was, I believe one of the first devices to have a finger print reader, and it was placed on the back. It wasnt as good as touch id or nexus print.

The current Pixel tablet has the X1.

I am sure there are a few more phones. But they were not successful. I am not sure how the Shield devices are doing.

Switch is a big grab for them and their Tegra chips. However I feel like they barely caught on and I am wondering if Nvidia is going to keep making these.

You are forgetting the most successful one:

https://blogs.nvidia.com/blog/2012/06/27/tegra-3-brings-googles-nexus-7-tablet-to-the-masses-2/
 
Do you have a source on that? I know there was a lot of optimism around the GPU power in the consoles, but I think the CPU grunt was always in question. The Jaguar Cores are essentially the same thing found in the AMD Athlon 5150, and it was easily outperformed in single threaded benchmarks by ultra low voltage intel parts destined for ultrabooks the same year it was released in the PS4 and XB1. In fact, the quad core variety released for PC's was handily defeated even in multi-threaded benchmarks by the Intel Atom Chips of the same release period.

Here's a review of the desktop class version of Kabini, which is the quad core Jaguar parts. Needless to say, it isn't great. The version reviewed here is the Athon 5350, which is actually clocked pretty close to the updated cores in the PS4 Pro.

I'm not saying it's super fast.

Anyway..

From Agner Fog's optimization guide
http://www.agner.org/optimize/microarchitecture.pdf

Bottlenecks in Bobcat and Jaguar
The Bobcat and Jaguar have a well balanced pipeline design with no obvious bottlenecks other than what is obvious from the low power design.
...
AMD Bobcat microarchitecture
The AMD Bobcat has an efficient out-of-order pipeline with good performance and no
obvious bottlenecks.

Sebbbi said some nice things about it here.
https://forum.beyond3d.com/threads/...hnical-discussion.47227/page-322#post-1386073
 
Right, but I'm mostly responding to it being called a mess. Everything I've read seems to suggest it was a well designed, balanced architecture.

They are a mess. The 5350 struggles just running Windows, and that's with 8GB RAM and a dedicated GPU helping it out. (Source: I built a 5350 build as an HTPC, it is absolutely horrible.)
 
So, is it more powerful than the wii u? How much?
Depending on whether you look at the GPU or the CPU and the specific performance metric you're interested in, I'd expect it will be anywhere between 2 and 5 times as fast.

Of course, this also depends on the exact clock rates Nintendo will allow for games. There's a pretty huge range available there from battery-focused to performance-focused.
 
They are a mess. The 5350 struggles just running Windows, and that's with 8GB RAM and a dedicated GPU helping it out. (Source: I built a 5350 build as an HTPC, it is absolutely horrible.)
They are shit. That's why most modern console games games are CPU bound and not GPU bound according to tests by Digital Foundry. Also why in this generation of consoles there hasn't been any innovation in AI or Physics. You would think that by now tech like machine learning would be a part of all modern game ai.
 
They are a mess. The 5350 struggles just running Windows, and that's with 8GB RAM and a dedicated GPU helping it out. (Source: I built a 5350 build as an HTPC, it is absolutely horrible.)

When someone says mess, they usually mean messy, not slow. I never said it was fast, just that I'd never heard it described as a mess before.
 
Hehe. It took Nvidia 12 years to go from winning the Sony PS3 GPU deal to the Nintendo Switch SoC deal.

December 2004 ====> October 2016

http://www.nvidia.com/object/IO_17342.html

TOKYO and SANTA CLARA, CA—DECEMBER 7, 2004—Sony Computer Entertainment Inc. (SCEI) and NVIDIA Corporation (Nasdaq: NVDA) today announced that the companies have been collaborating on bringing advanced graphics technology and computer entertainment technology to SCEI’s highly anticipated next-generation computer entertainment system. Both companies are jointly developing a custom graphics processing unit (GPU) incorporating NVIDIA’s next-generation GeForce™ and SCEI’s system solutions for next-generation computer entertainment systems featuring the Cell* processor.

This collaboration is made under a broad, multi-year, royalty-bearing agreement. The powerful custom GPU will be the graphics and image processing foundation for a broad range of applications from computer entertainment to broadband applications. The agreement will encompass future Sony digital consumer electronics products.

“In the future, the experience of computer entertainment systems and broadband-ready PCs will be fused together to generate and transfer multi-streams of rich content simultaneously. In this sense, we have found the best way to integrate the state-of-the-art technologies from NVIDIA and SCEI,” said Ken Kutaragi, executive deputy president and COO, Sony Corporation, and president and Group CEO, Sony Computer Entertainment Inc. “Our collaboration includes not only the chip development but also a variety of graphics development tools and middleware, essential for efficient content creation.”

“We are thrilled to partner with Sony Computer Entertainment to build what will certainly be one of the most important computer entertainment and digital media platforms of the twenty-first century,” added Jen-Hsun Huang, president and CEO, NVIDIA. “Over the past two years NVIDIA has worked closely with Sony Computer Entertainment on their next-generation computer entertainment system. In parallel, we have been designing our next-generation GeForce GPU. The combination of the revolutionary Cell processor and NVIDIA’s graphics technologies will enable the creation of breathtaking imagery that will surprise and captivate consumers.”

The custom GPU will be manufactured at Sony Group’s Nagasaki Fab2 as well as OTSS (joint fabrication facility of Toshiba and Sony).

Note:
* “Cell” is the code-name for an advanced microprocessor under development by IBM, Toshiba and Sony Group.

https://blogs.nvidia.com/blog/2016/10/20/nintendo-switch/

OCTOBER 20, 2016

NVIDIA Technology Powers New Home Gaming System, Nintendo Switch
 
This is perhaps the most intriguing aspect of the Switch, to me.

PS4/XB1 levels of graphical power in a handheld device will do wonders for Nintendo games - especially after the 3DS.
 
PS4/XB1 levels of graphical power in a handheld device will do wonders for Nintendo games - especially after the 3DS.

You're not getting that. By virtue of being a portable, its more like a WiiU 1.6.
 
Do you have a source on that? I know there was a lot of optimism around the GPU power in the consoles, but I think the CPU grunt was always in question. The Jaguar Cores are essentially the same thing found in the AMD Athlon 5150, and it was easily outperformed in single threaded benchmarks by ultra low voltage intel parts destined for ultrabooks the same year it was released in the PS4 and XB1. In fact, the quad core variety released for PC's was handily defeated even in multi-threaded benchmarks by the Intel Atom Chips of the same release period.

Here's a review of the desktop class version of Kabini, which is the quad core Jaguar parts. Needless to say, it isn't great. The version reviewed here is the Athon 5350, which is actually clocked pretty close to the updated cores in the PS4 Pro.

http://www.anandtech.com/show/7933/the-desktop-kabini-review-part-1-athlon-5350-am1/3


I don't think anyone would dare think it's a screamer, it's single threaded performance is very low. But the other poster was saying people had called it 'balanced', and that I agree with actually...If you look at the integer performance, the floating point performance, the memory performance, the SIMD it has on tap, it's 'balanced' for what it is, what it is is just low end. For its class, it's probably far better thought out than Bulldozer.

It's the 2013 Microsoft's definition of Balanced, in short :P



Hehe. It took Nvidia 12 years to go from winning the Sony PS3 GPU deal to the Nintendo Switch SoC deal.

December 2004 ====> October 2016

http://www.nvidia.com/object/IO_17342.html



https://blogs.nvidia.com/blog/2016/10/20/nintendo-switch/


Yowza. Every company that had Nvidia as a console partner dumped it next generation. It feels like the Tegra department had some humble pie so we'll see if this trend is broken.
 
You're not getting that. By virtue of being a portable, its more like a WiiU 1.6.
By virtue of being a portable it can demonstrate PS4/XB1 levels of 'graphical power' per its target resolutions.
 
Sony and MS were super disappointed in them, so...they ain't great.

One wonders why Sony didn't want to do something about that with the Pro

Depending on whether you look at the GPU or the CPU and the specific performance metric you're interested in, I'd expect it will be anywhere between 2 and 5 times as fast.

Of course, this also depends on the exact clock rates Nintendo will allow for games. There's a pretty huge range available there from battery-focused to performance-focused.

We should ask Charles from SA. Ironically he was the first one to get this right

You're not getting that. By virtue of being a portable, its more like a WiiU 1.6.

Nah.
 
I'm noticing a trend there. Which is also why the video of all the people playing on their Switch in public instead of on their phones is just not going to happen.
The only use cases we saw for that in the video was for target age core gamer men taking their console games (Zelda, Skyrim) and continuing them on a plane or at the park, or for people playing local multi with multiple joycons (Mario Kart, Splatoon) and even multiple Switches in one instance (NBA 2K). Nintendo wasn't targeting the mobile marketplace in this ad pretty consciously, even if the platform ends up fully capable of servicing those sorts of games.
 
Yowza. Every company that had Nvidia as a console partner dumped it next generation. It feels like the Tegra department had some humble pie so we'll see if this trend is broken.
The Tegra team never landed a console contract before, except for Android microconsoles (Ouya, Project Mojo). They did lose the 3DS contract though due to Tegra 2 turning out to be such a battery hog.
 
This is perhaps the most intriguing aspect of the Switch, to me.

PS4/XB1 levels of graphical power in a handheld device will do wonders for Nintendo games - especially after the 3DS.


What in the world? Are people actually expecting XB1/PS4 level graphics on this machine?
 
By virtue of being a portable it can demonstrate PS4/XB1 levels of 'graphical power' per its target resolutions.

That's not really how anything works. If the system really has 4 GB RAM like the rumors it won't be able to do most things modern games can. Performance doesn't linearly scale with resolution, and there are upper and lower limits to that scaling.
To be more precise, performance is a combination of how CPU/GPU/RAM interact with each other and how a game manages, displays and uses its assets. The fastest CPU in the world won't run BF1 if you have 1GB of RAM.
 
Well of course it's going to be less powerful than Xbox One, 3 times less powerful seems likely.

Its a tablet that will be priced from $250 to $350, its also constraint by the battery, why would it be even close to power of Xbox One?
 
Here's a question:

Assuming the active fan cooling during docking is a thing, and the chip runs at full clocks in this scenario, what do you think the maximum amount of heat generated will be?

I ask, because way back in response to the EG rumors in July, we had theorized potentially actively cooling the console in the dock, but if the general concept of the Switch is to be able to undock it at a moments notice, then it's possible that the unit gets excessively hot in the user's hands right after undocking. Notice in the video that the transition from console to handheld was presented as being seamless.

Could that heat be dissipated quickly enough due to the shape of the dock? Does anyone think this puts a theoretical limit on how much heat can be generated/how high the clocks can go?
 
So I've been digging through some old Digital Foundry analysis of Shield games.

And even with the X1's bandwidth (which is half of Parker/X2's) almost all the titles ran at either 720p or 1080p. Third party devs locking games at 504p because of the memory bandwidth are sounding more ridiculous.

Now on top of that the Shield TV is hampered by Android and is running what is presumably an older less powerful chip, with no active cooling, and with half the memory bandwidth available.
https://www.youtube.com/watch?v=je7-Ot4zyf0

Doom 3 BFG edition is a real standout doubling the framerate of the PS360 version while at the same upping the resolution from 720p (on ps360) to 1080p. It's interesting to note that some of the performance issues noted in Half Life 2 were apparently only present when running from a Micro-SD card. Hopefully the NX cartridges are a bit faster and more capable than that.
 
Does this mean that third party publishers that develop for Switch can relatively easily port their software to ARM based devices (Android, iOS?).
 
This is perhaps the most intriguing aspect of the Switch, to me.

PS4/XB1 levels of graphical power in a handheld device will do wonders for Nintendo games - especially after the 3DS.

You're not getting that. By virtue of being a portable, its more like a WiiU 1.6.
Shoot, even getting true 360/PS3 graphics and gameplay in a handheld will be amazing. I am really looking forward to seeing what a 2017 dedicated portable gaming device can achieve
 
So I've been digging through some old Digital Foundry analysis of Shield games.

And even with the X1's bandwidth (which is half of Parker/X2's) almost all the titles ran at either 720p or 1080p. Third party devs locking games at 504p because of the memory bandwidth are sounding more ridiculous.

Now on top of that the Shield TV is hampered by Android and is running what is presumably an older less powerful chip, with no active cooling, and with half the memory bandwidth available.
https://www.youtube.com/watch?v=je7-Ot4zyf0

Doom 3 BFG edition is a real standout doubling the framerate of the PS360 version while at the same upping the resolution from 720p (on ps360) to 1080p. It's interesting to note that some of the performance issues noted in Half Life 2 were apparently only present when running from a Micro-SD card. Hopefully the NX cartridges are a bit faster and more capable than that.

Half Life and Doom 3 are pretty old games now. And wasn't Doom 3 BFG missing a bunch of effects from the PS360 versions? Modern games have a lot more effects going on, they will require more bandwidth.
 
Half Life and Doom 3 are pretty old games now. And wasn't Doom 3 BFG missing a bunch of effects from the PS360 versions? Modern games have a lot more effects going on, they will require more bandwidth.

The analysis didn't note anything about the BFG edition missing effects.
 
So I've been digging through some old Digital Foundry analysis of Shield games.

And even with the X1's bandwidth (which is half of Parker/X2's) almost all the titles ran at either 720p or 1080p. Third party devs locking games at 504p because of the memory bandwidth are sounding more ridiculous.

Now on top of that the Shield TV is hampered by Android and is running what is presumably an older less powerful chip, with no active cooling, and with half the memory bandwidth available.
https://www.youtube.com/watch?v=je7-Ot4zyf0

Doom 3 BFG edition is a real standout doubling the framerate of the PS360 version while at the same upping the resolution from 720p (on ps360) to 1080p. It's interesting to note that some of the performance issues noted in Half Life 2 were apparently only present when running from a Micro-SD card. Hopefully the NX cartridges are a bit faster and more capable than that.

Good catch. This dismantles Zlatan "insider info" completely in my opinion.

Half Life and Doom 3 are pretty old games now. And wasn't Doom 3 BFG missing a bunch of effects from the PS360 versions? Modern games have a lot more effects going on, they will require more bandwidth.

Parker has double the bandwidth of Tegra X1. What kind of effect require twice as much memory bandwidth?
 
Top Bottom