Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.
So in terms of visuals what can we expect? Slightly better looking Wii U level, nowhere near Xbox One?

Just in case my hype needed to be lower.

My guess would be a highly scientific right-in-between.
 
So in terms of visuals what can we expect? Slightly better looking Wii U level, nowhere near Xbox One?

Just in case my hype needed to be lower.

Anything that ran at 720p on WiiU would be able to run at 1080p with much higher res textures and noticeably better effects to boot. That's IF we assume the specs in the op are exact, which has never been claimed. I'm still expecting a bit more than that.
 
Anything that ran at 720p on WiiU would be able to run at 1080p with much higher res textures and noticeably better effects to boot. That's IF we assume the specs in the op are exact, which has never been claimed. I'm still expecting a bit more than that.

Breath of the Wild in 1080p would look great.
 
I'm curious if anyone has any ideas about the Switch card specs. Looking at those huge contacts, I'm wondering if these have even higher read speeds than 3DS cards, and what that would mean for the RAM/bandwidth necessary for the Switch. If games are either running off of these game cards, SD cards, or internal flash storage, what kind of differences would we wind up seeing compared to HDD read speeds?

nintendoswitchgamecard.jpg
 
I know that, Maxwell and Pascal desktop GPUs do the same thing. Still, 25.6GB/s is ridiculously low, it probably needs twice as that to avoid bandwidth related bottlenecks, even after considering Pascal's color compression which reduces the bandwidth needed even more. A bit of SRAM is needed, not sure how much it would add to cost but they used 32MB eDRAM on Wii U so it shouldn't be an issue.

Its lower than I'd like to see, though not ridiculously so at all, not for this kind of GPU with 512Gflops. I do agree 50GB/s would be great and I agree that if bandwidth stays at 25.6GB/s then I think some extra embedded memory will probably be included. But it can't be eDRAM and certainly won't be close to 32MB, as it'll have to be eSRAM and likely used for texture caching.
 
I tell ya, what I'm concerned about is the OS having less than a gig to work with. The Wii U OS was an interminable slog and if the Switch one isn't nice and peppy and modern feeling it's gonna be a huge bummer.

How much RAM did the Xbox360 have for OS? Something like 80Mb? Windows XP ran perfectly fine on 512Mb, not sure how far 800Mb takes you in a linux environment currently, but obviously you can't compare it to Android etc, which is very bloated. If anything, compare it to iOS, which also runs better on iPhones with much less RAM than Android phones with more RAM. I think the original iPhone6 runs on 1Gb? Nintendo will make a custom OS, so they can decide what they want to do with every Mb of memory.

So in terms of visuals what can we expect? Slightly better looking Wii U level, nowhere near Xbox One?

Just in case my hype needed to be lower.

Performance wise, it's somewhere in the middle, and i think (due to the fact that it's more modern architecture than a XBone and it is targeting a 720p screen) it will look closer to XBO games than WiiU games.
 
Kinda underpowered but that's Nintendo we talk about, it doesn't really matter though especially when they go for style and not realism(which I also prefer).
Also with those specs it can't be expensive so that's a pro.
Looking forward..
 
Its lower than I'd like to see, though not ridiculously so at all, not for this kind of GPU with 512Gflops. I do agree 50GB/s would be great and I agree that if bandwidth stays at 25.6GB/s then I think some extra embedded memory will probably be included. But it can't be eDRAM and certainly won't be close to 32MB, as it'll have to be eSRAM and likely used for texture caching.

Yeah i was saying around 4MB, you don't need 32 with tile-based rasterization.

That's more or less the machine i was expecting, with lower clocks but 3 SM for the GPU instead of 2, 4.5GB of RAM for games (with 1.5 for the OS) and 4MB SRAM.

CPU: 4 A72 up to 1.9GHZ when docked+4 A53 1.2GHZ (2 for games, 2 for OS)
GPU: 3 SM up to 1GHZ when docked (768gflops max)
RAM: 6GB LPDDR4 (128bit bus, 51.2GB/s, 1-1.5GB for the OS)+4MB SRAM
Storage: 32GB+micro SD
Battery: 3-4 hours, USB-C charging
249$

If the specs are pretty much really those of the Jetson TX1 though, this is not gonna happen.
 
The Parker chip has 4 A57 and 2 Denversv2, right? If they're going with a custom chip, how feasable is it to swap the 2 Denvers for 2 A72? That would automatically mean FF16nm, 128bit bus, 50GBs memory bandwidth...


Kinda underpowered but that's Nintendo we talk about, it doesn't really matter though especially when they go for style and not realism(which I also prefer).
Also with those specs it can't be expensive so that's a pro.
Looking forward..

Considering it's also a portable... what did you expect exactly? 2 TF, with 20 minute battery and a 500$ pricetag?
 
I'm amazed that Switch might only use 800MB RAM for the OS compared to Wii U using 1GB. Must be far more efficient to need even less. Best part of course is that Switch would have over literally 3.2x the amount of RAM for games than Wii U (3.2GB vs 1GB).

Edit: Now to compare:

Switch = 4GB LPDDR4 RAM (3.2GB for games, 800MB for OS)
Xbox One = 8GB DDR3 RAM (5GB for games, 3GB for OS)
PS4 = 8GB GDDR5 RAM (5GB for games, 3GB for OS)
Wii U = 1GB DDR3 RAM (1GB for games, 1GB for OS)

... if my research is correct plus Vern's Switch info here.
 
I'm amazed that Switch might only use 800MB RAM for the OS compared to Wii U using 1GB. Must be far more efficient to need even less. Best part of course is that Switch would have over literally 3.2x the amount of RAM for games than Wii U (3.2GB vs 1GB).

Edit: Now to compare:

Switch = 4GB LPDDR4 RAM (3.2GB for games, 800MB for OS)
Xbox One = 8GB DDR3 RAM (5Gb for games, 3GB for OS)
PS4 = 8GB GDDR5 RAM (seemingly all 8GB for games, with a separate 256MB DDR3 RAM for OS)
Wii U = 1GB DDR3 RAM (1Gb for games, 1GB for OS)

... if my research is correct plus Vern's Switch info here.

Source?
 
I'm amazed that Switch might only use 800MB RAM for the OS compared to Wii U using 1GB. Must be far more efficient to need even less. Best part of course is that Switch would have over literally 3.2x the amount of RAM for games than Wii U (3.2GB vs 1GB).

Edit: Now to compare:

Switch = 4GB LPDDR4 RAM (3.2GB for games, 800MB for OS)
Xbox One = 8GB DDR3 RAM (5Gb for games, 3GB for OS)
PS4 = 8GB GDDR5 RAM (seemingly all 8GB for games, with a separate 256MB DDR3 RAM for OS)
Wii U = 1GB DDR3 RAM (1GB for games, 1GB for OS)

... if my research is correct plus Vern's Switch info here.

ps4 has about 5 or 5.5 for games
 
Are there any benchmarks or other tests showing whether the current tegra X1 is severely bandwidth limited?

I did a quick search for data on the relative bandwidth requirements of tile-based rendering and traditional rendering and found this rather old paper from 2004 with results for for 32x32px tiles:

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.79.2488&rep=rep1&type=pdf

NCwYVu0.jpg


Since the paper is over ten years old, the benchmarks used in the paper are also rather old games, so I can't say how representative they are. It's unclear how much bandwidth would be used for other parts of the pipeline aside rasterization. Moreover, the the Switch shares its memory bandwidth between CPU and GPU. Nevertheless, when looking at memory bandwidth requirements it's clear that the figure of 25,6 GB/s only tells part of the story.
 
For the record, I'm perfectly aware that Nvidia has been using tiled rasterization since Maxwell. It's also not a Tegra-specific feature, I was running the test program on my own GPU when this was discovered :P

Despite that, I still think that 25 GB/s of external bandwidth shared between the GPU and all the CPU cores could easily become a bottleneck. A larger texture cache would probably help, but even so it's just not a whole lot of bandwidth.

I would agree that 25GB/s would seem like a bottleneck, but it would be very difficult to say so without knowing precisely what changes they made to the cache hierarchy, and it would be very strange if Nintendo and Nvidia knowingly put out a piece of hardware so obviously bottlenecked.

In particular, I'm not sure if a larger L2 texture cache would be the way to go. I assume that, like most caches, Nvidia's L2s are configured with a focus on minimising latency rather than reducing bandwidth pressure on main memory. Installing a fully-associative victim cache as an L3, though, could have a larger effect on main memory bandwidth usage. This is what Intel (with Crystalwell L4) and Apple (with A8 onwards L3) have done, and seemingly for precisely the same reason (accommodating a TBR GPU on limited main memory bandwidth).

Of course this is getting slightly beyond my expertise when it comes to cache design (which consisted of about one or two lectures in college a good few years ago), but I do find it interesting how they might deal with limited LPDDR4 bandwidth, considering Nintendo has typically been willing to go to some expense to get what they want on the memory front for the past few generations.

The other possibility is that they're still fans of expensive, specialist RAM, and that the chip Nvidia has in production with a 4GB HBM2 stack is in fact the same Nvidia SoC with 4GB of memory we're talking about in this thread.

Note: This probably means nothing and is speculation. Nvidia uses TSMC for their GPUs.

After reading the poorly written article from that thread about TSMC which I posted in the last page. I went to look at TSMC earnings call transcripts for the last two quarters. Although I doubt this confirms anything as it may be reading too much into things.

http://www.tsmc.com/english/investorRelations/quarterly_results.htm

I looked at the Earnings Conference Transcripts for the 2Q and 3Q for 2016. Anytime I looked up gaming or game, it referenced their 16nm FinFet. (This would include PC gaming.)

For example from the 3Q:

This next part was also interesting but again, doesn't really mean anything. It's just from noticing Thraktor speculate how a 16nm FFC could be a possible node process for the GPU.

From the 2Q:

All of this probably doesn't mean anything. It was brought up since some people think the Switch GPU will be 20nm, unless a leak happens to show what the actual GPU die size is, we would have to wait for the actual reveal of the Switch in January.
(Because TSMC has their fourth quarter earnings on the 12th of January 2017 which is 1 day before the Switch is revealed.)

Regarding the first quote, I believe Xbox One S is manufactured on TSMC's 16FF+ process (and possibly Ps4 Slim and PS4 Pro as well). I do still think their 16FFC is the most likely process for Switch, if for no other reason than it's estimated to be 10-20% cheaper than 16FF+, and an improvement in power efficiency wouldn't hurt, either.

The 32MB of eDRAM gave it a higher "effective" bandwidth I thought. I'm not sure about it but that's what I recall being said.

Either way it seems like Nvidia (and Tegra chips specifically) do get more out of the bandwidth they have, so maybe 25GB/s isn't as bad as I originally thought. Either way, I'd be surprised if RAM isn't one of the areas where the custom Nvidia SoC differs from a standard TX1. It is essentially in Nintendo's DNA.

The 32MB of eDRAM in Wii U is directly-accessible memory with (as far as we can tell) 70GB/s of bandwidth to it. The idea is that all buffers would be kept in the eDRAM (as the considerable majority of GPU memory bandwidth is used in buffer accesses), and if developers make perfect use of both the eDRAM and the DDR3, then they could achieve a total of 83GB/s of memory bandwidth. (Although in practice doing so can be far from trivial)
 
I'm amazed that Switch might only use 800MB RAM for the OS compared to Wii U using 1GB. Must be far more efficient to need even less. Best part of course is that Switch would have over literally 3.2x the amount of RAM for games than Wii U (3.2GB vs 1GB).

Edit: Now to compare:

Switch = 4GB LPDDR4 RAM (3.2GB for games, 800MB for OS)
Xbox One = 8GB DDR3 RAM (5GB for games, 3GB for OS)
PS4 = 8GB GDDR5 RAM (5-5.5GB for games, 2.5-3GB for OS)
Wii U = 1GB DDR3 RAM (1GB for games, 1GB for OS)

... if my research is correct plus Vern's Switch info here.

PS4 has 5GB for games and PS4 Pro has 5.5GB for games

EDIT: could it be possible that the Switch has more than 4GB but that's the amount available for games (and the reason the dev kits has that amount? I doubt the dev kits have the OS )
 
Youre also under estimating the power of advertising in and of itself.

...

Also your comment of "N has chose home and showed portable" is completely off base as the initial trailer clearly showed both home console and portable, aka both playing on the TV and on the go. Again the message was pretty clear but it seems to you it wasnt, so maybe your right, I mean if you cant even understand whats going on what hope do we have for everyone else right?

Go look at that thread that asks 'how do you perceive the switch'. Now imagine that being asked to people walking down the game aisle at Tesco next spring.

You and M have made up your minds already, so there is no real point in arguing.
 
Aaah, that makes sense. I assume PS4 Pro has the same 8GB GDDR5 RAM then? What about Scorpio?

yeah it has the same amount but has an extra slow ddr3 1GB chip. on the standard ps4 it uses 1GB of ram for swapping out and storing apps (like Netflix) so that you can switch between game and app fast. sony realised that its a waste of GDDR5 so they added the extra DDR3 1GB for the swapping/storing. it then frees up half a GB for GDDR5 games and the other half a GB is used to run the OS at 4K.

Scorpio has either 12GB of GDDR5 ram or 8GB of a faster ram (to get the bandwidth stated).
 
I would agree that 25GB/s would seem like a bottleneck, but it would be very difficult to say so without knowing precisely what changes they made to the cache hierarchy, and it would be very strange if Nintendo and Nvidia knowingly put out a piece of hardware so obviously bottlenecked.

In particular, I'm not sure if a larger L2 texture cache would be the way to go. I assume that, like most caches, Nvidia's L2s are configured with a focus on minimising latency rather than reducing bandwidth pressure on main memory. Installing a fully-associative victim cache as an L3, though, could have a larger effect on main memory bandwidth usage. This is what Intel (with Crystalwell L4) and Apple (with A8 onwards L3) have done, and seemingly for precisely the same reason (accommodating a TBR GPU on limited main memory bandwidth).

Of course this is getting slightly beyond my expertise when it comes to cache design (which consisted of about one or two lectures in college a good few years ago), but I do find it interesting how they might deal with limited LPDDR4 bandwidth, considering Nintendo has typically been willing to go to some expense to get what they want on the memory front for the past few generations.

The other possibility is that they're still fans of expensive, specialist RAM, and that the chip Nvidia has in production with a 4GB HBM2 stack is in fact the same Nvidia SoC with 4GB of memory we're talking about in this thread.

Is HBM2 still prohibitively expensive? All I see when searching for info is a "high cost" and "price point" being a problem, but is there any cost comparison to other RAM types available?

I don't doubt that it's likely expensive and unlikely to be used, but it's just something I'm curious about.


Oh? Saw at least one say so, but it could be a classic case of broken telephone I suppose.

He had said he spoke with mods and asked them to anonymously post his info, and I guess we assumed from there that the mods had at least verified he would be in a position to know such info, but that never happened. Either way, he seems fairly trustworthy but as always with rumors and insiders, grain of salt and all that.
 
Go look at that thread that asks 'how do you perceive the switch'. Now imagine that being asked to people walking down the game aisle at Tesco next spring.

You and M have made up your minds already, so there is no real point in arguing.
You think a small group of the hardcore arguing about minutiae actually is representative of the greater populace in any meaningful way?

People will perceive the Switch as it is presented, and, again, we have already seen a fantastic example of that.

Hardcore gamers bitched about the Wii too, and it was sold in basically the same way to great success. Strong visual storytelling does a lot to sell an idea.
 
Dunno why people are complaining about 25 GB/sec being too slow. If your Internet connection was that fast, you'd be ecstatic.

Yes, this is a joke
 
I tell ya, what I'm concerned about is the OS having less than a gig to work with. The Wii U OS was an interminable slog and if the Switch one isn't nice and peppy and modern feeling it's gonna be a huge bummer.

I'm amazed that Switch might only use 800MB RAM for the OS compared to Wii U using 1GB. Must be far more efficient to need even less. Best part of course is that Switch would have over literally 3.2x the amount of RAM for games than Wii U (3.2GB vs 1GB).

Edit: Now to compare:

Switch = 4GB LPDDR4 RAM (3.2GB for games, 800MB for OS)
Xbox One = 8GB DDR3 RAM (5GB for games, 3GB for OS)
PS4 = 8GB GDDR5 RAM (5GB for games, 3GB for OS)
Wii U = 1GB DDR3 RAM (1GB for games, 1GB for OS)

... if my research is correct plus Vern's Switch info here.

I don't think the quantity of RAM allocated to the OS had anything to do with Wii U's sluggishness. As others have mentioned, the iPhone 6 ran a far more featureful OS on 1GB and was extremely responsive.

The amount of RAM required for a console OS depends on what it expects to perform while also keeping a game in memory, but also on other aspects of the system's hardware, most notably the storage speed. On the first part, the PS4 (as far as I'm aware) keeps other apps like Netflix in memory in the background while you're playing a game, which from my perspective is a massive waste of GDDR5 (although perhaps other people are happy with it if they like constantly switching back and forth between games and Netflix). Wii U does this with the web browser and the eShop (or at least keeps the game in memory while accessing them, which is equivalent from a memory requirements point of view), which frankly is still pretty unnecessary, as if we do need to browse the internet while playing a game we generally have at least one suitable device within reach which can do just that (and generally do a better job of it). By limiting the number of things users are expected to do while a game in still in memory, Nintendo could substantially reduce the memory footprint of the OS.

The second part, regarding storage speed, is one of the main reasons iPhones are able to get by with comparatively little RAM. Because Apple designs both the hardware and software, and uses very fast flash storage in all their iOS devices, they're able to be very aggressive about pushing apps from RAM to flash when necessary, as they know they'll be able to pull them back into RAM very quickly. Wii U had relatively slow flash storage (possibly compounded by an insufficient crypto co-processor), and moving to a faster internal storage, ideally UFS, would allow them to drop background processes much more quickly, and reload them as needed while still keeping the system feeling snappy. Typical UFS read speeds are around 5x that of the hard-drives used in PS4/XBO, which could give Nintendo a substantial advantage in terms of OS responsiveness, and even if they go with slower eMMC they should have a comfortably faster solution than their competitors.

Edit:

Is HBM2 still prohibitively expensive? All I see when searching for info is a "high cost" and "price point" being a problem, but is there any cost comparison to other RAM types available?

I don't doubt that it's likely expensive and unlikely to be used, but it's just something I'm curious about.

It's "expensive", but Wii U's eDRAM was also "expensive", as was Gamecube's 1T-SRAM. How expensive it is relative to those, though, I have no idea.

There are a couple of factors which would work in Nintendo's favour, though, when it comes to cost. The first, obviously, is that it's a single stack, which means from a pure DRAM component cost point of view it would be one quarter of the four-stack configurations which are expected to be used in upcoming GPUs. The second is that, as far as I can tell, a large portion of the cost of HBM/HBM2 comes from the use of a silicon interposer, which is an extra die that the GPU and HBM stacks all sit on. This interposer obviously has a non-trivial cost in and of itself, but then it, along with the GPU and HBM dies, all need to be sent to another company to actually assemble everything together. By comparison, whatever Nvidia has planned with a single HBM stack doesn't use a silicon interposer, but rather uses TSMC's integrated fan-out (InFO) packaging, which is much cheaper and can be carried out entirely in-house by TSMC.
 
Is HBM2 still prohibitively expensive? All I see when searching for info is a "high cost" and "price point" being a problem, but is there any cost comparison to other RAM types available?

I don't doubt that it's likely expensive and unlikely to be used, but it's just something I'm curious about.

Samsung is working on low cost HBM. I don't know if that'd be ready to ship in NX or how cheap that exactly is though.

HBM2 is also still more power hungry than LPDDR4.
 
If this can run breath of the wild and or mario kart 8 with better graphics fidelity than the Wii U are we still going to care what the ram bandwidth is or which chipset they chose?
 
I'm really intrigued about what the USB ports on the dock are for considering there's no external HDD support...

Keyboard, LAN adapter, GCN controller adapter, ???

Rock Band Instruments, Donkey Konga Bongos.

I've seen the usb = external HDD all over this forum recently for the switch. Have no idea when external HDD became the only usage of USB ports lol. Kind of crazy how many people have made that comparison since the switch reveal. There are so many accessories that will benefit from usb ports. One of the main ones right off the bat would likely charge the Switch Pro Controller, not to mention all the accessories MacTag and foltzie mentioned.
 
If this can run breath of the wild and or mario kart 8 with better graphics fidelity than the Wii U are we still going to care what the ram bandwidth is or which chipset they chose?

Because hardware performance is the only thing that could prevent it from getting the same third-party support as PS4 or XB1, obviously.
 
If this can run breath of the wild and or mario kart 8 with better graphics fidelity than the Wii U are we still going to care what the ram bandwidth is or which chipset they chose?

Well I care how much better they are going to look at at what resolution yes. Just being better than the WiiU is not particularly exciting.
 
Because hardware performance is the only thing that could prevent it from getting the same third-party support as PS4 or XB1, obviously.

Even if this thing used Pascal / Parker it still wouldn't get the same third party support as X1 or PS4 because it isnt x86.. It is a different architecture than the other two.

Lets be honest here Nintendo hedged their bets this time around by combining their console / portable development teams into 1 platform. They are going to work with their existing second parties across both their platforms to bring it to one.

Unless this thing sells at the pace of the original Wii or the PS4 it isnt going to get triple A games from third parties because regardless of which Tegra chipset they choose it is going to be under powered and a different architecture. We need to taper expectations and realize it isnt going to garner as much third party support as lets say Gamecube. This system is going to be a marginal upgrade from the Wii U that you will be able to take on the go with you and play at home.

We arent going to get Wii ----- > PS4 level graphical upgrades here.. people need to stop dreaming and come back to reality lol.
 
Edit:

It's "expensive", but Wii U's eDRAM was also "expensive", as was Gamecube's 1T-SRAM. How expensive it is relative to those, though, I have no idea.

There are a couple of factors which would work in Nintendo's favour, though, when it comes to cost. The first, obviously, is that it's a single stack, which means from a pure DRAM component cost point of view it would be one quarter of the four-stack configurations which are expected to be used in upcoming GPUs. The second is that, as far as I can tell, a large portion of the cost of HBM/HBM2 comes from the use of a silicon interposer, which is an extra die that the GPU and HBM stacks all sit on. This interposer obviously has a non-trivial cost in and of itself, but then it, along with the GPU and HBM dies, all need to be sent to another company to actually assemble everything together. By comparison, whatever Nvidia has planned with a single HBM stack doesn't use a silicon interposer, but rather uses TSMC's integrated fan-out (InFO) packaging, which is much cheaper and can be carried out entirely in-house by TSMC.

Samsung is working on low cost HBM. I don't know if that'd be ready to ship in NX or how cheap that exactly is though.

HBM2 is also still more power hungry than LPDDR4.

Interesting. So it's not necessarily out of the question, and it is something Nintendo of old would do. But we're not necessarily dealing with Nintendo of old anymore, so who knows what's going on. Insiders who have seen the specs in the OP and "confirmed" that what we're getting is close to that may not be talking about the type of RAM, or may not be tech people who know or care about the distinction between types of RAM.

If this can run breath of the wild and or mario kart 8 with better graphics fidelity than the Wii U are we still going to care what the ram bandwidth is or which chipset they chose?

It's interesting to discuss for a few reasons. One, some people have a genuine interest in the tech inside game consoles and like to discuss possibilities based on what's available in the market. Two, it's nice to know whether or not third parties will have an easy enough time porting their games over to the Switch, and what possible bottlenecks they may run into.

And three, there's really not much else about the Switch to discuss at this point, and I at least find it quite interesting and entertaining talking about the Switch.

Even if this thing used Pascal / Parker it still wouldn't get the same third party support as X1 or PS4 because it isnt x86.. It is a different architecture than the other two.

Lets be honest here Nintendo hedged their bets this time around by combining their console / portable development teams into 1 platform. They are going to work with their existing second parties across both their platforms to bring it to one.

Unless this thing sells at the pace of the original Wii or the PS4 it isnt going to get triple A games from third parties because regardless of which Tegra chipset they choose it is going to be under powered and a different architecture. We need to taper expectations and realize it isnt going to garner as much third party support as lets say Gamecube. This system is going to be a marginal upgrade from the Wii U that you will be able to take on the go with you and play at home.

We arent going to get Wii ----- > PS4 level graphical upgrades here.. people need to stop dreaming and come back to reality lol.

To your first point, many developers have come and told us there is very little problem porting from x86 to ARM, so that is absolutely not a barrier for ports.

Also, it's possible that the Switch winds up closer in real world performance to XB1 than XB1 is to PS4, so power isn't likely to be a reason why ports wouldn't come. We also have some insider reports stating that ports should not be a technical problem, and that comes from very trustworthy insiders.

But you're absolutely right that we shouldn't expect every AAA multiplat to get a Switch port unless it really takes off in sales. Those games typically don't sell well enough on Nintendo hardware to justify an investment for a port, although lowering the cost/effort in making those ports as much as possible will go a long way in encouraging said ports.
 
I tell ya, what I'm concerned about is the OS having less than a gig to work with. The Wii U OS was an interminable slog and if the Switch one isn't nice and peppy and modern feeling it's gonna be a huge bummer.

It's possible that the is is more 3ds like, with less bloat than WiiU, but I hear you.
When you turn on WiiU, you have to pass 3 menus before the thing even starts up proper.
 
Interesting. So it's not necessarily out of the question, and it is something Nintendo of old would do. But we're not necessarily dealing with Nintendo of old anymore, so who knows what's going on. Insiders who have seen the specs in the OP and "confirmed" that what we're getting is close to that may not be talking about the type of RAM, or may not be tech people who know or care about the distinction between types of RAM.

Ordinarily I wouldn't say there was any chance of Nintendo using HBM2 in a portable device, but there are only a very small number of potential things that the Nvidia chip in question could be, and as the Switch SoC is one of them, I feel it's worth considering as an outside chance.

Seronei is quite right in that HBM2 is more power hungry than LPDDR4, however it would likely be clocked down from the standard 2GHz (and possibly even more in portable mode), which could put it in a more reasonable position, certainly on a pJ/bit metric. The active cooling also points towards Nintendo targeting a much higher power draw than they usually would for a handheld system.

Regarding the information that has been leaked, if they were using HBM2 (which I should stress I don't expect, but I'm going along with the hypothetical here), then they would be the first company using TSMC's multi-chip InFO (unlike the Apple A10, which uses InFO for a package on package configuration), and would require perhaps an order of magnitude more TIVs than the A10, meaning final SoCs with HBM may simply not be ready for use in third-party dev-kits yet, and developers are still working on TX1 based kits.
 
Even if this thing used Pascal / Parker it still wouldn't get the same third party support as X1 or PS4 because it isnt x86.. It is a different architecture than the other two.

Different architectures is pretty much a non-issue these days for a variety of reasons. Especially with ARM, which is even more widely-used than x86 these days and trivial to port to from x86. Whether the Switch is x86 or not has no bearing on porting ease.

Even when the Wii U used PowerPC the whole architecture thing was kinda overblown.
 
A 2D game and last-gen or portable up-ports aren't really the best metric here. By that metric PS3 was totally 1080p-ready as well.
Should've been. But give people infinite processing power and they'll find a way to use it on more precise skin wrinkles instead of defaulting to 1080p60 games.
th3sickness said:
If this can run breath of the wild and or mario kart 8 with better graphics fidelity than the Wii U are we still going to care what the ram bandwidth is or which chipset they chose?
It's if there's some interesting non-Nintendo game that can't easily be ported that we might care.
 
But you're absolutely right that we shouldn't expect every AAA multiplat to get a Switch port unless it really takes off in sales. Those games typically don't sell well enough on Nintendo hardware to justify an investment for a port, although lowering the cost/effort in making those ports as much as possible will go a long way in encouraging said ports.

I mean it is nice to discuss the specs. But I think people have to understand the Switch is not going to get COD, Titanfall, or Battlefield because of how underpowered it is. Online communities drive the life of those games and not enough Nintendo fans will buy those games to justify them making them (We dont even know what online looks like for this system). Nintendo is expecting those developers to make an experience for the Switch rather than make the same experience and bring it over.

Most developers cant afford to make a special experience outside of their core product. This is why the Switch's potential is going to be limited.
 
Different architectures is pretty much a non-issue these days for a variety of reasons. Especially with ARM, which is even more widely-used than x86 these days and trivial to port to from x86. Whether the Switch is x86 or not has no bearing on porting ease.

Even when the Wii U used PowerPC the whole architecture thing was kinda overblown.

You say that but almost every developer has said that the Wii U was a lot harder to work with because of Power PC. You had to be much craftier to develop for it due to the limitations of the hardware. Which this time around is going to be the exact same problem. ARM may be alot easier, but the specs still arent on par.
 
You say that but almost every developer has said that the Wii U was a lot harder to work with because of Power PC. You had to be much craftier to develop for it due to the limitations of the hardware. Which this time around is going to be the exact same problem. ARM may be alot easier, but the specs still arent on par.

As far as I know, the difficulties do not come from different ISAs, but from the lack off availability of third-party libraries and frameworks for those ISAs. Especially when they make use of ISA-specific features like one of the different instruction sets for vector operations. I imagine that you can get any relevant third-party component for ARM these days. The same has very likely not been true for PowerPC.
 
Status
Not open for further replies.
Top Bottom