Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.
Think it's per 1GB stack. What I looked up was for at least 4GB of HBM

Ah I see, that makes sense. Thanks.

So yeah it looks like HBM wouldn't be possible in that form factor for any appreciable amount, without even getting into costs. I wonder if the cheaper low power HBM might be in a mid-gen revision or next iteration.

Either way I'm not terribly worried about Switch RAM as Nintendo typically understands how to customize their RAM properly.
 
If it hasn't been offered already, another dumb pun:

Nintendo Switch |OT| The Legend of Bezelda


From Nintendo? We absolutely will not

Was witholding specs something that more or less started when Iwata became the head honcho? Maybe with a new head, they'll be a bit more open about such details. It definitely feels like Nintendo's going in a new direction in other ways (in design strategy, partnerships, and marketing to varying degrees), so I wouldn't discount a change here, as well.


The only other major porting concern, I believe, is it seems we will be relying on the game carts for storing the game data. No idea if patches will be applied to those carts, or if you'd have to put them on a microSD or something like that.

While I don't really think it's something going on with the Switch, I really like the idea of patching cards. I vaguely recall that the Nintendo 64DD was partitioned or otherwise set up such that part of the media was read-only and the rest was writable. That way, a game on disk was protected but developers could save game data to the same physical storage device.
 
It's as if these arguments are shot down, but people forget pages later because they can only remember so many pages worth of information.

I think it's literally the same people arguing about the memory and deliberately discarding all explanations about it. I think I'm going to post it one more time and just copy paste or link next time this pops.

1) Bandwidth: As long as the Tegra chip can get 50 GB/s like Tegra Parker the bandwidth is really healthy in GB/Teraflops ratio, I'm not expecting as much but even at 0.768 Gflops the memory bandwidth ratio is way above most of the GTX line and while it's true that the bandwidth needs to be shared with the CPU cores, these need much less than the GPU and as said above there is plenty to spare.

Code:
|  gtx TitanX  |  10.974 Tf  |  480  G/sec  |  43.74  |
|  gtx 1080    |   8.873 Tf  |  320  G/sec  |  36.06  |
|  gtx 1070    |   6.463 Tf  |  256  G/sec  |  39.61  |
|  gtx 1060    |   4.372 Tf  |  192  G/sec  |  43.91  |
|  Tegra Max   |   0.768 Tf  |  50 G/sec    |  65.11  |
|  Tegra Min   |   0.512 Tf  |  50 G/sec    |  97.65  |

2) Memory Size: Every modern game works well at medium settings on 2GB GPUs as long as these are decently capable and you don't crank settings to very high/ ultra settings, youtube has many examples easy to check in like 2 min.

Quick example: Battlefield 1 @1080p Medium Settings Geforce 960m 2GB
 
While I don't really think it's something going on with the Switch, I really like the idea of patching cards. I vaguely recall that the Nintendo 64DD was partitioned or otherwise set up such that part of the media was read-only and the rest was writable. That way, a game on disk was protected but developers could save game data to the same physical storage device.
I believe Vita was actually set up this way, it's cards use EEPROM for everything and partition for game data and rewritable space. Unfortunately they also maxxed out at 4GB so devs tend to elect using as much space as possible which pushes saves and patches to system memory, and the cost of EEPROM likely made jumping to 8GB prohibitive.
 
It's as if these arguments are shot down, but people forget pages later because they can only remember so many pages worth of information.
That's what happens when you only have 3.2GB of RAM in your brain instead of 5.

I'll see myself out
 
Not sure if I missed this discussion , but did anyone discuss why the Switch has long vents on the back?

gIw0gJFgY6JQ1uU3Fw3TsJuuhe3HTDyMFudluCNLsJ5bR9sqJbPsGGBFX0TIlO5jCv2rYT3ixgVFdz2ae5b_jPcUGf-kXKqBdcPbNoBPJ2mHmVk0vy7F7GzFKSAyYxuk00K9kAT6
 
You won't lack of ram for barely 2GB difference when it's obvious you'll hit a wall before because of the lower GPU horse power. That and the better VRAM management with Maxwell and Pascal..
Yep i suppose 3GB is actually quite fitting for the portable system capabilities, in that way Nintendo might know whats up much better than us...

Though i would still have liked 4GB for games, sounds better on paper when considering ports and future proof
 
Yep i suppose 3GB is actually quite fitting for the portable system capabilities, in that way Nintendo might know whats up much better than us...

Though i would still have liked 4GB for games, sounds better on paper when considering ports and future proof

I look at what was possible on the Wii U with the 1 gig of ram for games.. Bayonetta 2, Xenoblade X, Breath of the Wild, and Mario Kart 8. With 3 times the ram and more CPU / GPU power they should be able to do a lot more.
 
That notch behind the usb is new to me, maybe they have a proprietary dock port in that?

My first thought is that it lets the USB port pivot as if it's on a hinge so that you can change the charger orientation to allow charging while the Switch is sitting on a surface in kickstand mode.

But that seems like an unnecessary addition of a point of failure, so I'm not really sure if it makes sense.
 
Just out of curiousity why is HBM out of the question*? I think we discussed this like ~20 pages back but is the cost really that prohibitive? Like, do we know if it's something like 2-3x the cost of LPDDR4 or more like 20-30x the cost?

*I don't expect it because everyone constantly says not to but I'm curious about the reasoning.

It's probably too expensive, and it's probably consumes too much power, but it's hard to make definitive statements on either.

It's certainly unlikely, but we can't 100% rule it out, as we know that Nvidia has a part in production with one logic die and one four-die HBM stack using TSMC's InFO packaging, and the Switch SoC is one of only two plausible candidates for that chip (the other being Xavier, which shouldn't technically be "in production" yet, for that matter). Using InFO would potentially bring the cost down quite a bit, although again it's impossible to say by how much.

Besides cost, power. Theoretical low for current HBM is 14.6W from what I found. While LPDDR4 would be 1.02W. HBM would kill batteries faster. Would be nice to have, but there would probably only be 2GBs at best inside the Switch in order to save as much battery as possible.

That's for four HBM stacks, whereas Nintendo would only be using one. They could also clock HBM2 quite a bit lower than the standard 2GHz (which gives 256GB/s per stack), potentially to 1GHz or below, which would reduce the power draw quite a bit (they could even clock it down in portable mode and up in docked mode alongside the GPU if they felt like it). Sub 2W power draw would in theory be possible, which is certainly still more than LPDDR4, but not impossibly so.

I thought HBM got into far lower power envelopes than that. The 7th slide at this link says 3.3W at 128GB/s bandwidth for HBM, which I think could be clocked lower to suit the Switch's needs. Unless I'm reading that slide wrong which I very well could be.

Edit: Ah that might be for the new LPHBM which isn't available yet?

That's first-gen HBM. As far as I'm aware HBM2 consumes slightly more power operating at full speed (2GHz compared to HBM1's 1GHz), but should consume quite a bit less when clocked down to give ~128GB/s bandwidth.

At manufacturer's prices and at large volume it shouldn't make an enormous difference if it's 2-3x more expensive for one component. I would say it would take the BoM from, say, $175 to $185-190 or something. Completely out of my ass though.

According to IHS, the 4GB of LPDDR4 used in the Galaxy S7 cost about $25 for a launch in March this year. A year later, and assuming Nintendo wouldn't use PoP, it would probably cost Nintendo somewhere from $15-$20.

HBM2 is much harder to price, and there's both the component cost and the packaging cost to consider. From the packaging costs point of view, the use of InFO should bring costs down substantially over the use of a silicon interposer (InFO is used in, for example, the Apple A10, although that implementation is a little different). Then, obviously the use of only one HBM stack (versus four in all existing HBM implementations) would bring the component cost down by 75%. The issue is that we don't really know how expensive existing HBM implementations are, how much of that is component cost versus packaging cost, or precisely how big the savings are on the packaging side.

Just for fun, though, let's make a first-order approximation. To do so, let's assume that the savings of InFO over a silicon substrate are between 50% and 75% (this is roughly plausible, but more importantly it means we can ignore the cost breakdown between components and packaging). Let's also assume that AMD launch their new Vega GPUs at around the same time as Switch, to use them for comparison purposes. Judging by current rumours, it seems likely that AMD will have a "smaller" version of Vega with 8GB of HBM2 (two stacks) and a higher-end version with 16GB of HBM2 (four stacks). The 8GB version seems to be competitive with the GTX 1070/1080, so let's say it launches at $500.

To give a benchmark of memory cost on AMD's ~$500 GPUs, we know that in late 2013, when PS4 launched, 8GB of GDDR5 cost about $88. AMD at the time were using 4GB of GDDR5 in their top-end GPUs, although they used a wider bus than PS4, so let's say it was costing them about $50 for memory for their ~$500 cards.

I don't think it's safe to assume that Vega will have the same memory cost as the R290/290X, as GDDR5 was a mature tech at the time, and HBM2 isn't now. Let's put an upper limit at double the cost though. So, at a maximum, we'd be looking at $100 for a two-stack HBM2 solution on a silicon substrate, and with a 50% saving with InFO, that would translate to a cost of $50 for Nintendo to use one HBM2 stack with InFO packaging.

For the lower limit, let's look at the "big" Vega with 16GB HBM2. Assuming it's going to be competing with Titan XP/1080Ti, it should be comfortably over $500 (how far over doesn't really matter too much). We'll assume, on a lower limit, that the cost of a four-stack HBM2 memory pool is $100 in this case. If we take a 75% cost reduction from moving from a silicon substrate to InFO, then that would put the lower-limit cost for a one-stack InFO HBM pool to $25 for Switch.

So, with some very rough approximations, we can put the cost of 4GB of HBM2 using InFO packaging at somewhere between $25 and $50. Compared to LPDDR4, this would be an increase of from $5 to $35. I should emphasise that these are very rough estimates, and I would personally feel that the higher end of that spectrum would be more likely, but it's hard to say with any degree of certainty.
 
-HBM discussion-

Thanks for that, very informative! At this point, now that we have more than one internal fan being rumored along with another fan in the dock, I'm starting to think this might have a bit of a bigger power envelope in portable mode than we previously thought. I wonder if HBM would essentially solve all of their present and future RAM issues while explaining some of the increased power draw.

Price seems like less of an issue than I previously thought, though as you said, those are very rough estimates. It'll certainly be interesting to see if they discuss any specs at the January event. I'm guessing if they did opt to use HBM then they'd want to publicly state such, as that could be an interesting selling point.

Of course I doubt they'd run into issues with 4GB of LPDDR4 like people have been saying here, but Nintendo has historically splurged on excessive RAM solutions in the past. Who knows.
 
Here's a question. Do you think Nintendo could get away with trimming the fat in terms of OS features to use less RAM? I mean I still adore the GC's OS, it's simple, fast, and easy to use. A far better menu than the Wii U's, that's for sure.

Surely some features like screenshot saving can be done with far less than 800MB of RAM dedicated to the OS.

The only problem with that is that the GC didn't really HAVE an OS. There was no background anything, past the opening animation and the options menu there was nothing really for it to do. AFAIK with both the GameCube and the Wii the system drivers were loaded onto the game discs themselves to be run, not by the OS

There's really no way to compare what the GameCube was doing versus what a modern OS needs to do in terms of loading, multitasking, overlays, etc
 
I look at what was possible on the Wii U with the 1 gig of ram for games.. Bayonetta 2, Xenoblade X, Breath of the While, and Mario Kart 8. With 3 times the ram and more CPU / GPU power they should be able to do a lot more.
I agree im not worried about first party games , exclusives or collaborations at all, they will look and run awesome :)

Just a little bit about third party ports like the rumored Dark Souls III from From :O
 
The only problem with that is that the GC didn't really HAVE an OS. There was no background anything, past the opening animation and the options menu there was nothing really for it to do. AFAIK with both the GameCube and the Wii the system drivers were loaded onto the game discs themselves to be run, not by the OS

There's really no way to compare what the GameCube was doing versus what a modern OS needs to do in terms of loading, multitasking, overlays, etc

Yup this is correct. Neither the GC nor the Wii have an operating system (too slow). Remember that "system menu" that would always pop up when you hit the home button? That's a process that developpers HAD to include in their gamecode.

So if you hit "Wii Menu", that just shuts off the game and starts the menu program.
 
Yup this is correct. Neither the GC nor the Wii have an operating system (too slow). Remember that "system menu" that would always pop up when you hit the home button? That's a process that developpers HAD to include in their gamecode.

So if you hit "Wii Menu", that just shuts off the game and starts the menu program.
Why does a console primarily meant for playing games need an operating system again?'

Sure some of the features are neat, but I would drop most of them them for having nearly 100% of the system resources running the game like it used to be pre current gen.
 
The only problem with that is that the GC didn't really HAVE an OS. There was no background anything, past the opening animation and the options menu there was nothing really for it to do. AFAIK with both the GameCube and the Wii the system drivers were loaded onto the game discs themselves to be run, not by the OS

There's really no way to compare what the GameCube was doing versus what a modern OS needs to do in terms of loading, multitasking, overlays, etc

The Wii is actually a bit more complicated because of IOS. IOS does a smallish subset of what you'd expect from an operating system, but runs on a completely separate CPU from the actual games. And that's not even getting into the wide variety of IOS versions. The Wii is really a bizarre half measure between having an OS and not having one.

The Wii U also technically has 2 OSes since it inherited the IOS thing from the Wii.

Why does a console primarily meant for playing games need an operating system again?'

Sure some of the features are neat, but I would drop most of them them for having nearly 100% of the system resources running the game like it used to be pre current gen.

Patches, various online services, having literally anything happen in the background (like downloads), and security are all either much more difficult or impossible without an OS.
 
I think it's literally the same people arguing about the memory and deliberately discarding all explanations about it. I think I'm going to post it one more time and just copy paste or link next time this pops.

Quick example: Battlefield 1 @1080p Medium Settings Geforce 960m 2GB

When it comes to third party ports it's almost as if some people take the very notion of what is essentially a tablet running these big name games as an insult to their chosen home console hardware. Nintendo not having the vast majority of the big name third party games in the past two generations makes the very thought of some of these franchises appearing on Switch a joke to some people aswell.

There are so many tweaks and optimisations developers can do to suit the hardware when developing for a fixed platform which yield incredible performance gains over even the massive framerate jumps we see in PC graphical settings videos like the one above.

For some of the more demanding titles like the DICE BF games they could even target 30fps instead of 60fps or sub native resolutions. I'm sure Switch owners would rather have graphically intensive games like BF1 at 30fps than not at all. People got on just fine last gen playing BF mp at 30fps on console after all.

I personally think that there will be quite a few very big name, current gen third party games announced for Switch in January and the following months. The games won't look anything like as good as they do on PS4 or even XB1 but by lowering graphical settings and releasing on Switch developers will be able to market their games as 'on the go' experiences rather than marketing to graphical enthusiasts (who are mostly on PC anyway).

On the subject of the OS I wouldn't be bothered if there was no access to web browsers or the eshop while a game is running as long as it meant the developers getting their hands on an extra half a gig of memory to play with. The fact my PS4 uses almost half of it's memory to constantly record gameplay for features I never use is extremely annoying to me.
 
I think Nintendo learned it's lesson from how awful and non scalable the wii u and 3ds operating systems were. I doubt they go the same path again it will be something more fluid and flexible including achievements imo.
 
On the subject of the OS I wouldn't be bothered if there was no access to web browsers or the eshop while a game is running as long as it meant the developers getting their hands on an extra half a gig of memory to play with. The fact my PS4 uses almost half of it's memory to constantly record gameplay for features I never use is extremely annoying to me.

I'd be VERY surprised if it uses 1/2 it's memory to record gameplay but certainly there's something going on to warrant 3GB of RAM usage for OS on a console.

For me the things I'd want/use are:
Background downloading whilst I play and when in sleep mode.
Pause game and launch another game/app from the menu. (Default on consoles as no "Exit" game)
Ability to check time when playing (Previous one should cover that)
Notifications from friends list to join a game and ability to join said game whilst I'm playing.

Lesser, would be opening the shop and starting a download of a game if I remembered whilst playing but it's not super important.

That's about all I can think of for now. Not exactly heavy resource usage. I'd hope they could get an efficient OS in 256MB or less.
 
Hmm I really doubt there's more than one fan in the handheld. I don't think that would make any sense at all. The article does seem to say that their is by repeatedly referring to the handheld as "the system" and then talking about "the systems inbuilt fans".

But I still think its just a mistake in the way its been written.
 
Been keeping an eye out for any more random terga benchmarks but nothing recently and the last set which seemed to trace to a pascal based device were removed.

The last benchmark seemed to be about a 640gflop device. Problem is it had far more Ram than stated for Switch
 
Hmm I really doubt there's more than one fan in the handheld. I don't think that would make any sense. Then again that article does seem to say that their is more than one fan in the handheld.

When I first read it I thought "the systems inbuilt fans" referred to the handheld and the dock combined as "the system". But reading it again it repeatedly refers to "the system" as something separate from the dock.

Not sure what to think about that article..

I just figured it meant there were two fans. One in the dock and one in the main unit itself. I guess with the wording it could be interpreted as multiple fans in the main system, but that would be kinda overkill and unnecessary, right?
 
Hmm I really doubt there's more than one fan in the handheld. I don't think that would make any sense. Then again that article does seem to say that their is more than one fan in the handheld.

When I first read it I thought "the systems inbuilt fans" referred to the handheld and the dock combined as "the system". But reading it again it repeatedly refers to "the system" as something separate from the dock.

Not sure what to think about that article..

The console and the dock are two separate entities:
 
Been keeping an eye out for any more random terga benchmarks but nothing recently and the last set which seemed to trace to a pascal based device were removed.

The last benchmark seemed to be about a 640gflop device. Problem is it had far more Ram than stated for Switch

Dev kit?
 
Been keeping an eye out for any more random terga benchmarks but nothing recently and the last set which seemed to trace to a pascal based device were removed.

The last benchmark seemed to be about a 640gflop device. Problem is it had far more Ram than stated for Switch

What was the date and how much RAM did it have ?

We should keep an open mind with regards to the specs imo. Eurogamer, Emily and Laura have been great and bang on with the Switch leaks but they're not infallible. Things like specs can be changed at the last minute before mass production. The 360 and PS4's memory was doubled in the months leading up to launch and even the WiiU CPU and GPU both got clock speed bumps not long before release. I imagine feedback from developers using the latest dev kits is instrumental in knowing which components need increased.

If the CPU leaks we've heard are true then I'm expecting a nice CPU which can compete with PS4/XB1's CPU, 4GB's of RAM and a modern 600gflop GPU. If it's anything more than that then it will be a nice surprise.

I'm personally buying Switch for Nintendo games and they have shown time and time again that they can create incredible looking games with the most pathetic of specs. If they can create games as beautiful as MK8, Mario 3D World and Pikmin 3 on WiiU then the mind boggles at what they will be able to achieve with even the above modest specs.

Resolution, anti aliasing and texture detail will hopefully be the biggest improvements over WiiU bringing them ever closer to that Pixar CGI look.
 
What was the date and how much RAM did it have ?

We should keep an open mind with regards to the specs imo. Eurogamer, Emily and Laura have been great and bang on with the Switch leaks but they're not infallible. Things like specs can be changed at the last minute before mass production. The 360 and PS4's memory was doubled in the months leading up to launch and even the WiiU CPU and GPU both got clock speed bumps not long before release. I imagine feedback from developers using the latest dev kits is instrumental in knowing which components need increased.

If the CPU leaks we've heard are true then I'm expecting a nice CPU which can compete with PS4/XB1's CPU, 4GB's of RAM and a modern 600gflop GPU. If it's anything more than that then it will be a nice surprise.

I'm personally buying Switch for Nintendo games and they have shown time and time again that they can create incredible looking games with the most pathetic of specs. If they can create games as beautiful as MK8, Mario 3D World and Pikmin 3 on WiiU then the mind boggles at what they will be able to achieve with even the above modest specs.

Resolution, anti aliasing and texture detail will hopefully be the biggest improvements over WiiU bringing them ever closer to that Pixar CGI look.

I doubt nintendo has balls for 6GB or more. They should really take whatever price hit but ram has been pricy as of late in the component sector. This is one of the few areas they have sucked on in the past. If devs are saying it's another GC situation they should speak up for themselves.

One GC biggest flaws that fucked it with multiplatform ports was this area. Hey we could put certain xbox titles on GC, whoops great fast ram just not enough for Doom3. Dead serious if devs especially big AAA devs are saying do more ram just listen 8GB is shit for last gen engines GTA5 and Skyrim alone are example enough on pcs. To be fair Dolphin was suppose to have 64MB of embedded good stuff. That was fucking crazy for a 2001 console. Same for having high end version of the power pcs they wanted.
 
What was the date and how much RAM did it have ?

We should keep an open mind with regards to the specs imo. Eurogamer, Emily and Laura have been great and bang on with the Switch leaks but they're not infallible. Things like specs can be changed at the last minute before mass production. The 360 and PS4's memory was doubled in the months leading up to launch and even the WiiU CPU and GPU both got clock speed bumps not long before release. I imagine feedback from developers using the latest dev kits is instrumental in knowing which components need increased.

If the CPU leaks we've heard are true then I'm expecting a nice CPU which can compete with PS4/XB1's CPU, 4GB's of RAM and a modern 600gflop GPU. If it's anything more than that then it will be a nice surprise.

I'm personally buying Switch for Nintendo games and they have shown time and time again that they can create incredible looking games with the most pathetic of specs. If they can create games as beautiful as MK8, Mario 3D World and Pikmin 3 on WiiU then the mind boggles at what they will be able to achieve with even the above modest specs.

Resolution, anti aliasing and texture detail will hopefully be the biggest improvements over WiiU bringing them ever closer to that Pixar CGI look.

The benchmark was from October. The device had 7.5GB of ram listed in the benchmark. In the end it could simply be a next gen Jetson board like the X1 Jetson board in the op
 
Well, Wei Yen is doing his own thing. He runs a cloud hosting service which Nintendo uses. He worked with Nintendo to develop the iQue for China.

I'm trying to guess how Nintendo ended up switching partners from AMD to Nvidia that easily, considering the Dolphin architecture is co-founded by AMD.
 
I doubt nintendo has balls for 6GB or more. They should really take whatever price hit but ram has been pricy as of late in the component sector. This is one of the few areas they have sucked on in the past. If devs are saying it's another GC situation they should speak up for themselves.

One GC biggest flaws that fucked it with multiplatform ports was this area. Hey we could put certain xbox titles on GC, whoops great fast ram just not enough for Doom3. Dead serious if devs especially big AAA devs are saying do more ram just listen 8GB is shit for last gen engines GTA5 and Skyrim alone are example enough on pcs. To be fair Dolphin was suppose to have 64MB of embedded good stuff. That was fucking crazy for a 2001 console. Same for having high end version of the power pcs they wanted.
I thought a Doom 3 port wasn't unlikely to happen due to the GCN not having actual DX9 graphical features that allowed the XBox to run the engine. As for the RAM, perhaps the GCN could have reserved RAM that was much faster than the ARAM used. Makes you wonder how powerful the system would have been if Nintendo amped up the budget a little bit more.
 
The latest LPVG article said the fan in the dock is in addition to "small internal fans" which means more than one. Can't find the link now, on mobile.

Edit: http://letsplayvideogames.com/2016/12/report-nintendo-switch-dock-increases-performance-not-via-extra-hardware/

That fan comment is ambiguous. The fan could be in additional to what's in the dock already. I would be very surprised (And annoyed) to find a fan in a handheld. Inevitably they'd get loose and start to make noise
 
I thought a Doom 3 port wasn't unlikely to happen due to the GCN not having actual DX9 graphical features that allowed the XBox to run the engine. As for the RAM, perhaps the GCN could have reserved RAM that was much faster than the ARAM used. Makes you wonder how powerful the system would have been if Nintendo amped up the budget a little bit more.

Carmack stated in interviews any graphical features wouldn't be a problem in a proper port. You cannot use the slow ram for what they wanted.

I've talked about Wii/Dolphin/Cube differences a lot as Wii is a proper dolphin far more than cube ever was. Shame it was never exploited. Pathetic factor 5 probably used more advanced features in their Lair tech demo than most developers ever did when it came to their games. Pathetic rebel strike shits on just about 95% of the Wii Catalog in pushing the architecture.

Doom on gamecube
Can Wii achieve Doom3?

Both threads go over a variety of things that will make you wonder. At the very least you will agree it highlights why nintendo being more standardize is a good thing. Since the snes despite doom ports made including original doom and special doom with Doom64 making them was a pain and required sacrifices from their source. Having an architecture that minimizes quite a bit or is a blessing is something they literally haven't been able to claim since the nes to me. I'm looking for cracks in armor and I can barely find any compared to anything they have done since N64.
 
But it doesn't only have vram

12GB DDR3 1600mhz RAM

@Skittzo0413: More than one internal fan in the handheld? Is this new info? Must have missed that.

But you don't think BF uses those 12GB right? I can't check it but system process can fit in the other 1.2 GB comfortably, and at that kind of settings GPU memory should not even be using it's full 2GB in the game, so the point remains.

PD.- Browsing a bit seems like BF can use more than 1.2 GB in system RAM on Windows PC, but not to the point where it seems like an impossible wall to balance the memory use.

Hmm I really doubt there's more than one fan in the handheld. I don't think that would make any sense at all. The article does seem to say that their is by repeatedly referring to the handheld as "the system" and then talking about "the systems inbuilt fans".

But I still think its just a mistake in the way its been written.

This is what I think as well, people are reading too much into the later quote.

Just looking at Pixel C using Tegra X1 @850Mhz / 20nm passively, why would Switch need to be actively cooled? I'm expecting a lower clock speed undocked and it may even be on 16nm.
 
Status
Not open for further replies.
Top Bottom