I wouldn't discount a mechanical HDD. Yeah, it's moving parts, but Nintendo are changing and bottom line is that they need to keep this thing cheap while also offering a ton of storage so that they can continue to push those digital downloads. Still, a PCIe SSD may make a certain sense under the "Nintendo Overcompensation Theory" (bgassassin, 2012). This was demonstrated most famously when Nintendo went from the N64's 4 kb of texture cache to the Gamecube's 1 MB. In this scenario, it would go something like, "You say our OS is slow, do you? How do you like this shit?"
I won't completely discount the possibility of a HDD, but Nintendo have never had a fondness for moving parts, and it would be an unusual point to start using them. I'd expect SSDs to have fully taken over consumer level electronics by the end of the generation, so if they can get by with NAND in the NX they can skip the spinning disc storage era altogether. Besides, with USB 3 hard drive support people would have lots of options for large, cheap storage if they need them.
Overcompensating is definitely possible, but there's already plenty of scope for that with eMMC, with speeds going up to 400MB/s*. If you can't get a snappy OS working off 400MB/s flash, then the flash bandwidth isn't your problem. I haven't seen PCIe flash solutions pushing less than 1GB/s sequential reads, which would be straight up crazy coming from 60MB/s in the Wii U (although I'm sure open-world game designers would be happy with it). The only reason I could see them going with a PCIe SSD is if they decided:
(a) We want to offer storage tiers of 250GB+
and
(b) We still don't like hard drives
Nintendo wouldn't spend money on ultra-fast storage for the sake of ultra-fast storage, but I can see them spending money on it for the sake of not having a hard drive.
That said, my money would still be on PCIe just being used in the dev kits.
*In an earlier post I stated that Apple's recent iPhones and iPad Pro used 400MB/s sequential read eMMC flash, but I may have been wrong in my assumption that it was eMMC (which is used in pretty much every other smartphone). Apple purchased a company which designed NAND controllers a while back, and as of the new MacBook, they seem to have started using their own flash controllers in their hardware. That being the case, it's possible that the A9(X) has a flash controller on-chip, and the flash modules used were just straight NAND chips as opposed to eMMC. That said, Toshiba (which make said NAND chips) do advertise eMMC modules which provide 400MB/s sequential read bandwidth, so it's still plausible on the upper end of the eMMC spectrum.
Just stop. As I posted earlier in this thread, I had a dream where the EXT port returned. Inexplicably, the handheld plugged into the bottom of the console, completely hiding the screen. Maybe player 2 would have to lay underneath a glass coffee table? My brain is trolling me.
I think your brain misses the good old days when you'd take your Gamecube out of its packaging and wonder what the hell those three mystery ports at the bottom were for.
Now, that's actually a really smart idea. Maybe it costs them a bit more up front, but the benefit to future hardware revisions is huge.
Well, historically Nintendo hasn't pushed hardware revisions nearly as much as their competitors, but I suppose they have talked about "increasing the number of form-factors", so maybe they are taking the pro-active approach this time around to account for possible revisions.
You assume that they would need to stick to 500mWatt for the CPU and lower, then point out the clocks for such a dual core A72 with the assumption that they will use 28nm, mine is based on 14nm, and it would be 800mhz on that chart with the old 16nm process, even if we took that as a direct reflection of samsung's 14nm, the 14LPP process reduces power consumption by 15%, you'd get 1Ghz dual core A72 at 500mWatt. It would also only take up about 4mm^2 of space at 14nm, now if they do happen to go with the 5 year old 28nm process, then yes A53 makes more sense, but sense isn't the word I would use for using such an old process.
Read further down my post, I wasn't working on the assumption that the CPU would have a budget of 500mW. On a 16nm process, and assuming that two A53 cores take up the same die space as one A72 (which I think is actually an underestimate), then Nintendo would have to dedicate at least 1W to the CPU in a situation where they're comparing 2 A72s to 4 A53s, or 1.6W if they're comparing 4 A72s to 8 A53s, to make A72s worth their while. The numbers would be similar for 14nm. Given that I'd expect around 2W for the entire SoC, and that would be heavily weighted towards the GPU, I would be extremely surprised if they were in a situation where A72s made sense.
Regarding 10nm (although it's a bit off topic), our friends at Chipworks
don't expect any 10nm products to ship this year. Samsung might squeeze out some NAND or DRAM by year's end, but indications are that we won't see any actual CPUs/GPUs/SoCs until 2017 at the earliest (Intel possibly excepted).
And 14LPE really isn't that mature. It's so far only been used in a small number of high-end mobile SoCs (which are both small dies and highly dependent on low power consumption). Qualcomm will start using 14LPP for the new Snapdragon 820, but all their mid-range chips (which are more likely to be representative of the kind of SoC you'd expect to find in a Nintendo handheld) are sticking with 28nm for a good while, perhaps even past the end of the year. AMD are another bit of evidence in favour of it being a slow-maturing node, as they're not shipping any APUs or high-end GPUs on 14LPP this year, which likely means you won't see anything over about 175mm² on it until 2017. In comparison, AMD were putting out 300mm²+ chips on 28nm pretty much immediately, even though it was widely considered to have very poor yields for the first year or two.
I don't think a 14nm SoC for the handheld is completely impossible (and as I said, I actually thing Nintendo
should got that route), but I don't think there's any reason to expect it.
Honestly even Wii U used a newer process when it was launched in 2012 (the 45nm process launched in 2008) The process was forced on Nintendo by NEC since that was the best they had, Nintendo isn't in that same position at all, so the logic of going with 28nm seems to be lacking, instead what I see is expectations based on pure assumption, if I'm wrong and there is real logic to it, I'm all ears.
Wii U's CPU (Espresso) was manufactured on IBM's 45nm process, which was first used in shipping products in 2010 (POWER7). IBM's first 32nm chip (POWER7+) didn't ship to customers until early 2013.
Wii U's GPU (Latte) was manufactured on Renesas's 40nm process. The first 40nm TSMC chips shipped late 2009, although I can't find a date for Renesas's 40nm process (probably 6 months to a year after). While you're right that 28nm wasn't an option with Renesas, they obviously had the option open to them of using a GCN 1.0 based GPU and manufacturing on TSMC's 28nm node, which they didn't do. The availability of eDRAM may have played a part in this, but it's hard to say.
Ultra high end APU doesn't mean 6700K+980ti. Even the PS4 APU is "ultra high end", considering that when it came out AMD stated that it was the "most powerful" APU they had developed to date. I'm also pretty sure i read here that an ex Qualcomm guy was in charge with the SoC design, so it's definitely NX related. Google AR and Steam machines are not even consoles.
Keep in mind that LinkedIn profiles are effectively CV's, so I wouldn't expect modesty on the part of the people writing them. Phrases like "ultra high end" depend heavily on what you're comparing to, and if the comparison is to other Qualcomm products then maybe a device that competes with the Shield TV might be considered "ultra high end". The other possibility is that it's an internal product to offer as a turnkey solution to Android console makers, and they don't actually have a client for it yet.