I get that, but they could easily just use that gap (1.2 to 1.8) for improved lighting or nicer shadows, and it basically would not hamper the handheld downport. I mean, Had WiiU and new3DS shared a similar structure, i think MK8 should be able to run on the handheld... with worse lighting, shadows, resolution and textures obviously, but the game would still offer the same gameplay.
Nicer shadows (in the sense of higher resolution shadow maps) are something that scales extremely easily, but having "improved lighting" on the home console relative to the handheld generally means having to maintain two different lighting systems, and having to optimise two different lighting systems, which is often the most challenging aspect of engine development. At the easiest end of the spectrum you can simply disable features of your engine on the handheld, but you risk having the handheld versions of games look like low-spec PC titles; flat and devoid of the game's aesthetic character. At the other end, if you're using deferred rendering on the home console and simply can't manage that on the handheld, then you have to rewrite huge chunks of your game engine under a completely different paradigm.
Obviously it is possible to develop games across highly divergent hardware capabilities; look at PS4/Vita cross-play games or even Smash 4. The further the two are apart, though, the less you can share between them in terms of code, assets, and even areas like QA and optimisation. These are all things which cost money to duplicate, and which Nintendo have talked about
not wanting to have to duplicate with future games.
Polaris 10 and 11 are set to release in Q2 of 2016 (Fiscal Year, think fall back to school in the US) and Vega 10 with HBM2 is off somewhere in 2017. For a winter release Nintendo would need to be in mass production sometime this Summer, so it wouldn't be outside the realm of possibility that the semi-custom chips are 14nm. I think it's important to remember than AMD has had designs taped out for 14nm since late 2014, and it was the slow process of tooling up at GlobalFoundries that caused them to abandon 20nm entirely and go straight to 14. I know this because one of the parents at my daughters daycare is an engineer at AMD for their server business, and I picked her brain once at a birthday party about the whole buyback disaster. If Nintendo has been working on NX since sometime in 2014 and mostly 2015, they very well could have been offered 14nm as an option knowing their release was late 2016.
Is it likely? I doubt it knowing how conservative Nintendo is, but it's not impractical or improbable. If they went x86, I would think 14nm is completely off the table since Zen would be their only option and it's completely unproven, whereas Polaris is an evolution of the proven GCN cores. That doesn't seem like Nintendo's style at all. But ARM opens up a lot of options and scalability for both the handheld and the console and AMD has access to the entire portfolio of reference designs.
Notice that, for the first time, AMD are releasing their entry and mid-range cards first, and not their high-end. The reasons for this would be two fold. The first is that, on a low-yield process, smaller dies are much better value (as the probability of a fault increases exponentially with die size). This is why we're seeing so many phone & tablet SoCs on 14nm & 16nm before much bigger CPU/GPU dies. The second reason is that low and mid-range chips are also used as laptop GPUs, where margins are far higher.
Traditionally for desktop chips you would want to move your production over to a node when the cost per unit performance is lower than your existing node (which factors yields, dies per wafer, cost per wafer, clock increases, etc.). In this case, though, I don't believe cost per unit performance on 14nm or 16nm will be lower than 28nm
for desktops/consoles until next year, which is why we're not seeing Vega (and likely Nvidia's high end chips) until then. However, with a likely big jump in energy efficiency with the new nodes, the cost per unit performance will be much better
for laptops this year, when AMD launch their first new cards.
AMD will probably launch the desktop 470/470X using Polaris 11 and 480/480X for Polaris 10, but it's unlikely that they'll make much money off them initially. Like the Pitcairn chip used in the 370(X) and Tonga in 380(X), though, they'll command far more money as laptop GPUs. In that sector they can, with relatively small 14nm dies, considerably outperform the existing options in terms of both maximum performance and performance/Watt, and there's scope for AMD to make a lot of high-margin sales and potentially grab a bigger chunk of the laptop market (or even lose what they have if Nvidia beats them to the market with 16nm chips).
I wouldn't be surprised if AMD have been sitting on Polaris for a long time, and I don't think it's the architecture that's the issue for Nintendo, but rather the maturity of the node. Particularly when you consider that they would have had to make this decision in late 2014, it would have been very risky of them to commit to 14nm at that stage purely on the basis of predicted yields. At that stage they probably would have been particularly keen to get a successor to Wii U out in a timely fashion, and I doubt they would have had much of a tolerance for risk in that regard.
All the above points, incidentally, feed into why I believe it would be a possibility (although still unlikely) for Nintendo to go with a 14nm Polaris based SoC for the handheld. The handheld is more heavily dependent on perf/W, so the cost per unit performance differential will make it viable well before it makes a home console viable. It's also a very small die, which means even on a low-yield process you'll still get a relatively good number of well-functioning dies. On the last point, Nintendo would probably have been fairly confident about the 3DS's longevity in late 2014, so the risk of a handheld launch being pushed back to 2017 might not have worried them so much. Then, of course, there's the fact that a higher-performing handheld (and a GCN-based one at that) would save them money in development costs over its life by allowing them to share more code and assets with the home console.