The idea of targeting 4K on the home console is an appealing idea to solve the problem of developing cross-device games that run on both handheld and home console while putting out a home console which is competitive with the competition. Doing so doesn't necessarily mean that they have to be "portable games on a console", though (which is an assumption I see come up a lot about cross-play games on NX, regardless of resolutions). A game which is developed side-by-side for a 540p (let's say) handheld and a 4K console should be able to utilise high poly-count assets with high-res textures on the home console, and simply scale down those assets for the handheld. Rendering handheld games at high resolution (like your 3D Land shot) tends to highlight the low asset quality, but this shouldn't be an issue for a game developed with both targets in mind.
You will have to get by with relatively simple pixel shaders (or more accurately less computationally expensive pixel shaders) due to the sheer number of pixels you're pushing to the screen. The effect of this will vary a bit depending on the aesthetic of the game. For a game that strives towards realistic graphics, pushing half a billion pixels a second to achieve 4K/60fps is going to seriously hinder the ability to compete with the likes of Uncharted and Assassin's Creed, but for Nintendo games it might not be that harmful.
Let's consider a situation where Nintendo releases a 540p handheld with a 128 Gflop AMD GPU which we'll call NX Portable and a home console with a 2048 Gflop AMD GPU we'll call NX Home. We'll forget about RAM and so forth for the moment. Now, let's say that Nintendo primarily groups the games being made into two categories: games which are developed to run on both devices (e.g. Mario, Animal Crossing, Splatoon, Mario Kart, etc.) and games which are developed to run just on the home console (e.g. Xenoblade Y, Metroid Prime 4: The Reprimening, most western 3rd party games, etc.). For the first group, there's a neat x16 relationship between the GPU ALU performance of the two machines, so assuming the other aspects of NX Home's design can handle it, they can run at 540p on NX Portable and 4K on NX Home. The game logic will have to be limited to what can run on the handheld CPU (as that's not going to scale with resolution), but for these kinds of games it shouldn't be a big issue, and
going by Blu's benchmarks, even a lowly quad-core 800MHz A53 cluster would still outperform Wii U's CPU by about 35%. For the second group of titles, they can run at 1080p on NX Home and take quite a bit more time on each pixel, as well as make full use of the home console's CPU.
Where does that put the first group of games in terms of graphics on the NX Home? They shouldn't be inherently limited compared to the second group when it comes to asset quality (although may have to use simpler texture filtering), but obviously will have to use simpler lighting and shading techniques to run through those pixels quickly. Assuming Nintendo wants to keep putting out games which run at 60fps, they need to push 497,664,000 pixels through their shaders per second to do so at 4K, and with a 2048 Gflops GPU this gives them 2,058 operations per pixel to work with (taking FMA as one op, and of course this isn't going to be purely dedicated to pixel shaders, but I just want to keep the comparisons simple here). NX Mobile, pushing 31,104,000 pixels per second through a 128 Gflops GPU will also have 2,058 ops/pixel to work its magic with, so should be able to use precisely the same shader code as its big brother.
To put this in perspective let's first look at Wii U. The RV700-era GPU used in the Wii U is about 6 years older than the GCN 1.2-era GPU that we'll assume is used in NX (or 8 years older than a Polaris-era GPU, however unlikely that may be). So, a Wii U flop is definitely not equal to an NX flop. The newer GPU will have a more advanced shader ISA allowing it to do more useful things in fewer cycles, and the components like the thread scheduler will have been substantially upgraded to allow developers to get as close as possible to fully utilising the computational power available. That all being said, if we want to say how these games will look in comparison to Wii U games, it's helpful to have some kind of numerical data to work with, even if we can't make a like-for-like evaluation.
With 176 Gflops of GPU grunt, Nintendo has been able to dedicate the following amount of processing power to each pixel in various Wii U games:
1080p/60fps: 708 ops/pixel (Smash Bros)
720p+480p/60fps: 1,102 ops/pixel (El Capitan Todd, Nintendoland)
720p/60fps: 1,592 ops/pixel (3D World, Mario Kart 8, Tropical Freeze, Splatoon, etc.)
720p/30fps: 3,183 ops/pixel (Xenoblade, Zelda?)
And to compare it to a handful of PS4 and XBO games (where you should be able to make a much more like-for like comparison):
Forza Motorsport 6 (1080p/60fps, XBO): 5,265 ops/pixel
Killzone : Shadow Fall (1080p/60fps, PS4): 7,406 ops/pixel
SW: Battlefront (900p/60fps, PS4): 10,666 ops/pixel
SW: Battlefront (720p/60fps, XBO): 11,846 ops/pixel
Uncharted 4 (1080p/30fps, PS4): 14,813 ops/pixel
Assassin's Creed: Syndicate (900p/30fps, XBO): 15,162 ops/pixel
Assassin's Creed: Syndicate (900p/30fps, PS4): 21,331 ops/pixel
So, you're obviously not going to be getting games which look like Uncharted or Assassin's Creed at 4K/60fps (duh!), but there's a big range there, and at 4K/30fps you'd be pretty close to pulling off the kind of shading you get in Forza 6.
It is worth considering, though, how valuable this kind of per-pixel computational grunt is to games like Mario and Animal Crossing versus the huge leap of image quality that you get when moving to 4K. As an example, Digital Foundry claims that Captain Toad "feels almost pre-rendered at points", despite it releasing in an era of games which have almost 20 times the computational resources to dedicate to each pixel. This isn't to say that Nintendo couldn't improve on this with more sophisticated pixel shaders, but most Nintendo games don't need, say, physically based material shaders to achieve the Pixar-like look they're going for, while for a game like Battlefront these more computationally intensive shaders are vital to achieving such realistic graphics. At the same time, the colourful, high-contrast worlds that many Nintendo games inhabit can really benefit from the crispness of a 4K image.
Take a look at Captain Toad:
Now, take a game which looks that good, give it a generational leap in polygon count and a generational leap in texture detail. Then give it whatever improvements in shading Nintendo can gain from a 6 years of GPU architecture advances and around twice the raw computational grunt per pixel. Then, render it at 4K resolution at a super-smooth 60fps. I don't know about anyone else, but I'd be extremely happy if those are the kinds of games NX Home would be able to produce, and I certainly wouldn't consider it a "low-spec game". Of course, games like Xenoblade would look over-stretched attempting to run at such resolutions, but at 1080p/30fps they would have the resources available to compete with the likes of Uncharted 4 when necessary.
As I said above, the idea of a 540p handheld/4K console is appealing, but I skipped by quite a bit when I said "assuming the other aspects of NX Home's design can handle it". Running games at 4K isn't simply a matter of being able to run pixel shaders for half a billion pixels a second, but many other potential bottlenecks need to be removed, the chief of which is RAM bandwidth. RAM bandwidth usage is one of those things that scales pretty much linearly with resolution; if your buffer is four times the size you'll need to use four times the bandwidth to access it, all other things being equal. For a 4K framebuffer you have to evict every 16ms, that bandwidth becomes pretty crazy, and when you add in the intermediate buffers used in deferred rendering, which Nintendo now seems to favour, the bandwidth required becomes downright insane, even if the operations you're performing on all those buffers are very simple.
You'd probably be looking at a case where Nintendo would need to use 4GB+ of HBM or 8GB+ of GDDR5X on an extremely wide bus to be able to accommodate games running at 4K/60fps. And that's where the problem with the plan comes in. I can see Nintendo releasing a ~2 Tflop NX Home. It's above my personal expectations, but within the realm of possibility. I could also see Nintendo going with maybe 1-2GB of HBM1 as a replacement for eDRAM with a less powerful console, as a more outside chance. The chances of Nintendo releasing a 2+ Tflop console with either 4GB+ of HBM or 8GB of top-of-the-line GDDR5X on a 256+ bit bus stretches somewhat beyond the bounds of credibility for me, though. I'd love it to be the case, and it would be a nice way for Nintendo to have their cake and eat it when it comes to cross play and competing with MS and Sony, but I just don't see it happening.