If the SCD is being used to do say increased graphics processing akin to an external GPU. There is nothing in the design of the Switch that indicates it having an PCI-e port or NVlink equivalent to have an external GPU connected to the Switch.
I don't even see how a new dock would work when the Switch has only 1 USB-C port which has to connect to the TV via HDMI to USB-C cable to do video output.
Not that I think such an SCD will happen but recall that the SCD patent did mention wireless connections between console and SCD for assisting with graphical processing. So it's possible I guess, but again that's very unlikely.
2 people have argued the Switch can be better than the Xbox One because the XB1 can't do F16 calcs, it can only do F32. Even then they both said it will be a game to game thing. I don't agree but this your FYI.
Ha, I think I've been one of those people. I do think Switch can perform better for certain, heavily optimized games, than the XB1, but overall it will be a weaker machine.
Did these specs get any further corroboration?
No, they are essentially a guess about what is in the devkit based only on Eurogamer reporting that the devkit is a Tegra X1. Which means, the specs in the OP are for a standard Tegra X1, which Nvidia has 100% confirmed is
not the final hardware in the Switch. Some insiders have said it's fairly similar to what we should expect though.
So..... It's true that the Switch should be able to handle Breath of the Wild better than Wii U, right?....... Right?
Yes, definitely (assuming the TX1 rumors as a baseline are true). The Wii U is very, very outdated at this point. It's a miracle they have a game like that running on it to be honest.
EDIT:
It really can't. Maybe in these particular circumstances (heavily optimized UE4 and in house games) it can be closer than what the paper says, but performing better? No way.
If we wind up with the best case scenario SoC (768GFlops docked) and a CPU better than what's in the OP then heavy use of FP16 in a UE4 game for instance (like 1/3 of the code in FP16) should get the real world performance at least close to XB1 levels. Depending on how much better Nvidia's dev software works (if Nvidia flops> AMD flops holds true for consoles, which is unknown as of now) that might push it over XB1 levels, again for a hypothetical, highly optimized game. Especially if that game relies heavily on CPU functions.
But that's clearly all very hypothetical and in general this shouldn't reach XB1 levels for most (if not all) games.