I think he's serious, and he's technically not incorrect. No one at Nintendo's R&D and those with this information, would care about whatever the hell we're going in circles about. The info she has is obviously going to be second hand just as 10k's info is second-hand.
The bigger question is who is the developer who fed 10k this information (and who is feeding Emily this information, is it another developer?), as we know its a dev who told him about the Polaris-based operations within the API/SDK but never did said source claim actual Polaris and 14nm FinFET. As has been noted, that information does not actually conclude or pre-conclude 14nm/Polaris GPU, that was a conclusion that we jumped on in this thread and took to rather absurd heights.
The issue that arises from ruling out 14nm (which, I suppose, in and of itself has not been ruled out but common sense would generally err on the side of ruling out tech that is barely in production) leads to a very bizarre situation with the GPU. It, in many ways, leads to a conclusion that is effectively nonsensical unless AMD+Nintendo came up with a very bizarre but effective (both in cost and performance) amalgamation of feature sets between GCN 1.2 and 1.3. From a basic point of view, this would be MORE expensive than just making a 14nm Polaris GPU.
Of course, the way Emily worded her response could also be construed to be about Polaris 10/11 rumors that had been circulating as those are actual GPUs that one could get wrong... and are the only actual GPUs mentioned. Polaris in and of itself isn't a GPU its an architecture. You could have both rumors end up being true, end up with 14nm, but end up with a very custom GPU that is of the Polaris family but is considerably lower than Polaris 10/11 in performance and, as a consequence, price.
However, as the CPU rumor/leak hasn't been debunked and the GamerGirl information also remains un-challenged by Emily and others, one can still rather un-crazily conclude that we'll be looking at a CPU performance over the PS4 of roughly ~30-40% (or the equivalent smaller gains over the X1's CPU), and the GPU has to have had at least a 10-20% gain over the PS4's (not the X1s) to not introduce rather catastrophic bottlenecking in the opposite direction of what the twins are currently suffering from. Likewise the RAM also needs the throughput to make this feasible. All of this comes back around to Emily's notion of "specs are good".