To be precise, every downclock has a negative impact even if leads to just 1% performance loose.
You obviously can argue that this "worst case" of only having 18% TF advantage on the Xbox Series X is the usual case but It's simply not guaranted that the PS5 GPU is running at 2.23GHz all the time.
Since we don't have clock numbers we can't tell if current games are already here and there a bit below the 2.23 GHz mark and if not, how many next gen games will put the clocks down?
Mark Cerny was stating that they expect the GPU to run most of its time at or "close" to that frequency and when downclocking occurs that they expect it to be pretty "minor".
Another statement was saying that to reduce power by 10% it only takes a couple of percent of lower clock speeds.
However all of that is of course not very precise and is based on "expectations", even if they are coming from Sony.
It's not like companies are right all the time or don't inject a bit of (too) optimistic marketing.
Now based on the claims and how it should fare, I wouldn't expect major downclocking to occur but next gen games, which also stress the CPU, could pass the treshold and consistently lower the clocks.
On avg. the PS5 might run at 2.15 GHz in one game, 2.07 GHz in another.
Maybe the TF advantage will go from 18% to 20%, would that be a major difference?
Obviously no, but it ties back to my initial statement that 18% is the worst case, it can't be worse than that but it can be better, without stating that it could be much better.
Just to share my perspective on here.