I think this new video might prove they were wrong. It's the same situation, higher res (though in this case less so), worse performance.
Unity was taxing on the dog poo CPUs in the last gen consoles, though they also admitted they gimped the PS4 version (probably the res).
This team (Red) didn't fix the PS4 version of Witcher 3 for a looooong time.
Yes, but the point is that deliberate decision or not, at the end of the day the performance of a game can vary, especially if we have different development tools.
If they used the same API, the most capable HW would always be consolidated as the best version, consistently, whether by a little or a lot of difference. There are games that, for example, perform much better on Nvidia or AMD, there are even cases of equivalent graphics that improve by 40% on AMD (or Nvidia), and then in other games, the difference is reversed.
But PS5 and Series X do not have the same API. It's not like seeing a 6700XT vs a 6800XT (between PS5 and XSX there is no such difference, but I want the point to be understood), they are going to share APIs and simply the most powerful one will be victorious.
If a platform receives greater optimization it will emerge victorious, at least at launch, because it is not a question of power, it is a question of being more advanced in the optimization process. We have seen how Callisto Protocol improved with each update in XSX massively. Before the day one patch, it runs around 20fps in quality mode when on PS5 it was already at a stable 30fps, then on XSX there was no Ray Tracing, it was completely broken, in addition to having worse performance; Then Ray Tracing was added but it was of lower quality than the PS5, finally today, the RT is practically the same. DF said it: on PS5 it was already good almost from the beginning.
With this I do not mean that it is not interesting to debate about HW, I simply believe that when you have two systems that are similar to each other, but with different development tools, in addition to the enormous difference in sales and the handicap of Series S, what matters most is the software, and the hardware in this case takes a backseat. I would firmly bet that the differences we see in 99.9% of games are due to the optimization process and not hardware limitations. For example, we often see games that have more stuttering on XSX than on PS5, but many cases are resolved with patches, and there is discussion here about what could be SSD speed or I/O system differences. Then those same people who argued that PS5 did not have stuttering because it was better designed disappear or do not mention anything about it if the problem is solved in XSX.
There may be some hardware differences that account for some things, but I personally think that in most cases, it's the software.
Additionally, CD Projekt seems to me to have very little "consistency" in the performance of its games. They vary a lot. The example you give seems very valid to me: The Witcher 3 was at 1080p and 900p on PS4 and One, but on the One at launch it worked better. Partly because it wasn't locked to 30fps, partly because it actually performed 2-3fps better, then in some of the patches it ran better on PS4 consistently.
Something very similar happened to the same game on PS5/XSX. One update improved, then another worsened performance... XSX and PS5 in one of their updates worked much worse than at launch, especially XSX. Then, after months, performance was massively improved on both systems, getting a lock very close to 60fps in performance mode. The only thing that was consistent is the 30fps mode, which got better with updates and never got worse.
Also in XSX there was a bug that used the XSS graphic settings, and curiously, the performance did not vary when using the same settings as on PS5.
Cyberpunk is a 4-year-old game, and it doesn't seem like a reliable source as a benchmark to me today, really.