I don't have the source images, but these that he posted - cropped at just 500-600pixels wide and - without identifying labels do look virtually the same even to a trained eye, and the underlying settings might be identical to a point, but as the performance mode get less time to make inference from the PS5 specific motion vectors saying they have the same settings is false.
You are saying that the sharpening filter has no performance cost, and the reason for that is that it isn't a sharpening filter for the game, but a sharpening bias value for the FSR algorithm, meaning that the algorithm reconstructs with different displacement biases, so whether set at minimum or maximum the performance cost is already baked in, unlike a real sharpening filter such as Guassian blur filter using a kernel to multi-sample and average at sub pixel displacements, which has a performance cost that increase proportionally to the square of kernel width/height.
And that's how we can tell that the cinematic image quality difference between PS5 balance mode and RX 6700 is more than just an FSR sharpening bias value. The loss in quality on the PC version of the cinematic at 6:03 looks like the internal resolution on PC for Unreal 5 engine - prior UE5 scaling it to 1080p and generating motion vectors for FSR reconstruction - is lower than 1080p or that the Pc settings are lowering depth precision or cascades, making the objects beyond the foreground look like they are smeared in Vaseline(quincunx AA) and making the the fog equation start to end crush quickly, and making it obscure all the Vaselined models very quickly on PC, and making all the fog pixels clump, as would happen at lower source resolution, and be a problem FSR or DLSS couldn't repair because the obscured geometry is effectively deleting data to inference from.