To answer the OP questions, currently NO 60fps is not killing overall graphic fidelity. First of all, with today's hardware the difference between 30 and 60fps is merely resolution scaling and maybe some tweaking of a few features here and there that have minimal impact on the overall image. It doesn't change the overall ambition and design of the game or it's assets to shoot for 60fps. We can see that especially with the current gen (PS5/XBS) as many games have multiple modes with 30 and 60 fps output. Ratchet and Clank doesn't look fundamentally different in its 30 vs 60fps modes with RT enabled. Just slightly lower
input resolution to the temporal injection scaler and some nips and tucks to the fidelity that's minimal and unnoticeable without side by side comparisons. Same can be said about games like Demon's Souls, ACV, and many others.
Obviously 60fps games have been around for over 30 years and throughout the "3D" gaming era (i.e. since PS1). It never prevented new generations of games from looking like a huge leap over previous generations (GT1/2 -> GT3 on PS2, MGS1 -> MGS2 on PS2, Tekken 3 -> Tekken Tag PS2, Call of Duty Xbox -> Call of Duty 2 X360 and the list goes on). Your example of GT7 not looking as good as Driveclub is A.) a bit premature since we haven't seen GT7 up close yet and B.) somewhat misguided since parts of GT Sport looked better than Driveclub on PS4. Hard to image they would take a huge step back on new hardware. Halo Infinite is a case of the change in developer goals and a redesign of certain aspects of the game. If they targeted 30fps, it wouldn't look totally different. Just perhaps running at a more stable framerate and a slightly higher resolution.
Of course, a lot of the 30 vs 60fps load discussion becomes moot with the wider adoption of temporal reconstruction tech (FSR, TSR and others on console) which can easily take a 30fps lock and double the framerate to 60fps with minimal to no impact on the image or the underlying game design