I think that ray tracing is an interesting feature that could help to make games look better while reducing the workload for visual artists, but I think it's being pushed way too hard by hardware producers as this next big thing, while the actual tech is simply not there yet to support it efficiently. I would much rather see this processing power used for things that can look actually impressive, like more advanced physics or high-resolution, high-framerate performance.
This is pretty much spot on.
RT pays BIG when we hit overwhelmingly obvious diminishing returns on the multiple, increasingly complex, pre-computed lighting hacks used to achieve the same effect as RT. At that point, fully raytraced lighting becomes a no brainer, since the performance cost of RT versus the multitude of precomputed lighting approximations approach equality.
This is not currently where we are now, however, since the baseline hardware target that directs the majority of games development, i.e. consoles, simply aren't powerful enough to support fully RT lighting, and the various precomputed lighting approximations intended to serve very specific VFX features, e.g. A.O., are currently good enough and comparatively significantly less performance-intensive than a RT'd alternative to make using RT for those applications worth it---for the most part.
There are still edge cases today, like with reflections, where RT with denoising at a reasonable sample rate will be only marginally more performance intensive and/or give significantly better results than traditional methods like using cube maps.
For many games, however, the trade-offs required to enable limited RT effects in games on current-gen consoles are just too steep to justify the cost.