The whole point of buying the best of the best is to run stuff without compromise.
..and that applies to all consumer products
they can technically up the shadow resolution by 36x and buckle the 4090 down to 5 fps and call it a ultra setting.
who determines the compromise line?
ac valhalla has ugly draw distance despite everything at "max/ultra". but yeah, you can then simply say you get 4k ultra 60 fps on a 2080ti (nearly). so now you think you didn't compromise on quality?
there's no universal way of labeling something low, med, high or ultra.
hogwarts legacy had 4 texture options at launch,
low: 3000 mb of budget
medium: 3500 mb of budget
high: 4100 mb of budget
ultra: 5000 mb of budget
and then, they went and made the old low new high. now
low: 1200 mb
medium: 1800 mb
high: 3000 mb
ultra: 5000 mb
for some other user, such as 4080 with 16 gb, 5 gb ultra texture is still a compromise. you can have 10 gb texture cache and get extremely high detailed textures even at an extreme view distance. however the option is simply not there.
or imagine Game A calling a 1/8 dithered ray tracing reflections as "epic mega max ray tracing". will you be simply that you get high framerate with their "label"?
then imagine Game B calling full resolution ray tracing reflections "medium" and 2x resolution ray tracing reflections "high". now what happens in relation to the game above?
so you didn't compromise on image quality on Game A because you could run "epic mega max ray tracing "label with high framerates, but you did with Game B?
we don't even know what ultra settings entail. for all we know, there could be one crucial setting that hampers framerates by %40 with any barely image quality difference. there's a reason most gtx 1000 and midrange 2000 users do not bitch as much as 3000 and 4000 users do. they simply go on and set their games to sane medium high settings and get great performance across most of the titles.
this even includes last of us part 1, which runs majestic at med/high mix settings on a 1070/2060. but once you push ultra, even a 3070 buckles down at 1080p. that's how it is.
spiderman's remastered pc version literally calls obnously bad ray tracing resolution "high". from their perspective, it is high. so we have to take up their word for it? and they labeled a setting "very high" because it makes reflections somehow bareable. worse is, you can still get higher reflection quality and there are indeed cards that can handle it, but they won't. why would they? run off the mill users should be happy that they play at very high ray tracing without a compormise and get great performance while doing that.
so it is how it is, huh?
people have to stop think in "preset" terms. just focus on what is on the screen. if high and ultra looks %99 the same and one is %50 more performant, then I hope devs present epic settings that take %250 more performance for a %0.5 improvement in image quality just to mess with you guys who do not want to compromise on image quality that somehow you believe are determined by some random dev.