I don't really think there is an agreement (a conspiracy?) between the game developers and Nvidia to not put much too effort into optimizing, so that people will need to go buy out and buy new GPUs and resort to brute forcing their games to get acceptable performance.
Maybe the game is adecuately optimized, but it's just too big or uses too much RTX stuff. I dunno.
BTW I'm kinda in the same boat with the Switch; I've had it since launch and had barely touched it, but for the last few months I've been buying many multiplatform games that run well enough on it, and a few exclusive titles as well.
The Switch draws 17W while playing and charging, and with the electricity prices whe have now, it just makes more sense for me to play on a 17W system instead of on my 750W PC. I've been using my PS5 more, too, because 200W vs. 750W.
there really is no conspiracy, just people that cannot come to terms with they got low VRAM devices.
3080 is 1.7x-2.5x times faster than a ps5 and up to 3x times faster in ray tracing applications combined with DLSS, but when all said and done, it only has 10 GB VRAM buffer, out of which only 9.2-9.3 GB of it can be used by video game applications, since Windows compositor and Steam by default will use around 500-700 mb on any given day.
At best case scenario, PS5 allocates around 9-10 GB VRAM for GPU related operations for settings that are sane and optimized, tuned and tweaked. Said 3080 users are unable to understand/accept that pushing settings above those "console" settings now require more VRAM. they were able to do so with crossgen games, because said games did not fully utilized 10 GB VRAM available on PS5 purely for rasterizaton.
this was not an issue until recently because most games were designed with PS4 buffer in mind. most games used around 6-7 GB VRAM even at 4K ultra settings since baseline textures were tailored for 4 GB GPU bound (out of 5.5 gb total) VRAM buffer of PS4. That helped 3080 to flex its powers, enable ray tracing on top of things.
but now hogwarts legacy is game where it will fill up 9-10 GB VRAM without ray tracing or advanced higher graphical settings on PS5. there pracitically exists no free VRAM for ray tracing on top of, not unless you sacrifice on texture settings.
3080 can be 125125x fater than a ps5. this does not change the fact above.
do notice how PS5 also has to reduce texture quality in a big manner with its ray tracing mode. the game and its primary textures are designed with 10 GB VRAM buffer purely for rasterization in mind.
really, 3080 simply should have 16 GB VRAM buffer so that it can stretch its legs. it can't in this game. there's nothing to do about it. settings are there. you can reduce texture quality to medium and enable ray tracing like PS5 does.
a 3080 user simply has to adhere whatever VRAM related limitations a PS5 adheres to. this is simply a capacity issue. people were warned and this has been told.
it is practically a design choice, its VRAM buffer only allowed to run ray tracing, high quality textures when the said textures were from the PS4 era.
if you want PS5 era textures, WITHOUT ray tracing, you're bound by that VRAM buffer.
take cyberpunk for example. this game fills up the entirety of an 10 gb buffer at 4k ray tracing. but it has lastgen textures.
now imagine cyberpunk with 1.5x higher sized textures.
how can you fit it into 10 gb buffer? you cannot.
this is what hog legacy is practically doing. it has higher quality textures than most lastgen games.
the base game is already filling the entirety of that tiny 10 gb buffer, like it does on PS5.