I thought you were talking about how 30FPS target would allow devs to reach new heights when in the case of Jedi, Spider Man 2 and Horizon 2 they were able to that and have 60fps modes, no problem.
Yeah, but I also said that the cost of those 60 fps modes were too high. Horizon 2 was broken in performance mode for months. Literally months. A shimmering mess. Most people played in that shitty mode. Jedi didnt get that no-RT patch until september, like 5 months after launch. people were playing a 650p game that wasnt even a locked 60 fps constantly dropping to mid 40s. FF16 has the same drops to mid 40s and literally caps all combat scenarios at 720p to get a locked 60 fps. Spiderman 2 is the best of them all but even that runs at 1080p internally. But thats another game people say doesnt look next gen. I wonder why.
I have seen so many people dismiss games like Alan Wake 2, Avatar, Jedi, FF16 because they played the shitty versions. That was my reason for saying devs should stick with 30 fps modes because at least that way the players will see the game in its best light. Again, going back to my original post, Uncharted 4 was 30 fps, after they revealed the game as a 60 fps game. No one cared because we all saw that uncharted was no longer a corridor shooter. It had wide open areas, massive setpieces that went on forever, driveable vehicles etc etc. Back then we understood the cost of going open world for linear games.
I am sure the issues in the performance mode will eventually be fixed like they were by GG and Respawn. It will take months though and I dont think its sheer incompetence. GG is anything but. Respawn too. These games are just doing a lot of new things and a lot of times, offering a 60 fps mode means retooling the entire graphics pipeline which takes time they dont have.
Most super CPU heavy games are like that due to incompetence and not some actual CPU heavy stuff that players can see, Jedi on PC is a fucking mess and still runs like shit on fastest CPUs on the planet with RT. Starfield has less open world than Skyrim yet is murders fast CPUs, joke is that it's not really utilizing CPUs in any decent way:
Honestly i was shocked when i saw this graphic from DF because my CPU profiling was showing 70-80% usage on each thread.
This was not the case with any other cpu heavy game. Star Wars Jedi is exceptionally poor with threads just sitting there in the teens. hell, 99% of the games not named cyberpunk and starfield top out at like 15-20%.
Starfield like most bethesda games uses the CPU more than most games. yes, its probably not as optimized as it could be, but its one of those rare games where i could excuse it because of their usage.
P.S I should mention that the PS3's cell processor was a dedicated chip with its own cooling. the PS5 cpu is part of an APU that has to shares it thermal budget with the GPU. If devs push it too hard like they did with the cell processor or like they do with GPUs, the GPU might start to get throttled. my guess is thats probably why we dont see much multithreading. the more CPUs are running the hotter it might get. the PS360 CPUs didnt have to worry about that.