uncreativename
Member
(I wanted to make this question include console gamers too, but on console either a game is optimised or not, you can't go out and buy better hardware, so I don't think this applies to consoles.)
I remember back in the day, I'd buy games that I could barely run, in part because I just wanted to play the game, even if I had to turn the settings all the way down, and in part because I knew it'd run well eventually when I upgraded. Even as kindasorta recently as The Witcher 2 that game included a lot of settings that weren't intended to run well on the PCs that most people had at the time, they were for future PCs.
I mean, even back then, people complained and moaned about this stuff and games not including 112092910 different graphics settings, but I feel like these days PC gamers have gotten too used to being able to run every game on ultra, because apart from some PC exclusive games, PC games, especially console ports, aren't designed with future tech in mind anymore. PC gamers expect to be able to just set everything to max if they have the most up to date hardware, and get extremely mad if it doesn't run absolutely perfectly when they do.
I feel like it's not a good thing, and that it's holding developers back from doing really ambitious things. What do you think?
I remember back in the day, I'd buy games that I could barely run, in part because I just wanted to play the game, even if I had to turn the settings all the way down, and in part because I knew it'd run well eventually when I upgraded. Even as kindasorta recently as The Witcher 2 that game included a lot of settings that weren't intended to run well on the PCs that most people had at the time, they were for future PCs.
I mean, even back then, people complained and moaned about this stuff and games not including 112092910 different graphics settings, but I feel like these days PC gamers have gotten too used to being able to run every game on ultra, because apart from some PC exclusive games, PC games, especially console ports, aren't designed with future tech in mind anymore. PC gamers expect to be able to just set everything to max if they have the most up to date hardware, and get extremely mad if it doesn't run absolutely perfectly when they do.
I feel like it's not a good thing, and that it's holding developers back from doing really ambitious things. What do you think?