• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Should PC Games Go Back To Being Designed With Future Hardware In Mind?

Should More PC Games Go Back To Being Designed With Future Hardware In Mind?


  • Total voters
    57
(I wanted to make this question include console gamers too, but on console either a game is optimised or not, you can't go out and buy better hardware, so I don't think this applies to consoles.)

I remember back in the day, I'd buy games that I could barely run, in part because I just wanted to play the game, even if I had to turn the settings all the way down, and in part because I knew it'd run well eventually when I upgraded. Even as kindasorta recently as The Witcher 2 that game included a lot of settings that weren't intended to run well on the PCs that most people had at the time, they were for future PCs.

I mean, even back then, people complained and moaned about this stuff and games not including 112092910 different graphics settings, but I feel like these days PC gamers have gotten too used to being able to run every game on ultra, because apart from some PC exclusive games, PC games, especially console ports, aren't designed with future tech in mind anymore. PC gamers expect to be able to just set everything to max if they have the most up to date hardware, and get extremely mad if it doesn't run absolutely perfectly when they do.

I feel like it's not a good thing, and that it's holding developers back from doing really ambitious things. What do you think?
 
Some games already get designed for future hardware. For example Avatar has some hidden settings that will push your pc to your limits. The problem nowadays compared to previous generations is the diminishing returns. Even if they make games with some future tech, it wont look super amazing compared to what we have now nor will many care. Also considering how hardware is getting fucked, this will probably die.
 
Yes. I'm spending 4K on a machine to play fucking state off the art games. Not better versions of console ports. Need PC exclusives to be a thing again
 
Some games already get designed for future hardware. For example Avatar has some hidden settings that will push your pc to your limits. The problem nowadays compared to previous generations is the diminishing returns. Even if they make games with some future tech, it wont look super amazing compared to what we have now nor will many care. Also considering how hardware is getting fucked, this will probably die.
I guess I don't really see the problem. Especially now that we have Steam and GOG, PC games aren't like console games, where you need to wait for them to be remastered for the next generation. You buy a PC game, you have it forever, and if it's a good game, you should want to go back and replay it once in a while, so why not allow for that?

Also, more importantly, it allows developers to include more ambitious gameplay features, because who cares if they run perfectly or not on current hardware? Obviously they'd need to communicate that effectively to their players, though.
 
As long as my 5070ti and 9800xd3 can handle it with Pathtracing DLAA frame gen x2 or Pathtracing DLSS quality frame gen off with a 1440p monitor cause I not getting a new PC for close to another decade.
 
Last edited:
To some extent they already are. It's not like any of the current games that offer path tracing and all that are doing so at 4k at decent FPS without the use of framegen and upscaling.
 
No because it would just become an excuse not to optimise the game better.

Sure implement some crazy optional feature much like "PT" on Cyberpunk back in the day but aiming simply to run like crap unless there is new hardware shouldn't be a thing unless you're specifically aiming for new paradigms.
 
No because it would just become an excuse not to optimise the game better.

Sure implement some crazy optional feature much like "PT" on Cyberpunk back in the day but aiming simply to run like crap unless there is new hardware shouldn't be a thing unless you're specifically aiming for new paradigms.
Right, of course I'm not talking about games being unoptimised out of laziness. I'm absolutely talking about really ambitious things like Path Tracing. Actually what I had in mind specifically was I was thinking "why don't any of these multiplayer games support gigantic npc armies fighting alongside you? Is it just because it'd run horribly on current hardware/internet? I don't think that should really matter."
 
Right, of course I'm not talking about games being unoptimised out of laziness. I'm absolutely talking about really ambitious things like Path Tracing. Actually what I had in mind specifically was I was thinking "why don't any of these multiplayer games support gigantic npc armies fighting alongside you? Is it just because it'd run horribly on current hardware/internet? I don't think that should really matter."
Unfortunately the feature would never be game changing like that when nobody can currently use it, it will be relegated to optional things like graphics that have little impact on gameplay. At best you might get a traffic/enemy density slider that goes beyond what current hardware supports or something like that. You're better off just adding the gameplay changing features down the line if the game is still alive.
 
Last edited:
Some games already get designed for future hardware. For example Avatar has some hidden settings that will push your pc to your limits. The problem nowadays compared to previous generations is the diminishing returns. Even if they make games with some future tech, it wont look super amazing compared to what we have now nor will many care. Also considering how hardware is getting fucked, this will probably die.
Look at Crimson Desert Cinematic settings and Nvidia Ray Reconstruction.
The game was never marketed with any of these settings. All the PC trailers were marketed with Ultra settings without Ray reconstruction.

I'm all for future proof games. One of the best aspect of the PC is replaying your old games when you change your GPU, at higher res / max settings.

Currently playing RE2 remake (on randomizer) on a 5080 at 4K + 180% resolution, getting 80 to 100 fps. I still see new details in textures compared to when I played it at 4K (base resolution 100%) 60 fps with my 3080 12gb.
 
I was into pcgaming back in the day for the cutting edge games and games that couldn't be played on console.

Games you needed a new gpu or pc to play or benefited greatly from one.

Now it is mostly console ports in a world of diminishing graphical returns.
 
We're still seeing some games with additional features difficult to run, like most people are still gonna struggle to get path tracing, and we're also gonna see neural rendering be added in future as another visual step up only the top high-end portion of consumers will get...most everyone else it will be an upgrade later with mid-tier to budget gpus.

It would be cool to get another Crysis that gobsmacks everything else, but that studio couldn't even sustain that for another game, and the series went right back into multi-platform restrictions. Platforms are blurring more together, so most of what you get that's gonna be different on PC are games that really only work with mouse/keyboard, and genres that have their audience there. Graphics are in general hitting diminished returns, so it's getting harder for me to care vs performance.
 
Last edited:
Doesn't make sense to me.
Game budgets are already very high, a lot of games feel poorly optimized for the hardware we currently have, not to mention they often launch in a buggy/unfinished state. And we want devs to spend time implementing features almost no one can use?

Besides, isn't stuff like path tracing basically this? Sure there are high end GPUs that run it decently, but it's still very demanding for the average gaming PC and it's not going to run well on any of the top 10 GPUs on the Steam hardware survey. Not to mention these high end GPUs now cost more then a whole high end PC used to cost back in the day.
 
Yes, but you'd have to hide it. Or name it sth other than ultra settings (like future hardware settings) so 5090 owners won't get butthurt.
 
Thing is, there aren't many graphics intensive PC exclusives any more, and tech advancement has slowed in relative terms.

I lean towards no. I'm not against developers including settings that really need future hardware to run well, but I don't think they should primarily be designed around anticipated future tech. Plus, I don't miss games running at 15fps.
 
I think RE9 has done it best. Plan for the lowest and add for the highest. That seems to work alot better than doing the opposite that devs have been doing for years now.
 
Requiem did it best, make all the settings fairly scalable, but have 1 specific setting or feature that's designed with high end or future PCs in mind.
 
No. This made sense back in the day when the next iteration of hardware was just the same but faster. Then we went to multiple cores on CPU's, upscaling, ray tracing, temporal anti-aliasing, machine learning etc on GPU's. Now we have DLSS 5 AI filters.
 
No, i remember when Crysis did this back in the day, well no one that turned AA on could run the game at barely over 20fps, so you had to wait for new Graphic cards, and by then the game was no longer new, make games for current hardware as best they can is my preference.
 
No, if anything because it tends to get that future wrong.

Case in point: Crysis.

Bingo.

Further, ambition has already outstripped current hardware capabilities so walking further down that road is the opposite of what I'd like to see happen.
 
Top Bottom