Message for devs: graphical settings is the wrong direction for console games

Do you want graphics modes in console games?

  • No, if I wanted it I would play on PC

    Votes: 101 41.1%
  • Sure, why not?

    Votes: 89 36.2%
  • Yes, console should become more like PCs

    Votes: 56 22.8%

  • Total voters
    246
Graphics settings is one of the reasons why I switched to PC. Now I can disable depth of field!
The Simpsons Animation GIF by FOX TV
 
I personally think it's just a cowards way out. If you can't make a game run well then you are over reaching.
Drop the settings and make everything look clean and run at 60 fps.
 
Its a really selfish take. One size doesn't fit all. If you don't like settings, don't click on the settings button.
 
This thread makes me think of when gamers play through an entire RPG/aRPG without going to the equipment magic screens to use the new spells/classes/weapons they pick up every 30 minutes because it takes too much work.
 
Sorry for joining the conversation but you're the one doing revisionism here, until the PS4 gen there were virtually no settings selection options for the user.
I'm just stating facts.
PS2 era was actually 'setting rich' compared to previous gen - well above 50% of PS2/GC library (100% XBox) had P-Scan options, 60%+ had 16:9, 50/60 fps was there in majority of 'PAL' region releases (it would have been common in NTSC too but Sony actively blocked it), and FOV, HDTV and AA options were in single digit %s.
And there were a handful of games that had detail settings too.

PS3/360 era upped the ante significantly, the only difference was they were more 'stealthy' about it. But just because settings were controlled mostly through system-menu - the modes don't stop existing.
Close to 100% of games had at least two modes for HD/SD - which would equate to resolution/performance in modern day (though most games didn't take the opportunity to uncap SD framerate, but because nothing ran at stable fps that gen anyway - you still got substantially better framerate). Also this would usually hide 4:3 option inside the SD setting.
Additionally - a sizeable number of games added further HD modes with different visual and performance settings (all GT games had 2 HD modes, many Sony and EA sports games had 1080p 30, 720p 60 modes, etc). Adjustable FOV and AA also made a return in a number of games, though still - quite few overall.
 
Last edited:
I'm just stating facts.
PS2 era was actually 'setting rich' compared to previous gen - well above 50% of PS2/GC library (100% XBox) had P-Scan options, 60%+ had 16:9, 50/60 fps was there in majority of 'PAL' region releases (it would have been common in NTSC too but Sony actively blocked it), and FOV, HDTV and AA options were in single digit %s.
And there were a handful of games that had detail settings too.
In my opinion, you make a mistake when considering progressive scan activation as a ''rich settings'' This makes our dialogue unfeasible because this is like comparing ''sound test'' to ''settings''
in this thread we are talking about ''graphic settings chosen by the user''. yes on the ps1 you could run GT at 60fps, Toshinden too, on the Dreamcast Re-volt you could do the same thing but this is not like the ps5 generation where there are many settings when the ideal is just one. If

If we have options, then let them be as total as one PC and not just 2 or 3. This is absurd.
 
All I want in my games are field of view adjustment and a performance mode that is locked 60. Drop whatever else you need to achieve that.
 
You can blame the Xbox series S.

If it was just ps5, pro, and series X you could create a far more even playing field for what to aim for.

That piece of shit means you have to develop for it first.
 
developers should have their default settings for the casual plebs.

and add a cheat code for people like me that opens up a mostly complete graphics settings menu. with refresh rate and framerate toggles, with different quality settings for each effect, and with a resolution target setting.

because I know for a fact that I would be able to configure the majority of AA and AAA games better than the devs do.

Apex Legends in its 120fps mode for example has SSAO enabled... a setting that can in some scenes cost around 15% performance, and in extreme cases even more than that.
meanwhile the dynamic resolution goes below 1080p at times and the framerate is unstable.
I'd lock it at between 1080p and 1440p, everything on the lowest possible setting, AO off, TAA off, spot shadows off etc.
and I bet it would look and run better than it currently does
 
Last edited:
i like options, but the more graphical modes, the less polished each mode will be

artemakis-slap.gif

Ha! I walked right into that one.

I mean it does not require me to interact with them before it let's me play the game. It's something PC players can ignore if they choose to.

I always take a look and make a few adjustments because I like the deep dive, but I get the impression there are console players who think you have to spend a bunch of time picking options before you're able to start playing. That's never been the case AFAIK.
 
OLED has made 30fps unbearable. The added input latency is also a killer.

Use 120hz output with unlocked VRR and target 60fps. No need for any other setting. People with TVs from 2018 and older without these features can settle for 60fps with occasional frame drops.
 
At this point most games are being made to comply with various devices going from mobile, switch, desktop pc, consoles and of course the cloud

Do you really think dropping one graphical mode would move the needle one bit ?

I don't think so.
 
And as long as we're at it, stop wasting resources to make games for the PC. Those fucking things are for the ladies at work to make PowerPoints or whatever.

Thanks in advance.
 
Why do you feel the need to tinker with something the author has already set to be just like he thinks it should be? In a fixed system?

I understand quality / percormance preference, but other than that, what's the point?
Because different people like and dislike different things. Just because From enables poor Ray Traced shadows by default in one of their games doesn't mean thats best for me.
 
It always makes me chuckle how console gamers look down on PC graphic settings, and think of it as a 'bad' thing, when in reality i find it incredibly fun, and even slightly addicting, getting a new game then tinkering with the settings, trying to squeeze a bit more juice out, push the res that bit more, pushing the settings that bit more while keeping my desired 60fps, etc, etc.

Its part of the reason why benchmarking culture is such a big thing on PC.

I dont think i could ever go back to consoles with their very limited options, or in the op's case, zero options.
Why anyone would want that i dont know.
 
menu settings aren't really anything to be afraid of. but .ini files, hidden engine options, trial and error tweaks, comparing pixels in screenshots to verify you have the best settings, that's the kind of OCD you might want to avoid as a console player
 
Top Bottom