Mr.Phoenix
Member
So from what you have said... I am getting that the primary reason upgrades are deemed Ok in the PC space is because you get higher framerates and are not locked to some sort of visual presets.There is no restrictions on a PC, no removed options menu or capping the framerate at 60 or 120 or being forced into dynamic resolutions or specific antialiasing methods.
You can test how good a new graphics card is on a 10 year old game if you want.
Can still be an unnecessary upgrade of course if you're up at 400fps or whatever, but you can at least see the power difference.
On console you're stuck using modes predefined by a developer like here with ini files.
If developers don't push the new hardware enough because the old hardware is the main target then you'll never see what it's capable of. You'll end up looking at versus videos on Digital Foundry zooming in 300x to spot dynamic resolution changes and frametime graphs.
Eventually you'll look at the tech specs and wonder where all the power went and then wish they had started a new generation instead where devs could target the new hardware.
Ok.
So if the a PS5pro, does give you higher framerates, more graphical features and better resolutions, doesn't that meet that supposed requirement to justify it as an upgrade?
Or does this only count if you are taking a game up to 200fps+?
I would think that its generally known, that consoles cater to the home TV market, so what you call restrictions, are actually standards that these consoles adhere to considering their primary market. Which typically caps out at 120fps in most cases. And that also, we can already see that current consoles run games at anywhere between 30-120fps at varying resolutions and IQ. So if something does all that, while either running 50% better or looking 50% better or both.... even while operating between the 30-120fps window that is common with home consoles.... how does that not warrant a $100 upgrade?