If you recalled the "Watch Dogs" fiasco, where the demo of the game shown at E3 2012 displayed much better graphical qualities than the one available at retail. I remember it was stated because the consoles available at the time was not equipped to handle the graphics that was shown during E3, but it didn't make much sense why it would impact PC users.
As It later turned out, Ubisoft wanted "parity" across platform, which many users just found ridiculous. It is not fair to hinder PC users simply because top-end PCs performed significantly better than the consoles that were out at the time.
Are developers still trying to create "parity" by artificially lowering the quality on PC versions of their games, or do the newer consoles harbor components that can compete with top-end PCs today?
As It later turned out, Ubisoft wanted "parity" across platform, which many users just found ridiculous. It is not fair to hinder PC users simply because top-end PCs performed significantly better than the consoles that were out at the time.
Are developers still trying to create "parity" by artificially lowering the quality on PC versions of their games, or do the newer consoles harbor components that can compete with top-end PCs today?
Last edited: