This is such a big gripe for me. Console devs ignore AF as if it's hard to run at all.... Makes almost zero performance difference but the visual quality change is massive.
visual change is massive if you're primarily playing on a monitor or if you're somewhat close to your TV than the average casual player is
there's a reason blurry TAA became so widespread because it is only a problem up close or on a monitor (so you need extremely high resolutions to fix it). while a ps4/one s + TV user from a medium/long distance does not even care how blurry RDR 2 is at 1080p and 864p respectively
all graphics design choices are made for people who play at long distances at this point. and for that reason, it is easy to see why devs would set it to 4x and forget because realistically their main target of userbase won't care/notice
that is also why nvidia/amd etc. has dedicated AF enhancer settings even in their control panel. because if 4x was the untweakable standard on PC, it would cause massive outcries.
look at horizon forbidden west. had ugly overdone sharpening on console for 2 years. barely anyone complains. you bring the same sharpening to PC version and forums are full of complaints about it. because sharpening does not look good and creates a lot of artifacts that are noticable if you're close to your screen.