These are GCN GPUs so just do the following. Load up a PC game at settings that run well with 0xAF (30fps, 60fps whatever you want just keep everything the same apart from AF), run a benchmark and then do it again with 16xAF. Look at the difference, at most it will be 2 FPS. I have tried to find performance benchmarks of various AF settings but other than a Skyrim benchmark from Nvidia (does not disclose the GPU) where 0 - 16xAF was ~ 2FPS, the only ones I can find are 5+ years old.
It really is that easy, that devs are not doing it is a total mystery to me.