Traditional MSAA is sometimes shoe-horned into recent titles, but
even 2x MSAA can see a good 20-30 per cent hit to frame-rates - modern game engines based on deferred shading aren't really compatible with MSAA, to the point where many games now don't support it at all while others struggle. Take Far Cry 4, for example. During our Face-Off, we ramped up MSAA in order to show the PC version at its best. What we discovered was that foliage aliasing was far worse than the console versions (a state of affairs that persisted using Nvidia's proprietary TXAA) and the best results actually came from post-process SMAA, which barely impacts frame-rate at all - unlike the multi-sampling alternatives.
Looking to another title supporting MSAA - Assassin's Creed Unity - the table below illustrates starkly why multi-sampling is on the way out in favour of post-process anti-aliasing alternatives. Here, we're using GTX Titan to measure memory consumption and performance, the idea being to measure VRAM utilisation in an environment where GPU memory is effectively limitless - only it isn't. Even at 1080p, ACU hits 4.6GB of memory utilisation at 8x MSAA, while the same settings at 1440p actually see the Titan's prodigious VRAM allocation totally tapped out. The performance figures speak for themselves - at 1440p, only post-process anti-aliasing provides playable frame-rates, but even then, performance can drop as low as 20fps in our benchmark sample. In contrast, a recent presentation on Far Cry 4's excellent HRAA technique - which combines a number of AA techniques including SMAA and temporal super-sampling - provides stunning results with just 1.65ms of total rendering time at 1080p.
An illustration of
how much GPU resources are sucked up by MSAA. As it is, only FXAA keeps ACU frame-rates mostly above the 30fps threshold and
our contention is that GPU resources are better spent on tasks other than multi-sampling.