Thanks for the reply, that's a real world example, which is good. If the majority of that is assets, then ultra setting at 1080p should run into the same problems on 4GB right? If not, then I'm still wondering why the need is higher for 4K, the assets are the same, if the GPU can handle the workload, why is the memory requirement so much more? As you say the actual frame buffer hasn't been a problem for a while.
What I'd love to know is, what is the breakdown of a games VRAM, and given the same settings (say Ultra), what changes between 1080p and 4k (or other higher than 1080p res). I know things like AA consume more frame buffer space, and so does things like double and triple buffering.
What I'm hoping this discussion would get us, is a better understanding of whether 4GB's is enough if a different memory architecture is used (HBM). Like does increasing the resolution 2x or 4x consume an extra 1GB or 2GB, just in its self.
I would like an explanation of this as well. Why does > 1080p necessarily require a lot more VRAM?
The assets aren't any different (?), so why is 4gb suddenly not enough?
Also, just because some games "use" more than 4gb doesn't mean they "need" more than 4gb.