Yeah it seems too expensive to keep up the 8x generational RAM increase. But there's mo way you can shift data into DRAM fast enough on a frame for frame basis - even a small fraction (eg 1GB) at 30 fps means you need 30GB/s bandwith .. even Sony's custom solution isn't expected to reach that
I'd be interested to hear what this means for games - as I understand it a 4k texture should be 4x (2x double resolution) the size of a "1080p" one
.. but how does that actually work in practice ..? are current textures actually detailed enough for 4k ? is there a way to cheat on textures ? how much of an average game RAM allocation is texture ? etc etc .....
Games will always be bandwidth limited because of the closed architecture of the console/PC GPUs. It's difficult to say whether 4k textures are enough. And any cheats for textures to keep storage down would have been utilized a long time ago. The big issue with moving to 4k is that you will need a 1:1 ratio on all of the levels of texture for a particular asset. For example, say you have an asset that requires the following components to render it:
Base color (which could have 2 or more textures blended together -- i.e. scratches, paint, scuffs, etc..)
Ambient color
Specular color
Emissive color
SSS color, etc..
Then you need at the same res (in order to have best quality):
Base normal map (1 or more blended together)
Specular normal map
etc.. etc..
Then all of those textures have to be mip-mapped into other levels of detail (4k, 2k, 1k, 512x512, 256x256, etc.. etc.. down to 1x1)
Do you see the pattern? A HUGE amount of bandwidth is needed throughout the entire graphics pipeline.