Need to keep in mind that PC has a separate memory pool, RAM + VRAM where else with consoles it's just one pool for the CPU+GPU+OS.
I'll take Shadow of Mordor once again as an example, that game (IMO) does not look impressive at all at 4K, yet it eats nearly 7Gb of VRAM.
This game was not created from the ground up for 4K so I'd expect games that are created with that in mind would require even more memory.
Sony bought the cheapest chips from Samsung for the original PS4 (512Mb x16:
http://www.samsung.com/semiconductor/products/dram/graphic-dram/gddr5-component/K4G41325FC?ia=759
For PS4 Pro they went with 1Gb sticks:
http://www.samsung.com/semiconductor/products/dram/graphic-dram/gddr5-component/K4G80325FB?ia=759
They could very well go with 1Gb sticks x 24, which would probably give the best result in terms of amount and price point.
You'd have the same situation as XBOX 12Gb, but instead it would be in clam-shell like base PS4 is but with a total of 24Gb.
That way they can keep the bus width at 384-bit (12x32-bit) while having more memory available (someone correct me on this).
I can't find the article, but in clam-shell mode they can reap the benefit of it's bandwidth and total memory capacity without widening the bus.
You'd have something like 1152GB/s of bandwidth or 1320GB/s depending on GDDR6's final specification on a 384-bit bus.
On paper that all looks impressive also and easier to communicate to the general consumer that isn't as knowledgeable.
They'll see that it's x and z times better and as a reason as to why they should upgrade, which is the vast majority Sony are selling to.