After reading thuway's original post, I think there's a couple of points to consider: a game's resolution, frames per second, geometry, etc, all impact how much memory you need at any given time.
While there are no 8GB DDR3 graphics cards out there, I did find a review comparing two Nvidia GT 440 cards which were identical except for the ram configuration:
1GB DDR3 vs 512MB GDDR5.
Throughout the tests, they all showed the GDDR5 card to perform anywhere between 4-13% faster than the DDR3 variant, which suggests that having less but faster RAM is favorable to having more but slower RAM.
This however doesn't paint the whole picture though, as none of the games were tested at 1920x1080; the highest they went was up to 1680x1050.
So where does this leave us with regards to playing games at full HD resolutions? I found this other article comparing the
Nvidia GTX 680 2GB card against its 4GB sibling. The conclusion is that, even at 2560x1440 resolution, there is no measurable difference between the two cards; in other words, none of the games tested were benefited by the additional 2GB GDDR 5 memory. 2GB is enough for today's games.
What does that mean for the next generation Xbox and Playstation? I'm not sure, as there are many more variables we aren't currently privy to, such as overall architecture, system memory architecture, cache, OS overhead, bus architecture, physics processing, etc, but it does seem that while it is generally preferable to have lower amounts of faster RAM, having 4GB is overkill for all of today's most demanding PC games.
In other words, it seems to me that both the next generation Playstation and Xbox will be well equipped to handle the visual demands of today's most demanding games. Having said that, who knows what kind of games we will have five years down the line?