Surprised to see all the impossible talk and alien tech is needed etc for 4k. people are putting 4k some kind of pedestal of $1000s. Thread is littered with it.
GTX 970 and 980 from 2014 are around 150W average power consumption. They are capable of 4k gaming. Sometimes at full settings, sometimes you need to drop the a few settings down and even then the settings can be above console or on par. Of course not all games, AC Unity at 4k is pretty tough but I can do around 30 fps at 3200x1800. Considering many console games dip below 30 often it's really not out of reach.
AMD will need to step up though but PS4 is around 130W. has a APU which roughly looks like a 25-30W Jaguar core and rest is the 7850 ish spec GPU.
Lets not forget that the PS3 launched at around 170w in total. I can see a 150W in total machine doing around 30fps at 4k in existing games. Just have to wait and see how well the 14nm jump will go.
If you want to a good example, look at 4870 vs 5870, double the performance at the same power.
The recent 28nm jump was a load of rubbish frankly especially the 7000 series from AMD and sadly the PS4 and XO used this family of GPU. Shortly after that Nvidia shown the way with the 750ti at incredibly low power consumption. Knocking a 150W 970/980 down to around a 100-110 is not some quantum leap, it happens all the time and we're only talking about around 50W here.
We don't need aliens to come down and visit, or need 2x 980ti to do 4k or shrink down a $3000 PC into a APU. A 150w GPU from 2014 can do it, Sony MS missed the boat on performance and power.
A shitty AMD cpu can keep a game at 30ish, people posting their 4790k CPUs are not understanding. My shitty i7 930 from 2009 can do 60fps. Many of the crappy A8 A10 and low end FX can keep 30fps on the PC side and all you need is a £240 GPU launched in 2014. Please stop with the silly rig examples.