I'd expect the PS4's APU to outperform the 750ti since it has over twice the bandwidth and 4-8x more memory. In single precision flops you're looking at 1840 vs. 1306. This
760 seems like a better comparison to me. It has a much higher tdp of 130 watts though.
Also could you elaborate on how the GTX 970 is three times more powerful than the APU in the PS4.
You're right, the 750ti's bandwidth might cripple it, didn't realise it was that low. I picked it because it is the first maxwell gpu, the 760 isn't as power efficient.
Comparing nvidia and amd tflops is pointless btw, as is comparing tflops across different architectures.
The hd 5870 was a 2.7tflop gpu, yet the 7850 with 1.8TF shits all over it, similarly the nvidia cards perform better despite lower TF performance on paper. It only makes sense to compare them within the same architecture.
I can't explain why (you'll have to ask durante or something) , that is just how it is.
the gtx 760 is a lot more powerful (bigger difference than between ps4 and xbox one, it's close to a 7970ghz edition not a 7850)
As for the 970, because it's supposed to be equivalent to the 780 according to leaked benchmarks (will know exactly in 8 days when it's officially "revealed")
And a 780 is 3x faster than a hd7850 in benchmarks.
The gpu in ps4 is a crippled 7850 (as in they are almost identical spec and architecture wise except apparently the ps4 version had the bits that allow for 'free' texture filtering cut off to save on die space).
edit: looked it up and it's closer to 2.5x difference (I went from memory earlier)
As always the difference also depends on resolution, higher end gpus with more memory bandwidth tend to scale better to very high resolutions, where the difference between one gpu and a higher end one at 1080p may be 70 percent , while at 1440p with msaa the lower end one might be memory bandwidth bottlenecked and get crushed by the higher end one . (70 percent example is just a random number I picked to explain)
In synthetic benchmarks or 3dmark the difference is 2.5x,
Anyhow back to the point: the point is maxwell (980, 970 , 750 ti and any future cards) is way more power efficient than kepler(gtx 6xx cards, titan, 760 770 780), and kepler in turn was more power efficient than amd's GCN used in the amd 7000 series and ps4.
The 970 is supposed to have the same TDP as the 7850 while having the same performance as the gtx 780
The only thing particularly power efficient about the ps4 is the cpu part of the apu, jaguar cores consume little power (and also run at low clockspeed), that's why they are originally designed for ultrabooks and tablets. and why they have ultrabook performance not desktop pc performance
An 80 watt i5 will shit on it from orbit though performance wise

a 50watt i3 will outperform it too.
The important thing to note with power usage is that the PS4 is not going to represent substantial power savings in all but the most grossly OP builds with triple SLI or something.
Compared to even a high end build, you may be saving a dollar a month with a PS4, maybe. Maybe.
But compared to the Wii U, then yes, your savings are more substantial. That might be 2-3 dollars a month, and that matters.
ehh I think he has a point, he is overstating it yes and maxwell will render it kind of moot even on the high end.
But a high end amd card will consume 250 watt, if one makes the horrible mistake of buying an amd fx 8 core cpu that's another 200 watt (but again the high end intels only use 80watts peak)
In that way a higher end pc does have a higher hidden cost than a midrange or low end one or a ps4.
But even then it's not going to add up to 50 bucks a year... (maybe if you play for 12 hours a day 365 days a year)
You have to consider TDP is not real power consumption, it's theorectical maximum under 100 percent load for the full chip, the only time you approach that is with something like furmark burn test.
Googling it as an example I found a midrange gtx 770 midrange gpu combined with a heavily overclocked i7 , 4 sticks of 1.5v ram and a couple of hard drives drew about 220-240 watt peak at the socket (that is at the power socket, so count power supply inefficiency of 20 percent having added to that number)
The TDP listing for that system would be almost 400watts.
So it's not nearly as power hungry as it looks (also why I scoff at people buying silly 700 watt power supplies for their midrange pcs, puhlease, people were running more power hungry gtx260s and core2quads on 350watt power supplies back in the day, their pc did not explode

)
So even an older midrange gpu+high end cpu + 20odd extra watts from those ram sticks and hdd raid config don't amount to all that much.
You'd literally have to play all day errry day to get to 50 bucks a year in power bills, if you buy something that's a good deal more powerful than a ps4 (and again maxwell will negate the difference)
Still, power consumption is worth keeping in mind when buying anything.
e.g I've seen people choose an amd fx 8350 over an new i5 or i7 because it's 20 euros cheaper...
But that shitty amd cpu is a ridonkulous power hog especially when overclocked (almost 300 watts on its own when overclocked, not tdp, actual power consumption ; as well as a much higher idle power consumption (even more important)
vs a 60-70watt i5 or i7... 230watts difference or so on the cpu alone under load , and up to 50odd watts idle (which matters more than load power consumption since a pc tends to be on but idle almost 24/7 for many people)
So over the course of 4-5 years of the lifetime of that cpu they're pissing away maybe 150-200 euros on power bills just to save 20 bucks on the purchase price, for no actual performance gains.