GTA V PS4: 1080@30, Core i3/750Ti: 1080@60. How is this possible?

I'm curious about API overhead. We all know that console overhead for things like an API are lower but it doesn't really seem like we're seeing it. Is that i3 Digital Foundry tested with that much faster than what the PS4's working with?
That i3 clocks at 3.4 GHz and has 2 Haswell cores (and also runs the OS/background processes).
The PS4 has 6 cores (dedicated for games), each one a 1.6 GHz Jaguar.

A Haswell core is probably, clock for clock, about 80% faster to twice as fast as a Jaguar core. Of course this can vary massively depending on workload, but it's a decent rule of thumb. Using that rule, we can say that the i3 is a bit like 2 Jaguar cores clocked at ~6.4 Ghz.

Now, if you simply sum that up you arrive at 2*6.4 = 12.8 "performance units" for the i3 and 6*1.6 = 9.6 "performance units" on PS4. However, parallel scaling is never completely linear in game workloads (though GTAV scales pretty damn well for a game), so you can assume that the actual advantage of the i3 setup is a bit higher. On the other hand, all this assumes that all OS/background workload (for which PS4 uses dedicated resources) is negligible on PC, so it balances out a bit -- though probably not fully.

If you divide 12.8/9.6 you get 1.34. If you multiply 29 FPS (apparently the minimum seen on PS4 after the latest patch) by 1.34 you get ~39. The minimum on the PC is 42, which is a bit higher -- likely due to fewer faster cores always being preferable to more slow ones -- but not off by too much.

What we certainly see and what it all comes down to is that there is no magic thingamajig built into any given PC which prevents it from making full use of its available hardware when properly programmed!

DF please... Using that stupid benchmark for your tests. Also a flame bait PC thread by Alexandros, who would have though.
Reporting facts and asking questions isn't flamebait. No, really, even if you dislike the facts or questions.
 
Sure you do, but you like playing dumb and innocent, and not videogames.

This thread will degenerate to another childish PC vs. console argument in a little while, but I won't be around. Gonna continue my GTAV playthrough (on PC, for anyone interested).

If I had a dollar for every one of these, from the usual suspects, I could retire and live a life of luxury.
 
Here's your answer (beside FPS fluctuating between 42 to 60fps on that config on PC):

Core i3 4130: $120
750 ti 2Gb: $130

That's $250 of your budget in just the CPU and GPU. You still need Motherboard, Ram, Case, Power Supply, Controller, Wifi (if not onboard), Bluetooth, OS, and thats not counting all the specific parts in the APU of the consoles.

So ... yeah.

I know it'll get lost on the PC Warrior "superior Race" types, but in the end you get what you pay for, and even if consoles are far more optimized than your average PC, they're also all-in-one only for $399 (even less either now or pretty soon I expect).

All in all, for either consoles (moreso the PS4 imo), I think they did a pretty good job in balancing the cost vs performance balance. Certainly a nudge more toward the CPU would have been nice, but at the price point they set for themselves (and more specifically that the -market- set for them), something else would have had to give.

The bolded is not true, the average framerate is 60 fps, so it goes well above that, reading comprehension.

And then you try to derail the thread with the cost argument.
Which has been had many times, and every time someone points out that you CAN build a similarly specced pc for the same price and that once you add in the console premium price on games and online fees the value proposition becomes even more lopsided.
At which point - are you still reading this elandyll, or have you already left to driveby another thread? , because here comes the kicker - , like clockwork, the argument about console optimisation and codig to the metal is brought up to suggest that you can't compare the on paper similar hardware if you want to actually play games at console settings.

Do you see how we have come full circle here?

Now I'm just waiting for you to move the goalpost to 'but the ps4 still has exclusives that the pc won't have'
Because since you're already completely missing the point, you might aswell go all the way, right?

All this endless goalpost moving, thread derailing, people getting offended at their new product is getting criticized etc could be avoided if people did not feel the need to justify their purchases at all cost, and could just accept things by the way they are.
 
If that's true and we suspect that the game is mainly cpu bound on ps4 and add in that the pc falls to as low as 42fps we get an answer.

I don't know how much dip could be considered 29 fps. I have the game and it's one of the most stable 30 fps from what I remember.

All I know is that it does drop below 30. A bit of useful information for the people who are discussing the uncapped framerate of the PS4 version for consideration of the averages.

ps4framerate61uc7.jpg


I'm not sure if 1.09 is the latest patch or not though.
 
I hope some console players don't take this thread as an offense, but see it as that: If you are willing to spend $400 on a game console, you can build yourself a pretty decent gaming PC for the same price (and if you're willing to spend $50~$100 more, a very good gaming PC.)
Without bluray player of course.
 
... Surely that depends entirely on the settings?

Yes, but I am calling bullshit that the combo used by the op is gonna deliver that at PS4 equivalent settings. The benchmark is nonsense, just like it was in gta4. Also this is a 2GB card and gta5 is extremely memory hungry.
 
I'm interested in explaining this in technical terms. This is clear from the OP. I want to know how a cheap dual core cpu paired with an entry-level graphics card produces better results than an eight-core console CPU.

You don't really have an eight core cpu, you have a six core cpu for the game.
And by clock alone the 2 i3 cores equal more than 4 cores of ps4.
Then add that a i3 core is much faster per clock than a jaguar core together with the fact that 2 cores will running closer to their peak than 6 cores, cause it's much harder to share the workload equally between 6 cores instead of 2.
And lastly the fact that, according to people posting here, the ps4 version is rock solid.
So the fps is 30 at worst compared to 42 on the pc.
That's only a 40% increase for the pc version instead of a 100% one.
 
Article says the AVERAGE frame rate on the 750ti is 60 FPS. Not the max.

Which is part of the reason those results don't mean all that much.
- 10s at 50fps and 10s at 70fps, => average 60fps.

However, if you have a 60fps display then that becomes:
- 10s at 50fps and 10s at 60fps, => average 55fps.

By allowing the game to render >60fps, the results don't mean what they appear to mean.

In truth, the average is probably nearer 50fps.
 
I'm not an expert but Intel cpu are not remotely comparable to AMD jaguar CPU. Although I don't think it's the only cause of better performance, CPU on PC it's the double more powerful I guess

According to Sebbi on Beyond3d, an Haswell core is twice as powerful as a Jaguar core at same clock speed.
 
Wow, I'm really impressed by those i3/ 750ti performance numbers in that video. Great job this time RockStar. It looks like this port was worth the delays.
 
A thread started by Alexandros about "weak consoles", followed by PC gamer circlejerking, "notebook CPU's" and stuff.

I honestly don't know what I expected.
This.

Why is the OP so confused about such a simple question? Or has he not seen the specs of PS4/XBO?

You want the answer OP? Just check the specs of PS4/XBO especially their CPU.
 
All I know is that it does drop below 30. A bit of useful information for the people who are discussing the uncapped framerate of the PS4 version for consideration of the averages.

ps4framerate61uc7.jpg


I'm not sure if 1.09 is the latest patch or not though.
Just to say to you a person who work on DF has replied some post ago to say it's 30 locked on ps4 from what he tested personally.
 
That i3 clocks at 3.4 GHz and has 2 Haswell cores (and also runs the OS/background processes).
The PS4 has 6 cores (dedicated for games), each one a 1.6 GHz Jaguar.

A Haswell core is probably, clock for clock, about 80% faster to twice as fast as a Jaguar core. Of course this can vary massively depending on workload, but it's a decent rule of thumb. Using that rule, we can say that the i3 is a bit like 2 Jaguar cores clocked at ~6.4 Ghz.

Now, if you simply sum that up you arrive at 2*6.4 = 12.8 "performance units" for the i3 and 6*1.6 = 9.6 "performance units" on PS4. However, parallel scaling is never completely linear in game workloads (though GTAV scales pretty damn well for a game), so you can assume that the actual advantage of the i3 setup is a bit higher. On the other hand, all this assumes that all OS/background workload (for which PS4 uses dedicated resources) is negligible on PC, so it balances out a bit -- though probably not fully.

If you divide 12.8/9.6 you get 1.34. If you multiply 29 FPS (apparently the minimum seen on PS4 after the latest patch) by 1.34 you get ~39. The minimum on the PC is 42, which is a bit higher -- likely due to fewer faster cores always being preferable to more slow ones -- but not off by too much.

Excellent, thanks for this! I'm sure it's simplified to make it easier to understand for us non-techies but it still gives me some insight on how that performance gap took shape.
 
Reporting facts and asking questions isn't flamebait. No, really, even if you dislike the facts or questions.

I own the PS3, PS4 and PC version, so I would say I am objective enough on the topic.
The only fact that I am pointing out is that the in-game benchmark is absolutely not realistic, so this thread is kind of stupid. Unless there is proof from DF that the combo (i3+750) delivers a locked 60 @ PS4 settings?
 
No, I honestly don't have the technical knowledge to adequately explain it. I'm a lawyer, not a programmer. That's why I adressed my question to TechGAF.

The i3 4130 runs at 3.4 Ghz, has hyper-threading, and more transistors. IIRC, a Haswell core is 15-20mm²@22nm while a Jaguar core is only 3.1mm@28nm. If we assume without evidence that the game is bottlenecked by single thread performance, not by total performance, the gap makes much sense.
 
I can't believe people are bothered by this..

If the game runs great on PC, better than PS4 then great.
The game still runs great on the PS4, it's not like it's dropping frames left and right.

Besides, have you seen Bloodborne, Driveclub or The Order on PS4? Those games should put all you worried about the power of the PS4 to rest.
Haven't seen a PC game that looks as good as those games.

The PS4 is great, the PC is great, we're all great. Just choose the platform that has the games that you like, or if you're fucking rich get both.

(although I think buying a 300£ is a huge, huge mistake.)
 
CPU trolling? Why are you baiting for an answer when you already know the answer?

You want a modern console box for 300 or 400 bucks that does a lot? You have to use an APU to make it a financially viable reality these days. The current APUs sacrifice transistor real estate for the GPU portion of the chip. This entails the cost/benefit tradeoff of having a weaker CPU. Sony and MS know that the best cost/benefit trade-off is doing it this way. So, they do it this way. They are also hoping GPGPU mitigates this somewhat.

They are trying to future proof the consoles with a graphical feature set akin to that found in true DX12 GPUs. The only way they could do that and stay within a mass market budget window of $300 to $400 retail was designing the APUs the way they did.

So much this. PS4's framerate is solid 30, unlike the variable framerate in the example given. Custom built computers will surpass consoles, and eventually, considering iterative nature of annually released hardware, a computer can and will outperform a "similarly" priced console. This was, is, and always will be the case.
 
better and more expensive specs.


no shit, sherlock. building a pc with those cpu+gpu will need more than $399.


oh,sorry. you apparently have to scour the whole internet to find the cheapest deals possible, and do research to min/max your setup.
 
The bolded is not true, the average framerate is 60 fps, so it goes well above that, reading comprehension.

I quote from the article:
The situation changes slightly with an i3 4130 CPU. As expected, the move to a dual-core processor clocked at 3.4GHz is an issue for rendering city areas, and driving at pace through busy streets produces bigger dips to 50fps. Our 60fps read-out is steadied a little by paring back draw distances and population density to 20 per cent, but little else helps. Even dropping all settings to 'normal' only claws a few frames per second back for these stress points. Overall, it's not worth the sacrifice to the visuals, and left at high settings the 750 Ti still produces 60fps for the most part, if not the perfect lock we had hoped for.

On the plus side, the GTX 750 Ti fares much better when paired with a budget CPU than the rivalling AMD R9 280. Despite costing £30-40 more, the card is a write-off for 60fps performance at 1080p, even with all settings and sliders at their lowest. Left at high settings, spikes down to 35fps are common, again pointing to an issue with AMD cards when paired with weaker CPUs. Unlike the Nvidia 750 Ti, a 30fps lock is needed here when targeting 1080p and anything close to current-gen console settings.

Dips to 42fps when with a Nvidida card, dips to 35 fps with the AMD card (also pointing toward a problem with AMD gear, related to drivers or something else).

And then you try to derail the thread with the cost argument.
Which has been had many times, and every time someone points out that you CAN build a similarly specced pc for the same price and that once you add in the console premium price on games and online fees the value proposition becomes even more lopsided.
At which point - are you still reading this elandyll, or have you already left to driveby another thread? , because here comes the kicker - , like clockwork, the argument about console optimisation and codig to the metal is brought up to suggest that you can't compare the on paper similar hardware if you want to actually play games at console settings.

Do you see how we have come full circle here?

Now I'm just waiting for you to move the goalpost to 'but the ps4 still has exclusives that the pc won't have'
Because since you're already completely missing the point, you might aswell go all the way, right?

All this endless goalpost moving, thread derailing, people getting offended at their new product is getting criticized etc could be avoided if people did not feel the need to justify their purchases at all cost, and could just accept things by the way they are.

Not sure if you're talking to yourself or to me, given that I'm mostly a PC gamer myself.
But when I read BS by PC warriors "Superior Race", I feel like it's better to be honest about things (and certainly not include things like "one time promos", or even "refurb/ off market parts".

As per cost being "derailing thread", just NO. Cost will always be a factor in console tech, period. Or we might as well compare $5000 PCs to consoles.
Also, my dear "Sneaky"Stephan, when you say "you CAN build a similarly specced pc for the same price and that once you add in the console premium price on games and online fees the value proposition becomes even more lopsided.", do you mean with -every- component included, and everything new and retail price?
Otherwise the one derailing is you I'm afraid, as you compared apples to oranges.

But keep up the good fight dear Stephan, I'll continue enjoying both PC and console gaming and keep on shaking my head at some of the stuff I read.
 
Also why is the 750 Ti the go to budget GPU? For $120 I just don't see the value. A R9 260x is roughly the same (5% better/worse than the Ti in benchmarks) and can be had for $90. And the R9 270X can be had for $140 and it crushes that card.
 
better and more expensive specs.


no shit, sherlock. building a pc with those cpu+gpu will need more than $399.


oh,sorry. you apparently have to scour the whole internet to find the cheapest deals possible, and do research to min/max your setup.

Not only that, but a setup like this will be very outdated in a year or 2, unlike the consoles.
If you buy a PC, you better spend more money on it.
 
Nobody should be shocked by any "Video game performs better on PC" thread.

It doesn't matter if they perform better. It doesn't matter if they perform better for a lower TCO. Games like GTA V exist because of consoles, and without consoles they would disappear or would change so much as to be unrecognizable. Compare the Google results for "bittorrent gta v" with "bittorrent bloodborne" from the perspective of a publisher who's thinking of investing hundreds of millions in a AAA project, and I think the conclusion is pretty straightforward.

Indie games, on the other hand, are in the healthy state they are in due to the PC ecosystem. Console gamers benefit from stuff like Don't Starve, Trine 2, and a lot of other titles, and the breeding ground of new dev studios and game operations that just wouldn't be able to get a start going straight to Console. So when THQ or Konami or whoever closes up there are others ready to step up into the void created in the market.

PC and Console need either other. They're both bastions of the "Premium" gaming space that is so different from the Casual/Mobile/Free 2 Play offerings that are becoming so prevalent.
 
I'm interested in explaining this in technical terms. This is clear from the OP. I want to know how a cheap dual core cpu paired with an entry-level graphics card produces better results than an eight-core console CPU.

Those Jaguar cores are really, really weak compared to Intel's Core iX CPUs.

A few Benchmarks as an example: http://www.anandtech.com/show/6974/amd-kabini-review/3

Single-threaded performance (normalized for clock speed) is about 2x as good on the Intel in these cases.

Just a small calculation (take with a grain of salt):
Only 6 of the 8 cores are used for games on PS4. They run at 1.6 GHz. 6*1.6 = 9.6.
The 2 i3 cores run at 3.4 GHz, and offer about 2x the performance per clock. 2*3.4*2 = 13.6.
That'd make the i3 more than 40% faster.

Of course you could still say that the API overhead on PC should be higher, but maybe it's just not a big factor in this specific game, or multithreaded scaling isn't perfect, or..


edit:
Sorry im awfully late, Durante already did the math..
 
The benchmark seems to have a lower framerate than general gameplay from what I've seen in videos, but then that's really what benchmarks are about. Stress testing.

Do you have the game? I do. The benchmark can be rock solid, but as soon as you venture into the countryside (in-game) you fps will go to the toilet. Guaranteed.

I am also pretty sure that 2gb cards will run into VRAM issues on PS4 equivalent settings.

But we could have saved us this discussion if DF would have done proper tests instead of using a benchmark that does not display real world results. I haven't see any respected PC sites use it for exactly that reason. As I said, GTA4 was exactly the same in that regard. Benchmark ran much better than the game.
 
disappointing.gif

And I'm a pc gamer....

Sony and MS cheaped out this generation, and everyone loses.

except for the consumer,
well of PS4 mostly :P

Agree, but what can you do? PS4 seems like a well designed machine with good performance without being an energy consuming monster or too expensive or a hugeheavy thing.

its about "balance" more than anything, unfortunately the CPU is a shame but with how the market is going you cant blame them.
 
If the latest patch on PS4 is 1.09, it still dips below 30.



You don't need to take my word for it. A video was posted earlier demonstrating it. That is of course if 1.09 is the performance improvement patch.
It's not. Its 1.11 now.

Still, where did you encounter dips? Online perhaps? I've only tested the story mode and didn't encounter a single drop over a couple hours of play.
 
Also why is the 750 Ti the go to budget GPU? A R9 260x is roughly the same (5% better/worse than the Ti in benchmarks) and can be had for $90. And the R9 270X can be had for $140 and it crushes that card.
One reason could be that, when you go with a "budget" GPU, you also often go with a "buget" CPU. As this benchmark (along with many others) shows, for DX11 (and DX9) games (the vast majority of all releases) which are CPU heavy you are then much better off with NV hardware.

Of course you could still say that the API overhead on PC should be higher, but maybe it's just not a big factor in this specific game.
Or it's simply not as big as it is purported to be in general ;)
 
if you follow the performance thread framerates are all over the place especially at night time with lighting hitting the grass. so I'm going to call b******* on that. also do we know what settings ps4 and Xbox one are set at? And the consoles could def. go over 30 in certain scenes they are capped.

How's that setup doing out in the countryside with some heavy foliage?

People with frame rates going all over the place are using settings that their systems cannot handle when those effects come into play. With High Resolution Shadows and/or Shadow Quality Very High, you can be at 60 fps at noon in-game where shadows are really short but then have it tank to 30 fps at dusk when shadows are really long (especially if Long Shadows are enabled). The settings that can go to ultra are also insanely taxing which can bring an otherwise powerful machine to its knees, things such as Ultra Grass and Post FX but only when they're in a scene (as Deepo points out). Draw distance and population/car density effects on frame rate will be highly dependent on time of day since GTA V somewhat realistically populates the world based on hour of day: you could be fine at night when no one's around but then have your system crumble during rush hour.

Basically, the benchmark tool is actually really good in that respect: it tests every time of day setting. If you ever see your frame rate dip in the benchmark, even for a fraction of a second, you should lower some settings. Personally, I found it dipping usually around dusk and dawn where shadows were longest, thus I lowered some shadow details (I recommend turning down the Shadow Softness setting first, especially turning off the AMD/Nvidia special shadows) until I had a buttery smooth 60 fps even in the grassy sections of the benchmark and in grass-laden hills in-game.
 
This.

Why is the OP so confused about such a simple question? Or has he not seen the specs of PS4/XBO?

You want the answer OP? Just check the specs of PS4/XBO especially their CPU.

I think the point was that the consoles should be getting twice the amount of performance for their hardware than the PC equivalent.
 
They should have added 1-2 ARM A7 or ARM A53 to the APU to handle OS and background tasks. Reserving 2 of 8 cores, when you are already having comparatively little CPU performance, seems excessive. These ARM cores have very little footprint and are strong enough for all the stuff the PS4 might do in the background. AMD was probably not yet ready for that.
 
It's not. Its 1.11 now.

Still, where did you encounter dips? Online perhaps? I've only tested the story mode and didn't encounter a single drop over a couple hours of play.

On the Digital Foundry analysis video desmonstration of patch 1.09 which was the subject of a performance improvement.
There's a screen capture of it above.
 
I think the point was that the consoles should be getting twice the amount of performance for their hardware than the PC equivalent.

That idea is more so based on information and references from last generation, with complex hardware setups and dx9 like overhead. Times have changed.
 
A 750 Ti is equivalent to an R9 270x, right? That's what I got. My processor is also Core i3 4130. I can manage 60, you think?
 
Top Bottom