Face-Off: Grand Theft Auto 5 on PC

The GTX 750 Ti and i3 combo also performs much closer to what I expected, which is very similar to the PS4's frame rate if it was uncapped. Whelp, I wonder how Alexandros feels about that other thread now.
 
Well looking at the vids and screen shots I'm glad I got the game on X1 & PS4. Really is good enough for me.
Nice to see a well optimised PC version and I'm sure I'll buy the game for a fourth time when it's on a Steam sale.
 
The GTX 750 Ti and i3 combo also performs much closer to what I expected, which is very similar to the PS4's frame rate if it was uncapped. Whelp, I wonder how Alexandros feels about that other thread now.
Either way, the i3/750Ti even matching the PS4 version is still highly impressive stuff considering it's budget PC hardware.
 
Get the PS4 version. Cheaper right now, and you'll probably get better performance as that GPU is pretty weak.

Thanks. About what I thought. Laptop is a few years old now and one thing I can't bear with PC gaming is having to turn down/off settings. Genuinely upsets me. Its the knowledge that I'm not getting the best experience, it niggles at the back of my brain. On PS4 even though I know its looking/performing worse than on PC, I have no option to fiddle with settings so I can just forget about it all, secure in the knowledge that what I'm playing is as good as it can be at that moment in time.
 
Either way, the i3/750Ti matching the PS4 version is still highly impressive stuff considering it's budget PC hardware.

Indeed. The fact that you can match consoles with such low end hardware is fantastic.
I believe the PS4 could do more but it does not matter in this case since the framerate will never be unlocked.

There is also a 4gb version of the 750ti.
 
Those using graphics cards with a 2GB memory buffer will be best off using the 'normal textures' with FXAA.
Worst suggestion ever.

I have a 2GB card and using very high with no problems at 60fps/1080p.

Seriously, though suggesting normal? not even high? Normal looks worse than last gen versions.
 
Bit of extra foliage noticed on PC at 1:10 and better AA
That better AA can make a pretty huge difference, though. The game at 1080p is rough looking in terms of aliasing. However, if you've got a decent PC, you can use downsampling quite effectively and it makes a world of a difference when actually playing(rather than through a Youtube vid):



There's a ton of shimmering during gameplay that doesn't show in a screenshot, and being able to clean that up is pretty huge.
 
Either way, the i3/750Ti even matching the PS4 version is still highly impressive stuff considering it's budget PC hardware.
Of course, but it will be interesting to see how things turn out in a few years. (I'm not hinting that the 750 Ti won't keep up, BTW. It's impossible to tell now.)
 
Of course, but it will be interesting to see how things turn out in a few years. (I'm not hinting that the 750 Ti won't keep up, BTW. It's impossible to tell now.)

I don't think it will keep up. Specs inflation is bound to happen and the hardware is not always at fault.
 
I don't think it will keep up. Specs inflation is bound to happen and the hardware is not always at fault.
Agreed, but right now, it's very hard to tell. I think the PS4 will be ahead when devs start using the Jaguar cores and GPU computing units effectively, though.
 
Another excellent showing for the Corei3/750Ti combo. It effortlessly matched the consoles at the same quality level and there's seemingly enough performance available to push some settings higher without compromising the 30 fps refresh rate.

So, in the video the i3/750Ti is not 60fps. How is this possible?

Still, with something like gsync/freesync it would still be a nice improvement at ~40fps.
 
So, in the video the i3/750Ti is not 60fps. How is this possible?

Still, with something like gsync/freesync it would still be a nice improvement at ~40fps.
tumblr_nkevj9ju9N1skf179o2_250.gif
 
A really good show for PC owners, they get the best version of the game as they should with more powerful HW.

I think it reflects better on Rockstar, that even though its a cross gen game, every version so far has been an impressive showpiece. The 360 and PS3 versions were impressive for how far they pushed the consoles at that stage of life in particular
 
Here are the PS4/XBO settings as per DF

Textures = Very High
Post-FX = Ultra

Some textures on PC Very High are higher quality than on X1/PS4. I doubt it's down to AF too because it's the same case when viewed head on.

Post-FX is not on ultra either because DoF on PC has noticably finer details.
You can see it in some scenes in this video which is a great comparison.
 
This game is a good example of how good a PC version of a console game can turn out when properly optimized. Especially considering that it isn't even using dx12, it's amazing how well the performance scales and is on PC hardware.
 
Even more impressive is the fact it's really an 18 month old game, made on technology released in 2005/2006. Imagine what a ground up, next gen GTA VI could look using PS4/Xbone as a starting point and then utilizing PC for maximum visual fidelity.

Sure they updated a lot of the lighting, textures, for the next gen versions but they're in the end just prettied up last gen versions. Rockstar is perhaps my favorite developer today. The one company whose games I will buy day one, sight unseen because I know they put their all into what they give. Can't wait to see what they do next. I only wish, they didn't focus so much on Online and instead had released single player dlc as well. 18 months since last gen versions, almost 5+ months since current gen came out.

Either way, Rockstar seems like they proved their critics wrong. So many expected the PC port to be shoddy, when it's honestly one of the best ever done.
 
Worst suggestion ever.

I have a 2GB card and using very high with no problems at 60fps/1080p.

Seriously, though suggesting normal? not even high? Normal looks worse than last gen versions.

I guess DF just like some people over at the performance thread take the Rockstar graphics settings memory bar as an absolute fact and not what it actually is, an estimation.

Been playing for ~28 hours on a 2GB 670 with Very High Textures without a single stutter, in game bar shows I'm ~300MB over limit while actually in game the highest MSI Afterburner VRAM usage reported was ~1900MB.

Good AA is completely integral to this game's visuals. So many artefacts without it.

FXAA is in no way adequate.

FXAA will have to do for now shimmering and all.

60FPS > jaggies
 
Absolutely awesome job by R*! I hardly ever see major games release where the benchmarks actually list my 7870HD. I still doubt I could run it with a slightly overclocked Core 2 Quad Q9400.

So, in the video the i3/750Ti is not 60fps. How is this possible?

lolol
 
Agreed, but right now, it's very hard to tell. I think the PS4 will be ahead when devs start using the Jaguar cores and GPU computing units effectively, though.

What makes you think they aren't using the CPU effectively, are you aware that a launch title has near max CPU utilization ?
http://bf4central.com/2013/11/battlefield-4-uses-95-cpu-power-found-ps4-xbox-one/

I think better use of compute is what will give the console some legs, and not its pitful CPU cores.

I wonder what PC will be required to match or outclass consoles with DX12. I suspect a modest Core I5 will obliterate the 7 cores featured on Xbox One/PS4 easily enough. On the GPU side I'm not quite sure but it's safe to say it won't take much.
I see why DX12 is such an anticipated API for both publishers and ISVs, it lowers the entry barrier for their PC versions.
 
Agreed, but right now, it's very hard to tell. I think the PS4 will be ahead when devs start using the Jaguar cores and GPU computing units effectively, though.

And they probably will be using dx12 and compute on pc.
So it's not an evidence like it was last gens.
 
What makes you think they aren't using the CPU effectively, are you aware that a launch title has near max CPU utilization ?
http://bf4central.com/2013/11/battlefield-4-uses-95-cpu-power-found-ps4-xbox-one/

I think better use of compute is what will give the console some legs, and not its pitful CPU cores.

I wonder what PC will be required to match or outclass consoles with DX12. I suspect a modest Core I5 will obliterate the 7 cores featured on Xbox One/PS4 easily enough. On the GPU side I'm not quite sure but it's safe to say it won't take much.
I see why DX12 is such an anticipated API for both publishers and ISVs, it lowers the entry barrier for their PC versions.
When I said use the CPU cores effectively, I meant writing codes that are built around the strength and weaknesses of those cores. Of course devs won't get much from those weak cores but in closed platforms every bit of power counts. And I agree regarding compute. It should really benefit the performance of consoles.

And they probably will be using dx12 and compute on pc.
So it's not an evidence like it was last gens.
Possibly. Like I said, as of right now, it's hard to tell.
 
When I said use the CPU cores effectively, I meant writing codes that are built around the strength and weaknesses of those cores. Of course devs won't get much from those cores but in closed platforms every bit of power counts.
I see no reason to assume they aren't already. Unless the Jaguar cores are some sort of elusive technology, paradigm shift in the history of CPU.

And I agree regarding compute. It should really benefit the performance of consoles.
And PC as well. DX12 supports all the latest GPU compute improvements unlike DirectCompute.
It's about time, really.

By the way, GPU compute has been used on PC versions of multiplatform games since 2009. It's not new at all in the PC gaming world.
 
I see no reason to assume they aren't already. Unless the Jaguar cores are some sort of elusive technology, paradigm shift in the history of CPU.


And PC as well. DX12 supports all the latest GPU compute improvements unlike DirectCompute.
It's about time, really.

By the way, GPU compute has been used on PC versions of multiplatform games since 2009. It's not new at all in the PC gaming world.
To be clear, it's quite obvious that GPU compute will be highly beneficial to PC's and consoles alike. But only time will tell if devs can fully utilize the compute units to create amazing looking games on consoles, which they can easily brute force in PC due to the much better CPU's.
 
Agreed, but right now, it's very hard to tell. I think the PS4 will be ahead when devs start using the Jaguar cores and GPU computing units effectively, though.

Frostbite already uses ACEs just like Infamous and The Tomorrow Children. If Battlefront uses DX12 in PC we will have a good measure for the upcoming years.
 
To be clear, it's quite obvious that GPU compute will be highly beneficial to PC's and consoles alike. But only time will tell if devs can fully utilize the compute units to create amazing looking games on consoles, which they can easily brute force in PC due to the much better CPU's.

They already are using GPU compute for post-processing like it's the case on PC as well. What the hell are you talking about ?

They use compute for the same thing on PC and consoles, see AC Unity or COD Advanced Warfare, Ryse etc..

Your post made it sound like they would be using GPU compute on consoles and not on PC where they would be using the CPUs, that makes no sense. Compute is used across all platforms for the same things in multiplatform games.
Obviously, DX11's compute is less efficient because it's very old (2009) and has not been updated (DX11.1/DX11.2 are for graphics).

Much to the dismay of AMD/Nvidia which have improvement their architectures over time.
 
"Nearly the same" is pushing it a lot, but there's no surprise that the PC shots don't look 5x better than the consoles, as its hardware would suggest - DF themselves said it, the game was made around consoles, the PC stuff is merely (delicious) frosting.
 
Pretty spectacular difference there and no doubt there are other spots where similar disparities can be observed.


Playing this at 60fps is godly by the way. I can't get it locked 100.00% of the time unfortunately but it's absolutely right that high framerates make a world of a difference in this game. I wish everyone to experience it, maybe when the PS5/Xbox 4 remaster hits the shelves.
Come on Kezen, the console versions are more than fine at 30fps with mostly very high settings, this is not street fighter at 30fps, I don't think I'd rebuy GTA5 on PS5 just for 60fps.

I'm glad that the 750ti runs it fine for those who have that, great job from Rockstar, but these comparisons make little sense since the console versions are capped/locked and we can't know for sure how high they run (PS4 in particular). I'm looking at certain scenes where there's lots of explosions, especially during fire fights and it seems this is where the 750ti is stressed most, it never really goes below 30fps in Df's video but it's mostly in the 33-30fps range in those type of scenes. I imagine the PS4's better GPU is running these scenes much higher than the 750ti, but we will never have concrete evidence.

If these comparisons are to make sense devs should offer an unlocked mode in all games on consoles so we can see how well they run. It would be interesting to get a look at Dying light and Alien Isolation unlocked against the 750ti amongst many others. Till then these comparisons against the consoles are worthless outside of the improvements made a year later..
 
The GTX 750 Ti and i3 combo also performs much closer to what I expected, which is very similar to the PS4's frame rate if it was uncapped. Whelp, I wonder how Alexandros feels about that other thread now.

Pretty good I imagine, considering he set out to show, yet again that the incredibly stupid meme of 2 times performance coding to the METAL!!! bs console gamers love to spew is exactly that: bs.
 
Frostbite already uses ACEs just like Infamous and The Tomorrow Children. If Battlefront uses DX12 in PC we will have a good measure for the upcoming years.
Yeah, DX 12 should be some nice changes.

They already are using GPU compute for post-processing like it's the case on PC as well. What the hell are you talking about ?

They use compute for the same thing on PC and consoles, see AC Unity or COD Advanced Warfare, Ryse etc..
I hope you're not suggesting that computing won't get better in the upcoming years. What I meant was simply that exclusives games on consoles, especially the PS4 could be doing some incredible things with compute that multiplatforms don't.
 
Pretty good I imagine, considering he set out to show, yet again that the incredibly stupid meme of 2 times performance coding to the METAL!!! bs console gamers love to spew is exactly that: bs.

you really think the PS4 version of GTAV was optimized or " coded to the metal " ?

How about you glance over at DriveClub or The Order or Infamous: SS to see what happens when you actually optimize and code " to the metal "

best looking games of this generation belong on the PS4 atm.
 
Agreed, but right now, it's very hard to tell. I think the PS4 will be ahead when devs start using the Jaguar cores and GPU computing units effectively, though.

By that time DX12 and OpenGL next will be avaiable on PC though. Optimizations done on jaguar will carry over to PC architecture. In fact, there will be muc more performance there to play with, should developers choose ot utilize it.
 
Pretty good I imagine, considering he set out to show, yet again that the incredibly stupid meme of 2 times performance coding to the METAL!!! bs console gamers love to spew is exactly that: bs.
If you think console gamers on NeoGAF who know about tech actually believe this bullshit, you should seek help.

By that time DX12 and OpenGL next will be avaiable on PC though. Optimizations done on jaguar will carry over to PC architecture. In fact, there will be muc more performance there to play with, should developers choose ot utilize it.
No one's denying that. What's your point?
 
you really think the PS4 version of GTAV was optimized or " coded to the metal " ?

How about you glance over at DriveClub or The Order or Infamous: SS to see what happens when you actually optimize and code " to the metal "

best looking games of this generation belong on the PS4 atm.

You mean 900p corridors everywhere and of course, plenty of cinematic blur, with AI from 2005?

ok.

Drive club does look good though, but it's not something that can't be done on similar PC GPU hardware.
 
If you think console gamers on NeoGAF who know about tech actually believe this bullshit, you should seek help.

Of course they aren't tech savvy, or they wouldn't have said it. That doesn't change the fact that a LOT of console gamer son GAF kepe on saying it.

No one's denying that. What's your point?

You implied that this wouldn't be the case. Or did I misread your statement?
 
Wasn't it John Carmack that specifically said that though? Go send him an angry email or something.

Carmack was tlaking about very specific situations (DX9, CPu overhead). Console gamers love to take that tweet compeltely out of context. Kind of like you're doing. Either because they are ignorant, or because they are trolling.

Anytime Alexandros or anyone else shows actual data that disproves this notion, console GAF immediately starts the salt train, get all indignant about the "elitist" PC gamers and start hurling ad hominems at whoever they feel is an easy target.

Kind of sick and tired of that crap.
 
Of course they aren't tech savvy, or they wouldn't have said it. That doesn't change the fact that a LOT of console gamer son GAF kepe on saying it.



You implied that this wouldn't be the case. Or did I misread your statement?

You're generalizing the hell out of console gamers. I see people say the x2 console quote some on gaf but not nearly as much as you imply.That's like me saying /r/pcmasterrace is an accurate representation of pc gamers on GAF.
 
Carmack was tlaking about very specific situations (DX9, CPu overhead). Console gamers love to take that tweet compeltely out of context.
I think this kind of thing USED to be the norm when we were talking about specialized hardware for gaming versus a general purpose PC. We've reached a point where consoles ARE just PCs and PC hardware options are much stronger than ever.

Those days are pretty much over, it seems.
 
Can the generalizations and strawmen, for fuck's sake.
They are not strawmen. The "2x" myth is alive and well, or at least it was a few scant months ago.

The fact that every single data point, including this one, shows it to be wrong is something that can't be reiterated often enough.
 
Carmack was tlaking about very specific situations (DX9, CPu overhead). Console gamers love to take that tweet compeltely out of context. Kind of like you're doing. Either because they are ignorant, or because they are trolling.

He also said focus on single specs is important in this:

Developers can't pour all their resources into a single PC set up, meanwhile every console is the same.
 
Come on Kezen, the console versions are more than fine at 30fps with mostly very high settings, this is not street fighter at 30fps, I don't think I'd rebuy GTA5 on PS5 just for 60fps.
Maybe others would, I see the thirst for 60fps on consoles is strong, rightly so. I played GTA 5 on PS4 and while it looked great the low framerate didn't allow me to fully enjoy the game. I don't consider myself as a 60fps snob but I have the intellectual honestly to acknowledge that 60fps can truly brings action games to life.

I'm glad that the 750ti runs it fine for those who have that, great job from Rockstar, but these comparisons make little sense since the console versions are capped/locked and we can't know for sure how high they run (PS4 in particular). I'm looking at certain scenes where there's lots of explosions, especially during fire fights and it seems this is where the 750ti is stressed most, it never really goes below 30fps in Df's video but it's mostly in the 33-30fps range in those type of scenes. I imagine the PS4's better GPU is running these scenes much higher than the 750ti, but we will never have concrete evidence.
Which is why it's hardly worth discussing, what matters is what we have now : a modest combo performing similarly to a console environment. That is even modest PCs can have a good enough experience, I don't think 750ti owners value 60fps all that much and they'll be pretty happy get the game running as well as it does.
Ultimately, there is little doubt developpers will hit the ceiling on the 750ti sooner than on consoles (both of them actually) if we are looking at paper specs. I was never, ever, expecting a 750ti to trade blows with either consoles. Job well fucking done.
I believe a budget AMD card (R7 265) paired with a recent (Sandy/Ivy/Haswell) Core I5 stands a decent chance as well.

If these comparisons are to make sense devs should offer an unlocked mode in all games on consoles so we can see how well they run. It would be interesting to get a look at Dying light and Alien Isolation unlocked against the 750ti amongst many others. Till then these comparisons against the consoles are worthless outside of the improvements made a year later..
I'm sure owners of budget PC cards are very interested in those comparisons. What they want is to know what kind of hardware is necessary to have console-like experience, not if X or Y game pushed either consoles to their limits.
I have the feeling you don't really like those comparisons because you don't like the results. However much you may criticize them fact is it does not take anywhere near high-end PC hardware to match consoles.

Ah, I remember the 2012-2013 predictions threads, and I was one of those who believed the consoles would outshine PCs for at least a whole year. That is absolutely not what happened and I'm glad.

I hope you're not suggesting that computing won't get better in the upcoming years. What I meant was simply that exclusives games on consoles, especially the PS4 could be doing some incredible things with compute that multiplatforms don't.
PC versions will benefit from that as well with DX12. It won't be exclusive to consoles.
I expect DX12 to significantly reduce the efficiency gap between PC and consoles.
 
With the minuscule difference (60fps and higher IQ is barely noticeable) on the big games and Steam now charging a subscription fee the reasons to game on PC are shrinking. Is it going to be 2006 all over again?
 
Besides The Framerate I see no dramatic differences.
Nope but 60fps, less shimmering, less visible jaggies, higher res (if possible), etc are pretty dramatic differences in genral i think.
Then rhere is more small stuff that is less noticable maybe, but will all add up to more immersion.
 
Top Bottom