Either way, the i3/750Ti even matching the PS4 version is still highly impressive stuff considering it's budget PC hardware.The GTX 750 Ti and i3 combo also performs much closer to what I expected, which is very similar to the PS4's frame rate if it was uncapped. Whelp, I wonder how Alexandros feels about that other thread now.
Get the PS4 version. Cheaper right now, and you'll probably get better performance as that GPU is pretty weak.
Other than 60 FPS they look almost the same.
Either way, the i3/750Ti matching the PS4 version is still highly impressive stuff considering it's budget PC hardware.
Worst suggestion ever.Those using graphics cards with a 2GB memory buffer will be best off using the 'normal textures' with FXAA.
That better AA can make a pretty huge difference, though. The game at 1080p is rough looking in terms of aliasing. However, if you've got a decent PC, you can use downsampling quite effectively and it makes a world of a difference when actually playing(rather than through a Youtube vid):Bit of extra foliage noticed on PC at 1:10 and better AA
Of course, but it will be interesting to see how things turn out in a few years. (I'm not hinting that the 750 Ti won't keep up, BTW. It's impossible to tell now.)Either way, the i3/750Ti even matching the PS4 version is still highly impressive stuff considering it's budget PC hardware.
Of course, but it will be interesting to see how things turn out in a few years. (I'm not hinting that the 750 Ti won't keep up, BTW. It's impossible to tell now.)
Agreed, but right now, it's very hard to tell. I think the PS4 will be ahead when devs start using the Jaguar cores and GPU computing units effectively, though.I don't think it will keep up. Specs inflation is bound to happen and the hardware is not always at fault.
Another excellent showing for the Corei3/750Ti combo. It effortlessly matched the consoles at the same quality level and there's seemingly enough performance available to push some settings higher without compromising the 30 fps refresh rate.
So, in the video the i3/750Ti is not 60fps. How is this possible?
Still, with something like gsync/freesync it would still be a nice improvement at ~40fps.
Here are the PS4/XBO settings as per DF
Textures = Very High
Post-FX = Ultra
Worst suggestion ever.
I have a 2GB card and using very high with no problems at 60fps/1080p.
Seriously, though suggesting normal? not even high? Normal looks worse than last gen versions.
Good AA is completely integral to this game's visuals. So many artefacts without it.
FXAA is in no way adequate.
So, in the video the i3/750Ti is not 60fps. How is this possible?
Agreed, but right now, it's very hard to tell. I think the PS4 will be ahead when devs start using the Jaguar cores and GPU computing units effectively, though.
Agreed, but right now, it's very hard to tell. I think the PS4 will be ahead when devs start using the Jaguar cores and GPU computing units effectively, though.
When I said use the CPU cores effectively, I meant writing codes that are built around the strength and weaknesses of those cores. Of course devs won't get much from those weak cores but in closed platforms every bit of power counts. And I agree regarding compute. It should really benefit the performance of consoles.What makes you think they aren't using the CPU effectively, are you aware that a launch title has near max CPU utilization ?
http://bf4central.com/2013/11/battlefield-4-uses-95-cpu-power-found-ps4-xbox-one/
I think better use of compute is what will give the console some legs, and not its pitful CPU cores.
I wonder what PC will be required to match or outclass consoles with DX12. I suspect a modest Core I5 will obliterate the 7 cores featured on Xbox One/PS4 easily enough. On the GPU side I'm not quite sure but it's safe to say it won't take much.
I see why DX12 is such an anticipated API for both publishers and ISVs, it lowers the entry barrier for their PC versions.
Possibly. Like I said, as of right now, it's hard to tell.And they probably will be using dx12 and compute on pc.
So it's not an evidence like it was last gens.
I see no reason to assume they aren't already. Unless the Jaguar cores are some sort of elusive technology, paradigm shift in the history of CPU.When I said use the CPU cores effectively, I meant writing codes that are built around the strength and weaknesses of those cores. Of course devs won't get much from those cores but in closed platforms every bit of power counts.
And PC as well. DX12 supports all the latest GPU compute improvements unlike DirectCompute.And I agree regarding compute. It should really benefit the performance of consoles.
To be clear, it's quite obvious that GPU compute will be highly beneficial to PC's and consoles alike. But only time will tell if devs can fully utilize the compute units to create amazing looking games on consoles, which they can easily brute force in PC due to the much better CPU's.I see no reason to assume they aren't already. Unless the Jaguar cores are some sort of elusive technology, paradigm shift in the history of CPU.
And PC as well. DX12 supports all the latest GPU compute improvements unlike DirectCompute.
It's about time, really.
By the way, GPU compute has been used on PC versions of multiplatform games since 2009. It's not new at all in the PC gaming world.
Agreed, but right now, it's very hard to tell. I think the PS4 will be ahead when devs start using the Jaguar cores and GPU computing units effectively, though.
To be clear, it's quite obvious that GPU compute will be highly beneficial to PC's and consoles alike. But only time will tell if devs can fully utilize the compute units to create amazing looking games on consoles, which they can easily brute force in PC due to the much better CPU's.
Come on Kezen, the console versions are more than fine at 30fps with mostly very high settings, this is not street fighter at 30fps, I don't think I'd rebuy GTA5 on PS5 just for 60fps.Pretty spectacular difference there and no doubt there are other spots where similar disparities can be observed.
Playing this at 60fps is godly by the way. I can't get it locked 100.00% of the time unfortunately but it's absolutely right that high framerates make a world of a difference in this game. I wish everyone to experience it, maybe when the PS5/Xbox 4 remaster hits the shelves.
Possibly. Like I said, as of right now, it's hard to tell.
The GTX 750 Ti and i3 combo also performs much closer to what I expected, which is very similar to the PS4's frame rate if it was uncapped. Whelp, I wonder how Alexandros feels about that other thread now.
Yeah, DX 12 should be some nice changes.Frostbite already uses ACEs just like Infamous and The Tomorrow Children. If Battlefront uses DX12 in PC we will have a good measure for the upcoming years.
I hope you're not suggesting that computing won't get better in the upcoming years. What I meant was simply that exclusives games on consoles, especially the PS4 could be doing some incredible things with compute that multiplatforms don't.They already are using GPU compute for post-processing like it's the case on PC as well. What the hell are you talking about ?
They use compute for the same thing on PC and consoles, see AC Unity or COD Advanced Warfare, Ryse etc..
So, in the video the i3/750Ti is not 60fps. How is this possible?
Pretty good I imagine, considering he set out to show, yet again that the incredibly stupid meme of 2 times performance coding to the METAL!!! bs console gamers love to spew is exactly that: bs.
Pretty good I imagine, considering he set out to show, yet again that the incredibly stupid meme of 2 times performance coding to the METAL!!! bs console gamers love to spew is exactly that: bs.
Agreed, but right now, it's very hard to tell. I think the PS4 will be ahead when devs start using the Jaguar cores and GPU computing units effectively, though.
Glorious post.
Can the generalizations and strawmen, for fuck's sake.
If you think console gamers on NeoGAF who know about tech actually believe this bullshit, you should seek help.Pretty good I imagine, considering he set out to show, yet again that the incredibly stupid meme of 2 times performance coding to the METAL!!! bs console gamers love to spew is exactly that: bs.
No one's denying that. What's your point?By that time DX12 and OpenGL next will be avaiable on PC though. Optimizations done on jaguar will carry over to PC architecture. In fact, there will be muc more performance there to play with, should developers choose ot utilize it.
you really think the PS4 version of GTAV was optimized or " coded to the metal " ?
How about you glance over at DriveClub or The Order or Infamous: SS to see what happens when you actually optimize and code " to the metal "
best looking games of this generation belong on the PS4 atm.
Pretty good I imagine, considering he set out to show, yet again that the incredibly stupid meme of 2 times performance coding to the METAL!!! bs console gamers love to spew is exactly that: bs.
If you think console gamers on NeoGAF who know about tech actually believe this bullshit, you should seek help.
No one's denying that. What's your point?
Wasn't it John Carmack that specifically said that though? Go send him an angry email or something.
Of course they aren't tech savvy, or they wouldn't have said it. That doesn't change the fact that a LOT of console gamer son GAF kepe on saying it.
You implied that this wouldn't be the case. Or did I misread your statement?
No one's denying that. What's your point?
I think this kind of thing USED to be the norm when we were talking about specialized hardware for gaming versus a general purpose PC. We've reached a point where consoles ARE just PCs and PC hardware options are much stronger than ever.Carmack was tlaking about very specific situations (DX9, CPu overhead). Console gamers love to take that tweet compeltely out of context.
They are not strawmen. The "2x" myth is alive and well, or at least it was a few scant months ago.Can the generalizations and strawmen, for fuck's sake.
Carmack was tlaking about very specific situations (DX9, CPu overhead). Console gamers love to take that tweet compeltely out of context. Kind of like you're doing. Either because they are ignorant, or because they are trolling.
Maybe others would, I see the thirst for 60fps on consoles is strong, rightly so. I played GTA 5 on PS4 and while it looked great the low framerate didn't allow me to fully enjoy the game. I don't consider myself as a 60fps snob but I have the intellectual honestly to acknowledge that 60fps can truly brings action games to life.Come on Kezen, the console versions are more than fine at 30fps with mostly very high settings, this is not street fighter at 30fps, I don't think I'd rebuy GTA5 on PS5 just for 60fps.
Which is why it's hardly worth discussing, what matters is what we have now : a modest combo performing similarly to a console environment. That is even modest PCs can have a good enough experience, I don't think 750ti owners value 60fps all that much and they'll be pretty happy get the game running as well as it does.I'm glad that the 750ti runs it fine for those who have that, great job from Rockstar, but these comparisons make little sense since the console versions are capped/locked and we can't know for sure how high they run (PS4 in particular). I'm looking at certain scenes where there's lots of explosions, especially during fire fights and it seems this is where the 750ti is stressed most, it never really goes below 30fps in Df's video but it's mostly in the 33-30fps range in those type of scenes. I imagine the PS4's better GPU is running these scenes much higher than the 750ti, but we will never have concrete evidence.
I'm sure owners of budget PC cards are very interested in those comparisons. What they want is to know what kind of hardware is necessary to have console-like experience, not if X or Y game pushed either consoles to their limits.If these comparisons are to make sense devs should offer an unlocked mode in all games on consoles so we can see how well they run. It would be interesting to get a look at Dying light and Alien Isolation unlocked against the 750ti amongst many others. Till then these comparisons against the consoles are worthless outside of the improvements made a year later..
PC versions will benefit from that as well with DX12. It won't be exclusive to consoles.I hope you're not suggesting that computing won't get better in the upcoming years. What I meant was simply that exclusives games on consoles, especially the PS4 could be doing some incredible things with compute that multiplatforms don't.
Nope but 60fps, less shimmering, less visible jaggies, higher res (if possible), etc are pretty dramatic differences in genral i think.Besides The Framerate I see no dramatic differences.
With the minuscule difference (60fps and higher IQ is barely noticeable) on the big games and Steam now charging a subscription fee the reasons to game on PC are shrinking. Is it going to be 2006 all over again?