GTA V PS4: 1080@30, Core i3/750Ti: 1080@60. How is this possible?

Just a few I could think of:

General technological stagnation. Not enough improvements in the manner in which games are made. Games continue to increase in size, but older tool chains can't support the weight, and they can't improve much because the hardware couldn't handle the tech improvements. Devs spend more time and money squeezing blood from the stone, and less time on the actual game.

Playerbase loses interest in the product, due to the platforms becoming old hat or too commonplace. Consumer interest moves on to newer, more exciting frontiers. This happened a bit last gen when some people got tired of PS3/360 being old and busted, so they moved to PC.

Some aspects of hardware could actually begin to cost manufacturers more to produce than newer alternatives. See stuff like DDR3/GDDR5 vs future DDR4 improvements and upcoming stacked/HBM chips.

Marketing deals and stuff start to dry up because those with the money won't see any potential growth in the platform they are being asked to invest in.

There are some reasons why they cant move to PS5/Xbox Two too.

Releasing a big platform is a huge risk and investment, Sony is on the best postion to release a PS5, but why would they? they are selling, they are on top right now, only a few of their games have been released, with more to come, the system doesnt even have its first ND game yet!

And thats other thing, developing games is taking longer, less "true" next generation games are released, and more HD ports are made for fans, this is because they become too much of a risk, and sometimes they are even delayed.

People getting bored of the tech its not always true as the 360 got its best years thanks to Kinect, the system got a second life, even if short, new systems take long to develop and market, thats why 360 and Ps3 were not replaced earlier.

Now, Microsoft and Nintendo have more of a reason to upgrade, MS struggled with the ONE at first, but they are aggressive enough to stay competitive, the Xbox brand is still important to them.

Nintendo has the best reason reason to upgrade, the Wii U was a disaster, and sales are poor,for a time people were actually demanding for them to drop their system, but they didnt, they cant abandon their system wich was a huge project and release another, more powerful console because that is simply going to take years to develop. Wii U is going to have a "normal" console cycle.

The PS4 is just starting, im not sure a short cycle is even realistic anymore, if anything the era of short dev. cycles could be over.
 
Indeed.
Even the i3 is already below the minimum requirements for Witcher 3. I believe the 750Ti is comparable to 660 which is the minimum gpu. I wonder how will the combo fare in Witcher 3. 1080p 30fps on high settings is my guess, about the same performance as PS4.
Crap... you are right, I didn't notice.

http://www.pcgamer.com/the-witcher-3-system-requirements-announced/

Well that won't stop PC benchmarkers from modding the restriction out (ala FC4) to test.
 
i'm new to PC tech... so i admit i'm pretty ignorant to this sort of thing. but looking at benchmarks at the 700 and 900 series i'm wondering how consoles will even be able to run future games like elder scrolls 6, which i'm guessing would even give the 970 a run for its money at 60 fps 1080. i know console games are highly optimized, and i know they'll find a way, but it's hard not to imagine them running on fumes at that point

well theirs always the option of lowering the resolution. consoles can continue on for years that way, at 900p or 800p etc.
 
i'm new to PC tech... so i admit i'm pretty ignorant to this sort of thing. but looking at benchmarks at the 700 and 900 series i'm wondering how consoles will even be able to run future games like elder scrolls 6, which i'm guessing would even give the 970 a run for its money at 60 fps 1080. i know console games are highly optimized, and i know they'll find a way, but it's hard not to imagine them running on fumes at that point
Don't worry. Multiplat games are often tied to the most profitable hardware rather than the most powerful one. So rather than a TES6 that stresses even high-end PC's, we'll more likely get a game designed to run on consoles first and foremost and then ported to PC. Naturally, PC players will play the game at higher settings/resolution/fps but multiplats themselves will be designed for consoles first and foremost in almost all cases. Alien Isolation would be one exception of this rule.
 
Indeed.
Even the i3 is already below the minimum requirements for Witcher 3. I believe the 750Ti is comparable to 660 which is the minimum gpu. I wonder how will the combo fare in Witcher 3. 1080p 30fps on high settings is my guess, about the same performance as PS4.

Below the 2500k, but is it below the Phenom X4? Legit question, I have no idea.
 
After the last gen being 8 years, we got a roughly normal 8x increase in GPU power, but a less-than-normal increase in CPU power. CPU and GPU tech has been stagnating to some extent over the last few years, AMD CPUs have been practically DOA half the time compared to intel's offerings. 2011 Intel CPUs are outperforming 2013-14 AMD CPUs, which is depressing as fuck. But what's also depressing is that 2011 Intel CPUs are competitive with 2014 Intel CPUs too, getting <10% yearly gains in performance with new chips.

At 10% annual gains, five years of tech progression gives us... a 1.6x increase in power compared to 5 years ago. Now, it's possible Skylake will be a meaningful jump, and that whatever AMD comes out with next will finally make some real gains. But it's also possible that we'll have CPUs five years down the line that are merely 2-2.5x more powerful for a given TDP.

In the GPU space things are less depressing, yet still a far cry from the glory days of GPU progress when we were getting crazy increases every year or two. To get an 8 fold increase in power over an 8 year period (the last generation's length), we need 30% annual performance gains. And suffice it to say, we haven't been getting that. If we generously assume 25% annual gains, after five years on the market, the PS4 could be replaced by a new console that is roughly.... 3x more powerful.

Oh dear.

It's time to leave silicon based transistors behind.
For now we have another 2-3 iterations of real gains ahead of us (at least for gpus, rip cpu market with 0 competition :() but after 10nm silicon is gona hit a wall.

We have the amd 390 series to look forward to, then a 16nm successor (probably refresh) late next year alongside pascal and huge gains on memory bandwidth.

Then after that volta maybe on 14 or 10nm and whatever is up next for amd, but then...

Short term things are good in gpu land (well the awful fucking price gouging aside :\ ) but in as little as 3-4 years we neeeeed something new.
Bandaids like finfet (and future iterations) will only go so far, silicon is a dead end and we are so close we can feel its breath on our face.

It's kind of funny that the sheer size of the computer chip industry is so big that it is causing stagnation on a technological level.
They don't want to move on from silicon lithography because of the massive massive infrastructure investment worldwide for all those wafer plants... So they're going to suck every last drop out of it before moving on and before some exponential progress can happen again.

I wonder how different things could be if it was a smaller industry, think of all the r&d money wasted on sucking blood out of a stone that could have gone into new technology paths instead.. to say.. materials that can operate at hundreds of gigahertz frequencies instead of just trying to further force shrink silicon that is capped at a much lower clockspeed and leaky as shit and very power inefficient (and then also think of the massive environmental impact of all those technologically retarded (:p) data centers worldwide while the industry behind it is soo short term profit focused to move on)
From a long term view the continued effort into silicon transistors is a classic case of sunken cost fallacy by corporations who are too invested in short term revenue.

It's kind of similar to how nand flash is being treated with ever more complicated memory controllers, 2bit per cell memory etc, basically pouring water into a wicker basket when resistive ram or phase change memory are just around the corner and will make nand look like the archaic shit it is. Who cares when cds became a thing about efforts to make a floppy store 20 percent more data or read 20 percent faster, noone.
 
They also bring out Javin98 and his tired catchphrases.
Catchphrases? What catchphrases? I know I used this in the other thread as well, but both these threads are very similar so it seems logical to use it, no?

Edit: Just to clarify, I was referring to the minority of PC gamers in this thread. Most of them make good points.
 
Crap... you are right, I didn't notice.

http://www.pcgamer.com/the-witcher-3-system-requirements-announced/

Well that won't stop PC benchmarkers from modding the restriction out (ala FC4) to test.

The i3 isn't that bad actually, and while it's dual core but it has 4 threads.

Below the 2500k, but is it below the Phenom X4? Legit question, I have no idea.

According to this benchmark for GTA V, Phenom x4 is indeed below the 2500k. (light green vs dark green). So I guess it's safe to assume that the i3 is still in the minimum requirements zone.
Zu5n61d.jpg

However, it's interesting to see that the 750Ti is actually on the yellow zone. I guess it's not that accurate then.
 
More importantly, this thread has been going on for a while, but no one, except one guy, made a comparison between PC High and PS4 version? I doubt the 750 Ti can run this at much higher frame rates. To clarify, I'm not some salty console warrior. I'm genuinely curious about this.
 
I'm mildly surprised that alexandros was juniored for this thread, but considering all the thread whining I'm extremely surprised that his was the only punishment dished out.

Hah, very true! After the complaining started I went to bed fully expecting to wake up banned.

Edit: Oh, now I understand what you meant. Mods make those calls, it is what it is.

Alexandros man.... was it really worth your thread making privileges just for an opportunity to troll console gamers?

In a sense, although the intent was never to troll anyone. I honestly don't know why the performance disparity exists. It didn't make sense to me how those PC parts could out match fixed console hardware that also has the benefit of lower overhead. I would expect it to match PS4 performance but not beat it. Alien Isolation produced the same bizarre result but that was explained by the fact that Creative Assembly is a PC developer. But Rockstar Games? Very weird.

I got some reasonable answers to my question that helped explain things to me and others so yeah, in that sense it was worth it. I knew my thread would attract platform warriors. Comparison threads always do, no matter the title or the perceived intent. I did my best to just state the facts in my title and post, I avoided making inflammatory remarks about platform power levels and I just asked a question. I don't see how I could have asked that question differently and it seems to me the answer is supposed to be "don't ask it at all because some people don't like it".
 
i'm new to PC tech... so i admit i'm pretty ignorant to this sort of thing. but looking at benchmarks at the 700 and 900 series i'm wondering how consoles will even be able to run future games like elder scrolls 6, which i'm guessing would even give the 970 a run for its money at 60 fps 1080. i know console games are highly optimized, and i know they'll find a way, but it's hard not to imagine them running on fumes at that point

so, games aren't made to be optimized for the highest level environment. they are made wherever the money can be made the most (ps4 right now). in 2-ish years, the ps4 will be holding back the pc game market just like the entire last gen did for 4+ years
 
i'm new to PC tech... so i admit i'm pretty ignorant to this sort of thing. but looking at benchmarks at the 700 and 900 series i'm wondering how consoles will even be able to run future games like elder scrolls 6, which i'm guessing would even give the 970 a run for its money at 60 fps 1080. i know console games are highly optimized, and i know they'll find a way, but it's hard not to imagine them running on fumes at that point

Any good game engine can scale up or down a LOT

crysis still gives a modern pc a run for its money but it scales down incredibly well. It will run on a potato and it will run well.
On the lowest settings (a bunch of stuff like pom, depth of field, motion blur etc is disabled) and with low textures etc it will run on a potato pc like an old intel dual core (not even core 2 duo, the thing before it) and a gpu from 2004.
Hell if you really wanted to you can run crysis on a pentium 4 and a radeon 9800 pro (I had a pc like this that I bought in 2003)

They even made it run on consoles... all it takes is low enough settings (though in crysis it also meant drops to 10 fps and having to cut entire levels out of the game to run on consoles:p)

So to answer your question: if elder scrolls is a graphically fancy game for high end pcs when it comes out, then on ps4 and xbox one it will simply run at much lower draw distance/effects/shadow quality etc etc to retain a playable framerate.
It'll be the same game, just with lower precision effects and a worse draw distance etc.

Though tbh it is very possible that a lot of future multiplats (and even current exclusives) are cutting features, mechanics and ideas due to performance limitations (especially on the cpu side)
It happens all the time, hell it happened in battlefield 3 where they cut out their realtime gi they initially showcased for forstbite and just precalculated it during the loading screen instead.

edit: this is what frostbite was originally supposed to be:
https://www.youtube.com/watch?v=O8730SR1POk
fancy dynamic lighting, in the end all frostbite games have had 90 percent static baked lighting and no day and night cycles.
If they didn't have to cut down their engine (better hardware) then battlefield 3 would for sure have had maps with day and night cycles.

So maybe , possibly, elders scrolls won't be the game it could have been, if the cpu was less shit.

edit: normally multiplat games are also held back by lower end pc hardware, but at this point the jaguar cpu in ps4/xbox one is SO weak that it is well below what is considered low end cpu wise for even entry level gaming computers. And that is DESPITE 4 years of stagnation in the cpu industry on pc... imagine if amd was able to compete and an i3 would be a quad core (which everyone assumed it would be by now back in 2011) and the i5 would be equivalent to the intel extreme 8 cores.
 
It's time to leave silicon based transistors behind.
For now we have another 2-3 iterations of real gains ahead of us (at least for gpus, rip cpu market with 0 competition :() but after 10nm silicon is gona hit a wall.

We have the amd 390 series to look forward to, then a 16nm successor (probably refresh) late next year alongside pascal and huge gains on memory bandwidth.

Then after that volta maybe on 14 or 10nm and whatever is up next for amd, but then...

Short term things are good in gpu land (well the awful fucking price gouging aside :\ ) but in as little as 3-4 years we neeeeed something new.
Bandaids like finfet (and future iterations) will only go so far, silicon is a dead end and we are so close we can feel its breath on our face.

It's kind of funny that the sheer size of the computer chip industry is so big that it is causing stagnation on a technological level.
They don't want to move on from silicon lithography because of the massive massive infrastructure investment worldwide for all those wafer plants... So they're going to suck every last drop out of it before moving on and before some exponential progress can happen again.

I wonder how different things could be if it was a smaller industry, think of all the r&d money wasted on sucking blood out of a stone that could have gone into new technology paths instead.. to say.. materials that can operate at hundreds of gigahertz frequencies instead of just trying to further force shrink silicon that is capped at a much lower clockspeed and leaky as shit and very power inefficient (and then also think of the massive environmental impact of all those technologically retarded (:p) data centers worldwide while the industry behind it is soo short term profit focused to move on)
From a long term view the continued effort into silicon transistors is a classic case of sunken cost fallacy by corporations who are too invested in short term revenue.

It's kind of similar to how nand flash is being treated with ever more complicated memory controllers, 2bit per cell memory etc, basically pouring water into a wicker basket when resistive ram or phase change memory are just around the corner and will make nand look like the archaic shit it is. Who cares when cds became a thing about efforts to make a floppy store 20 percent more data or read 20 percent faster, noone.

that's true about silicon losing steam, but it's also true for any competing technology. transistor scaling has a definite end point. you can't for example make something smaller than 1 atom, and the reality is that you need at least 10-20 atoms to transmit electrical signals.
 
I avoided making inflammatory remarks about platform power levels and I just asked a question. I don't see how I could have asked that question differently and it seems to me the answer is supposed to be "don't ask it at all because some people don't like it".
This may work for those who don't know your post history....

so, games aren't made to be optimized for the highest level environment. they are made wherever the money can be made the most (ps4 right now). in 2-ish years, the ps4 will be holding back the pc game market just like the entire last gen did for 4+ years
You know there is a bit of light at the end of the tunnel, though. When DX12 arrives, PC's will gain lower level access and the ability to run vastly more draw calls, just like on consoles, so it creates a parity there(just not on power) that will allow developers to do more with their multiplatform games. This could have benefits for everyone.
 
that's true about silicon losing steam, but it's also true for any competing technology. transistor scaling has a definite end point. you can't for example make something smaller than 1 atom, and the reality is that you need at least 10-20 atoms to transmit electrical signals.

Nope but as I said there are other materials that will allow MUCH higher clock frequencies
There are still exponential gains to be made performance wise due to sheer clockspeed increases, which means you don't need to fit ever more transistors on a die (no need to shrink )

Other materials will also have better electron leaking properties and be able to shrink below what you can do with silicon (though as you said also hit a wall soon after)

But the main thing for the medium future is clockspeeds.
(which I know people are cynical about atm, mainly because of intel looking stupid with their 10ghz claims in the pentium 4 days, but this is only because of the limitations of silicon as a semiconductor)

If I can make an analogy, we are only in the iron age as far as computer technology goes.
We work with silicon because like iron it's an easy to manipulate and work with material, but as time moves on we will learn to work with better materials that don't share the same limitations.

Wether it be graphene, carbon nanotubes or whatever other scifi sounding new technology that replaces silicon, it'll be a massive leap.
Who knows, maybe the next transistors (or the ones after that) won't even be semiconductor based at all.
 
I'm mildly surprised that alexandros was juniored for this thread, but considering all the thread whining I'm extremely surprised that his was the only punishment dished out.
You can say that again. The amount of deliberate thread shitting in here is pretty staggering.
 
i think they should hurry up making a few more gigs of memory and some more cpu power available for games. 3 gigs and 2 cores dedicated to os is just ridiculous.what is the os doing that is so much more demanding than the PS3 os which used 20mb or something.
 
I wonder how different things could be if it was a smaller industry, think of all the r&d money wasted on sucking blood out of a stone that could have gone into new technology paths instead.. to say.. materials that can operate at hundreds of gigahertz frequencies
So... silicon? And especially silicon alloys like SiGe (which have been demoed switching at almost 800GHz)? Heck, you can buy an 85GHz SiGe BJT for like 70 cents on Digikey.

There's an astronomical difference between switching speed of a discrete transistor and clock speeds of complex integrated circuits. Both because of process considerations, and because of the fact that transistors in a processor have to smoothly propagate logic down big series of transistors in between clock edges (and thus the actual switching speed much tremendously exceed the clock speed).

Even ignoring that, alternatives are incredibly immature in VLSI.
As an example, none of those high-speed graphene transistors that people rave about have a bandgap, which is a huge issue for being suitable for digital circuits. No graphene transistor in the world had a bandgap until a breakthrough last year. It has yet to be seen whether it's particularly possible/practical/whatever to make a graphene transistor with the necessary properties that also performs better than silicon in various relevant ways.

There *is* research going into potential alternatives, it's simply too early (and there's still enough improvement happening in silicon) to ditch silicon process development, particularly given how competitive and lucrative the market is.
 
i think they should hurry up making a few more gigs of memory and some more cpu power available for games. 3 gigs and 2 cores dedicated to os is just ridiculous.what is the os doing that is so much more demanding than the PS3 os which used 20mb or something.

Running the game? Doesn't seem too outlandish to me. Do you remember how great the OS was for the PS3 too? Me either.
 
Loving the anticipated Witcher 3 comparisons/PC concern trolling. The minimum requirements for PC is a Phenom II X4 940?

clwctH0.png


The i3 destroys it. The only way the i3 doesn't work is if the developer intentionally blocks it. The i3/750 Ti will keep up/surpass consoles for the generation outside of developers not supporting hardware intentionally. I almost bought a Phenom II to save some cash when I got my i5. Thank god I didn't.
 
i think they should hurry up making a few more gigs of memory and some more cpu power available for games. 3 gigs and 2 cores dedicated to os is just ridiculous.what is the os doing that is so much more demanding than the PS3 os which used 20mb or something.

Texture quality doesn't seem to be an issue even with 5 gigs, so releasing more would likely just increase loading times to transfer more assets from hdd to ram. Maybe they will in the future, but there's no immediate performance gain from giving devs extra ram.
 
i'm new to PC tech... so i admit i'm pretty ignorant to this sort of thing. but looking at benchmarks at the 700 and 900 series i'm wondering how consoles will even be able to run future games like elder scrolls 6, which i'm guessing would even give the 970 a run for its money at 60 fps 1080. i know console games are highly optimized, and i know they'll find a way, but it's hard not to imagine them running on fumes at that point
i can almost guarantee by the end of this generation we'll be back to 720p and sub 30fps
 
More importantly, this thread has been going on for a while, but no one, except one guy, made a comparison between PC High and PS4 version? I doubt the 750 Ti can run this at much higher frame rates. To clarify, I'm not some salty console warrior. I'm genuinely curious about this.
Post a few screenshots from the PS4 version for us to compare to high and other PC settings. I just tried playing on high and it doesn't seem that far off from what I've seen on the consoles.

I'd say that DF's high settings estimate isn't that far off.
 
The i3/750 Ti will keep up/surpass consoles for the generation outside of developers not supporting hardware intentionally.
I don't think it will work out that way.

The i3/750Ti certainly keeps up with a fair number of *multiplatform* titles on consoles right now, but as cross gen trickles out, and devs get better acquainted with the console hardware(with both XB1 and PS4), I suspect that at least GPU-wise, this budget PC pairing will fall behind. PS4 can leverage a large amount of GDDR5 and even the XB1's inferior memory setup is competitive with what's in the 750Ti. Both consoles already have first party titles that more than likely couldn't be done on that sort of budget PC at equivalent settings/performance and it's only a matter of time before multiplatforms catch up.

I think going forward, the i3/750Ti will fare well in certain CPU-heavy multiplatform games(which will be few and far between on consoles), but will probably be surpassed in terms of visuals overall, particularly on the PS4.
 
There are some reasons why they cant move to PS5/Xbox Two too.

Releasing a big platform is a huge risk and investment, Sony is on the best postion to release a PS5, but why would they? they are selling, they are on top right now, only a few of their games have been released, with more to come, the system doesnt even have its first ND game yet!

And thats other thing, developing games is taking longer, less "true" next generation games are released, and more HD ports are made for fans, this is because they become too much of a risk, and sometimes they are even delayed.

People getting bored of the tech its not always true as the 360 got its best years thanks to Kinect, the system got a second life, even if short, new systems take long to develop and market, thats why 360 and Ps3 were not replaced earlier.

Now, Microsoft and Nintendo have more of a reason to upgrade, MS struggled with the ONE at first, but they are aggressive enough to stay competitive, the Xbox brand is still important to them.

Nintendo has the best reason reason to upgrade, the Wii U was a disaster, and sales are poor,for a time people were actually demanding for them to drop their system, but they didnt, they cant abandon their system wich was a huge project and release another, more powerful console because that is simply going to take years to develop. Wii U is going to have a "normal" console cycle.

The PS4 is just starting, im not sure a short cycle is even realistic anymore, if anything the era of short dev. cycles could be over.

You're thinking too short term. I was speaking in relation to the '5 year cycle', of which we are already 18 months into, and giving reasons why 7-8 year cycles are not a good idea. I think I thought of some more.

For one, I think you may be forgetting about Publishers and their role in the cycle. They have a decent amount of clout when dealing with the big three (just ask Nintendo). They have to keep one upping themselves every year to suck people into their hype machines. What generates a hype machine after a gen's cycle goes on for the normal time period? New consoles. What puts a damper on a hype machine? A a stagnating generation ("It's the same thing as last year."). Their customers feel the same way. Look at how many people on GAF have decried cross gen games for the past year, and are now happy to see them disappearing. How do you think they'll feel in a couple of years? Sure, the games may ultimately be the same, but they're on new boxes, man, that's totally different! HYPE! Sony and Microsoft have to sell their own games for many of the same reasons.

Nintendo is stuck with the Wii U, which is reason enough for 5 years, as you mentioned.

The PS4 won't be selling gangbusters indefinitely, which I covered. Even the Wii was basically done (in more ways than one, for many reasons) by 2010.

Kinect did not give new life to the 360. It brought nothing to the table in terms of solid, 'core' games support. The camera was shackled to the 360, its design brimming with compromises due to the 360's age, and the limitations of the tech in general. It sold a lot of units, but it was simply a casual focused life support system sold on false promises that customers eventually saw through (See: Kinect for Xbox One).

And, as previously brought up in this thread, unlike their predecessors, the One and PS4 are already outpaced by alternative hardware like the 750Ti, a card that came out for $150 a year ago. The analogous mobile variants, namely the very same 860m cards in the favorably console comparable (in a smaller form factor to boot!) Alienware Alpha, launched at that exact same time.

The big takeaway is this: Both the industry and us gamers just went through/are going through this stuff right now. Again. It's the 8th gen, we've done this before. "Time is a flat circle". Only this time the merry go round appears to be running at a faster pace.

OT: Better cpu + Maxwell arch bandwidth optimizations and good drivers. GTA 5 being a cross gen title probably helps, but many current gen titles seem to perform similarly when benchmarked.
 
The i3 destroys it. The only way the i3 doesn't work is if the developer intentionally blocks it. The i3/750 Ti will keep up/surpass consoles for the generation outside of developers not supporting hardware intentionally. I almost bought a Phenom II to save some cash when I got my i5. Thank god I didn't.

When I was setting up my PC build I tried everything to justify an AMD CPU, but was never able to. There's a small increase in highly threaded perf per dollar, but it's very small and eaten up by the extra PSU/cooling costs.
 
I find it hard to believe this set up gives a smooth experience. I am having so many problems with my i7 and 780 at not just 1080p but 768p too.

GTA is such a huge game that I think you will never get the perfect experience.
 
More importantly, this thread has been going on for a while, but no one, except one guy, made a comparison between PC High and PS4 version? I doubt the 750 Ti can run this at much higher frame rates. To clarify, I'm not some salty console warrior. I'm genuinely curious about this.

I'll try and do a comparison of the closest to PS4 settings.
My GTX 460 is a bit shit but it's fine as long as I don't go crazy with the sliders.

As for performance, there's no doubt in my mind that a 750 Ti can do the performance DF claims just from my experience with the 1GB GTX 460.

I find it hard to believe this set up gives a smooth experience. I am having so many problems with my i7 and 780 at not just 1080p but 768p too.

GTA is such a huge game that I think you will never get the perfect experience.

Check out videos on youtube too.
There's plenty of people demonstrating the performance with an overlay.
 
Crap... you are right, I didn't notice.

http://www.pcgamer.com/the-witcher-3-system-requirements-announced/

Well that won't stop PC benchmarkers from modding the restriction out (ala FC4) to test.

This is not Ubisoft, there is 99% chance there will be no restrictions on what hardware (unless is absolutely ancient) will be able to launch Witcher 3.

Far Cry 4, and COD AW are the 2 games i can remember that had soe sort of restrictions. AW needed 6GB of ram, but i think they patched that out themselves. FC4 needed 4 threaded CPU, and modders patched that out.

Just because i3 is not listed in the min spec does not mean that you cant have a playable experience.
 
Post a few screenshots from the PS4 version for us to compare to high and other PC settings. I just tried playing on high and it doesn't seem that far off from what I've seen on the consoles.

I'd say that DF's high settings estimate isn't that far off.
Dude, I don't know how to tell you this, but I don't have a PS4..... yet :P
On a more serious note, some of the settings may be on high while others on very high on the PS4 version IMO.

I'll try and do a comparison of the closest to PS4 settings.
My GTX 460 is a bit shit but it's fine as long as I don't go crazy with the sliders.

As for performance, there's no doubt in my mind that a 750 Ti can do the performance DF claims just from my experience with the 1GB GTX 460.
It was you who did it earlier, right? I appreciate it. It's contributing far more than the fruitless debates in this thread. Anyway, I don't doubt the GTX 750 TI can pull of this performance. I just doubt the PS4 version is fully on high.
 
when favourably compared with its rivals, mistress PS4 is mark cerny's engineering masterpiece.

when unfavourably compared with a PC, it's those AMD hacks holding it back.
 
The PS4 has a weak ass netbook processor and a mid-lowend GPU from 2012.

They really cut corners this time around.

Can't believe some people still write this ... Another first reply fail.

This is valid for all the other 14yo like replies that have little to nothing kwnoledge, but that act as they were in the know.

Anyway, there are a lot of reasons and it's a blend between hardware, software (RAGE engine), quality of porting, optimization, developing time, etc.
In the end this is not a game that can be used as comparison between a specific PC setup and consoles and I think the real good news is a good game like GTAV performs pretty good on PC.
 
Can't believe some people still write this ... Another first reply fail.

This is valid for all the other 14yo like replies that have little to nothing kwnoledge, but that act as they were in the know.

Anyway, there are a lot of reasons and it's a blend between hardware, software (RAGE engine), quality of porting, optimization, developing time, etc.
In the end this is not a game that can be used as comparison between a specific PC setup and consoles and I think the real good news is a good game like GTAV performs pretty good on PC.

Which games can be then? Because basically every mutliplatform port has shown results like this.

What is so different and incomparable? Likewise, your claim about this game not scaling is kinda silly. CPU and GPU tests show this game scales quite linearly with hardware, core count, and frquency.
 
Can't believe some people still write this ... Another first reply fail.

This is valid for all the other 14yo like replies that have little to nothing kwnoledge, but that act as they were in the know.

Nothing he wrote was wrong though. Please tell me exactly where he was wrong in his statement.
 
Didn't read the article. Does it take into consecration controller, chassis, bluray etc costs?
can't you just read the article and the thread or is that too much to ask these days?
The GTX 750 Ti paired with the i3-3410 is a real success, for example - a combo that forms part of a circa-£300 PC, where 1080p60 at high settings is very doable.
there you go. they used 'circa'.
 
I'll try and do a comparison of the closest to PS4 settings.
My GTX 460 is a bit shit but it's fine as long as I don't go crazy with the sliders.

As for performance, there's no doubt in my mind that a 750 Ti can do the performance DF claims just from my experience with the 1GB GTX 460.

The 460 is a beast, can't believe it's still being used for games today. Cleaned my old one out and it runs great again. The card will live forever
just with my younger sibling, 970 upgrade! :D
 
Can't believe some people still write this ... Another first reply fail.

This is valid for all the other 14yo like replies that have little to nothing kwnoledge, but that act as they were in the know.

Anyway, there are a lot of reasons and it's a blend between hardware, software (RAGE engine), quality of porting, optimization, developing time, etc.
In the end this is not a game that can be used as comparison between a specific PC setup and consoles and I think the real good news is a good game like GTAV performs pretty good on PC.

Every multiplatform game performs better on a PC with those specs. What is your problem? If The Order or Bloodborne would have been released on the PC they would ALSO perform better on the PC with those specs. Is it that hard to understand?
 
Top Bottom