Shimmering is an AA issue, but it's not caused by AA, it's caused by a lack of AA. Post-processing AA solutions are usually pretty helpless in eliminating it, so how bad the shimmering in is just going to depend on how well-sampled the source image is... in this case, the PS4 version was constructed from 40% more samples than the XB1 version, so details in the scene are more precisely captured, so there's less erroneous shimmering.
The first is less forgiving but has a better atmosphere, imo. Last light was really good as well but it plays more like modern shooters, much more forgiving.
Seeing the video they showed, the console versions do not have any of the volumetric lighting the PC version has, or the tesselation. Hope that stays in the PC version!
I really like that it maintains a flat 60, they deserve much praise for that.
The resolution difference (~900 vs 1080p) became very noticeable to me after switching between the Destiny beta on the x1/ps4. If you only own the Xbox, it's something where you really struggle to spot the difference because you don't have a frame of reference to the clarity the 1080p version of the game offers.
It's subjective, and I still find the difference not that huge of a deal, but it is something that you definitely spot if you can see the two hand in hand, and I think everyone would obviously prefer the 1080p version. That's why I'm going to get the ps version over the x1 (most likely) despite having had purchased games like AC for x1 instead if ps4
Yeah, I'm sure youre right. I'm gonna buy it anyway though, I watched the gamersyde gameplay and it looks amazing! =) And 2 such immersive games running with such conditions for 50 dollars is a given.
The first is less forgiving but has a better atmosphere, imo. Last light was really good as well but it plays more like modern shooters, much more forgiving.
Keep in mind there will be "Spartan" and "Survival" modes in both Metro:LL and 2033 Redux. "Spartan" plays more like original Last Light, "Survival" plays more like original 2033.
Not a fan of the controller gimmicks. I didn't like how they used it in Thief either. It's more distracting than immersive. Glad they achieved 60 fps on both though.
How can you tell when PS4 has new dev tools with default settings judging off 2 games (which IMO one of them looks great and the other DF say PS4 version display is more what they were aiming for ), but you cant tell when you're looking at crush blacks and increased sharpening filter?
I don`t mean crushed blacks and sharpening filters.
Again. When i started playing TLOU R i immediately noticed that the picture balance (contrast/brightness/gamma/colours) were different to all other PS4 games i played to this point. All other games (Killzone, inFamous, Wolfenstein, AC, TR DE) had a similar mixed picture and looked really nice and sharp. They looked right.
But TLOU R was different: Colours looked slightly desaturated, contrast to gamma ratio was very weird and the game wasn`t as crisp as infamous or TR DE(and that one even uses FXAA = additional blur).
Now with the media to Redux i see exactly the same. It looks slightly soft for an 1080p output, colours look desaturated, contrast/gamma ratio isn`t looking optimal... thats the reason why i said its like Sony released a new dev tool update with a different default picture setting for new PS4 games.
Maybe its just the compression, we will see, but the picture balance looks damn close to TLOU R, and i was not a fan of that colours/contrast/gamma mix.
Paired with an I7 though which is leaps and bounds better than the PowerPC cores of the 360/PS3.
It's obvious you won't achieve similar results with exactly the same specs as a console,"more" is necessary but I'm interested to know how much.
It would be great if we could get precise information about graphical presets used on PS4/XBO and try to run the game as close as that on PC.
For some reason I suspect 4A Games will not shed light on this, the screenshots in the DF gallery aren't flattering. It's definitely not very high even without tessellation.
I would guess at High/ Very high without any tesselation. Metro LL scales quite well on single GPUs
Paired with an I7 though which is leaps and bounds better than the PowerPC cores of the 360/PS3.
It's obvious you won't achieve similar results with exactly the same specs as a console,"more" is necessary but I'm interested to know how much.
It would be great if we could get precise information about graphical presets used on PS4/XBO and try to run the game as close as that on PC.
Who cares as long as they play well. to be honest if your that worried about fps and resolutions you would get yourself a monster PC, because in reality the ps4 and x1 both can barely hold full 60fps and 1080p on the games that due...and yes i put x1 because they even a few that.
FYI i own both systems so i have not pref, but the worry over 900 to 1080 should be shifted to the gameplay, to the story, to the replayablity, micro transactions ect ect
Its 60fps on a 1.84 T GPU, there is only so much magic they can make. I would not be disappointed, 60 is worth it and exrtemely commendable.
BTW, the flashlight does not mess up the volumetrics. They are just gone.
It is 60fps @ 1080p, how it currently looks is quite awesomore for the PS4 specs. IMO, get it. You will still be constantly impressed with how it looks, and especially how it runs.
Oh, now I am no longer certain. Is the VSV the non silenced version of this or this? CONFUSION.
Who cares as long as they play well. to be honest if your that worried about fps and resolutions you would get yourself a monster PC, because in reality the ps4 and x1 both can barely hold full 60fps and 1080p on the games that due...and yes i put x1 because they even a few that.
FYI i own both systems so i have not pref, but the worry over 900 to 1080 should be shifted to the gameplay, to the story, to the replayablity, micro transactions ect ect
Who cares as long as they play well. to be honest if your that worried about fps and resolutions you would get yourself a monster PC, because in reality the ps4 and x1 both can barely hold full 60fps and 1080p on the games that due...and yes i put x1 because they even a few that. FYI i own both systems so i have not pref, but the worry over 900 to 1080 should be shifted to the gameplay, to the story, to the replayablity, micro transactions ect ect
The resolution difference (~900 vs 1080p) became very noticeable to me after switching between the Destiny beta on the x1/ps4. If you only own the Xbox, it's something where you really struggle to spot the difference because you don't have a frame of reference to the clarity the 1080p version of the game offers.
It's subjective, and I still find the difference not that huge of a deal, but it is something that you definitely spot if you can see the two hand in hand, and I think everyone would obviously prefer the 1080p version. That's why I'm going to get the ps version over the x1 (most likely) despite having had purchased games like AC for x1 instead if ps4
This is false, i own both systems and played destiny for both, the difference is not very noticeable in actual gameplay, most things are not very noticeable during gameplay...which is why DF has tools and other ways to pick games apart.
You are also comparing the destiny before the patch to take advantage of the power it was given because of the resources taken from the kinect. Thats like everyone comparing the ps4 and x1 diablo now....there is a patch that will fix a issue with the ps4 version. Just another fanboy, because if everyone pref 1080p then you would be a pc gamer...ps4 has how many games that have a solid 1080p and 60fps? exactly, and i think people would obviously prefer solid frame rate over the 1080p.
Paired with an I7 though which is leaps and bounds better than the PowerPC cores of the 360/PS3.
It's obvious you won't achieve similar results with exactly the same specs as a console,"more" is necessary but I'm interested to know how much.
HD 3000 has similar performance bracket combined with i7 to past gens. Its maybe 50-70gflops better, but thats about it. Its not 2 times.
And that automatically disproves Your quote.
The first is less forgiving but has a better atmosphere, imo. Last light was really good as well but it plays more like modern shooters, much more forgiving.
HD 3000 has similar performance bracket combined with i7 to past gens. Its maybe 50-70gflops better, but thats about it. Its not 2 times.
And that automatically disproves Your quote.
That is surprising to me, why would he make such claims then ?
What about Carmack who believes likewise ?
I'm primarily a PC gamer but to my mind there is no question that consoles are more efficient. That's why I think a solid mid-range machine (I5 3470/GTX 760) will be necessary to be on par with the PS4 on games built from the ground up for it.
That is surprising to me, why would he make such claims then ?
What about Carmack who believes likewise ?
I'm primarily a PC gamer but to my mind there is no question that consoles are more efficient. That's why I think a solid mid-range machine (I5 3470/GTX 760) will be necessary to be on par with the PS4 on games built from the ground up for it.
It makes sense for such for last generation DX9 games (and only in certain scenarios), but with DX11 and low-draw call scenarios... it should be pretty much hardware parity I imagine.
May be a joke but I'm thinking third party games will be nearly identical going into the 2nd and 3rd year of this gen. Only first party Sony games will take full advantage of what's under the hood. Trust me I don't like this as only a PS4 owner.
That is surprising to me, why would he make such claims then ?
What about Carmack who believes likewise ?
I'm primarily a PC gamer but to my mind there is no question that consoles are more efficient. That's why I think a solid mid-range machine (I5 3470/GTX 760) will be necessary to be on par with the PS4 on games built from the ground up for it.
Come on guys, this 'coding to the metal' thing again? The consoles and their games have been out for some time and that mythical console advantage is nowhere to be found. Let's not go into this again.
May be a joke but I'm thinking third party games will be nearly identical going into the 2nd and 3rd year of this gen. Only first party Sony games will take full advantage of what's under the hood. Trust me I don't like this as only a PS4 owner.
Come on guys, this 'coding to the metal' thing again? The consoles and their games have been out for some time and that mythical console advantage is nowhere to be found. Let's not go into this again.
Come on guys, this 'coding to the metal' thing again? The consoles and their games have been out for some time and that mythical console advantage is nowhere to be found. Let's not go into this again.
Hard to see when its a multi-plat game as surely they will not optimise as much as you would with a single platform specific game, just a guess though this 2x as powerful is silly.
It's not a magical excuse, but I believe it is reasonable to argue that developing for one closed platform is a benefit to optimization and can allow for unique ways to improve performance.
We have known for a while from dev documentation and PDFs that the metal advantage this gen is quite quite difference from last gen. Especially now that mantle, Open GL, and DX11.2+ exist.
It's not a magical excuse, but I believe it is reasonable to argue that developing for one closed platform is a benefit to optimization and can allow for unique ways to improve performance.
Of course, but how far this goes is the question. This gen, it does not go as far. There is also the common misconception of "optimization" and "quality degredation."
It makes sense for such for last generation DX9 games (and only in certain scenarios), but with DX11 and low-draw call scenarios... it should be pretty much hardware parity I imagine.
Over brighten can make things look soft but why did DF only apply it to the PS4 version? As you can see the contrast in the X1 version is very similar to the older PS4 shot. Regardless what's being done, everything in the newer PS4 shot looks blurrier than the older one.
That is surprising to me, why would he make such claims then ?
What about Carmack who believes likewise ?
I'm primarily a PC gamer but to my mind there is no question that consoles are more efficient. That's why I think a solid mid-range machine (I5 3470/GTX 760) will be necessary to be on par with the PS4 on games built from the ground up for it.
Because, they are talking about specific implementation of algorithms and dx9, not full games.
There is not a single game in last 10 years that required two times more performance on PC in comparison to console in similar setting. Not a one.
So, according to you what PC are we looking at for PS4-equivalent settings in 8th games ?
Am I just too pessimistic with my earlier estimation ?
Because, they are talking about specific implementation of algorithms and dx9.
There is not a single game in last 10 years that required two times more performance on PC in comparison to console in similar setting. Not a one.
This gen has a ways to go... but it has been shown that a 7850-7870 can run similarly at PS4 like settings.
Especially when Ryse comes out we will know more about the x1 equivs. But the PC release of Dead Rising 3 has shown that 720p lower settings (x1 settings) requires a pretty low end GPU.