But that doesn't explain why poor filtering is also an issue in Driveclub?
But that doesn't explain why poor filtering is also an issue in Driveclub?
Aren't their indie devs in here who have access to the sdk who can answer this. Or are you bound by some nda about what's in there?
While I do find the lack of AF in certain games confusing, please don't start that shit.
Unless you're talking about a more general agreement about how much they're allowed to say regarding the PS4 SDK.
I play everything on a monitor, but is AF less noticeable on a big TV?
I pretty much answered in my post last page.
I really wish the idea that AF has no performance impact would die because it DOES have an effect on performance.
The main ressource used by AF is texture bandwith. AF works by doing many more sampling than trilinear filtering so that's more bandwith and potential cache thrashing for bigger textures.
When a shader samples a texture, there is a latency before the shader can actually use the result. To avoid having the GPU doing nothing while waiting for a texture sample, shader compilers will fill the gap with arithmetic operations from independant code branches if there are any available.
What it means is that if there is no arithmetic operations to be processed in between texture samples, the GPU will have to wait for them. That's what you would call being "texture bound". In these cases, if you add more samples (like AF does), GPU time will increase.
What people have to understand is that this is highly engine/shader dependant. In an engine where most shaders are ALU bound, adding more texture samples won't impact performance at all because there will always be stuff to do during the wait, but in the case where you are texture bound, it will.
Another thing to note is that PC shader compilers are pretty shitty, resulting in much less ALU optimization, so that's also more arithmetic operations to hide latency with.
That being said, there is nothing magic with implementing AF on PS4. It's only a flag to setup on the samplers like on DirectX. So that's only a dev's decision/mistake if it's missing in a game.
Look at literally any other face-off except the games from the OP to see identical AF levels between platforms.
Here's the most recent face-offs:
BF:Hardline beta
![]()
The Crew
![]()
Dragon Age: Inquisition
![]()
Far Cry 4
![]()
GTAV
![]()
AC Unity
![]()
COD Advanced Warfare
![]()
Probably going to be overlooked. >.>;
i gave 3 criteria that I thought was reasonable, i didnt say it was.
we keep circling the question and we have an ICE team member being pretty explicit about the capability existing. some games have it, some dont.
i dont think styder is pushing ps4 to the point that they couldnt optimize it any further for AF.
so what is it?
I really wish the idea that AF has no performance impact would die because it DOES have an effect on performance.
The main ressource used by AF is texture bandwith. AF works by doing many more sampling than trilinear filtering so that's more bandwith and potential cache thrashing for bigger textures.
When a shader samples a texture, there is a latency before the shader can actually use the result. To avoid having the GPU doing nothing while waiting for a texture sample, shader compilers will fill the gap with arithmetic operations from independant code branches if there are any available.
What it means is that if there is no arithmetic operations to be processed in between texture samples, the GPU will have to wait for them. That's what you would call being "texture bound". In these cases, if you add more samples (like AF does), GPU time will increase.
What people have to understand is that this is highly engine/shader dependant. In an engine where most shaders are ALU bound, adding more texture samples won't impact performance at all because there will always be stuff to do during the wait, but in the case where you are texture bound, it will.
Another thing to note is that PC shader compilers are pretty shitty, resulting in much less ALU optimization, so that's also more arithmetic operations to hide latency with.
That being said, there is nothing magic with implementing AF on PS4. It's only a flag to setup on the samplers like on DirectX. So that's only a dev's decision/mistake if it's missing in a game.
No AF in both consoles is the best case then?
They really don't. I've seen posts similar to yours before, some people will read them and be informed, but the next time a digital foundry article comes up without AF on the PS4 there will be a new set of people to pick up pitchforks.It's almost like people don't want the answer they are looking for![]()
I guess this answers the question.Look at literally any other face-off except the games from the OP to see identical AF levels between platforms.
Here's the most recent face-offs:
BF:Hardline beta
![]()
The Crew
![]()
Dragon Age: Inquisition
![]()
Far Cry 4
![]()
GTAV
![]()
AC Unity
![]()
COD Advanced Warfare
![]()
I really wish the idea that AF has no performance impact would die because it DOES have an effect on performance.
The main ressource used by AF is texture bandwith. AF works by doing many more sampling than trilinear filtering so that's more bandwith and potential cache thrashing for bigger textures.
When a shader samples a texture, there is a latency before the shader can actually use the result. To avoid having the GPU doing nothing while waiting for a texture sample, shader compilers will fill the gap with arithmetic operations from independant code branches if there are any available.
What it means is that if there is no arithmetic operations to be processed in between texture samples, the GPU will have to wait for them. That's what you would call being "texture bound". In these cases, if you add more samples (like AF does), GPU time will increase.
What people have to understand is that this is highly engine/shader dependant. In an engine where most shaders are ALU bound, adding more texture samples won't impact performance at all because there will always be stuff to do during the wait, but in the case where you are texture bound, it will.
Another thing to note is that PC shader compilers are pretty shitty, resulting in much less ALU optimization, so that's also more arithmetic operations to hide latency with.
That being said, there is nothing magic with implementing AF on PS4. It's only a flag to setup on the samplers like on DirectX. So that's only a dev's decision/mistake if it's missing in a game.
Are there games that DO have AF on PS4? Because if there are then that means there isn't an issue and this is more of a question to the developers.
That's because texture quality was already ass quality, so it was just blurring blurredness.AF was apparently hyper terrible on last gen consoles, but i never noticed it. I think its just us console gamers lower standards and don't really know what to look for with this type of thing. We were able to game on 720p or sub 720p with horrible AA and sub 30fps framerates...AF on top of that was never really something considered.
i dont think stryder is pushing ps4 to the point that they couldnt optimize it any further for AF.
so what is it?
Ppffwhaha, what?Wasn't the PS4 supposed to be the most powerful?
And people call Molyneux a pathological liar.
It's almost like people don't want the answer they are looking for![]()
I appreciate this but don't understand really. What is the difference between the PS4/Xbox that makes it simpler to do on Xbox? Why would there ever be a game for Xbox One with AF while the same game on PS4 is lacking it?
so why do some games have it and some dont? performance thresholds exist with finite hardware, but the PS4 is more powerful than xbone, right? I get they have different SDK, but the solution exists on both.
a game like Stryder isn't pushing PS4 to its limit, so it is just weird to me
The point is that it's not simplier on the xbox. Like I said in my last sentence. A game lacking AF is only due to the developer decision or mistake.
And for those wondering if there is a performance advantage on the XBone due to ESRAM, there should be none. Material textures won't ever fit in ESRAM for your typical game (we are talking of hundreds of MBs of texture here, versus the 32MB of ESRAM).
The point is that it's not simplier on the xbox. Like I said in my last sentence. A game lacking AF is only due to the developer decision or mistake.
The point is that it's not simplier on the xbox. Like I said in my last sentence. A game lacking AF is only due to the developer decision or mistake.
And for those wondering if there is a performance advantage on the XBone due to ESRAM, there should be none. Material textures won't ever fit in ESRAM for your typical game (we are talking of hundreds of MBs of texture here, versus the 32MB of ESRAM).
Devs are lazy? Lol
Ok so what are the possible explanations as to that being a decision or mistake? Why would it be a decision a dev would make for the PS4 version of a game but not the Xbox One? Or what would the mistake be? Sorry I'm just trying to understand what exactly could be going on.
Edit: I guess if you can't make it make sense to me, does it at least make sense to you? Does it seem like a normal thing to you that it may happen, or does it seem strange/odd/wrong to you in some way?
I think this may be more true than people think. With previous generations, AF simply didn't matter because you couldn't see it anyway due to the low resolution. I think a lot of devs simply don't think about it. If the Xbox SDK really is applying some sort of "always-on" AF solution, that would make it even worse because it fails to draw the devs' attention to the issue....how likely is it that the omission of AF is down to a mistake/oversight?
I completely disagree. Small effect that does little for the overall look? Smart dev doing this?The reason is that it is a rather small effect that does little for overall look compared to other visual effects, but can use resources that could be utilized elsewhere.
It is an effect that has the ability to make things look a little better, but also causes latency to the buffers that show the on screen image.
All visual effects cause latency to the screen, and as they add up, they mean things like screen tearing, or slow frame rates, or even input lag(button press to action showing up on screen).
These are the reasons a smart dev would opt to leave it low, or off.
They still could optimize a game to include, given enough time.
so if not due to performance....
Great, but obviously this is not a Sony issue. This is a developer issue. What console of choice championing do you see when everyone has literally posted receipts of other games that don't have it and ICE programmers already proving that it wasn't a system issue?
I've been following this thread and keep seeing this line passed around as if y'all are being ignorant to the obvious answer.
The reason is that it is a rather small effect that does little for overall look compared to other visual effects, but can use resources that could be utilized elsewhere.
It is an effect that has the ability to make things look a little better, but also causes latency to the buffers that show the on screen image.
All visual effects cause latency to the screen, and as they add up, they mean things like screen tearing, or slow frame rates, or even input lag(button press to action showing up on screen).
These are the reasons a smart dev would opt to leave it low, or off.
They still could optimize a game to include, given enough time.
I did not say that, quite the opposite actually
It can be a matter of performance. Especially in games where 60fps is the target where every millisecond is huge. I merely stated the fact that there is no difficulty implementing AF on PS4 AND that there is an actual hit in performance depending on the scenario.
Also, don't forget that when running the same game in 900p versus 1080, the difference is huge. That's way more pixels to shade and way more texture samples to do.
I completely disagree. Small effect that does little for the overall look? Smart dev doing this?
The reason is that it is a rather small effect that does little for overall look compared to other visual effects, but can use resources that could be utilized elsewhere.
It is an effect that has the ability to make things look a little better, but also causes latency to the buffers that show the on screen image.
All visual effects cause latency to the screen, and as they add up, they mean things like screen tearing, or slow frame rates, or even input lag(button press to action showing up on screen).
These are the reasons a smart dev would opt to leave it low, or off.
They still could optimize a game to include, given enough time.
I did not say that, quite the opposite actually
It can be a matter of performance. Especially in games where 60fps is the target where every millisecond is huge. I merely stated the fact that there is no difficulty implementing AF on PS4 AND that there is an actual hit in performance depending on the scenario.
Also, don't forget that when running the same game in 900p versus 1080, the difference is huge. That's way more pixels to shade and way more texture samples to do.
Ah ok that I can understand (I think).
I wonder if some of these games with AF differences are 900p on Xbox One vs 1080p on PS4, or 30fps vs 60fps and so that would explain the AF situation?
interesting, i thought latency would just make it "pop in". i always thought bandwidth was the constraint on visual effects.
i wonder if there is a push or even agreement from Sony to hit 1080p that is causing the issue. i havent really looked into it, but i wonder if it is happening with 1080p titles only?
The reason is that it is a rather small effect that does little for overall look compared to other visual effects, but can use resources that could be utilized elsewhere.
It is an effect that has the ability to make things look a little better, but also causes latency to the buffers that show the on screen image.
All visual effects cause latency to the screen, and as they add up, they mean things like screen tearing, or slow frame rates, or even input lag(button press to action showing up on screen).
These are the reasons a smart dev would opt to leave it low, or off.
They still could optimize a game to include, given enough time.
The latency we are talking about has nothing to do with "pop in". Texture sampling latency is a matter of nanoseconds within the computation of one frame that lasts 16 to 33 milliseconds (for 60 or 30fps). It's completely invisible on the final image. It will only add to the overall time it takes for the image to be computed.
Texture pop in is usally due to texture streaming issues which is dependant on hard-drive access time which are several order of magnitude higher than memory access time (it can be a matter of seconds when there are tons of texture to stream in and out).
These notions are not related in any way.
I'm sorry but this is simply not true. In ANY industry, be it film, medical, the US Government - there will ALWAYS be lazy people and people not putting in effort. Why do you think so many people end up leaving or getting laid off and replaced? I'm not talking about mass layoffs, either.Really shocked at the amount of "lazy developer" comments in this thread. Game developers, especially in the AAA space, are incredibly hardworking people. Most, if not all, AAA games go through months of crunch if not more. And during these periods, many programmers just stay at the studio overnight to get more work done and go long stretches of time without spending a meaningful amount of time with their families. And we all know how strict most publishers are with deadlines and release dates. The people who made Assassin's Creed: Unity weren't incompetent and lazy. They just weren't given the time they needed to fully polish the game. Similarly, the AF situation on the PS4 probably isn't due to developers just not doing the required work. It probably has more to do with "we need to ship this game now so we have to prioritize certain things over others." This also explains why lots of games on PS4 do not have this issue.