PS4's AF issue we need answers!

Could it be as simple as:

XB1 SDK AF On Default
PS4 SDK SF Off Default

Some Devs Change it others leave to default?

Judging by the state of games launching this gen, Devs have far more to worry about than changing a default option.

Seriously? You really think devs are so bad that they would do that? People here make it sound like devs don't even know what AF is... AF has been around forever and is pretty much just a matter of adding an enumeration value somewhere in the code. Hell we used it extensively on PS3 ourselves...

So if a game does not have AF everywhere it just means that it was part of a decision made by the devs somewhere along production and that's it. People should stop trying to find some reason seemingly only to fuel the console war (at least that's what it looks like in my opinion).
 
Could be that xbone having those 2 stages(memory to esram) on separate buses gives that edge to have more "free" af. everything on the ps4 goes through that same memory to cpu / gpu path.


the lazy dev excuse just doesn't cut it. why would they be lazy on one console but not the other? even more so when that platform is the market leader.

We are still in bets made timeframe with these releases, im betting alot of the multi format devs bet on xbox continuing to be the leader and went first with xbox.Games take years to make and its only a year and a bit old this gen so still dealing with bets made with sku leaders etc.
 
Great game, can't wait to play it on my PS4. Oh by the way, can you put textures in ESRAM?

Yes! In compressed formats.
32MB of ESRAM is not a hard limit for render targets.
Render targets can be passed to and from ESRAM and DDR3.
http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects

Your DTE move engines can move and decompress textures from the ESRAM and not use any of the GPU cycles to do so.

move_engine112.jpg


Speed?

Is 4GB @ 256bits of DDR5 faster than 4MB of DDR3 @ 1024bits.
In this case the 4MB wins.
For every 1 set of 256bits to the GPU the DDR3 is sending 4x to the GPU.

That's why ESRAM is in XB1.

Wow that's incredible! ESRAM is the future!
 
Seriously? You really think devs are so bad that they would do that? People here make it sound like devs don't even know what AF is... AF has been around forever and is pretty much just a matter of adding an enumeration value somewhere in the code. Hell we used it extensively on PS3 ourselves...

So if a game does not have AF everywhere it just means that it was part of a decision made by the devs somewhere along production and that's it. People should stop trying to find some reason seemingly only to fuel the console war (at least that's what it looks like in my opinion).

It was a simple question. Dont get so knickers.
 
Barely more than a handful, among hundreds of games.

The prevalent narrative with these threads seem to be that there is a problem with the PS4 either on the SDK or the Hardware, in spite of the overwhelming evidence that this is not the case.

How is pointing the finger at developpers and saying that the answer should be found there, "not a satisfactory answer"?

The guys from DICE already answered what they could: the PS4 or its SDK have no known AF problem and is perfectly capable of 16x AF. Now we need a dev that either -chose- to not implement it, or ended up not implementing it due to deadline/ omission, to step up and explain.

Until then, it's only going to be speculations.

Given their focus and also how prompt they have been at underlining lack/ bad AF, why would Eurogamer/DF not go to these (7?) devs and ask questions about this?

You sound personally angry about the fact that this issue is brought up to the light. Do you own Sony stock by any chance?
 
Hopefully someone from Sony or a multiplat dev will read this and shed some light on what the issue is.

None of the games listed with this issue I really care for but if batman arkham knight has this issue I will be pissed.
 
Could be that xbone having those 2 stages(memory to esram) on separate buses gives that edge to have more "free" af. everything on the ps4 goes through that same memory to cpu / gpu path.


the lazy dev excuse just doesn't cut it. why would they be lazy on one console but not the other? even more so when that platform is the market leader.

Because assuming the game is being ported from a PC engine, AF maybe just comes along in a format the xbox can use without changes? And maybe there isn't time (or priority) to make the changes to bring it to PS4.

'Lazy devs' usually just means devs don't have the time to get that done before the game needs to ship, as other things have higher priority.


Is there any pattern in these games? Are they all directX on PC? Are they all using similar middleware?
 
It was a simple question. Dont get so knickers.

Sorry if it sounded harsh, it wasn't meant to be and it wasn't aimed at you specifically. It's just that the overall idea that there must be some hidden strange reason why there is no AF in some games is tiring. I wish people would just accept the simpler (and often right) reason is enough.
 
Because assuming the game is being ported from a PC engine, AF maybe just comes along in a format the xbox can use without changes? And maybe there isn't time (or priority) to make the changes to bring it to PS4.

'Lazy devs' usually just means devs don't have the time to get that done before the game needs to ship, as other things have higher priority.


Is there any pattern in these games? Are they all directX on PC? Are they all using similar middleware?

Once again, there is absolutely no difference in setting up AF on PS4 versus XBone/PC. If it isn't there, then it's the developper choice based on performance/tradeoff etc. (not that it cannot be a mistake from time to time, but it's not a technical reason anyway, it's just an artist forgetting a flag on a texture and someone not paying enough attention during reviews. Most likely in the games shown here, it's a choice).
 
There is no problem. The majority of those games run at a higher framerate and resolution on the PS4. There is this misconception that AF is free but when there is AA and higher resolutions then there is usually a hit in performance. Devs seem to have made a call for higher res and better performance in these small selection of games.


The higher res and frame rate should be naturally balanced by the increased performance of the PS4. 900p - 1080p matches pretty nicely with the different GPU setups in the respective consoles. So I don't think that is a likely explanation. Also if that were the case, why is it only applying to a handful of games?


Once again, there is absolutely no difference in setting up AF on PS4 versus XBone/PC. If it isn't there, then it's the developper choice based on performance/tradeoff etc. (not that it cannot be a mistake from time to time, but it's not a technical reason anyway, it's just an artist forgetting a flag on a texture and someone not paying enough attention during reviews. Most likely in the games shown here, it's a choice).


The point being, with xbox one being directX based, maybe it is more forgiving of PC engines being ported to it, whereas PS4 needs a little more sanity checking for some pipelines. And so where AF may be slightly more likely to fall through the cracks on PS4
 
Dev from Tripwire

4X3TdR7.png

Only the "some engine ports are shit" part makes sense, because artists don't decide games AF settings and I am pretty sure that artists usually want a game to display their work 1:1 without any degradation (from any camera angle).
 
The higher res and frame rate should be naturally balanced by the increased performance of the PS4. 900p - 1080p matches pretty nicely with the different GPU setups in the respective consoles. So I don't think that is a likely explanation. Also if that were the case, why is it only applying to a handful of games?





The point being, with xbox one being directX based, maybe it is more forgiving of PC engines being ported to it, whereas PS4 needs a little more sanity checking for some pipelines. And so where AF may be slightly more likely to fall through the cracks on PS4

It has nothing to do with being ported from DirectX at that point, it's only a matter of GPU architecture/power. The graphical API, given the same feature set (which is pretty close here between PS4/DirectXOne) will only affect the CPU side of performance. AF will only affect GPU performance.
Now the only difference between PS4 and XBone regarding texture sampling rate is the memory which IS different between the two consoles (even though that should be somewhat hidden by caches anyway), and also the fact that pretty often, the resolution at which those games are rendered is different (and YES resolution has an impact on texture sampling rate). That's the only explanation you need for the differences between the two consoles.
 
Only the "some engine ports are shit" part makes sense, because artists don't decide games AF settings and I am pretty sure that artists usually want a game to display their work 1:1 without any degradation of their work (from any camera angle).

Yes they do.
It's pretty stupid to put AF on everything if you don't need to, hell, a lot of special effects need point sampling for that matter. So yes, artists choose what kind of filtering they want until someone asks them to optimize their stuff where they can cut corners based on pertinent choices.
 
Not for AF since it has been optimized on a hardware level to work with GDDR5 for generations now.

I just wanted to say that latency never gets mentioned when number throwing is happening again.
For af it's not important. Question still remains in the context of the thread, though.
 
Have we ever got some real world usage speeds of the GDDR5 in the PS4?

I'm going to say this is simply caused by the one pool of unified ram. On average isn't the required memory bandwidth around 15GBps for 1080p with 16AF?

On the X1, the DMAs can filter the textures without using any GPU cycles from DDR3 to eSRAM? If so, then this can explain the differences.

Also, I feel like the only games we've seen this on, on the PS4 are the open world games which will require a higher memory BW due to all the assets streaming.

But then, there's strider. That doesn't make sense.
 
Have we ever got some real world usage speeds of the GDDR5 in the PS4?

I'm going to say this is simply caused by the one pool of unified ram. On average isn't the required memory bandwidth around 15GBps for 1080p with 16AF?

On the X1, the DMAs can filter the textures without using any GPU cycles from DDR3 to eSRAM? If so, then this can explain the differences.

Also, I feel like the only games we've seen this on, on the PS4 are the open world games which will require a higher memory BW due to all the assets streaming.

But then, there's strider. That doesn't make sense.

GPU's have been using ddr5 for years we know what it can do.
 
Only the "some engine ports are shit" part makes sense, because artists don't decide games AF settings and I am pretty sure that artists usually want a game to display their work 1:1 without any degradation (from any camera angle).

So, we ask for a devs take on this "issue", a dev posts, gaf refuses to believe.

smh
 
Yes they do.
It's pretty stupid to put AF on everything if you don't need to, hell, a lot of special effects need point sampling for that matter. So yes, artists choose what kind of filtering they want until someone asks them to optimize their stuff where they can cut corners based on pertinent choices.

Which begs the question, why have it in Xbone then but not in PS4?
 
On the X1, the DMAs can filter the textures without using any GPU cycles from DDR3 to eSRAM? If so, then this can explain the differences.

What does this actually mean?

Moving data from DDR3 to eSRAM? How is that better than not moving it in unified memory, I'm assuming this is the point of unified memory by layman's logic?
 
But latency should be way better which can make a difference.

The more cache misses you have the more the lower latency of the ESRAM will come into play unless developers find workarounds in the shaders for PS4 (optimizations which would benefit Xbox One too, but might not be needed there == lower performance improvement).
 

What does this actually mean?

Moving data from DDR3 to eSRAM? How is that better than not moving it in unified memory, I'm assuming this is the point of unified memory by layman's logic?
Well, your simply saving the GPU from performing texture filtering, and letting the DMA's take the load instead. Suppose it saves you vital cycles, which you could be using elsewhere.

To me though, it still doesn't make sense to see why some games are lacking AF on the PS4. It just could mean that it's more 'free' to perform on the X1.
 
Only the "some engine ports are shit" part makes sense, because artists don't decide games AF settings and I am pretty sure that artists usually want a game to display their work 1:1 without any degradation (from any camera angle).

Your assumptions are not correct and I find it weird that you would assume a developer is talking nonsense instead of trying to explain something.

It's possible some workflows have artists not involved at all in determining filtering but I'd say it's far more common that they are involved.
 
Dev from Tripwire

4X3TdR7.png

OK, this answers the question on AF by an independent dev, some of the engines ported on Ps4 by some devs are shit is the way I read it.

So next game by Turtle Rock with their engine will also have no AF unless they can be bothered to polish it.
 
OK, this answers the question on AF by an independent dev, some of the engines ported on Ps4 by some devs are shit is the way I read it.

So next game by Turtle Rock with their engine will also have no AF unless they can be bothered to polish it.

"Shit" to the extent the gulf in rendering power between the two consoles means AF can't be done to the same level as XB1?
 
Seriously? You really think devs are so bad that they would do that? People here make it sound like devs don't even know what AF is... AF has been around forever and is pretty much just a matter of adding an enumeration value somewhere in the code. Hell we used it extensively on PS3 ourselves...

So if a game does not have AF everywhere it just means that it was part of a decision made by the devs somewhere along production and that's it. People should stop trying to find some reason seemingly only to fuel the console war (at least that's what it looks like in my opinion).

So you actually don't know.

You're just assuming.

And this has nothing to do with console war. Devs could just happen to "forget", what's so wrong with that idea?

I am still waiting for a list of Xbone games that don't have it. It is possible that it is just forced or put on by default.
 
I feel some people are on a quest in this topic, is not about the "PS4's AF issue" anymore but to prove Xbox One has some kind of advantage now. Well, stay sane, guys.
 
"Shit" to the extent the gulf in rendering power between the two consoles means AF can't be done to the same level as XB1?

No idea what you are going on about,, you sounding like Mr X media and console wars and hidden GPU power.

Its just a devs shit engine in his opinion.

Read what the guy said

Gulf in power ? Wat.... Ps4 GPU is low and Xb1 is worse, WiiU is a toaster, there are no gulfs, they are what they are and there are devs that miss stuff.
 
Jux has answered the questions several times now but it is just being ignored.

Also, I learned in this thread that many people are essentially blind. Poorly implemented texture filtering looks like absolute ass to me and is very important to visuals. I guess that goes with having every PC game in the last 10yrs set at 16x (when possible).
 
No idea what you are going on about,, you sounding like Mr X media and console wars and hidden GPU power.
How exactly am I sounding like MrX?
Its just a devs shit engine in his opinion.

Read what the guy said

Gulf in power ? Wat.... Ps4 GPU is low and Xb1 is worse, WiiU is a toaster, there are no gulfs, they are what they are and there are devs that miss stuff.
Which is his opinion, which you then formed your own upon.

Again, there is a gulf in power between the PS4 and XB1 rendering capabilities. I find it strange that even with a "shit" port the extra PS4 grunt can't apply something as basic as AF to at least match the XB1? Unless of course it would be debilitating to performance.

O/T: I think you need to take a break as a lot of your posts have been pretty angry.
 
Jux has answered the questions several times now but it is just being ignored.

Also, I learned in this thread that many people are essentially blind. Poorly implemented texture filtering looks like absolute ass to me and is very important to visuals. I guess that goes with having every PC game in the last 10yrs set at 16x (when possible).

Jux is going on about GPU and memory bandwidth differences, but we see well optimised games like TLOU chucking everything out at 60 FPS 1080p, its well optimised and polished to death.

Then a dev from Tripwire says, refering to AF discussion, its just a shit engine port.

Considering Evolve is struggling to do 30 FPS on both Ps4 and Xb1 at 900p, I am favouring the response from the dev at tripwire which makes more sense. Its a shit engine.
 
How exactly am I sounding like MrX?
Which is his opinion, which you then formed your own upon.

Again, there is a gulf in power between the PS4 and XB1 rendering capabilities. I find it strange that even with a "shit" port the extra PS4 grunt can't apply something as basic as AF to at least match the XB1? Unless of course it would be debilitating to performance.

O/T: I think you need to take a break as a lot of your posts have been pretty angry.

OK sorry about Mr X, it was tongue in cheek. Think you mis-understand, the dev just basically said some engines just miss stuff like this because they are shit.

Really that simple.
 
The way some folks have made the leap from a dev explaining the way textures are set up while adding an addendum that some engine ports are just bad to now claiming a dev confirmed that it was a bad port is why developers avoid interfacing with communities like this.

There are a million reasons why any game might have any setting the way it is, including AF, if you want a definitive answer then ask Turtle Rock about Evolve or ask Double Helix about Strider.
 
Jux is going on about GPU and memory bandwidth differences, but we see well optimised games like TLOU chucking everything out at 60 FPS 1080p, its well optimised and polished to death.

Then a dev from Tripwire says, refering to AF discussion, its just a shit engine port.

Considering Evolve is struggling to do 30 FPS on both Ps4 and Xb1 at 900p, I am favouring the response from the dev at tripwire which makes more sense. Its a shit engine.
Well Jux did highlight the differences between various engines and their technique for texture handling. Most engines seem to jive well with Aniso, with the odd one choking with the increased demand 1080p puts on them. One could ask though, shouldn't the Dev have just made those games 900p on PS4 as well like with BF4?
 
I am no technical expert but I am surprised when people say that AF requires no power.

A game without AF can look substantially more blurry.The difference is just as palpable as a resolution difference.
 
It's absolutely true that for many games artists set AF levels per material.

Personally, I think that's a needless layer of complexity for modern platforms, and as we can see it introduces a point of failure, but that's the state of the art. (Which is in fact why we have the whole "force 16xAF" driver-level setting on PC)
 
Dev from Tripwire

4X3TdR7.png

Taking this into account I don't think a screenshot can be representative of the AF levels in a whole game, as AF might actually vary depending on the scene. Proving all the thread arguments baseless (for or against).

In order to prove the point that PS4 has difficulty with high AF there needs to be more instances with low AF that those shown in these captured screenshots.
 
Top Bottom