PS4's AF issue we need answers!

It's never as simple as a global switch for your typical game scenario, even disregarding textures you want to sample with point or bilinear filtering because of various valid reasons, there are cases where you want to disable it for performance.
I'll give an example that happened to me to illustrate: in our hair rendering pipeline on PS3, we had to render hair meshes several times for various passes. The problem with that kind of meshes is that they often have a ton of overdraw because of the number of layers. Tons of overdraw means a lot of texture samples. In this particular case, AF proved to be much too costly for the increase in quality (especially since most hair strands are usually close to parallel to the camera plane where AF isn't as needed) so once we agreed with the artists that the tradeoff was acceptable, we just disabled it explicitely for some of the hair passes and got a significant boost of performance.
What I'm trying to say (like several time already in this thread) is that apart from a mistake made by some artist, there can be instances where due to lacking time to optimize properly, cuts are made where it's the easiest. Now wether a dev should cut AF or another feature is always debatable and the final decision always lies with the dev.

AF can be costly, but these are all multiplatform games. If a dev develops a game like Stryder for all 5 platforms, get's to PS4 and notices that AF isn't working in their engine and removes it, that's a problem. But it's not. The GPU on the PS4 should be perfectly fine of utilising AF in a similar way. If not, and the PS4 needs a special method, then Sony should help devs with it.

Otherwise that means we will be seeing this a heck of a lot more. We know devs leave out things, but them routinely leaving out AF is a problem.

Unless what you are really trying to say is that AF is just so effortless on Xbone in comparison.
 
I agree, but all of this does point to the fact that Sony should either offer a very simple API to developers which enables AF globally (especially for small independent games which don't need fine-granular management for performance reason), or, if such an API is already available, communicate its existence better.
You think the issue is that Sonys SDK only offers per-texture flag for AF, so the developer would need to go in and manually flag every single texture with the wanted amount of AF (and some would forget to do it)? That doesn't sound impossible, and would explain some oddities at least.
 
You think the issue is that Sonys SDK only offers per-texture flag for AF, so the developer would need to go in and manually flag every single texture with the wanted amount of AF (and some would forget to do it)? That doesn't sound impossible, and would explain some oddities at least.

That would be incredibly silly..
 
I really can't imagine that it's too costly for some of these indie games in any realistic scenario. I realize that consoles are different from PCs, but we aren't talking "AAA optimized-to-the-metal" first party titles here, and globally forcing 16xAF in such indie (or even "AA") multiplatform games on PC generally has a negligible performance impact.

If it was about rendering/performance cost, you wouldn't expect to have XB1 (and even PS3!) titles featuring superior filtering.

I can agree with that. To clarify, I, too, find it weird that it's missing in some of those games on PS4. I'm just trying to make people understand that there is no voodoo magic in using AF on PS4 and to be frank, it's quite depressing to see so many ignorant post trying to pull theories out of thin air when things have been explained several times already in this thread.
More to the point though, one thing that should not be disregarded is that the main advantage of the PS4 over the XBone lies in the number of arithmetic units, the bandwidth not so much. The ratio ALU/Bandwidth is pretty bad for both console (when you compare to last gen, ALUs have increased much more than bandwidth) so when you port a game to XBone and PS4, and choose to target 900p on XBone with and engine using pretty simple shaders (which is most likely the case in those "small" titles), resolution makes a HUGE difference because that's many more pixels to be written and many (many many in case of AF) textures samples to be done with very little ALU to hide latency with. If your game is a few millisecond short of outputing a stable 30 or 60fps, the extra time it takes can be meaningfull.
 
Do people believe that a company spends millions to make a game and test it, but then forgets to set a flag in the SDK? Also if it was that simple, if caught the fix could go in an update?
 
Maybe Xbox One is the lead platform for many games just because it takes much more time to get it at the level of PS4 and PC so by the time they get to the PS4 version they ran out of time to optimize everything.
 
Do people believe that a company spends millions to make a game and test it, but then forgets to set a flag in the SDK?
You would be surprised at what happens in projects people spend millions on -- not just in games.

I'm not saying that is what is happening here (because I don't know), just that the fact that some of these games are relatively large projects does not necessarily confirm that this isn't what happened.
 
95% of them have equal AF amount on both consoles, with all other usual things being better in ps4.

This game has:
Better resolution on ps4
Better framerate
Better AO

It's inconcievable to me that they'd consciously implement something so much more power hungry as HBAO instead of something so simple like AF on ground textures at least. It has to be an accidental omission, much like Fafalada suggested.

It is even more inconcievable that something so incredibly simplistic looking as strider or a unfinished Swan has better AF on PS3 than on PS4, while something like Far Cry 4 has good AF on PS4 - if this was some hardware power issue I mean.
I can agree in games like Unfinished Swan or Strider, but something like Evolve? Not really.
I mean both consoles have comparable bandwidth when they use resources right, so when You increase resolution, You need higher bandwidth generally and thats could be the point You starting to be starved and just reduce AF.

Personally if i would be designing budgets for different features i would reduce AF last.
 
Just no. As I said before, your typical game scene will be at the very least several Hundreds MB of textures. 32MB of ESRAM is way to small to make any significant use for texture usage.

Digital Foundry: There's been a lot of controversy about the design of the Xbox One. On the Beyond 3D Forum, you mentioned that 32MB has been the magic number for optimising render targets on your engine. Can you go into depth on how you approached ESRAM?

Sebastian Aaltonen: Actually that 32MB discussion wasn't about our engine; it was about an unoptimised prototype of another developer. The general consensus in that discussion thread was that it is not possible to fit a fully featured G-buffer to a 32MB memory pool. I, of course, had to disagree, and formulate a buffer layout that had all the same data tightly packed and encoded to fit to the target size.

Sebastian Aaltonen: Both competing consoles are now closer to each other than ever. While the last-generation consoles required a lot of custom console-specific optimisations, now most of the optimisations help both of them.

Optimising the render target size to fit it better to the fast ESRAM scratchpad reduces bandwidth cost and that boost performance on PS4 and PC GPUs as well. Optimising for data locality helps all GPUs with caches. Intel has quite big L3 (and even L4) caches in their GPUs and Nvidia's new Maxwell GPUs have 8x bigger L2 caches than their older (mainstream) Kepler GPUs. Writing memory/cache optimised code has become really important for GPUs as well, and the trend seems to be continuing.

You forget that XB360 was only 12MB. ESRAM is like a on chip Level 3
cache . Now that's 2 times that i have posted a link that say you can use ESRAM for render targets can you show me just one that say you can't?

It's just one game out of 20+ that is not as good as XB1 we are still on top....
 
You forget that XB360 was only 12MB. ESRAM is like a on chip Level 3
cache . Now that's 2 times that i have posted a link that say you can use ESRAM for render targets can you show me just one that say you can't?

It's just one game out of 20+ that is not as good as XB1 we are still on top....

I have a really hard time making sense of what you are saying.

ESRAM is mainly meant to be used as render target memory.
Yes it can be technically used to put textures used for your scene materials. And yes, I maintain it's still too small to be used that way because we are talking about potentially thousands of different textures across thousands of drawcalls. You can't move around that much memory to ESRAM between each drawcalls without completely negating its usefullness.
 
Strider - Xbox and PS4 version are both 1080p 60fps. Yet AF is completely lacking in the PS4 version.

Edit: Nevermind Dying Light is 900p on XBox One. Removed that example from post.
 
Why can't they just give us toggles like they do on PC?

They're coding for a fixed target. Ship the game with the settings they feel is best and let us choose how we want our game to run using the toggles.

If we want a stuttering sub 30FPS mess with major eye candy let us do it.

If we want a silky smooth 60FPS and a lower resolution, let us do it.

I can't think of any reason other than ego to not allow us this option when it can be built into their engines...
 
Why can't they just give us toggles like they do on PC?

They're coding for a fixed target. Ship the game with the settings they feel is best and let us choose how we want our game to run using the toggles.

If we want a stuttering sub 30FPS mess with major eye candy let us do it.

If we want a silky smooth 60FPS and a lower resolution, let us do it.

I can't think of any reason other than ego to not allow us this option when it can be built into their engines...

The entire point of doing that is to facilitate computers with largely varying components. That doesn't exist with console. Making a game on console implies shipping a product you are developing with one set pipeline.

Your suggestion is essentially telling devs to make games optimized for two seperate situations which doesnt make sense, as resolution doesnt = FPS(that has to do with GPU) in a CPU limited game, and there's no way to cross the difference between the two.
 
i like to consider myself pretty knowledgeable on the technical aspects of things, but admittedly when it comes to the intricate workings of APIs and game engines i am no expert...

However, the only thing that makes real sense is some sort of API glitch or a flag of sorts that is either buried in the API or doesnt need to be worried about on the PC/Xbone side of things that just gets overlooked when it comes to the PS4...because nothing on the hardware front seems like it would be any sort of a problem at least MATCHING the AF levels on the other consoles (especially LAST GEN consoles lol)...

i dont expect consoles to hit 16x AF like a PC, but there just doesnt seem to be anything i can logically think of that would cause the PS4 to suffer here...
 
Unfortunately many console gamers dont like to have a choice. It seems to be confusing them ...
http://www.neogaf.com/forum/showthread.php?t=469496
It's not confusing, but it's precisely the one thing I can't stand about PC gaming.

I'm not against having some simple option in console game, preferably a 30/60 fps switch like in TLOU, but my two memorable examples of playing the AAA games on PC (tomb raider and bf4) went like this:
- spend 15 minutes messing with all the settings ("Wow, I can run this at playable framerate even with tressfx enabled, this is awesome!)
- playing for 5-10 minutes, I get to a first slightly more open area and framerate goes to hell.
- another 10-15m in settings replaying this new section over and over until I get it right, with tressfx as a wildcard, I want to keep it if I can.
- playing another 5-10 minutes, get out of the cave, framerate again tanks
- goodbye tressfx, lower shadows, replay, replay, lower postfx, play again...
- play some more, get into first firefight, framerate tanks again.
- back in to settings, replay replay replay.
- bigger firefight later in the game... Yep framerate tanks again. At that point I'm conpletey sick of the game for no games fault at all, as I'm sick of replaying everything over and over just to confirm if disabling this or that option had enough of an effect.

Basically, for me it's the best hardware available or nothing when it comes to PC. And I'm not spending the money on that hardware - especially as I know that once in a while there will be a game that will just glich out regardless, and I'll spend a dumb amount of time battling that glitch. It's just not worth it for me.
 
It's not confusing, but it's precisely the one thing I can't stand about PC gaming.

I'm not against having some simple option in console game, preferably a 30/60 fps switch like in TLOU, but my two memorable examples of playing the AAA games on PC (tomb raider and bf4) went like this:
- spend 15 minutes messing with all the settings ("Wow, I can run this at playable framerate even with tressfx enabled, this is awesome!)
- playing for 5-10 minutes, I get to a first slightly more open area and framerate goes to hell.
- another 10-15m in settings replaying this new section over and over until I get it right, with tressfx as a wildcard, I want to keep it if I can.
- playing another 5-10 minutes, get out of the cave, framerate again tanks
- goodbye tressfx, lower shadows, replay, replay, lower postfx, play again...
- play some more, get into first firefight, framerate tanks again.
- back in to settings, replay replay replay.
- bigger firefight later in the game... Yep framerate tanks again. At that point I'm conpletey sick of the game for no games fault at all, as I'm sick of replaying everything over and over just to confirm if disabling this or that option had enough of an effect.

Basically, for me it's the best hardware available or nothing when it comes to PC. And I'm. It spending the money on that hardware - especially as I know that once in a while there will be a game that will just glich out regardless and, and I'll spend a dumb amount of time battling that glitch. It's just not worth it for me.
Raptr game settings or Geforce Experience automatically calibrates your game settings to an FPS target based upon your hardware. Sounds like soemthing you would enjoy.
 
Raptr game settings or Geforce Experience automatically calibrates your game settings to an FPS target based upon your hardware. Sounds like soemthing you would enjoy.
I've heard of those things and people having mixed results with them. I attribute that possibly to people having different priorities of what they actually want from graphics (like my insistence of keeping the TressFX until it became painfully obvious that I can't), or possibly games having intense spikes of processing requirements just like I experienced with TR.

I realize the biggest problem is that my PC is outdated as hell, there's no denying that. It's just that I mislead myself thinking it can do more than it could, and I wasted lots of time before realizing it can't. I'd need a whole new PC at this point, and I wouldn't want to get something that's just going to match or slightly exceed the PS4. To make sense to have another costly gaming machine, I'd need something that can absolutely blow the PS4 out of the water, but I can't justify paying for that - especially as my primary work machine is a laptop that I also need to replace at some point.

I can agree in games like Unfinished Swan or Strider, but something like Evolve? Not really.
I mean both consoles have comparable bandwidth when they use resources right, so when You increase resolution, You need higher bandwidth generally and thats could be the point You starting to be starved and just reduce AF.

Personally if i would be designing budgets for different features i would reduce AF last.
Well, in case of Evolve at least, the game at some point was running at 1080p on PS4 and 900p on XB1, and PS ver. clearly had better framerate. So they go and enable a pretty intensive HBAO, that's doesn't even make that much visual difference in a dark game like this? I guess they did that, and the framerate still remained better on PS4. So they try to enable AF in just ground textures, but that somehow pulls the performance down too much? But let's keep HBAO still...? Somehow I doubt process went like what I just described. It's more likely that no one ever even bothered trying to set per-texture AF in the game, much like it obviously happened in Strider and Unfinished Swan.
 
Anyone got the 500-some-odd post summary? Seems like every time this is brought up people just drudge up that old ass ICE Twitter claiming it's not a PS4/SDK issue and the armchair developers weigh in about it not being a a "free" effect, conveniently ignoring how little sense that makes for many of the examples pointed out. Has there been anything beyond that or has it been the same circle of stupid for 12 pages?
 
Anyone got the 500-some-odd post summary? Seems like every time this is brought up people just drudge up that old ass ICE Twitter claiming it's not a PS4/SDK issue and the armchair developers weigh in about it not being a a "free" effect, conveniently ignoring how little sense that makes for many of the examples pointed out. Has there been anything beyond that or has it been the same circle of stupid for 12 pages?

It's implement if possible, if it's not, it's down to performance decisions.
 
It's implement if possible, if it's not, it's down to performance decisions.

Problem is that doesn't explain away stuff like Strider, Unfinished Swan, or the Xbone parity ports. Logically, it just doesn't make sense developers who demonstrably have knowledge of texture filtering and actively enable it on other consoles, would be "forgetting" about it by "mistake" on PS4, or running into performance issues.
 
Did the Naughty Gods implement AF in Last of Us on PS4?

Yup.

ipa05h58Das2f.jpg


i0mNk4S7Qo3hS.jpg


LastOfUsRemastered-4.png
 
Well, in case of Evolve at least, the game at some point was running at 1080p on PS4 and 900p on XB1, and PS ver. clearly had better framerate. So they go and enable a pretty intensive HBAO, that's doesn't even make that much visual difference in a dark game like this? I guess they did that, and the framerate still remained better on PS4. So they try to enable AF in just ground textures, but that somehow pulls the performance down too much? But let's keep HBAO still...? Somehow I doubt process went like what I just described. It's more likely that no one ever even bothered trying to set per-texture AF in the game, much like it obviously happened in Strider and Unfinished Swan.
CryEngine doesnt have HBAO, it has SSDO.

SSDO is compute bound operation, not bandwidth bound.
 
I don't notice it much during gameplay but look at screenshot comparisons it does look quite ugly.

I wonder why it's never in Ps4 games. From what I've heard it generally doesn't demand much extra power from the hardware

It's present in most games actually.
 
Anyone got the 500-some-odd post summary? Seems like every time this is brought up people just drudge up that old ass ICE Twitter claiming it's not a PS4/SDK issue and the armchair developers weigh in about it not being a a "free" effect, conveniently ignoring how little sense that makes for many of the examples pointed out. Has there been anything beyond that or has it been the same circle of stupid for 12 pages?
It's pretty much been circle of stupid really.

Summary seems to be most games have it ergo the hardware and SDK must support it so the issue is down to isolated games/engines/code/optimization.

There's a bunch of people conveniently skipping all info around majority of games having it and other information and just snatching tales from their ass as to the "why" the PS4 can't support it (when it clearly can).

There were some good posts on use of AF and various ideas for why some games might not have it - from developer mistakes to optimization decisions.

But in short all that seems to have been confirmed it's not widespread (majority of games having it) and that it's not limited to ports/smaller titles (as some bigger titles don't have it) and that the hardware can definitely support it.

Takeaway for me is it's either somehow easy to miss activating it on PS4 (although seems odd majority of developers/titles have no issues in that case) or we're seeing developers drop it as an easy option to gain some resources quickly when finalizing optimization to stabilize fps at 1080p (seems reasonable if fairly weak option for developers to take vs more focused optimization) combined with some developers flat out failing to activate it and leaving it un-patched because the majority don't seem to care

Due to the repeat/rinse of stupid stuff I've skimmed here and there so I may have missed something. Ideally a more definitive statement can be delivered be enough people "in the know" to put this to bed. Myself though I think it's just "one of those things" by title/developer like odd missing stuff on XB1 or even PC at times that some are reading way too much "concern" into.
 
I agree, but all of this does point to the fact that Sony should either offer a very simple API to developers which enables AF globally (especially for small independent games which don't need fine-granular management for performance reason), or, if such an API is already available, communicate its existence better.

I am thinking this could be the case. I am not a developer, so I can't speak on the API's available for ps4.

To those that keep saying it is because 1080p is used, then you'd think the same compromises would apply for when the decision is made to down scale to 900p for xbone titles to provide AF. that is why it is bizarre. no conspiracy, just bizarre without context.
 
CryEngine doesnt have HBAO, it has SSDO.

SSDO is compute bound operation, not bandwidth bound.
OK, even if SSDO has zero impact on bandwidth usage (hard to believe but I'll take your word on it), the fact still remains that the game in its published state runs at a higher framerate on PS4, and I find it really hard to believe that enabling higher AF just on a ground textures and nowhere else would somehow tank it. I find it easier to believe that much like other completely dumb omissions like patching out god rays out from FC4, this was just an oversight, or running out of time to enable something that requires manual effort and then extensive framerate re-testing.

To some degree maybe, but even if implemented, it never looks like proper 16x AF.
Looks like 16xAF in TLOU:R and that's 1080p/60 game that looks far better than many others at least, where this effect is lower or missing. I think it might be similarly 8 or 16x in FC4.
 
Xbox esram is still only about as fast as Ps4 memory generally. So that shouldn't be an issue

It's about as fast but that's bandwidth that's wholly available only to the GPU, without any contention from the CPU.

Not sure it's that though, if that's the cause, in games where xbone has AF and Ps4 doesn't xbone would also have a performance advantage in heavy alpha blending scenarios and that doesn't seem to happen...
 
or we're seeing developers drop it as an easy option to gain some resources quickly when finalizing optimization to stabilize fps at 1080p (seems reasonable if fairly weak option for developers to take vs more focused optimization)

i think that makes sense at face value if you have a situation where at 1080p the PS4 version has a less stable or lower overall framerate...so you could say "hey, lets cut AF and gain a couple FPS."

but in a situation where its the Xbone with the lower framerate, then wouldnt they cut AF there too to bring the framerates closer together?
 
It's possible some workflows have artists not involved at all in determining filtering but I'd say it's far more common that they are involved.

This!

And it gets convoluted when crossing across different platforms?
 
It's not confusing, but it's precisely the one thing I can't stand about PC gaming.

I'm not against having some simple option in console game, preferably a 30/60 fps switch like in TLOU, but my two memorable examples of playing the AAA games on PC (tomb raider and bf4) went like this:
- spend 15 minutes messing with all the settings ("Wow, I can run this at playable framerate even with tressfx enabled, this is awesome!)
- playing for 5-10 minutes, I get to a first slightly more open area and framerate goes to hell.
- another 10-15m in settings replaying this new section over and over until I get it right, with tressfx as a wildcard, I want to keep it if I can.
- playing another 5-10 minutes, get out of the cave, framerate again tanks
- goodbye tressfx, lower shadows, replay, replay, lower postfx, play again...
- play some more, get into first firefight, framerate tanks again.
- back in to settings, replay replay replay.
- bigger firefight later in the game... Yep framerate tanks again. At that point I'm conpletey sick of the game for no games fault at all, as I'm sick of replaying everything over and over just to confirm if disabling this or that option had enough of an effect.

Basically, for me it's the best hardware available or nothing when it comes to PC. And I'm not spending the money on that hardware - especially as I know that once in a while there will be a game that will just glich out regardless, and I'll spend a dumb amount of time battling that glitch. It's just not worth it for me.

Crysis was the worst game for this scenario for me lol. I think I pretty got to the end of the game before I stopped going back to settings or changing the cfg every 10 minutes.
 
OK, even if SSDO has zero impact on bandwidth usage (hard to believe but I'll take your word on it), the fact still remains that the game in its published state runs at a higher framerate on PS4, and I find it really hard to believe that enabling higher AF just on a ground textures and nowhere else would somehow tank it.

I havent said it has zero impact on bandwidth, definitely has some, but probably quite low.
Also, i dont think enabling AF only for ground textures would impact framerate either. Just enabling it for all, could affect performance.
As i said earlier, personally i would sacrifice pretty much everything else before cutting or reducing greatly AF.
I wouldnt say the higher framerate is the most important factor in determining it could be done. Maybe higher AF is causing fps spikes or some streaming problems etc.

---
Looks like 16xAF in TLOU:R and that's 1080p/60 game that looks far better than many others at least, where this effect is lower or missing. I think it might be similarly 8 or 16x in FC4.

It is really hard to guess if texture is using 8x or 16x AF, but TLOU looks to have at least 8x for sure.
 
I didn't know what the hell I was looking at, till it was pointed out. Now I spot everything right away.

Thanks GAF, I am no longer blind.
 
pixlexic said:
were those games going from previous gen to the current gen like unfinished swan?
Yes, there were some PS1 ports that would revert to point-sampling across "some" surfaces. Likewise for some native titles.
Unlike AF - there were also no additional costs associated with Bilinear on any PS2-era hardware(although that didn't stop the internet from claiming otherwise).

Panajev2001a said:
Stop, maybe I am reading too much into it.
Artistic arguments for 3d-polygons could work, but not so much when they are used inconsistently like that. Worth noting certain postprocess operations required you to disable bilinear to get correct math working, but that's not something where people could actually see filter used on screen.

Jux said:
What I'm trying to say (like several time already in this thread) is that apart from a mistake made by some artist, there can be instances where due to lacking time to optimize properly, cuts are made where it's the easiest.
Problem is - examples in this thread showcase AF on ground textures - which is just about the last surface one would chose for optimization because it has the most obvious visual tradeoffs.
 
i think that makes sense at face value if you have a situation where at 1080p the PS4 version has a less stable or lower overall framerate...so you could say "hey, lets cut AF and gain a couple FPS."

but in a situation where its the Xbone with the lower framerate, then wouldnt they cut AF there too to bring the framerates closer together?

No idea to be honest. What im pretty sure of, because the majority of PS4 games have AF with no issues, is that it's not a consistent effect driven by the console/SDK

That means it's per game/developer and that means you're likley talking different causes too. Often a developer as split teams working on PS4/XB1 I believe, so again each team could be making different choices: 900p & AF on XB1 and acceptable frame rate and 1080p no or lesser AF on PS4 and acceptable frame rate.

The main thing for me (and why I'm probably dropping out of this thread although I am interested) is that it's on a minority of games and it's clearly individual to those games however any focus on this is getting lost in the rinse/repeat stupid "concerns" posts desperately circling this as some sort of PS4 centric issue despite the majority of evidence indicating otherwise and the volume of clearly technically uninformed posts containing "guesses".

Given the OP is also merely updating his original post with every GIF he can find showing the AF issues to increase sense it's widespread and seems to miss or ignore GIFs showing AF present and can't be bothered to update his OP with good information from the thread that's potentially positive I'm now of the view this is merely another "concern" thread from someone who's fundamentally "concerned" the PS4 is dominating globally this gen.

I base this on actions that clearly focus on making the problem seem bigger and definite and magically ignoring the even larger amount of GIFs/evidence he should be updating OP with showing AF present and making clear it is supported.
 
It's probably their multiplatform engines having different AF defaults for each console. Not sure why that would be the case, but that's the only thing that makes somewhat sense to me. Im guessing certain rushed devs don't bother checking out the AF settings.
 
That Strider pic still confuses me. It doesn't look like a lack of AF rather the texture doesn't seem to be there at all. I know it probably is AF missing but it seems way off rather than non a AF'd texture.
 
Top Bottom