PS4's AF issue we need answers!

I have no idea what causes it, but it's really distracting when I play a game on PS4 that has poor AF. Dying Light's ground textures looks absolutely awful on PS4.

Well concerning Dying Light, the ground textures might look blurry but those wire-mesh fences simply disappear a few meters in front of you.
 
These games were probably ported to ps4 using gnmx, the directx 11 wrapper, and not purpose built using gnm. Why take additional steps, or spend extra time on something that very few will notice? Especially when the game is likely running as smooth and at a higher resolution. Just my thoughts.
 
I can only imagine that in some cases the PS4 version of a game is barely hitting their frame rate target, so they axe AF and get that extra 1fps they needed. They just prioritize it lower than other effects.

Just a guess.
 
But what about games like The Order then? I don't think there's any reason for them to implement variable AF if there's no issues at all to have 16XAF on PS4.

There's clearly something PS4 related, hardware or software, I don't know, but it's not that simple.

It's not just about AF completely lacking or being worse than on Xbox One, even when it's better it's still not quite there. On PC it's almost free, and while I do understand that consoles and PC are different, if it's not free on PS4, then it's also a PS4 issue, not a dev issue.

I find it hard to believe that they would just choose not to use 16xAF because reasons.

Of course there is a reason. AF isn't FREE. It does hit your FPS. But its only like 2-4fps or so. Which is fine in the PC world, because in the PC world you are used to your framerate bouncing all over the place anyway. Unless you can brute force a game of course and get it to like 100fps. People think PC = 60fps, but in my 20 years of playing PC games, it is quite rare for a game to be a smooth framerate in general, let alone 60fps and smooth. Almost never happens, unless you can afford those high end systems that brute force everything into submission. And even then, you'll see people complaining about stuttering and such in the OT's.

If I were really going to hammer in on it, some people are saying AF is CPU bound? Which if it is, means that the bandwidth being used is only 20GB/s of the 176GB/s available ( garlic / onion architectural split ). So perhaps they are bandwidth capped and then begin to implement the various post-processing effects and notice that the CPU can't handle an additional texture filter.

First party teams can of course optimize and optimize and optimize some more for the single platform, so they will get the best end result. In the case of the Order, that game is pushing the envelope in so many areas. They probably saw a variable AF as the best way to handle the bandwidth cap. When the action is high and the calls are high, the AF is lowered to compensate. When not much is going on, the AF is kicked up to sharpen the IQ. When action is high, you are not going to be worrying about texture filtering. But when exploring, you'll notice, if you know what to look for anyway.

Thats all assuming of course that AF is CPU bound. If AF can be done on the GPU, then I can't see 156GB/s of available bandwidth having any sort of bottleneck due to a simple texture filter being implemented. It's understandable only if the filtering is done in a very late state of development, and teams don't have time to optimize around the performance hit and instead choose to just lower the effect and be done with it.
 
dZqWIYF.jpg


cerny plz

From DmC thread.
 
Of course there is a reason. AF isn't FREE. It does hit your FPS. But its only like 2-4fps or so. Which is fine in the PC world, because in the PC world you are used to your framerate bouncing all over the place anyway. Unless you can brute force a game of course and get it to like 100fps. People think PC = 60fps, but in my 20 years of playing PC games, it is quite rare for a game to be a smooth framerate in general, let alone 60fps and smooth.


Thats all assuming of course that AF is CPU bound.

Are you sure you know what you are talking about? Because these sections seem like someone who doesnt know anything about frame capping or how even AF works.
 
Do you think it could be the memory bandwidth wearing thin being the one and only bus for everything?
Maybe, in cases like The Order, but for most of these games it doesn't look as if that should be the case. I still think it's some kind of API mismatch for ports.
 
Which it never ever at all in any case is. I'm trying to say that AF is 100% GPU-bound. It's as GPU-bound as increasing rendering resolution.

Yeah, then who knows. 156GB/s of memory bandwidth, 32 rendering processors, GPU compute capability. I can't see anything hardware wise that could cause a simple texture filter to cause an issue.

So it all falls on software and development.

I mean, we hate the term lazy devs of course, not all of them are. But no doubt some are. Just look at FFXIII for a quick example. It took you what, a day? to add all those resolution enhancement features to PC FFXIII? How is it possible SE was incapable of doing that on their own?

Or maybe not so much lazy as time constrained. Or a simple case of the porting over of the PC / DirectX code not matching up well enough with the current PS4 openGL based API and devs just not wanting to spend much time optimizing it. Which again would most likely be due to time constraints. Who knows why they don't go back and optimize and then release the optimized code on a PS4 patch a week later or whatever though.

Durante you would know. When is AF implemented during the engine development process? Is it nearer the end of the process or would everything post processing / filter wise be implemented long before a game reaches crunch time?
 
Maybe, in cases like The Order, but for most of these games it doesn't look as if that should be the case. I still think it's some kind of API mismatch for ports.

I agree, it has to be something the API does differently than others. It is not a hardware issue as gcn has no issues, it is not bandwidth as it works fine on Xbox so the only remaining reason is api related. Obviously it can do it but it seems to be overlooked in enough games that there is something going on.
 
It's not confusing, but it's precisely the one thing I can't stand about PC gaming.

I'm not against having some simple option in console game, preferably a 30/60 fps switch like in TLOU, but my two memorable examples of playing the AAA games on PC (tomb raider and bf4) went like this:
- spend 15 minutes messing with all the settings ("Wow, I can run this at playable framerate even with tressfx enabled, this is awesome!)
- playing for 5-10 minutes, I get to a first slightly more open area and framerate goes to hell.
- another 10-15m in settings replaying this new section over and over until I get it right, with tressfx as a wildcard, I want to keep it if I can.
- playing another 5-10 minutes, get out of the cave, framerate again tanks
- goodbye tressfx, lower shadows, replay, replay, lower postfx, play again...
- play some more, get into first firefight, framerate tanks again.
- back in to settings, replay replay replay.
- bigger firefight later in the game... Yep framerate tanks again. At that point I'm conpletey sick of the game for no games fault at all, as I'm sick of replaying everything over and over just to confirm if disabling this or that option had enough of an effect.

Basically, for me it's the best hardware available or nothing when it comes to PC. And I'm not spending the money on that hardware - especially as I know that once in a while there will be a game that will just glich out regardless, and I'll spend a dumb amount of time battling that glitch. It's just not worth it for me.

1000x this.

This is the worst aspect of PC gaming, and why I'm mainly on console now. I want the developer to push the limits of a fixed, closed environment... and give me the absolute best results. I don't want to see amazing graphics that work great for 10 minutes then make the game drop to 20 fps after that.

It is especially horrible when playing multiplayer on PC.

When playing Battlefield 3, I was constantly tweaking the graphics. I had to constantly decide between having an awesome looking game that ran badly or a shitty looking game that ran well. In multiplayer, frame rate is everything, so I always opted for the shitty looking game.

On consoles, the game looks great, and everyone has the exact same performance.
 
Not only compared to pc. ps4 has worse texture filtering than ps3. Actually negates the resolution improvement of the new ps4 version.

Hrmm. that is definitely odd.
1000x this.

This is the worst aspect of PC gaming, and why I'm mainly on console now. I want the developer to push the limits of a fixed, closed environment... and give me the absolute best results. I don't want to see amazing graphics that work great for 10 minutes then make the game drop to 20 fps after that.

It is especially horrible when playing multiplayer on PC.

When playing Battlefield 3, I was constantly tweaking the graphics. I had to constantly decide between having an awesome looking game that ran badly or a shitty looking game that ran well. In multiplayer, frame rate is everything, so I always opted for the shitty looking game.

On consoles, the game looks great, and everyone has the exact same performance.

So on PC you are unhappy with having medium settings, but are totally fine with medium settings on console? Sounds like an expectations problem (your hardware just wasnt very good).
 
Why bother asking devs when you can just blame the PS4 hardware and then pull out the "You guys are taking this too personally" card when people say it's not the hardware.

I'm having GameFAQs flashbacks here.
 
I have heard of this issue popping up a few times in reviews but haven't noticed it myself.

I was also under the impression that it was an issue in most or lots of PS4 games so I was curious as to what the deal was. It seems to only affect some games and comes down to developer decisions.

Of course it would be great if it's sorted out, but it's not a big deal to me.

Edit:This thread won't end well.
 
Can't be. PC GPUs with much lower bandwidth can handle 16x AF.

PCs have dedicated vram. The actual bandwidth isn't hurt by system memory tasks.

The ps4 everything , cpu gpu, recording video in the background, os running in the back ground all go through same memory bus. No matter how small.

Think of it like a hdd. 1000 1k files will take longer to transfer that one 1000k file because of read write constraints.

Like a hdd gddr can not read and write at the same time.
 
PCs have dedicated vram. The actual bandwidth isn't hurt by system memory tasks.

The ps4 everything , cpu gpu, recording video in the background, os running in the back ground all go through same memory bus. No matter how small.

Think of it like a hdd. 1000 1k files will take longer to transfer that one 1000k file because of read write constraints.

Like a hdd gddr can not read and write at the same time.

CPUs don't require high bandwidth. Latest core i7 have ~25 GB/s bandwidth. OS doesn't run on the GPU, it runs on the CPU.

For example, a PC with 20GB/s of CPU and 100GB/s of GPU bandwidth can run a lot of the titles lacking AF on ps4 with AF on 16x. PS4 has 176 GB/s total.

Edit: Also see how intel integrated gpus are very good at encoding videos. Encoding videos won't take away much bandwidth.
 
CPUs don't require high bandwidth. Latest core i7 have ~25 GB/s bandwidth. OS doesn't run on the GPU, it runs on the CPU.

Gpu uses the same memory as the cpu on the ps4.

They both can't access it at the same time.

It does make a difference.


where as a pc with a dedicated gpu can be rendering a frame while the cpu is starting the next set of tasks from what ever process that needs it.
 
This thread is depressing, it's like the first 10 pages never happened...

So I get that you are a dev with hands on experience with Both PS4 and X1. How much more work is a GNMX conversion over a GNM code, i.e. porting the DX11 codebase do you have to change alot more from it or not?
 
here is what I have spoken to someone who is learning this art as a student so he is not an industry person yet.

The Xbox one for example has the AF there because it is a negligible hit and the games are not 1080p resolution focused as having AF with 1080p resolution will have a huge performance hit because 1080p is hard and AF just adds to that . as the resolution is lower, these things stay in play

the PS4 has a much higher chance in reaching for the 1080p but the AF might end up having slight performance issue where the game does get to 1080p but the fps might suffer 1-2 points. where as if they remove AF (something which MOST gamers dont notice as compared to resolution or effects) developers are affording themselves the opportunity to hit the 1080p 30 fps with lack of AF as opposed to 1080p 26 fps with AF

The X1 can reach 900p almost as easily as PS4 but when you aim for 1080p, you have to remove 1 item for performance as opposed to X1 which can suffer after 900p but you can keep AF in
 
PCs have dedicated vram. The actual bandwidth isn't hurt by system memory tasks.

The ps4 everything , cpu gpu, recording video in the background, os running in the back ground all go through same memory bus. No matter how small.

Think of it like a hdd. 1000 1k files will take longer to transfer that one 1000k file because of read write constraints.

Like a hdd gddr can not read and write at the same time.
CPU in the PS4 has a separate bandwidth allocation, limited to 20 GB/s, which would cover everything in your list short of the GPU and video streaming. For comparison of the video streaming, I work on a video server that delivers 12-bit per color uncompressed RGB at 4k (24 fps) and it uses under 1 GB/s.

And your example of the HDD is at least partly due to the mechanical limitation of the drive head moving back and forth.
 
here is what I have spoken to someone who is learning this art as a student so he is not an industry person yet.

The Xbox one for example has the AF there because it is a negligible hit and the games are not 1080p resolution focused as having AF with 1080p resolution will have a huge performance hit because 1080p is hard and AF just adds to that . as the resolution is lower, these things stay in play

the PS4 has a much higher chance in reaching for the 1080p but the AF might end up having slight performance issue where the game does get to 1080p but the fps might suffer 1-2 points. where as if they remove AF (something which MOST gamers dont notice as compared to resolution or effects) developers are affording themselves the opportunity to hit the 1080p 30 fps with lack of AF as opposed to 1080p 26 fps with AF

This makes no sense on games that are 1080p on both the XB1 and PS4, but has AF on the X1 but not the PS4. Like DmC:DE. Or something like Strider, which is really simplistic graphics wise and is 1080p on both, but no AF on PS4 and there should be more than enough power left over to use AF on it.

Also, no AF is one of the most easily noticeable things out there to *anyone*. It's stands out far more than resolution or a missing effect.
 
So I get that you are a dev with hands on experience with Both PS4 and X1. How much more work is a GNMX conversion over a GNM code, i.e. porting the DX11 codebase do you have to change alot more from it or not?

I won't go into details again because I've already made more lengthy posts in this same thread to explain all this so I'll just say this: AF is as trivial to implement on PS4 as on XBone and has absolutely nothing to do with a game being ported from DX11.
 
Gpu uses the same memory as the cpu on the ps4.

They both can't access it at the same time.

It does make a difference.


where as a pc with a dedicated gpu can be rendering a frame while the cpu is starting the next set of tasks from what ever process that needs it.

A system with the latest CPU and a GPU with 120GB/s bandwidth (a mid range mobile GPU for example) has less total bandwidth compared to ps4. And can easily to AF 16x with same or better fps on many of the ps4 ports with the AF problem while running more tasks in the background. And AF is completely dependent on the GPU.

So it's not the bandwidth that's the problem. It's the software.
 
This makes no sense on games that are 1080p on both the XB1 and PS4, but has AF on the X1 but not the PS4. Like DmC:DE. Or something like Strider, which is really simplistic graphics wise and is 1080p on both, but no AF on PS4 and there should be more than enough power left over to use AF on it.

thats why he is a student and speculating.
 
Isn't this a dev problem? Why aren't you guys, the ones that are really concerned, go and ask them.
 
Hrmm. that is definitely odd.


So on PC you are unhappy with having medium settings, but are totally fine with medium settings on console? Sounds like an expectations problem (your hardware just wasnt very good).

for me it isn't being unhappy with medium settings on PC, its not knowing what settings are the best balance for your specific setup that will look the best while never (or only rarely) dropping below what I consider a decent framerate.

I have a 970, I could set everything to medium and it'd play most games just fine. But then I'm worried that I'm leaving stuff on the table and the game could look better and still have a good framerate. Thats when the pain starts.
 
PCs have dedicated vram. The actual bandwidth isn't hurt by system memory tasks.

The ps4 everything , cpu gpu, recording video in the background, os running in the back ground all go through same memory bus. No matter how small.

Think of it like a hdd. 1000 1k files will take longer to transfer that one 1000k file because of read write constraints.

Like a hdd gddr can not read and write at the same time.
That is a terrible analogy... my god!

Think of it like this you have to read & write within a 16.7ms frame across ~256*256 textures using a ~150GB/s bandwidth available to the GPU (If that is all available to the GPU). For DmC or lets say Unfinished swan or Strider this should be more than sufficient to apply minimum 8x.

What we are seeing has to be an Engine/API compatibility issue or overhead or combination of them both.
 
Games like the Order prove this is must be a weird software issue. The hardware is clearly capable, but even older games like DMC have this weird shit happening.
 
CPU in the PS4 has a separate bandwidth allocation, limited to 20 GB/s, which would cover everything in your list short of the GPU and video streaming. For comparison of the video streaming, I work on a video server that delivers 12-bit per color uncompressed RGB at 4k (24 fps) and it uses under 1 GB/s.

And your example of the HDD is at least partly due to the mechanical limitation of the drive head moving back and forth.

Allocation isn't the same as actual bandwidth usage. That is just saying 20gbs is allocated and that space us unusable.

The hdd explaination is totally correct Even if it ends of not the case here.
It's not the speed of the transfer it's the seek time and call stack. The hdd isn't just being accessed for the task at hand it has a queue just like Ddr memory.
 
Do you people actually see the difference during gameplay, or do you just obsess over it in static screenshots?

depends.

if you are in an action sequence or something, you won't notice much of anything

when you are just wandering around and exploring, if you know what to look for, it is noticeable.

My brother has been gaming for as long as me, but not nearly as into the finer details of things. I often have to optimize his PC games due to the auto-config prevalent in so many games butchering everything and turning AF off completely and such like that.

That auto-config and turning off AF sounds awfully familiar lol

He doesn't notice any differences until I point them out though. Which is normal for the regular joe. What they see is what they see, they aren't looking for whats not there.
 
Games like the Order prove this is must be a weird software issue. The hardware is clearly capable, but even older games like DMC have this weird shit happening.


like I said before it may just come down to some games juggling memory better than others.

The order ..one game on one platform would probably have Beyer memory management than DBZ.
 
But Order also have low AF on many surfaces.

I mean lack of any AF is completely bonkers, but AF is an issue on those consoles, because almost all games operates on low values.

I might be wrong, but I think that's his point, and by software, he means the PS4 software.

If even an exclusive considered by most the best graphics ever made on consoles suffer from low/variable AF, then it's clearly not a dev issue.
 
I might be wrong, but I think that's his point, and by software, he means the PS4 software.

If even an exclusive considered by most the best graphics ever made on consoles suffer from low/variable AF, then it's clearly not a dev issue.

DriveClub is another game that has low AF but it's barely noticeable in motion. Still the most impressive game visually I've seen on consoles.
 
1000x this.

This is the worst aspect of PC gaming, and why I'm mainly on console now. I want the developer to push the limits of a fixed, closed environment... and give me the absolute best results. I don't want to see amazing graphics that work great for 10 minutes then make the game drop to 20 fps after that.

It is especially horrible when playing multiplayer on PC.

When playing Battlefield 3, I was constantly tweaking the graphics. I had to constantly decide between having an awesome looking game that ran badly or a shitty looking game that ran well. In multiplayer, frame rate is everything, so I always opted for the shitty looking game.

On consoles, the game looks great, and everyone has the exact same performance.



As someone that has always self identified as a PC gamer, who prefers PC gaming and will continue to prefer PC gaming i do have to admit this can be true some times. There have been times where after tweeking and finagling with a game so much that i burn myself out of the game, where as when playing a game on a console, when all those options are ignored, you enjoy the game more. in the end the perks of PC gaming completely outweigh the cons, but it is a valid concern
 
Top Bottom