Alcoholikaust
Member
I understand the effect after reading some of this thread, but what does AF actually stand for? It could be useful to define the acronym in the OP for those of us who are less informed.
Here you go
I understand the effect after reading some of this thread, but what does AF actually stand for? It could be useful to define the acronym in the OP for those of us who are less informed.
I have no idea what causes it, but it's really distracting when I play a game on PS4 that has poor AF. Dying Light's ground textures looks absolutely awful on PS4.
I understand the effect after reading some of this thread, but what does AF actually stand for? It could be useful to define the acronym in the OP for those of us who are less informed.
But what about games like The Order then? I don't think there's any reason for them to implement variable AF if there's no issues at all to have 16XAF on PS4.
There's clearly something PS4 related, hardware or software, I don't know, but it's not that simple.
It's not just about AF completely lacking or being worse than on Xbox One, even when it's better it's still not quite there. On PC it's almost free, and while I do understand that consoles and PC are different, if it's not free on PS4, then it's also a PS4 issue, not a dev issue.
I find it hard to believe that they would just choose not to use 16xAF because reasons.
![]()
cerny plz
Which it never ever at all in any case is. I'm trying to say that AF is 100% GPU-bound. It's as GPU-bound as increasing rendering resolution.Thats all assuming of course that AF is CPU bound.
Of course there is a reason. AF isn't FREE. It does hit your FPS. But its only like 2-4fps or so. Which is fine in the PC world, because in the PC world you are used to your framerate bouncing all over the place anyway. Unless you can brute force a game of course and get it to like 100fps. People think PC = 60fps, but in my 20 years of playing PC games, it is quite rare for a game to be a smooth framerate in general, let alone 60fps and smooth.
Thats all assuming of course that AF is CPU bound.
I understand the effect after reading some of this thread, but what does AF actually stand for? It could be useful to define the acronym in the OP for those of us who are less informed.
Which it never ever at all in any case is. I'm trying to say that AF is 100% GPU-bound. It's as GPU-bound as increasing rendering resolution.
Do you think it could be the memory bandwidth wearing thin being the one and only bus for everything?
Maybe, in cases like The Order, but for most of these games it doesn't look as if that should be the case. I still think it's some kind of API mismatch for ports.Do you think it could be the memory bandwidth wearing thin being the one and only bus for everything?
From DmC thread.
Which it never ever at all in any case is. I'm trying to say that AF is 100% GPU-bound. It's as GPU-bound as increasing rendering resolution.
The floating platform turns to soup in the ps4 version. Playing pc games for years at 16xAF really makes this shit noticeable.
Maybe, in cases like The Order, but for most of these games it doesn't look as if that should be the case. I still think it's some kind of API mismatch for ports.
It's not confusing, but it's precisely the one thing I can't stand about PC gaming.
I'm not against having some simple option in console game, preferably a 30/60 fps switch like in TLOU, but my two memorable examples of playing the AAA games on PC (tomb raider and bf4) went like this:
- spend 15 minutes messing with all the settings ("Wow, I can run this at playable framerate even with tressfx enabled, this is awesome!)
- playing for 5-10 minutes, I get to a first slightly more open area and framerate goes to hell.
- another 10-15m in settings replaying this new section over and over until I get it right, with tressfx as a wildcard, I want to keep it if I can.
- playing another 5-10 minutes, get out of the cave, framerate again tanks
- goodbye tressfx, lower shadows, replay, replay, lower postfx, play again...
- play some more, get into first firefight, framerate tanks again.
- back in to settings, replay replay replay.
- bigger firefight later in the game... Yep framerate tanks again. At that point I'm conpletey sick of the game for no games fault at all, as I'm sick of replaying everything over and over just to confirm if disabling this or that option had enough of an effect.
Basically, for me it's the best hardware available or nothing when it comes to PC. And I'm not spending the money on that hardware - especially as I know that once in a while there will be a game that will just glich out regardless, and I'll spend a dumb amount of time battling that glitch. It's just not worth it for me.
Not only compared to pc. ps4 has worse texture filtering than ps3. Actually negates the resolution improvement of the new ps4 version.
1000x this.
This is the worst aspect of PC gaming, and why I'm mainly on console now. I want the developer to push the limits of a fixed, closed environment... and give me the absolute best results. I don't want to see amazing graphics that work great for 10 minutes then make the game drop to 20 fps after that.
It is especially horrible when playing multiplayer on PC.
When playing Battlefield 3, I was constantly tweaking the graphics. I had to constantly decide between having an awesome looking game that ran badly or a shitty looking game that ran well. In multiplayer, frame rate is everything, so I always opted for the shitty looking game.
On consoles, the game looks great, and everyone has the exact same performance.
Do you people actually see the difference during gameplay, or do you just obsess over it in static screenshots?
This thread is depressing, it's like the first 10 pages never happened...
Can't be. PC GPUs with much lower bandwidth can handle 16x AF.
From DmC thread.
PCs have dedicated vram. The actual bandwidth isn't hurt by system memory tasks.
The ps4 everything , cpu gpu, recording video in the background, os running in the back ground all go through same memory bus. No matter how small.
Think of it like a hdd. 1000 1k files will take longer to transfer that one 1000k file because of read write constraints.
Like a hdd gddr can not read and write at the same time.
CPUs don't require high bandwidth. Latest core i7 have ~25 GB/s bandwidth. OS doesn't run on the GPU, it runs on the CPU.
This thread is depressing, it's like the first 10 pages never happened...
CPU in the PS4 has a separate bandwidth allocation, limited to 20 GB/s, which would cover everything in your list short of the GPU and video streaming. For comparison of the video streaming, I work on a video server that delivers 12-bit per color uncompressed RGB at 4k (24 fps) and it uses under 1 GB/s.PCs have dedicated vram. The actual bandwidth isn't hurt by system memory tasks.
The ps4 everything , cpu gpu, recording video in the background, os running in the back ground all go through same memory bus. No matter how small.
Think of it like a hdd. 1000 1k files will take longer to transfer that one 1000k file because of read write constraints.
Like a hdd gddr can not read and write at the same time.
here is what I have spoken to someone who is learning this art as a student so he is not an industry person yet.
The Xbox one for example has the AF there because it is a negligible hit and the games are not 1080p resolution focused as having AF with 1080p resolution will have a huge performance hit because 1080p is hard and AF just adds to that . as the resolution is lower, these things stay in play
the PS4 has a much higher chance in reaching for the 1080p but the AF might end up having slight performance issue where the game does get to 1080p but the fps might suffer 1-2 points. where as if they remove AF (something which MOST gamers dont notice as compared to resolution or effects) developers are affording themselves the opportunity to hit the 1080p 30 fps with lack of AF as opposed to 1080p 26 fps with AF
So I get that you are a dev with hands on experience with Both PS4 and X1. How much more work is a GNMX conversion over a GNM code, i.e. porting the DX11 codebase do you have to change alot more from it or not?
Gpu uses the same memory as the cpu on the ps4.
They both can't access it at the same time.
It does make a difference.
where as a pc with a dedicated gpu can be rendering a frame while the cpu is starting the next set of tasks from what ever process that needs it.
This makes no sense on games that are 1080p on both the XB1 and PS4, but has AF on the X1 but not the PS4. Like DmCE. Or something like Strider, which is really simplistic graphics wise and is 1080p on both, but no AF on PS4 and there should be more than enough power left over to use AF on it.
Hrmm. that is definitely odd.
So on PC you are unhappy with having medium settings, but are totally fine with medium settings on console? Sounds like an expectations problem (your hardware just wasnt very good).
That is a terrible analogy... my god!PCs have dedicated vram. The actual bandwidth isn't hurt by system memory tasks.
The ps4 everything , cpu gpu, recording video in the background, os running in the back ground all go through same memory bus. No matter how small.
Think of it like a hdd. 1000 1k files will take longer to transfer that one 1000k file because of read write constraints.
Like a hdd gddr can not read and write at the same time.
CPU in the PS4 has a separate bandwidth allocation, limited to 20 GB/s, which would cover everything in your list short of the GPU and video streaming. For comparison of the video streaming, I work on a video server that delivers 12-bit per color uncompressed RGB at 4k (24 fps) and it uses under 1 GB/s.
And your example of the HDD is at least partly due to the mechanical limitation of the drive head moving back and forth.
here is what I have spoken to someone who is learning this art as a student so he is not an industry person yet.
Do you people actually see the difference during gameplay, or do you just obsess over it in static screenshots?
Games like the Order prove this is must be a weird software issue. The hardware is clearly capable, but even older games like DMC have this weird shit happening.
Games like the Order prove this is must be a weird software issue. The hardware is clearly capable, but even older games like DMC have this weird shit happening.
But Order also have low AF on many surfaces.
I mean lack of any AF is completely bonkers, but AF is an issue on those consoles, because almost all games operates on low values.
I might be wrong, but I think that's his point, and by software, he means the PS4 software.
If even an exclusive considered by most the best graphics ever made on consoles suffer from low/variable AF, then it's clearly not a dev issue.
1000x this.
This is the worst aspect of PC gaming, and why I'm mainly on console now. I want the developer to push the limits of a fixed, closed environment... and give me the absolute best results. I don't want to see amazing graphics that work great for 10 minutes then make the game drop to 20 fps after that.
It is especially horrible when playing multiplayer on PC.
When playing Battlefield 3, I was constantly tweaking the graphics. I had to constantly decide between having an awesome looking game that ran badly or a shitty looking game that ran well. In multiplayer, frame rate is everything, so I always opted for the shitty looking game.
On consoles, the game looks great, and everyone has the exact same performance.